YouTube announced a quick fix for conspiracy theory videos on the platform, which should solve the problem at least temporarily: adding Wikipedia links to debunk the theories, YouTube CEO Susan Wojcicki said Tuesday. Last month, YouTube said it would start alerting users about videos from news corporations that get funding from governments. Expect to see Wikipedia links in videos promoting conspiracy theories and issues that are very likely to spark heated debates.
The new features, which will be rolled out in the coming months, are part of a range of new initiatives that YouTube is considering to eradicate misinformation from the platform.
"When there are videos that are focused around something that's a conspiracy - and we're using a list of well-known internet conspiracies from Wikipedia - then we will show a companion unit of information from Wikipedia showing that here is information about the event", Wojcicki said according to a report by The Verge.
For example, on a video questioning whether humans ever actually landed on the Moon, YouTube might provide a link to Wikipedia's page on the Apollo 11 lunar-landing mission in 1969.More news: Case Keenum Intends To Sign With Broncos In Free Agency
More news: Fox News host calls Trump out for backpedaling on gun control
More news: Kurdistan airports open to worldwide flights
Still, Wikipedia can be edited by anyone, and editors have often engaged in stellar bipartisan battles over controversial topics.
Though music and gaming videos are far more popular on YouTube, the company has made addressing the criticism around news and science videos a top priority this year.
In breaking news situations, the web's speed becomes a double-edged sword for tech companies trying to combat fake news and disinformation: Will YouTube be able to keep up with the spread of conspiracy theory videos during the next major news event?