【】
Even withoutAlex Jones, harmful conspiracy theory videos were running rampant on YouTube. Now, the company says it’s going to take action.
In a blog postpublished on Friday, YouTube said it would be making changes to its recommendations algorithm to explicitly deal with conspiracy theory videos. The company says the update will reduce the suggestion of “borderline content and content that could misinform users in harmful ways.”
YouTube clarified what kind of videos fit that description by providing three examples: “videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
The company clarified that this content doesn’t necessarily violate its community guidelines. This means that while the content may still exist on YouTube, the site’s algorithm will omit these videos from being recommended to its users.
In order to deal with this sort of problematic content, YouTube says it relies on “a combination of machine learning and real people.” Human evaluators and experts will train the recommendation system to evaluate these videos. At first, the changes will only be visible on a small number of videos in the U.S.
YouTube says that overall less than 1 percent of videos will be affected by this change. But, with the platform’s massive video archive and hundreds of hoursof new content being uploaded per minute, that still amounts to a lot of videos.
SEE ALSO:YouTube rolls out ban on dangerous challenges and pranksThe video site, which is the second most trafficked website in the world, according to Alexa, has longbeen criticizedfor its recommendation engine. The company actually did make changesin an attempt to combat misinformation. For example, YouTube adjusted its search algorithm to center trusted news sources for breaking news queriesin September.
YouTube recommendations continued to be a problem, however.
The Washington Postrecently discovered hateful content, as well as Supreme Court Justice Ruth Bader Ginsberg health conspiracies, being recommended on YouTube. Motherboard reportedon a 9/11 newscast that was being suggested to YouTube users en masse last week.
Just yesterday, BuzzFeed News published an investigationinto YouTube’s recommendation algorithm. BuzzFeed found that YouTube would eventually recommend conspiracy theory and hate videos from far-right commentators for the most basic of current events searches.
A Pew studypublished in November found that an increasing number of Americans are researching topics on YouTube and going to the service for news. The study also found that the site’s recommendation engine plays a large role in what videos its users consume.
Omitting flat Earthers, 9/11 truthers, and bogus MDs from YouTube recommendations would be a big step toward fixing one of the platform’s many problems.
Featured Video For You
YouTube bans dangerous pranks
TopicsYouTube
相关文章
These glasses hide a fitness tracker on your face
The last time a company tried popularizing wearable tech embedded in glasses, most notably with Goog2025-01-18Microsoft is making Xbox body wash that absolutely no one asked for
Everyone named Kyle is legally obligated to buy this Xbox soap and then never use it. Apparently, 202025-01-18Amazon shareholders shut down proposal to limit facial recognition sales
Amazon will sell its facial-recognition technology to whomever it damn well pleases thank you very m2025-01-18Crowd sang 'Ave Maria' in the streets of Paris as Notre
As crowds in Paris helplessly watched the Notre-Dame Cathedral engulfed in flames on Monday night, t2025-01-18Researchers create temporary tattoos you can use to control your devices
In the future, your tattoos could be much more than just ink designs.。Scientists have created a new2025-01-18'Heaven's Vault' review: A promising game that tries to do too much
UPDATE: April 24, 2019, 7:31 p.m. EDT It's worth noting that Inkle has already put out an updating a2025-01-18
最新评论