YouTube's algorithm is responsible for the rabbit holes most viewed by video users – which critics have portrayed as an effective megaphone for spreading wonky ideas. Men January 2019, YouTube has announced it will launch defeat “line content ” in its recommendations. Well, a a new study raises that YouTube's efforts have bear some fruit. According to research, cControversial theory is now 40 percent more likely to appear in your recommendations than before YouTube was dismantled.
The context of this study is for Bergley researchers to evaluate eight million recommendations over a 15-month period. To judge how effective YouTube's efforts have been, researchers are training the algorithm to determine if a video has creative ideas based on its description, comments, and text. The results were mixed.
On the plus side, researchers have found that YouTube has been successful in endorsing videos that align the US government's belief in the 9/11 terrorist attacks and that the world is flat – two articles that YouTube recognized as enemies when they initially announced their plans to tackle counterfeit . The study also found that during the period from June to December 2019, the percentages of conspiracy theories dropped first. 50 percent, and then the lowest constant value of 70 percent.
However, the researchers also found that those numbers, although relevant to their data, were not responding to the popularity of the source video. In arranging for that, they received complimentary recommendations from the lowest point in May 2019 and now 40 percent less common than before. Also, while YouTube has succeeded in curbing some of the myths, others are still very common – including those involving aliens who build pyramids as well, in this regard, the denial of climate change. Investigators told them tthe New York Times
The other problem here is that while the research shows a marked decrease in complainants' recommendations, it does not shed much light on how that has an impact. flexibility. In addition, this study was limited because it only reads recommendations without having to log into a YouTube account – which doesn't show how many people browse the platform. Without the cooperation from YouTube, it is difficult to duplicate the right-made recommendations to some extent, meaning any research that seeks to judge accurately that YouTube contributes to make people wrong is a natural mistake.
YouTube has close to 2 billion users, a growing number of those who use the platform as their primary media source. Measures such as reducing recommended video content and rendering users more direct control beyond what the algorithm shows is a straightforward step, but there is still work to be done.