Over the years, YouTube went from a simple user-generated content hub to a global tech giant. Now, it hosts not only countless amateur users, but also millions of channels from top-tier official sources. In that way, the site has come to rival even the likes of Facebook and Twitter as a complete source of news worldwide. Thus, in the interest of maintaining neutrality, YouTube repeatedly claims that it does not rank videos by content.
That should mean that a user’s viral cat video won’t gain any more or any less promotion than Rihanna’s official channel. Confusingly, YouTube also supports (and even encourages) paid sponsorships. That means that violating their terms of service, by posting nudity, copyrighted music etc, results in demonetization. However, a recent study seems to prove YouTube’s bias towards mainstream news campaigns.
The study shows that YouTube is more likely to push pro-vaccine videos rather than anti-vaccine ones
A team from Virginia Tech University investigated just how unbiased the watch recommendation services were. These services basically entail the specific, related videos peddled based on your current search results. The team set up bots to organize and analyze the search results of hundreds of thousands of recommended videos. They found that when it came to a high-profile topic like the COVID-19 pandemic (and the upcoming vaccines), results showed pro-vaccine, anti-vaccine and neutral content.
However, the surprising evidence showed that there were more pings for pro-vaccine content than the others. This implies that YouTube, probably intentionally, is trying to help support progressive, scientific consensus for this ongoing pandemic. After all, the mainstream, most backed stance about COVID-19 is the importance of a vaccine. This, while at odds with their official stance, is still essentially beneficial worldwide.
The algorithm is meant to encourage impartial journalism, but often leads to promotion of harmful content
The watch recommendation feature is controlled by the site’s algorithm, an automated service that points users towards similar or related videos. That means that just by watching a video about the Armenian-Azerbaijan war, you’d be recommended content on Azerbaijani cuisine. While that seems harmless, it also means that searching for videos about space pushes videos by flat-earth conspiracy theorists. That may seriously harm the integrity of scientific advancement for many gullible viewers.
What’s more, this feature may indirectly lead to the promotion of blatant lies. Imagine the consequences of an impressionable viewer pushed pro-ISIS propaganda, for instance. Just by having a video on a trending topic, no matter how negative the content is, such users gain huge outreach thanks to the algorithm.
The same findings don’t extend to other conspiracy theories
YouTube, like most other businesses, finds its best interests in ending the pandemic as soon as possible. Therefore, it seems to especially have stepped in to influence public awareness. However, when it comes to climate change, or global warming, the results remain uninfluenced. And that’s despite the crisis the planet is now in, and the scientific consensus backing the acceptance of man-made climate change. Furthermore, baseless videos challenging the moon landings, flat earth conspiracies and others continue to thrive.
In the end, it seems YouTube is a force for easy to access online information. Whether that information is truthful, false or just random depends on the company’s interests. Still, let’s just give them credit for pushing the right message for once. Hopefully, there’s more that other tech giants can do.