YouTube is the biggest video streaming and hosting platform in the entire world. With over 2 billion users around the world watching over 3 billion hours worth of content every month, the effect that YouTube has had over our daily lives and the global culture is massive. However, the platform has always been riddled with controversy due to some questionable choices and a complicated process that not even those within YouTube understand properly.
While there is content for people of all ages on YouTube, the company pays special attention towards catering its content towards children. This is because younger people tend to have more time on their hands and thus consume content for prolonged periods of time, which drives the numbers up for YouTube.
For content creators, this change in YouTube policy prompted a quick change over to child-friendly content which got monetized more easily. However, this change backfired for the platform as the upsurge in child-friendly content came at a cost. The videos were now being hoarded by secret pedophile rings ultimately creating an unsafe environment for children.
Furthermore, a lot of people felt that the children in these videos were being ‘used’ and ‘exploited’ by their guardians as means for earning money. In addition to that, children were also being recommended some violent and sexual videos because people found ways to exploit the YouTube algorithm and bypass its safety measures.
So, the end result was YouTube completely putting a full stop to monetization on all videos that were being targeted towards children. Despite the damage control, it seems like another problem is on the rise that YouTube just isn’t ready to deal with yet.
YouTube is recommending white supremacist propaganda to young adults and teenagers
Recently, people have started to notice a trend on YouTube. In spite of YouTube stating that its algorithm closely scrutinizes every video that is pushed on to its platform, we’re starting to see a rather unnerving rise in videos that include subtle racist undertones.
Credible-looking (or moderate-posing) content ties to popular culture or current events, then introduces particular framings or phrasings where the only search results are conspiracy theories and extremist content. From there, the kids go down the rabbit hole.
— anildash.com (@anildash) January 28, 2020
This starts off as just a normal piece of content that seems educational with ties to history and current events. However, YouTube’s algorithm then starts to recommend more of these videos to these younger people who, in turn go down a rabbit hole of videos that appear to be harmless on the surface but have some serious racist and sexist connotations.
So a kid goes looking for info on their favorite comic book movie or singer, sees someone saying “that thing is bad, for this reason”, not realizing it’s thinly-veiled racism/sexism. The every time the hateful rhetoric gets more overt, it’s couched as being “ironic” or a joke.
— anildash.com (@anildash) January 28, 2020
The end result? Planned radicalization of the youth. There has been some serious evidence of white supremacist organizations trying to recruit young adults and teenagers into their ranks by exploiting the YouTube algorithm and bombarding them with videos which include white supremacist propaganda. If you’re worried about your kids being the victims of such videos, Washington Post did a great piece on it a few months ago. You can check it out here.
YouTube Kids app deemed not safe for children either
However, YouTube’s troubles aren’t limited to the racist propaganda targeted towards young adults. The YouTube Kids app, which has been under fire in the past for not properly filtering content for children is having some troubles keeping the lid closed again.
Despite YouTube saying that it manually reviews every single video that is being published to its Kids platform, there have been multiple incidents recently of parents finding their kids watching disturbing sexual or violent content. While these videos include famous cartoon characters like the Mickey Mouse, these videos tend to take bizarre violent or sexual turns midway through.
YouTube’s manual review team clearly doesn’t have enough time on its hands to go through the entirety of each video so they just push these videos to the platform deeming them fit for the Kids platform from their initial tones. However, it is the algorithm that comes into play and recommends the videos once they have been posted.
The people that post these videos have managed to yet again exploit the website’s algorithm, which is arguably the biggest problem with the platform right now. The issue is that those within the company do not fully understand the algorithm either since there needs to be a level of secrecy between departments to stop internal abuse. However, this leads to incoherent management which results in a dysfunctional algorithm.
Even if YouTube reacts and manages some degree of damage control once again, it wouldn’t really solve the problem in the long run. The issue will never be resolved with short term changes and a serious change in direction for YouTube as a whole is needed.
YouTube plays a massive part in directing the global culture and political climate now and the platform needs to take more responsibility with the way it handles itself as the effects of any misstep are far greater than most people would realise.