What you didn’t know about the Youtube algorithm - the exploitation of minors

Opinion piece by Taleen El Gharib, Staff Writer

April 27th, 2020

With over 2 billion active monthly users as of 2019, YouTube is one of the biggest and most effective social networking platforms in the world. 95% of the internet population watches YouTube, and with such a huge audience, the site has created an algorithm to show the users the content which they seek.

For example, a few clicks on sports related content will fill up the Recommended section with other sports related content that might interest the user. Great, right?

In order to do this, YouTube observes the user’s engagement with each watched video. The algorithm tracks what you don’t watch, what you skip, how much time you spend watching a certain video and the channels you visit most frequently. This ensures that the user will come back to use the platform regularly, knowing that the presented videos will all meet the user’s ultimate satisfaction.

With this in mind, the platform promises user security upon use, given that their algorithm stalks the user’s activity. YouTube policies include a section on Child Safety on YouTube, in which the site states that they do not allow any content that “endangers the emotional and physical well-being of minors”, which are those under the legal age of 18. YouTube reports to the National  Center for Missing and Exploited Children: a private, non-profit organisation, which works with legal law enforcement agencies. One would expect that child security, or overall user security, is taken rather seriously by this video-sharing platform that is willing to take legal action against those who violate the basic community guidelines. 

So, how are child predators still able to use YouTube’s algorithm for the sexual exploitation of minors?

Typing something as simple as “Gymnastics” into the search engine will provide the user with endless gymnastics-related videos to choose from, as well as a “Recommended” section to the right of the chosen video. However, upon clicking on several other gymnastics videos within that Recommended panel, one eventually begins to see random videos of minors, typically younger girls. The videos weren’t something quite out of the ordinary — kids filming themselves doing things that normal kids would do.

Nonetheless, it is quite odd for the YouTube algorithm to analyse user activity in such a way that the Recommended section will only include videos of minors, and such a loophole in the system provides an easy gateway for child predators to target children. Predators have been able to use the algorithm to watch videos and share snippets of minors from the video in unintentionally compromising positions within the comment section, creating a large, easily accessible web of inappropriate content that clearly violates community guidelines. This issue has been brought to light several times by concerned users, enraged with YouTube’s inability to ensure the most basic user right: safety. YouTube’s inability to protect child users from such easily identifiable violations draws a big question mark on both their reliability and intentions. Although some of the videos have been taken down or have had the comments disabled due to any violations found in the comment section, many of these recurring violations still exist on the platform. YouTube’s new stricter policies were to take effect on January 1st, 2020, but it seems as though the initially concerning algorithm persists. 

Predators have created a means of communication amidst the new strict policies, in order to remain subtle in the eyes of a random user and not be reported. Now that it is common to put timestamps in the comments that take the viewer directly to that certain clip of the video, predators would comment a series of timestamps on videos, easily recognisable by other child predators.

What is even more appalling is their ability to comment, seemingly without any trouble, links that can redirect the user to child pornography sites. Although this behaviour has been reported several times (the biggest response against YouTube’s algorithm was back in 2017), it still persists 3 years later. The fact that YouTube’s ability to create such a complex algorithm does not lie parallel with the protection of its users raises concern over their effort to control such violations, as well as their involvement in other activities that may also violate their own policies. Such a potentially business-shattering scandal remains infrequently mentioned, despite its severity. 

Considering the extent of which many violations may be kept under wraps, it is relevant to question the integrity of such largely popular social media platforms. Why is YouTube not doing something about it? Could it possibly be involved? How safe are we?

Previous
Previous

The lockdown is second best in the Swedish context

Next
Next

Sweden: thousands of technophiles get microchip implants, a sign of potential tyranny?