Disturbing Results: YouTube’s Algorithm Recommends Harmful Content to Young Viewers


Youtube recommendations

A new study about YouTube’s recommendation system has found that it is exposing kids to disturbing videos such as school shootings and guns. The study was conducted by the non-profit watchdog group TTP. The latter found that YouTube’s algorithm steered boys keened over video games to content depicting school shootings, the use of guns, and showing how to make custom firearms.

What does YouTube suggest for children?

Scholars created four YouTube accounts as if they belong to two 9-year-old boys and two 14-year-old boys to conduct the study. These accounts watched Grand Theft Auto, Halo, Lego Star Wars, and Roblox playlists, among other famous video game titles. Then, the scholars tracked how YouTube behaves toward these accounts over a 30-day period.

The results showed that YouTube recommended movies with guns and shootings to all gamer accounts. Plus, those who clicked on those videos, got more videos from the same topic. These movies included mass shootings and other shooting-related scenes, graphic depictions of the damage that firearms can do to the human body, and how-tos for turning pistols into fully automatic weapons.

The research also noted that many of the proposed films appeared to go beyond YouTube’s own rules. What’s odd, some of these films showed a little girl firing a gun and guides on altering illegal weapons. Most of those videos were ads. So it turns out YouTube is earning by suggesting dangerous videos to kids.

Youtube recommendations

Gizchina News of the week


In response to the research, a YouTube agent said that there are the YouTube Kids app and its in-app supervision features. He said they are designed to make watching safer for tweens and teens. But, the agent accepted that paper shares some key points they should take into account. Also, he that YouTube was willing to work with uni researchers. On the other hand, he had some complaints saying there is no concrete info on what methods they used, what the total number of videos suggested to the test accounts, and the lack of details on the use of YouTube’s Supervised Experiences feature.

YouTube recommendations has not been working well ever

In fact, this is not the first case scholars or average users have complained about YouTube’s recommendation system. Prior to this, many users and scholars have raised concerns that YouTube is pushing forward dubious content that may not directly break YouTube’s rules but still should not be on the net. As a result, YouTube has made efforts to hide such videos. For some of these films, YouTube even didn’t allow sharing.

YouTube has also come under fire for suggesting that viewers watch radical and extreme content. In some cases, YouTube has led viewers of innocent or non-partisan films down a rabbit hole of extreme ideology and hate speech.

Once, YouTube was thought to be the biggest source of spreading info on conspiracy theories. Users have been directed to misleading content while watching films about certain topics, such as vaccines or political events. As a result, millions got the wrong info.

YouTube watchers have also said that YouTube wants some violent and graphic content to be on top. They refer to real-life violence or disturbing material that can have an impact on kids.

As we can see, YouTube should review the factors it takes into account when suggesting videos. Of course, it should keep in mind that there are many kids who use their parents’ phones / accounts for watching YouTube videos. So it’s better to revise the idea behind the suggested videos rather than advise special features.

Disclaimer: We may be compensated by some of the companies whose products we talk about, but our articles and reviews are always our honest opinions. For more details, you can check out our editorial guidelines and learn about how we use affiliate links.

Source/VIA :
Previous Google ignored Android 14 for the first time in Google I/O
Next Mango Power Solar Generators: A Perfect Fit for Colorado