Study Finds YouTube’s System Sends Gun Videos to 9-year-olds

Reading audio



20 May 2023

A new study has found that YouTube's tool to suggest videos can direct young users to content about guns and violence.

The study was based on an experiment carried out by the Tech Transparency Project. The nonprofit group studies social media services. Researchers from the group set up two YouTube accounts that simulated online activity that might be interesting to 9-year-old boys.

The two accounts contained the exact same information. The only difference was that one account chose only to watch videos suggested by YouTube. The other ignored the video service's suggested offerings.

The organization found the account that chose to watch YouTube's suggestions was flooded with graphic videos. These included videos about school shootings and instructions for making guns fully automatic.

Many of the suggested videos violate YouTube's own policies against violent or graphic content.

YouTube has technology tools that are meant to restrict some kinds of videos. But the study suggests that those tools are failing to block violent content from young users. The researchers involved in the study said the tools even may be sending children to videos that include extremist and violent material.

Katie Paul leads the Tech Transparency Project. She said, "Video games are one of the most popular activities for kids. You can play a game like ‘Call of Duty' without ending up at a gun shop — but YouTube is taking them there."

Paul added, "It's not the video games, it's not the kids. It's the algorithms." An algorithm is a set of steps that are followed to complete a computing process or problem.

Social media companies use algorithms to predict what content users might be interested in based on past watch history. Algorithm tools suggest that content to users.

The accounts that clicked on YouTube's suggested videos received 382 different gun-related videos in a single month. The accounts that ignored YouTube's suggestions still received some gun-related videos, but only 34 in total.

A spokeswoman for YouTube defended the platform's protections for children and noted that it requires users under age 17 to get a parent's permission before using their website.

YouTube says accounts for users younger than 13 are linked to a parental account. The company noted that it offers several choices for younger viewers that are "designed to create a safer experience for tweens and teens."

Activist groups for children have long criticized YouTube for making violent and troubling content easily available to young users. They say YouTube sometimes suggests videos that promote gun violence, eating disorders and self-harm.

In some cases, YouTube has already removed some of the videos that the Tech Transparency Project identified. But others remain available.

Many technology companies depend on computer programs to identify and remove content that violates their rules. But Paul said findings from her organization's study show that greater investments and efforts are needed to block such material.

Justin Wagner is the director of investigations at Everytown for Gun Safety, a gun control activist group. He told the AP that without federal legislation, social media companies must do more to enforce their own rules.

He added, "Children who aren't old enough to buy a gun shouldn't be able to turn to YouTube to learn how to build a firearm, modify it to make it deadlier, or commit atrocities."

I'm Bryan Lynn.

Bryan Lynn wrote this story for VOA Learning English, based on reports from The Associated Press.

_______________________________________________________________

Words in This Story

simulate – v. to do or make something that behaves or looks like something real but is not

graphic adj. extremely clear and detailed

promote – v. to urge people to like, buy or use something

modify – v. to change something in order to improve it

atrocity – n. an extremely violent and shocking attack