YouTube’s Recommendations Send Violent Gun Videos to 9-Year-Olds, Study Finds

Published: May 17, 2023

When researchers at a nonprofit that research social media needed to grasp the connection between YouTube movies and gun violence, they arrange accounts on the platform that mimicked the habits of typical boys dwelling within the U.S.

They simulated two nine-year-olds who each favored video video games, particularly first-person shooter video games. The accounts had been an identical, besides that one clicked on the movies really helpful by YouTube, and the opposite ignored the platform’s options.

The account that clicked on YouTube’s options was quickly flooded with graphic movies about faculty shootings, tactical gun coaching movies and how-to directions on making firearms totally automated.

One video featured an elementary school-age woman wielding a handgun; one other confirmed a shooter utilizing a .50 caliber gun to fireplace on a dummy head full of lifelike blood and brains. Many of the movies violate YouTube’s personal insurance policies in opposition to violent or gory content material.

The findings present that regardless of YouTube’s guidelines and content material moderation efforts, the platform is failing to cease the unfold of scary movies that would traumatize weak kids — or ship them down darkish roads of extremism and violence.

“Video games are one of the most popular activities for kids. You can play a game like ”Call of Duty” without ending up at a gun shop — but YouTube is taking them there,” mentioned Katie Paul, director of the Tech Transparency Project, the analysis group that revealed its findings about YouTube on Tuesday. “It’s not the video games, it’s not the kids. It’s the algorithms.”

The accounts that adopted YouTube’s recommended movies obtained 382 totally different firearms-related movies in a single month, or about 12 per day. The accounts that ignored YouTube’s suggestions nonetheless obtained some gun-related movies, however solely 34 in complete.

The researchers additionally created accounts mimicking 14-year-old boys who favored video video games; these accounts additionally obtained comparable ranges of gun- and violence-related content material.

One of the movies really helpful for the accounts was titled “How a Switch Works on a Glock (Educational Purposes Only).” YouTube later removed the video after determining it violated its rules; an almost identical video popped up two weeks later with a slightly altered name; that video remains available.

Messages seeking comment from YouTube were not immediately returned on Tuesday. Executives at the platform, which is owned by Google, have said that identifying and removing harmful content is a priority, as is protecting its youngest users.

YouTube requires users under 17 to get their parent’s permission before using their site; accounts for users younger than 13 are linked to the parental account.

Along with TikTok, the video sharing platform is one of the most popular sites for children and teens. Both sites have been criticized in the past for hosting, and in some cases promoting, videos that encourage gun violence, eating disorders and self-harm. Critics of social media have also pointed to the links between social media, radicalization and real-world violence.

The perpetrators behind many recent mass shootings have usedsocial media and video streaming platforms to glorify violence or even livestream their attacks. In posts on YouTube, the shooter behind the attack on a 2018 attack on a school in Parkland, Fla., that killed 17 wrote “I wanna kill people,” “I’m going to be a professional school shooter” and “I have no problem shooting a girl in the chest.”

The neo-Nazi gunman who killed eight people earlier this month at a Dallas-area shopping center also had a YouTube account that included videos about assembling rifles, the serial killed Jeffrey Dahmer and a clip from a school shooting scene in a television show.

In some cases, YouTube has already removed some of the videos identified by researchers at the Tech Transparency Project, but in other instances the content remains available. Many big tech companies rely on automated systems to flag and remove content that violates their rules, but Paul said the findings from the Project’s report show that greater investments in content moderation are needed.

In the absence of federal regulation, social media companies can target young users with potentially harmful content designed to keep them coming back for more, said Shelby Knox, campaign director of the advocacy group Parents Together.

Knox’s group has called out platforms like YouTube, Instagram and TikTok for making it easy for children and teens to find content about suicide, guns, violence and drugs.

“Big Tech platforms like TikTok have chosen their profits, their stockholders, and their companies over children’s health, safety, and even lives over and over again,” Knox said in response to a report published earlier this year that showed TikTok was recommending harmful content to teens.

TikTok has defended its site and its policies, which prohibit users younger than 13. Its rules also prohibit videos that encourage harmful behavior; users who search for content about topics including eating disorders automatically receive a prompt offering mental health resources.

(This story has not been edited by News18 employees and is revealed from a syndicated news company feed)

Source web site: www.news18.com