American News Group

Suicide instructions spliced into kids’ cartoons on YouTube and YouTube Kids

A girl watches a video on youtube on a computer on February 27, 2013 in Chisseaux near Tours, central France. AFP PHOTO/ ALAIN JOCARD (Photo credit should read ALAIN JOCARD/AFP/Getty Images)

Tips for committing suicide are appearing in children’s cartoons on YouTube and the YouTube Kids app.

The sinister content was first flagged by doctors on the pediatrician-run parenting blog pedimom.com and later reported by the Washington Post. An anonymous “physician mother” initially spotted the content while watching cartoons with her son on YouTube Kids as a distraction while he had a nosebleed. Four minutes and forty-five seconds into a video, the cartoon cut away to a clip of a man, who resembles Internet personality Joji (formerly Filthy Frank). He walks onto the screen and simulates cutting his wrist. “Remember, kids, sideways for attention, longways for results,” he says and then walks off screen. The video then quickly flips back to the cartoon.

“I am disturbed, I am saddened, I am disgusted,” the physician wrote. “But I am also relieved that I was there to see this video with my own eyes, so that I could take the appropriate actions to protect my family.” Those actions included deleting the YouTube Kids app and forever banning it from the house.

That particular video was later taken down from YouTube Kids after the doctor reported it to YouTube. However, parents have since discovered that several other cartoons contain information about how to commit suicide, including the same spliced-in video clip. In a subsequent blog post, pediatrician Free Hess, who runs pedimom, reported another cartoon—this time on YouTube—with the clip spliced in at four minutes and forty-four seconds. That cartoon was also later taken down, but Hess captured a recording of it beforehand, which you can view on the blog.

In an emailed statement, a spokesperson for YouTube told Ars:

We work to make the videos in YouTube Kids family-friendly and take feedback very seriously. We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video. Flagged videos are manually reviewed 24/7 and any videos that don’t belong in the app are removed. We’ve also been investing in new controls for parents including the ability to hand pick videos and channels in the app. We are making constant improvements to our systems and recognize there’s more work to do.

Nadine Kaslow, a past president of the American Psychological Association and professor at Emory University School of Medicine, told the Post that simply taking down the videos isn’t enough. “For children who have been exposed, they’ve been exposed. There needs to be messaging—this is why it’s not okay.” Vulnerable children, perhaps too young to understand suicide, may develop nightmares or try harming themselves out of curiosity, she warned.

Suicide is the third leading cause of death among individuals between the ages of 10 and 24, according to data from the Centers for Disease Control and Prevention. However, more youths survive suicide attempts than die. Each year, emergency departments nationwide treat self-inflicted injuries in 157,000 youth between the ages of 10 and 24. Sixteen percent of high-school students reported seriously considering suicide in a nationwide survey.

Suicide tips stashed in otherwise benign cartoons are just the latest ghastly twist in the corruption of kids’ content on YouTube and YouTube Kids. For years, the video-sharing company has struggled with a whack-a-mole-style effort to keep a variety of disturbing and potentially scarring content out of videos targeting children. Videos have been found with adult content ranging from foul-language to depictions of mass shootings, alcohol use, fetishes, human trafficking stories, and sexual situations. Many contain—and attract clicks with—popular cartoon characters, such as Elsa from the 2013 animated Disney film Frozen. This chilling phenomenon has been referred to as Elsagate. Though YouTube has deleted channels and removed videos, Hess points out that it’s still easy to find a plethora of “horrifying” content aimed at children on YouTube Kids.

Last week, YouTube lost several advertisers, including Fortnite maker Epic Games, Disney, and Nestle, over a “wormhole into a soft-core pedophilia ring.”

Exit mobile version