Is YouTube safe for kids? | The Tylt

Is YouTube safe for kids?

Every minute, 400 hours of content is uploaded to YouTube. With this kind of volume, some parents question whether or not the site can ever be truly safe for kids. In early 2019, one parent found a video on YouTube Kids with hidden instructions on how to commit suicide. Many parents feel that no matter how many parental controls the site rolls out, YouTube can never be completely safe. Others trust that YouTube does everything in its power to purge problematic material around the clock. What do you think?

FINAL RESULTS
Culture
Is YouTube safe for kids?
A festive crown for the winner
#YouTubeNotSafe
#YouTubeIsSafe
Dataviz
Real-time Voting
Is YouTube safe for kids?
#YouTubeNotSafe
#YouTubeIsSafe
#YouTubeNotSafe

In early 2019, one mother in Florida was horrified to find instructions on how to commit suicide spliced into a video on YouTube Kids, a section of YouTube dedicated to "family-friendly" content. The Daily Mail's Tim Stickings reported that the so-called "instructions" were hidden four minutes and 45 seconds into a seven-minute video.

In the clip a man signals with his hand on his wrist, giving specific advice on how to slit them to 'get results'.

 CNN's Doug Criss added that the mother, Free Hess, first saw this video in July of 2018. YouTube promptly took the video down, only for Hess to find it again in February. Upon further investigation, Hess was shocked to find more dangerous content on YouTube Kids:

When Hess went to YouTube Kids and started exploring the site, what she saw there shocked her. She said she found videos glorifying not only suicide but sexual exploitation and abuse, human trafficking, gun violence and domestic violence. One video, inspired by the popular "Minecraft" video game, even depicted a school shooting.
#YouTubeIsSafe

But according to YouTube, the site does everything within its power to prevent content like this from appearing on YouTube Kids and to take it down immediately if it does. YouTube tells parents:

We use a mix of filters, user feedback and human reviewers to keep the videos in YouTube Kids family friendly. But no system is perfect and inappropriate videos can slip through, so we’re constantly working to improve our safeguards and offer more features to help parents create the right experience for their families.

CNN reported on YouTube's statement following the controversy with the "instructional" video. The YouTube team understands that there are improvements to be made, and it is ready and willing to do the work to make sure parents feel totally secure. 

"We appreciate people drawing problematic content to our attention, and make it possible for anyone to flag a video," the statement said. "Flagged videos are manually reviewed 24/7 and any videos that don't belong in the app are removed.
#YouTubeNotSafe

Some say that YouTube's algorithm is also to blame. The Verge's Julia Alexander reported on comments from Matt Watson, a former YouTube content creator, on the subject. According to Watson, it is incredibly easy for predators to both sexualize innocent content via video comments and to assist other predators in finding explicit content. Watson explains:

“Youtube’s recommended algorithm is facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual child pornography in the comments,” Watson wrote on Reddit. “I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.”

400 hours of videos are uploaded to YouTube every minute, and 1.3 billion people use the site. In other words, YouTube remains a never-ending maze of content, making it almost impossible to expect zero foul-play on the site. From Elsagate in 2017 to the life-threatening "instructions" found in 2019, YouTube cannot guarantee safe content for kids.

#YouTubeIsSafe

Given the depth of content on the site, part of YouTube's strategy for monitoring is leaning on parents to report the questionable content they find. According to The Conversation

YouTube largely regulates itself...YouTube also uses computer programs to try to identify and classify problematic content, and complaints by users themselves...This last idea—of user complaints—depends on parents to be vigilant or to teach children to make these complaints themselves.

Although this might seem like a daunting task, there are a number of simple steps parents can take to help monitor the content their kids consume. CNET recommends things like turning off the search function for kids to counteract YouTube's algorithm, setting a custom passcode, and whitelisting certain channels and videos.

In 2017, YouTube's global head of family and learning content, Malik Ducard, maintained that in the past 30 days, “less than .005 percent of the millions of videos" on YouTube Kids contained inappropriate content. YouTube might have unlimited videos to sift through, but parents need only approve of the content they want their kids to see, rather than feeling the pressure of searching through it all to find the bad. If and when something inappropriate comes up, parents can report it immediately. As Ducard puts it, parents "are in the driver's seat."

FINAL RESULTS
Culture
Is YouTube safe for kids?
A festive crown for the winner
#YouTubeNotSafe
#YouTubeIsSafe