Skip to main content

YouTube has finally announced that it will start cracking down on the countless videos that have amassed millions of views featuring family cartoon characters and superheroes in violent, sexual or other inappropriate scenarios, which target unsuspecting children.

As The Free Thought Project first reported in June, comedian Daniel Tosh raised concerns with objectionable content found on YouTube. Tosh revealed how the "Seven Super Girls" channel—while it is geared to teenage content creators—likely also serves as eye candy for pedophiles looking to indulge in streaming videos of real kids in compromising situations.

It seems YouTube finally got the message and is preparing to take action, although that action seems clouded by the fact that regular users of YouTube must now flag content that is inappropriate before it can be age-restricted.

The company says it took action following allegations YouTube engages in "infrastructural violence" towards children, by creating and maintaining a platform where videos that appeal to children (cartoons and superheroes) can then be hijacked and violent videos can be produced using those same characters. Sometimes the entire legitimate video—such as Peppa Pig—are dubbed over with objectionable commentary and re-released. YouTube now seeks to redirect such videos into an age-restricted section of its database.

As the Guardian noted, on Monday, in a widely-shared article, campaigning technology-focused artist and writer James Bridle detailed the vast industry of low-quality, algorithmically-guided children’s content created for youtube. “Someone or something or some combination of people and things is using YouTube to systematically frighten, traumatize, and abuse children, automatically and at scale, and it forces me to question my own beliefs about the internet, at every level,” Bridle wrote.

While YouTube taking action to redirect content that could be damaging to children is a positive action, the move by the company leaves us scratching our heads and asking more follow up questions. For example, why doesn't the company simply remove the objectionable videos from its lineup, as it does when it discriminately censors certain political and military videos. And will the creators have their channels demonetized so that they can no longer continue to earn money from the violent content?

As TFTP's Matt Agorist described, while it may be safe to assume right-wing extremist videos are the only ones being targeted for demonetization, the truth is alternative media outlets have been targeted as well. Activists on both the left and the right have seen their channel's ability to earn money removed. Libertarians were targeted as well. As Psychology Today's editor told WSWS:

Scroll to Continue

Recommended for You

"This is political censorship of the worst sort; it’s just an excuse to suppress political viewpoints,” Robert Epstein, a former editor in chief of Psychology Today and noted expert on Google, told wsws."

Military content and activist produced videos have also been targeted for removal. Instead of allowing such videos to be seen by the world, for the citizenry of the world to create its own documentary, the Anti-Defamation League was tapped to be the ones to decide which videos are removed from YouTube and Google.

Applauding their decision to remove objectionable videos is ADL's CEO Jonathan A. Greenblatt who said:

"The fight against terrorist use of online resources and cyberhate has become one of the most daunting challenges in modern history...Google has been a leader in this area from the beginning. The reality is extremists and terrorists continue to migrate to and exploit various other social media platforms. We hope that those platforms can learn from and emulate what YouTube is doing to proactively identify and remove extremist content."

As good as YouTube and Google can be, however, at removing objectionable videos from the internet, extremists only have to turn to other platforms such as Liveleak and Livestream, among others, two get their videos out there.

The task of staying one step ahead of terrorists' ability to upload videos will likely be an impossible one. The other more reasonable option is to simply leave them up, or move them to an age-restricted section of YouTube, for example. However, "Why isn't YouTube, instead of deleting objectionable videos, simply moving them to an age-appropriate section like they're doing with kid content?" The answer to that question may be given in one simple word, censorship.

As Agorist concluded:

"If ever there were a time for people to get over their political, religious, racial, sexual, or any other differences — it is now. While we may disagree on different issues, if any one group is allowed to be silenced, it is only a matter of time until everyone else is silenced too."