As the mainstream media catches up to the fact that YouTube has become a breeding ground for pedophiles with hundreds of questionable videos — with billions of views — and thousands of sexual comments that face little oversight, the platform’s moderators are claiming that its system for reporting abuse has not been working properly for over a year.
A group of moderators told the BBC that YouTube has allowed at least 100,000 predatory accounts to leave inappropriate comments on videos with no repercussions as “YouTube’s system for reporting sexualized comments left on children’s videos has not been functioning correctly for more than a year.”
The moderators, who are referred to as YouTube’s “Trusted Flaggers,” are charged with flagging inappropriate content and reporting the users who are violating the platform’s policies. However, they claim that in many cases, the accounts they report face no consequences.
An investigation conducted by BBC reporter Elizabeth Cassin looked into the sexual, lewd and other inappropriate comments that are left on YouTube’s videos for children—even after those accounts are reported by moderators.
“Although the videos themselves are completely innocent, there are attempts from adults to collect personal information from children, and requests for them to remove clothing,” Cassin said. “These are a clear violation of YouTube’s child endangerment policies. So you might expect that comments like these would be removed immediately once reported—but no. It’s claimed that one key part of YouTube’s mechanism for reporting comments like these hasn’t been working properly for over a year, so some obscene comments directed at children have remained on the site.”
After making a list of comments they believed were in clear violation of YouTube’s child endangerment policies, Cassin’s team reported 28 accounts to YouTube. “Two weeks later, 23 of these accounts still remained on the site,” Cassin said.
In a statement replying to the accusations, YouTube said:
“We take child safety extremely seriously and have clear policies against child endangerment. We have systems in place to take swift action on this content with dedicated policy specialists reviewing and removing flaggers material around the clock.”
While YouTube claims that it is devoted to ending “child endangerment” on its platform, the fact is that the adults who fill the comments section of children’s videos with inappropriate and suggestive language, are not the only thing putting children in danger.
As The Free Thought Project reported in June, there are also dozens of videos on YouTube advertised as harmless videos for children that contain everything from violent, bloody scenes with Mickey and Minnie Mouse, to sexually suggestive scenes with Spiderman and Elsa.
There are also a number of videos that feature young girls suggestively eating cream pies, or being taped to their beds—while these may be advertised as harmless videos for children, they also act as “free candy” for sick pedophiles. These accounts have millions of subscribers and these videos have hundreds of millions of views, indicating that the creators are likely making an impressive profit.
All the while, the videos YouTube is actually removing and the creators who are facing consequences are the ones who attempt to document war crimes committed by the United States. YouTube has begun labeling videos that show the U.S. launching drone strikes that have killed innocent civilians as “violent or graphic content,” all the while turning a blind eye to the thousands of videos that show violent and graphic content—and are marketed specifically for children.