Amazon UK

Friday, 1 March 2019

YouTube disables comments on videos with kids after reports of predatory behavior

YouTube will shut off comments on videos featuring “younger minors and videos with older minors at risk of attracting predatory behavior,” according to a statement from the company. It’s an effort to stamp out predatory behavior from viewers that included salacious notes in the comments section of videos featuring underage kids.

Last week, TechCrunch confirmed reports which first arose on Reddit about the existence of a soft-core pedophile ring that was communicating via YouTube’s comments section and disseminating videos of minors by gaming the company’s search algorithms.

YouTube creator Matt Watson flagged the problem in a subreddit, noting that he found scores of videos of kids where YouTube users are trading inappropriate comments and identifying timestamps to focus on below the fold. Watson denounced the company for failing to prevent what he describes as a “soft-core pedophilia ring” from operating in plain sight on its platform.

The reports brought condemnation from several businesses that advertise on YouTube (the company’s primary source of revenue). Disney, Fortnite maker Epic Games, McDonald’s and Nestlé Foods reportedly all pulled advertising from the site in the wake of the scandal.

“Over the past week, we’ve been taking a number of steps to better protect children and families, including suspending comments on tens of millions of videos,” a Google spokesperson said in a statement emailed to TechCrunch. “Now, we will begin suspending comments on most videos that feature minors, with the exception of a small number of channels that actively moderate their comments and take additional steps to protect children. We understand that comments are an important way creators build and connect with their audiences; we also know that this is the right thing to do to protect the YouTube community.”

The rollout of the new moderating tools will take several months, according to YouTube.

And while the company acknowledged the severity of the changes and the impact it may have on YouTubers, it said it was taking action to prevent the exploitation of minors on the platform.

A small number of known channels will be able to keep their comments sections up, but will be required to actively monitor them beyond simply using YouTube’s own moderation tools, the company said.

YouTube also is speeding up the launch of a new classification tool that can detect and remove twice as many individual comments as in the past — accelerating the automation of content moderation (which could, itself, have unintended consequences).

In a related move designed to protect children from abhorrent content, YouTube has terminated the channel FilthyFrankClips and several other channels that were reportedly instructing children on how to slash their wrists.

First reported in The Washington Post, the clips from the channel contained children’s videos spliced with content on self-harm, according to an initial report on the blog, PediMom.

As we noted in our earlier reporting, this isn’t the first time that YouTube has been identified as a haven for pedophiles hiding in plain sight:

Back in November 2017, several major advertisers froze spending on YouTube’s platform after an investigation by the BBC and the Times discovered similarly obscene comments on videos of children.

Earlier the same month YouTube was also criticized over low-quality content targeting kids as viewers on its platform.

The company went on to announce a number of policy changes related to kid-focused video, including saying it would aggressively police comments on videos of kids and that videos found to have inappropriate comments about the kids in them would have comments turned off altogether.

Some of the videos of young girls that YouTube recommended we watch had already had comments disabled — which suggests its AI had previously identified a large number of inappropriate comments being shared (on account of its policy of switching off comments on clips containing kids when comments are deemed “inappropriate”) — yet the videos themselves were still being suggested for viewing in a test search that originated with the phrase “bikini haul.”

YouTube addressed its creators earlier today in a blog post telling them about the steps it was taking.



from TechCrunch https://ift.tt/2XqYLgp
via IFTTT

No comments:

Post a Comment

CPA Cash Machine