YouTube claims it keeps getting better at enforcing its own moderation rules

0

YouTube wants the world to know that it is enforcing its own moderation rules better than ever. The company says a growing number of people are seeing problematic videos on its site – such as videos that explicitly contain violence, scams, or hate speech – before being taken down.

In the last months of 2020, up to 18 of 10,000 views on YouTube were videos that violated company policy and should have been removed before anyone viewed them. That’s less than 72 of every 10,000 views in the fourth quarter of 2017, when YouTube started tracking the figure.

But the numbers have one important caveat: While they measure how well YouTube fares in limiting the spread of troubling clips, they are ultimately based on what videos YouTube believes should be removed from its platform – and the company still stands a number of clearly disturbing videos to keep up with.

The stat is a new addition to The YouTube Community Guidelines enforcement report, a transparency report that is updated quarterly with details on the types of videos being removed from the platform. This new number is called Violative View Rate, or VVR, and it tracks how many views on YouTube occur on videos that violate the guidelines and should be removed.

This figure is essentially a way for YouTube to measure how good it is at moderating its own site, based on its own rules. The higher the VVR, the more problematic videos spread before YouTube can catch them; the lower the VVR, the better YouTube is at eradicating banned content.

YouTube has created a chart showing how the figure has gone down since it started measuring the number for internal use:

A diagram from YouTube with VVR since the start of the measurements.
Image: YouTube

The steep decline from 2017 to 2018 came after YouTube began relying on machine learning to detect problematic videos, rather than relying on users to report problems, Jennifer O’Connor, YouTube’s product director for trust and security, said during a briefing with reporters. The goal is “to get this number as close to zero as possible”.

Videos that violate YouTube’s advertising guidelines, but not the general community guidelines, will not be included in the VVR rating because they do not warrant removal. And so-called “borderline content” that clashes with the rules, but doesn’t completely violate the rules, is also not counted for the same reason.

O’Connor said the YouTube team is using the figure internally to understand how well they are doing in protecting users from troubling content. If it rises, YouTube could try to figure out what types of videos are slipping through it and prioritize developing machine learning to capture them. “The North Star for our team is to protect users,” said O’Connor.