YouTube claims that crackdown on borderline content really works

After the recommendation algorithm has been changed to attempt to reduce the spread of "borderline" content – videos that are the boundary between what is acceptable and what violates YouTube's terms of service – YouTube has 70 percent viewing time shortened on that kind of video & # 39; s by non-subscribers.

Advertisements

More than 30 changes have been made to the way video & # 39; s have been recommended since January 2019, according to a new YouTube blog post explaining how the company is trying to tackle borderline content. YouTube does not say exactly what has changed, nor does the blog post provide an overview of how many videos were recommended before and after the changes were implemented. Instead, the new YouTube blog post outlines how external moderators go through specific criteria to determine if a marked video is borderline. This information is then used to inform machine learning tools that YouTube relies on to monitor the platform.

"Each evaluated video receives up to nine different opinions and some critical areas require certified experts," the blog post reads. “For example, doctors provide guidelines for the validity of videos about specific medical treatments to limit the spread of medical disinformation. Based on the consensus input from the evaluators, we use well-tested machine learning systems to build models. "

Some of the criteria that moderators look through have been demonstrated in one recent interview with YouTube CEO Susan Wojcicki on 60 minutes. Wojcicki walked reporter Lesley Stahl through a couple of videos that might contain border content. A video that Wojcicki considered violent was aimed at Syrian prisoners, but was allowed to stay because it was uploaded by a group trying to expose problems in the country. Another video used images from the Second World War and, although many may regard this as acceptable for the historical context, Wojcicki showed how hateful groups could use it to spread white supremacist rhetoric. It was forbidden.

YouTube has recently changed its hate policy to address issues such as white nationalism, which are now considered a violation of YouTube's terms of service. People could believe that by stating that any supremacist statement could lead to a ban. That is not necessarily true. Wojcicki defended Stahl's point of view and YouTube's view that the content of a video is judged by context, and added that if a video simply said "white people are superior" without any other context, this would be acceptable.

"Nothing is more important to us than ensuring that we live up to our responsibilities," the blog post adds. "We remain focused on maintaining that delicate balance that allows different voices to flourish on YouTube – including those that others will disagree with – while protecting viewers, creators, and the wider ecosystem from harmful content."

Part of YouTube's approach to the problem is to find more authoritative sources for topics such as "news, science, and historical events where accuracy and authority are central." The YouTube teams are trying to do that by tackling three different but related issues: more authoritative sources such as The Guardian and NBC when searching for news topics, providing more reliable information during breaking news events, and providing additional context to users in addition to the video & # 39; s.

Advertisements

That means that when going to topics like & # 39; Brexit & # 39; or & # 39; anti-vaccination & # 39; is sought, the top results must show videos from reliable, authoritative news sources – even if the involvement is lower than with other videos on the subject. By doing this while breaking news events such as mass shootings or terrorist attacks, YouTube has "seen that consumption on the channels of authoritative news partners has increased by 60 percent."

It's good to see that YouTube is fighting this type of problematic content. The problem is that it is not clear in this new blog post – or in another public interview that Wojcicki and executives have given – what those numbers generally translate to. A 70 percent decrease in the number of people viewing borderline content from channels to which they are not subscribed is important; it recognizes the rabbit hole effect that journalists, academics, and former YouTube engineers have cited for years. The question remains whether that still translates into a considerable number of viewing hours. The YouTube blog entry does not contain.

"Content that comes close to – but does not completely exceed – violating our Community Guidelines is a fraction of 1 percent of what has been viewed on YouTube in the US," the blog post says.

There is 500 hours of content uploaded to YouTube every minute. That is 720,000 hours of content every day. It would take 30,000 days to watch every uploaded video on YouTube in just one day. It is a lot of video – much of it is viewed in the United States. A decrease in the number of people viewing borderline content is good, but until YouTube releases specific numbers, it's hard to judge what that really means.

- Advertisement -