<pre><pre>YouTube's new policy brings educators, journalists and activists to the crossfire

YouTube teams have removed more than 100,000 videos & 17,000 channels since the company made changes to its hateful content policies in June.


That number is about five times as high as the company's last quarter, according to a new YouTube blog post about the company's attempts to tackle a growing number of hateful and dangerous videos on the platform. This also includes doubling the removal of responses (more than 500 million) that turned out to be hateful. Some of these channels, video & # 39; s and responses are old and have ended after the policy change, according to the blog post. This could explain the peak in removal numbers.

YouTube usually relies on machine learning tools to capture hateful videos before they are available online. About "80 percent of those automatically marked videos were deleted before they received a single view in the second quarter of 2019," the blog post reads.

"We have removed harmful content since the start of YouTube, but our investment in this work has been accelerated in recent years," the blog post reads. "In the past 18 months we have reduced the number of watched videos & # 39; s later deleted for violating our policies by 80 percent and we are constantly working to further reduce this number."

Yet it is difficult to determine what those numbers amount to without the right context. More than five hundred hours of video is uploaded to YouTube every minute. There were more than 23 million YouTube channels in 2018, according to analysis agency Social Blade. YouTube also has nearly two billion monthly logged-in users. Gross numbers and a few percentage points without the extra context do not provide a clear picture of how much of the problem YouTube can now tackle.

The blog post shows how long the policies and product teams of YouTube have tried to combat hateful activities. A new timeline produced by YouTube, shown below, shows various efforts of the YouTube teams to combat harmful, hateful and disturbing content. It goes back to November 2016, when disturbing video & # 39; s aimed at children goods found by journalists.

Late 2016 and early 2017 is often seen as one of the earliest tumultuous periods of YouTube in its battle for contain, combat and prevent dangerous videos. In early 2017, journalists' reports also indicated that YouTube had it become a hotbed for terrorist content. Between then and now, YouTube has had to deal with global public control for it Allow videos & # 39; s that are harmful to society to live on the platform.


Part of YouTube's approach to addressing this growing problem is the use of an Intelligence Desk to monitor what other people see. The agency was launched in January 2018 – the same month that vlogger Logan Paul uploaded a video of a dead body that was seen millions of times before it was removed. The team "follows the news, social media and user reports to detect new trends around inappropriate content."

"We are committed to continuing to reduce exposure to videos that violate our policies," the blog post says.

That includes updating policies when needed. The YouTube policy team is currently working on updating its harassment policy, including and in particular to address harassment from maker to maker. A popular topic among YouTubers, harassment from maker to maker became a much larger conversation in June then Vox personality Carlos Maza detailed conservative expert Steven Crowder & # 39; s use of homophobic language when talking about Maza. YouTube has revoked Crowder's advertising privileges, but its channel has remained. CEO Susan Wojcicki addressed the controversy RecodeCodeCon, and reiterated that although the company did not agree with Crowder's language, the videos were considered acceptable.

(Disclosure: Vox and Recode are Vox Media publications, which also owns The edge.)

Removing harmful video & # 39; s is just one step that YouTube takes to combat problematic content on its platform. The company plans to release more information in the coming months on three other steps: reducing the spread of content, increasing more authoritative votes, and rewarding positive creators through advertising rights.