Facebook has reportedly received complaints from political parties who said a major change in the news feed was driving them toward negative, polarizing posts. Today, The Wall Street Journal leaked reports posted from Facebook after it boosted “meaningful social interactions” on the platform. While Facebook framed the move as helping friends connect, internal reports said it had “unhealthy side effects on key areas of public content, such as politics and news”, calling these effects an “increasing liability”.
The news is part of a taller Wall Street Journal series based on internal Facebook research. Today’s report takes a closer look at the impact of a 2018 decision to prioritize posts with many comments and reactions. Facebook is said to have made the change after noting that comments, likes, and re-shares had declined in 2017 — something it attributed in part to people watching more professionally produced video. Public, CEO Mark Zuckerberg described it as a way to increase “time well spent” with friends and family rather than passive video consumption.
After the change, internal research yielded mixed results. Daily active users increased and users found content shared by close connections more “meaningful”, but re-shared content (which rewarded the change) contained “excessive” levels of “misinformation, toxicity and violent content”. People tended to comment and share controversial content, and in the process they made Facebook angrier in general.
A report signaled concerns from unnamed political parties in the European Union, including one in Poland. “Research in the EU shows that political parties ‘strongly feel that the change in the algorithm has forced them to be negative in their communication on Facebook, with the downstream effect of leading them into more extreme policy positions,'” says the. Facebook apparently heard similar concerns from parties in Taiwan and India.
In Poland, “the management team of a social media party estimates that they have shifted the proportion of their posts from 50/50 positive/negative to 80 percent negative, explicitly as a function of the algorithm change.” And “many parties, including those that have moved strongly to the negative, are concerned about the long-term effects on democracy.”
Unsurprisingly, news publishers — a frequent victim of Facebook’s algorithm tweaks — weren’t thrilled with the change, either. Facebook flagged that BuzzFeed CEO Jonah Peretti complained that the change was sowing issues of “junky science” and racial division.
Facebook regularly updates the news feed to promote different types of content, often responding clearly to public concerns and financial considerations. (The “time well spent” movement, for example, stigmatized “brainless scrolling” on social media.) Lars Backstrom, vice president of Facebook engineering, told the log that “as with any optimization, there will be some ways it is exploited or taken advantage of.”
But the log writes that when Facebook researchers suggested solutions, Zuckerberg was hesitant to implement them if they threatened to reduce user engagement. Ultimately, however, Facebook would reduce the importance of commenting and sharing with the News Feed algorithm – by giving more weight to what people actually said they wanted to see.