WhatsNew2Day
Latest News And Breaking Headlines

More than 60% of people who joined extremist groups on Facebook had the pages recommended by algorithms

Facebook researchers learned as early as 2016 that 64 percent of all extremist group closings are due to their own recommendations, but executives, including Joel Kaplan, have killed all attempts to fix the problem, sources said.

Investigations at the social media giant in 2016 and again in 2018 uncovered a troubling trend linking the platform’s recommendations to extremist views on the site.

But despite researchers devising various solutions to address the problem of extremism, no action was taken.

People familiar with the matter told The Wall Street Journal that the decision to reject the recommendations was largely due to Facebook VP for policy and former George W. Bush official Joel Kaplan, who gave Brett Kavanaugh a party when he was appointed a Supreme Court Justice in the midst of sexual assault allegations in 2018.

The sources said executives, including Kaplan and Mark Zuckerberg, chose not to comment on the findings in question because they were already criticized for being biased against the right and worried about “paternalism.”

They added that Republican Kaplan had the power to shut down plans.

Facebook researchers learned as early as 2016 that 64 percent of all extremist group closings are due to their own recommendations, but executives, including Joel Kaplan (pictured), have been killing all attempts to fix the problem.

Facebook researchers learned as early as 2016 that 64 percent of all extremist group closings are due to their own recommendations, but executives, including Joel Kaplan (pictured), have been killing all attempts to fix the problem.

In 2016, the company conducted a survey that revealed a disturbingly high percentage of extremist content and groups on the platform.

At the time, Facebook researcher and sociologist Monica Lee wrote in a presentation that there was an abundance of extremist and racist content in more than a third of the major German political Facebook groups.

Lee also found that nearly two-thirds – 64 percent – of the time Facebook users joined extremist groups, the groups were recommended by the site’s algorithms, meaning that their own recommendation systems ‘grow’ the problem of extremism among social media users. .

Most of the participation activity came from the platform’s ‘Groups you should join’ and ‘Discover’ algorithms, she discovered.

Facebook then started new research in 2017 into how its social media platform polarized the views of its users.

The project was led by Chris Cox, Facebook’s then lead product manager, who led the Common Ground Task Force.

It revealed that the social media platform fueled conflict between its users and increased extremist views.

It also showed that bad behavior among users came from the small groups of people with the most extreme views, with more accounts on the far right than the far left in the US.

A page shows Facebook recommendations to other extreme groups to users in one group. Research by the social media giant in 2016 and again in 2018 uncovered a troubling trend linking the platform's recommendations to extremist views on the site

A page shows Facebook recommendations to other extreme groups to users in one group. Research by the social media giant in 2016 and again in 2018 uncovered a troubling trend linking the platform's recommendations to extremist views on the site

A page shows Facebook recommendations to other extreme groups to users in one group. Research by the social media giant in 2016 and again in 2018 uncovered a troubling trend linking the platform’s recommendations to extremist views on the site

The relevant findings were released in an internal presentation the following year.

“ Our algorithms harness the appeal of the human brain to division, ” a slide from the 2018 presentation read.

“If not checked,” it warned, Facebook would feed users’ increasingly divisive content in an effort to grab the user’s attention and extend the time on the platform. ‘

Cox and his team offered several solutions to the problem, including building a system for digging out extreme content and suppressing clickbait around politics.

Another initiative called “Sparing Sharing” involved reducing the spread of content through what it called “hyperactive users” – who are very active on the platform and show extreme views on the left or right, the sources told the Journal.

But the efforts – and the investigation – were reportedly blocked by senior executives, including founder Mark Zuckerberg and Kaplan.

According to sources, Kaplan killed all attempts to “ paternalistically ” change the platform of the movement and raised concerns that it would affect primarily right-wing social media users, the Journal reported.

Kaplan has also reportedly pushed back by saying it would harm a hypothetical ‘Girl Scout troop,’ meaning the most committed users would be unfairly affected, people familiar with his comments to the Journal said.

Kaplan (left) and Mark Zuckerberg (right) pictured together in 2018

Kaplan (left) and Mark Zuckerberg (right) pictured together in 2018

Kaplan (left) and Mark Zuckerberg (right) pictured together in 2018

“ We are explicitly not going to build products that try to change people’s beliefs, ” says a 2018 document, the Journal said.

“We are focused on products that enhance the empathy, understanding and humanization of the ‘other side’.”

This came at a time when the company was already under fire for allegations that it was politically biased against the right.

However, it also came at a time when Kaplan publicly rallied behind Kavanaugh – who was sworn in by President Trump in 2018.

In August 2018, while discussions about extremist content were new, Facebook employees would be furious to hear that the executive, who also worked for George W. Bush’s Republican government, held a party for the Conservative Supreme Court to him congratulations on getting the role.

This came after Kaplan supported the judge over all the accusations Christine Blasey Ford made against him for sexually assaulting her in high school.

The social media giant is still asking questions about possible political prejudices.

In May, it emerged that Trump is considering setting up a committee to review complaints about anti-conservative bias and censorship on social media, including Facebook, Instagram, Twitter and Google.

Facebook hit back at the news, while a spokesperson told it Wall Street Journal“People on both sides of the aisle disagree with some of the positions we have taken, but we remain committed to seeking out external perspectives and communicating clearly why we make the decisions we make.”

.

Comments
Loading...