Inside Facebook: how your secret algorithm works

Facebook HQ in Menlo Park, California.

On a blackboard within the extensive Menlo Park campus on Facebook in San Francisco, Sara Su is drawing a diagram to explain exactly how her news supply algorithm decides what she sees.

Until recently, telling all this to a journalist would have been unthinkable.

The algorithm looks for what users relate to – "signals", the company's news feed product manager calls them – what publications people and their circle members do & # 39; like it & # 39; ;, share or comment.

"We can use these signals to make a series of predictions," she says.

"From all these signals and predictions, we come up with a number that predicts how relevant this story is to you."

How the Facebook news source works

SBS News

She is frank about why Facebook is showing everything: "Trust can be lost in an instant, but it takes time to rebuild."

The revelation earlier this year that the political consultancy Cambridge Analytica had collected information from Facebook users, saw latent resentment over false news, electoral manipulation and the ethics of Facebook's business model bursting into a public relations stake for the giant of social networks.

The scandals provoked anger around the world, younger users rejected and caused the value of Facebook shares to plummet by almost 20 percent.

That fall reflected the doubt; Would people still willingly share their lives on Facebook?

Facebook HQ in Menlo Park, California.

Facebook HQ in Menlo Park, California.

SBS News

"Trust is the real currency on which these platforms depend," says analyst Paul Verna of E-Marketer, which oversees Facebook's fortunes.

"Because without that, you can not build the social network, and you can not attract advertisers."

Trust is the real currency on which these platforms depend.

Facebook's response to the crisis has been a concerted push to better explain itself: open up to the world about how things work as part of its drive to regain user confidence.

It has never been more important as the first mid-term elections of the United States under President Donald Trump are approaching.

& # 39; Meaningful social interaction & # 39;

Sara Su explains a change that Facebook CEO Mark Zuckerberg ordered even before the Cambridge Analytica scandal: rethink the algorithm of delivering news away from the content that attracts attention, and towards what Zuckerberg called & # 39; Meaningful social interaction & # 39;

"[In the algorithm] We are pondering stories that are more likely to provoke conversations, "says Ms. Su.

"That will make them get a higher relevance score, and they'll start appearing earlier in the people's news."

Facebook CEO Mark Zuckerberg

Facebook CEO Mark Zuckerberg testifies before a US Senate Committee in April.

AAP

The opening is part of Facebook's public relations offensive, aimed at combating what Stanford University social networking professor Jeff Hancock calls "popular theories": the decisions of users based on often incorrect interpretations of how they work things.

"If people have an idea of ​​how the system works, but in fact the system works in a totally different way, then people are going to stop using that product, like Facebook news," Professor Hancock told SBS News. .

But the last fortnight has brought instructive news for Facebook about how long it can take to rebuild its reputation.

A new Pew Research Center survey shows that 53% of adult Facebook users in the United States still do not understand how it works. That confusion has often provoked criticism that Facebook has been too slow to remove incorrect or hateful material.

53 percent of adult Facebook users in the United States still do not understand how it works.

It has also allowed conspiracy theories to thrive, as a viral deception that tells users that the algorithm change limits their feed to the publications of 25 friends.

The Pew survey also reveals that almost three quarters (74 percent) of American adult users say they have deleted their Facebook accounts or reduced their use in the last 12 months.

What Facebook knows about you

Having recognized their failure to detect and stop the coordinated interference of Russia in the 2016 presidential elections, part of Facebook's challenge has been to accept the enormous responsibility involved in being a platform for 2.2 billion active users, deciding what they can say. And why not.

According to its drive for transparency, the company has published the guidelines that its reviewers follow, so that people know how Facebook evaluates the questionable material.

Separately, it points to significant expenditures on content review teams and machine learning tools capable of marking and removing suspicious material.

Those investments are paying off, says Monika Bickert, Facebook's vice president in charge of the content rules, and tells SBS News that she astonishingly eliminated 583 million false accounts in the first quarter of 2018 alone.

Facebook HQ in Menlo Park, California.

SBS News

Ms. Bickert says that Facebook sees misinformation as a "global problem", recognizing the use of the platform by groups that incite violence against Muslims in Sri Lanka and Rohingya Muslims in Myanmar.

Criticized for failing to respond to complaints from minority groups, Facebook has now hired more content moderators who can understand local languages ​​and eliminate objectionable material.

The mid-term challenge is coming

As the mid-year elections near the United States approach, the main figures associated with Facebook warn that it faces increasingly sophisticated attempts to sow disinformation and division.

"The techniques that were used in the electoral interference in 2016 are probably just a percentage of what is to come," said Facebook co-founder Mike Krieger recently.

"But at least what I'm seeing, internal on Instagram, internal on Facebook, is that people are thinking about this a lot more than it was in 2016."

In fact, Facebook has made several recent announcements announcing the detection and removal of "inauthentic" Russian and Iranian websites.

"We are definitely improving to find this type of activity and stop it," says Ms. Bickert.

But he insisted on whether users are responding to the company's transparency drive, but acknowledges that public opinion remains skeptical.

"In terms of communication with people, I think there is more we have to do," he says.

"We are trying to get involved every day, we are trying to convey the message that we take it seriously."

The harshest critics of Facebook believe that the company is reaping what it sowed, seeing its current problems as the consequence of continuing to grow at all costs.

But inside, there is still a firm belief that by recognizing bad actions and working to compensate for their mistakes, Facebook can recover.

"It is always difficult to read the negative press about the work you are doing," says Ms. Su, in front of her blackboard.

"But I think that really underscores the importance of this work for all of us on this team, and that helps us move forward.

"We are in it in the long term."