Facebook’s VP of Global Affairs on the new News Feed controls

0

Facebook is not short of criticism: it spreads misinformation and hate speech, it is too polarizing, it is responsible for unraveling the fabric of society. The list continues.

This morning Nick Clegg, Facebook’s VP of Global Affairs, released a lengthy Medium post covering some of these criticisms and revealing some of the changes the company is making to give users more control over their experience. Specifically, the company is going to allow Facebook users to customize their feeds and how the algorithm presents content from other Facebook users to them. “People should be able to better understand how the ranking algorithms work and why they make certain decisions, and they should have more control over the content shown to them,” writes Clegg. “You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes – to change your personal algorithm in the cold light of day, using breathing space provided in the platform’s design. is built in. “

There is a lot to discuss there. And to help us unpack the mail, Clegg sat down Platform game editor and Verge contributing editor Casey Newton yesterday for a special episode of Decoder

In particular, Clegg doesn’t think Facebook is designed to reward provocative content, which is a new rebuttal to the company’s critics (and probably a surprise to anyone who has paid attention to their Facebook feeds). “The reality is that it is not in Facebook’s interest – financial or reputation – to constantly raise the temperature and push users towards increasingly extreme content,” Clegg writes in his post. Keep in mind that the vast majority of Facebook’s revenue comes from advertising. Advertisers don’t want their brands and products displayed alongside extreme or hateful content – a point many made explicit last summer during a high-profile boycott by some. famous brands. “

Fundamentally, Clegg’s argument is that the Facebook response is not rooted in fact or science, and that if it gets carried away, we’ll never get the better version of the internet that many of us want, and he’s trying to get the debates on the terms and conditions of Facebook.

We leave it up to you to decide how successful he is.

Okay, Casey Newton with Nick Clegg, VP of Global Affairs at Facebook. Here we go.

Below is a slightly edited excerpt from their conversation. This post will be updated with a full transcript of the interview on Friday, April 2.

Most of us are not majors in math or computer science. And so there is a kind of fear and uncertainty about what is going on in the background. One of the things Facebook is doing right now is giving us some new ways to change what we see in the News Feed. So what are some of these new controls?

So some controls are old. We’ve had them for a while, but we’re just going to make them a lot more prominent. So you can always switch to a chronological feed. But to be honest, it was not easy for people to find. So we are now going to have a feed filter bar. When you scroll to the top of your feed, it is there. It will always be there and you can switch between the feed as it currently exists, to organize it chronologically, or crucially, and this is new so you can create your own new feed of favorites – from favorite groups, friends, posts, and so on. And you can manage that for yourself, if you want, and switch between those three – the feed as it is, the chronological feed, and your new favorites feed – in a much, much easier way.

It will be much more visible. It will be visible there when you scroll to the top of your feed. There are also other new controls, which I’m announcing this week. You can manage who can reply to your posts with much greater granularity than before. And that was something that was not available before. And we’re also going to expand on something that already existed for, say, ads and connected content. Namely, why am I seeing this? So you can go to the three dots and see, “Why am I seeing this ad?” We are now going to expand that to suggested content. So when you’re introduced to something, that cooking video, you can go on the three points, and you can see why you’re seeing that.

So I think the collective is a start. I’m not going to pretend that those changes on their own will take away all the questions people have about how social media works and how they interact with Facebook. But I feel like they’re important steps in a better direction, where users get more control, are more open and transparent about things, and we’ll be following some additional steps, more transparency, and more control in the coming months. .

Is that also a suggestion that this might be the start of this feed filter bar you are introducing, that more filters may be coming to it over time? Is the idea that users are gaining more and more control over how the things they see are arranged?

Yes. See, in an ideal world, you just want to push more and more forcefully in the direction where people can personalize their feeds. And if people want to see more or less of certain forms of content, of certain pages or groups, there is at least the possibility conceptually to investigate whether or not people can, if you want, push the button higher. or lower on certain content classes. That’s exactly the kind of work we want to do. Now, exactly how detailed, which watch faces exactly apply to what types of content, all of that has yet to be filled in. But that is exactly the direction we are going.

So the conventional wisdom about how the feed works now I think for a lot of people, and certainly from the people most critical of Facebook, is that it rewards the most polarizing and outrageous content. And this is something that you really incorporate into this piece and push against it. I suspect that if there is one sentence in your piece that most people will object to, it is when you write, “Facebook’s systems are not designed to reward provocative content.” At the same time, if we look at lists of pages that get the most engagement, they are usually pages that seem to push really polarizing content. So how do you reconcile this on Facebook?

First off, of course, I accept that we just need to provide more and more data and more evidence about what is the specific content that is popular on the news feed. And then of course, although Facebook’s critics often talk about sensational content dominating the news feed, we naturally want to show, as I think we can, that many of the most popular posts on the news feed are light-hearted. They are feel-good stories. We want to show people that the vast majority of the posts people see on the news feed are about pets, babies, holidays, and the like. No inflammatory topics. In fact, I think Monday one of the hottest messages in the US was a mother bear with three or four baby boys crossing a road. I saw it myself. It’s beautiful. I highly recommend you take a look at it. And I think we can and will do more to substantiate that.

But otherwise, I think, trying to wrestle with this as thoroughly as I can in a 5,000-word stretch. First, the cues used in the ranking process are much more complex, much more sophisticated, and involve a lot more checks and balances than implied by this cardboard cutout caricature that we somehow spoon-feed people burning sensational stuff. And I’d love to go into the details if you want, but it uses thousands of signals literally from the device you’re using to the groups you’re a member of and so on. We examine evidence. We use more and more research data. We will also do more of that in the future to ask people what they find most meaningful. There’s been a big shift in recent years anyway to reward content that makes more sense, your connections with your family and friends, rather than things that are just grossly engaging – politician pages and personalities and celebrities and sports pages and so on.

So that shift is already underway. But in terms of incentives, this is the bit that we may not have been vocal about enough. First, the people who pay for our lunch don’t like the contents besides flammable, obnoxious material. And if you needed even more proof, last summer a number of major advertisers boycotted Facebook because they felt we were not doing enough on hate speech. We got much better at reducing the prevalence of hate speech. The prevalence of hate speech is now due to, what? 0.07, 0.08 percent of the content on Facebook. So every 10,000 pieces of content you see, seven or eight could be bad. I wish it had been reduced to zero. I don’t think we’ll ever cut it to zero. So we have tremendous incentive to do that.

But even when you think about it, if you build a product that you want to survive in the long term, where people will still use these products in 10 years, in 15 years, in 20 years, there really is no incentive for the company to attract people. give the kind of sugar rush of artificially polarizing content, keeping them on board for maybe an extra 10 or 20 minutes. Now we want to fix it for 10 or 20 years, not 10 or 20 extra minutes. And so I don’t think our incentives point in the direction that many people take.

That said, of course it’s true … any newspaper sub-editor will tell you. That’s why tabloids from time immemorial have had unused eye-catching images and cage-rattling language on their front pages. Of course there are emotions of fear, anger, jealousy, anger, which, of course, provoke emotional reactions. They have always done that in all media. And so, of course, emotional content elicits an emotional response in people. We can’t reprogram human nature, and we don’t want to deny that, which is why our CrowdTangle tool really gets into that and shows how things are involved. But as you know, there is a world of difference between what is most concerned with – in other words, where comments and shares are most common – and actually the content most people see. And that is quite, very different. Actually, if you look at what most people, if you look at eyeballs, instead of comments and shares, you get a very different picture.