Advertisements
Is Facebook ready for 2020?

This week on the interview episode of The Vergecast, Nilay Patel chats with Alex Stamos, director of Stanford & # 39; s internet observatory and former chief security officer for Facebook.

Advertisements

Stamos was on Facebook during the Cambridge Analytica scandal, so the discussion covers much of what was going on on Facebook and how the Facebook scandal has changed since then. They also address the trade-offs that large platforms must make between issues such as end-to-end coding, collaborating with law enforcement officers, protecting users against bad actors and what the threats are to platforms in the coming elections. Below is a slightly edited excerpt of the conversation.

Nilay Patel: Of course I want to talk to you about Facebook. But we just had Michael Bennet, senator from Colorado, on the show and he wrote a book about election security. When you think of Facebook, this was the center of election interference in 2016, basically like posting Russians' memes on Facebook. Do you think we are now ready for 2020? Because Bennet really didn't think we were ready.

Alex Stamos: Yes, so I met Senator Bennet and we talked a lot about things like this. We have just released a report from our group in Stanford. You can go to electionreport.stanford.edu if you want to see it, but we have about 40 recommendations for how Congress, technology companies and individuals can prepare for 2020. If we look at 2016, there are actually three or four different types of interference by the Russians. So you have what you refer to, those are the online meme wars, which are usually on Twitter and Facebook. You have the campaign to break into Podesta & # 39; s email, break into the DNC and then leak out information in a way that changed the overall information environment to the detriment of Hillary Clinton. There is the open propaganda campaign. So there is Russia today and Sputnik and such, and then there were direct attacks on the election infrastructure.

So I think our response as a society has been different for those four different lanes. So the sort of meme lord stuff, I think that's actually what we are best prepared for, in the sense that the responsibility there sort of lies neatly with the technical platforms, and they've done things. The big difference between now and 2016 is that the organized government propaganda in 2016 was nobody at the tech companies. So I inherited this kind of heredity as an issue that I had to lead the team working on, because we had an intelligence team whose job was to look for governments that did bad things online. That was based on a very traditional idea of ​​what government interference is online. Takeover of malware accounts, suppression of dissidents. It didn't include, you know, hot takes and edge lording by people posing as Black Lives Matter activists who were actually in St. Petersburg. So much has changed … That is now a whole kind of trust and security subfield that is now being invented in places like Google, Twitter and Facebook, and there are people whose job it is to do that.

And then the government has responded. There are people within the government whose job it is to work on these issues. I think if you made some kind of a big look at 2016 as a society, we had this big blind spot because it really wasn't his job to follow the Internet Research Agency and the other types of online propaganda outlets because it wasn't was considered a traditional part of cyber security. And now it's the NSA and Cyber ​​Command. There are people working on this, there is a foreign influence team at the FBI, there are people working at DHS and they work together with the people in the company, so you know whether the precautions are great or not. This is now at least a field that people focus on and that was completely untrue three years ago.

How is Facebook doing and how are they prepared for the elections?

Advertisements

So I think these people are reasonably well prepared for what happened in 2016. It has the widest set of advertising transparency. Unlike Google, Facebook considers problem ads as political ads. I think this is a very important step because, according to our review on Facebook during our investigation, about 80 percent of the Russian ads they had were not illegal under US law because they do not participate in elections. So Facebook actually takes a much broader definition of what an unacceptable political ad is than Google. I think Facebook has the largest team.

I think the most difficult thing for Facebook will be trying to predict how the non-Facebook products will be used. So Instagram has some of the same problems as Twitter, because you can have a pseudo-anonymous identity on Instagram. The fact that Instagram is primarily images offers some advantage, but not tons. As you know, the Russian troll factories have professional meme farms. As if graphic designers use Illustrator all day long to create memes. So "is Instagram ready?" Is actually a big question. I'm guessing that Instagram is far behind what happened on Facebook.com. And then the use of WhatsApp – WhatsApp: number one source of disinformation in Southeast Asia. Will WhatsApp, with its end-to-end encryption, be used in the same way in the United States? It seems unlikely in 2020, but after 2020 when people move to those platforms, I think they will become a problem.