An easy thing Facebook must do in Myanmar

In March, human rights scientists from the United Nations found that Facebook had played a role in spreading murder speech in Myanmar, which burned ethnic violence, which has spurred more than 650,000 Rohingya Muslims to flee Myanmar's Rakhine state to neighbor Bangladesh. The report, opposed to growing concerns about the way in which social networks can cause violence, contained some of the most serious charges that were offset against Facebook to date.

Chastened by the UN's findings, Facebook ordered a separate survey – as it was published in the evening before the US mid-term, how few people would be aware. The report, conducted by nonprofit Business for Social Responsibility (BSR), is a 62-page document describing that the dimensions of Facebook's challenge in Myanmar are of the opinion that there are solutions to mitigate it.

After the dust from the midterms more or less cleared, I read the report. And while I spend more time reading hot, than non-profit committees, I was struck by the extent to which a report called an "human rights impact assessment" does so little to assess Facebook's impact on human rights in Myanmar.

The authors report to talk with about 60 people in Myanmar for their report, but they fail to investigate any specific cases of hate speech on the platform or the resulting injuries. Their analysis is limited to high level, who can really say equal. Its approach to understanding the situation on Earth in Myanmar seems to be primarily anecdotal, and its conclusions are the same as anyone reading a news story about the issue in the spring.

"Although the actual relationship between content posted on Facebook and offline damage is not fully understood, Facebook has become a means for those who seek to spread hatred and cause harm and posts have been linked to offline violence," the authors write in One of many cases where the passive voice serves paper over their refusal to investigate.

I began to read the report in the hope that it would clarify the link between hate speech that was published on social media and violence in reality. We starve for knowledge of how unique platform mechanics like part buttons and encryption contribute to lynch mobs. But instead, the authors choose to explore the current political dynamics of Myanmar to a large extent, and eventually, Facebook offers a task list of tasks that will let the company continue to operate with minimal disruption of its business.

Most reports generated by consultants are destined to run into eternity inside a neglected disappointment, and BSR's contribution to the Myanmar situation deserves a similar fate. (Nonprofit did not respond to a request for comment Friday afternoon.)

Fortunately, this week we have another report on Facebook and Myanmar – and this one I thought was much more useful. It comes from the UN Office for the High Commissioner for Human Rights. Unlike BSR, the UN report asks why Facebook would come into Myanmar – or any other country in conflict – without first understanding how it would moderate the content on the platform. They write:

Before entering a new market, especially those with volatile ethnic, religious or other social excitement, Facebook and other social media platforms, including messenger systems, must conduct profound impact assessments of human rights for their products, policies and operations based on the national context and take remedial measures to reduce risks as much as possible.

Instead, Facebook launched a country-specific version of Myanmar in 2015 and added it to its site-based Free Basics program a year later. Soon, the company had 20 million users in the country – despite the fact that the non-Burmese speaking moderators because of local language and Unicode's specialities had very little insight into what was happening on the platform.

The UN takes a broad overview of the situation in Myanmar. The specific effect of social media is limited to a few pages towards the end of an extremely comprehensive report. And yet, Facebook works as a context too much of what the authors write: In a 444-page report, Facebook is mentioned 289 times.

As the BSR, the UN recognizes that freedom of expression enabled on Facebook to contribute positively to Myanmar. But it also suggests that Facebook makes available examples of the hate speech it has removed from the platform, at least to a subset of researchers, so that its role can be understood better. This has consequences for privacy that should not be taken easily. But there is probably a middle ground.

Meanwhile, BSR and the UN have agreed on one thing and it is easy: Facebook should provide country-specific data on hate speech and other violations of the company's Community standards in Myanmar. We may not be able to say with certainty the extent to which social networks contribute to ethnic violence – but we should be able to monitor the blast in hate speech on our largest social network. Dehumanizing speech is so often the forerunner of violence – and Facebook, if it took its role seriously, could help as an early warning system.

Democracy

Google in China: When "Do not be Bad" met the Great Firewall

Mark Bergen provides Google's Dragonfly dilemma feature treatment:

Interviews with more than 18 current and former employees indicate that the company's difficulties stem from a part of failing to learn from mistakes that played a decade earlier when it first confronted the realities of China's economic and political power. This story is known to many at Google's headquarters in Mountain View, California, but mostly unknown beyond it. In an interview in September, Downey, 42, extends. "There is this utopian idea: technology will come in and people will take these tools, change their government and get their freedom," he says. "We tried that experiment and it did not work."

Sundar Pichai from Google: Technology does not solve human issues & # 39;

Here's an insignificant clumsy interview from Google's CEO, where he resembles the laws that allow an authoritarian, autocratic regime for Europe's true forgotten laws:

One of the things that I do not understand is that we operate in many countries where there is censorship. When we follow "right to be forgotten" laws, we censor search results because we comply with the law. I am committed to serving users in China. Whatever form it takes, I do not really know the answer. It is not even clear to me that search in China is the product we have to do today.

#GoogleWalkout update: Collective action works, but we must continue to work.

Here is the response from Google Walkout organizers to the company's concessions so far. In short: good start, but more action is needed:

Organizer Stephanie Parker said about the answer: "We demand a truly fair culture, and Google's leadership can achieve this by putting employee representation on the board and giving full protection and rights to contract workers, our most vulnerable workers, many of whom are black and brown women."

The long story behind Google Walkout

Marie Hicks looks at it (inspirational!) History of collective actions in the technology industry:

This may seem like a new development, but Google Walkout is permeated in a long story – both women are minimized and discriminated against in technology and women claim their power to force change. Like I wrote my book Programmed inequality, I found a lot of signs of discrimination throughout technological history, but I also saw that when women in technological battle return – especially by organizing or taking their work elsewhere – the effects are huge. For example, the woman's forced emigration from Britain's growing early computer industry resulted in British computing's premature decline. When they put their talents at work, they created multibillion dollars software companies. Their experience shows us that not only undervalued employees often have an unexpected amount of power, but they also offer a blueprint for what's coming in the US if the movement started by Google employees continues.

Facebook to end forced arbitration for sexual offenses

Here's more positive dropout from the walkout, from Doug MacMillan:

Facebook concludes its policy to demand that employee sexual harassment be claimed to be settled in private arbitration one day after Google rolled back a similar policy under increasing pressure from employees.

The rule change, which will allow Facebook employees to comply with these requirements in court, was announced in an internal mail to employees on Friday, says a spokesman for the company. The social network giant has also updated its interoffice dating policy to require any executive at a director level or over to reveal whether they are dating someone in the company.

Amazon Execs addressed concerns about Amazon Recognition and ICE at an All-Hands meeting

Amazon employees are still dissatisfied with Recognition, the controversial face recognition technology that it sells to law enforcement agencies. But leaders do not bend, reports Davey Alba:

Andy Jassy, ​​Managing Director of the company's cloud computing arm, Amazon Web Services, derided employee criticism of how Amazon has aggressively marketed its Amazon Recognition product to law enforcement agencies across the country and the United States Immigration and Customs Enforcement (ICE) . "I think we need people who have very wide-ranging opinions, which is amazing, but we really feel great and really strong about the value Amazon Recognition offers our customers of all sizes and all types of law enforcement and outside industries law enforcement, "said Jassy. He added that he believed that it was the government's responsibility to help specify rules on technology.

The White House used a documented video to tell a lie

Bijan Stephen has two additional experts to say yes, the Acosta video was documented. And here's another from the AP.

Gab cries foul as Pennsylvania attorney general sue DNS provider

Pennsylvania's lawyer sent a daily money day to Epic, Gab's new DNS provider. It raises some (legitimate, in my opinion) First Amendment Concerns.

PayPal interrupts accounts used by the proud boys and antifa groups

In other de-platform news, PayPal closes accounts far to the left and far right today.

Elsewhere

Former Instagram Leader Systrom Talks about "unhealthy & # 39; Internet incentives

Kevin Systrom still does not want to tell anyone why he really left Instagram and it drives me literally. But he didsay that his next company will not be social. As I hope is a lie! If Systrom comes back with a SAAS solution for integrating Kubernetes with your CRM system or anything, I will turn every table to my office.

The most engaging Facebook publishers from October 2018

CNN and the New York Times took over Fox News to take the top two spots in Facebook's publisher ratings last month, NewsWhip reports.

Facebook, Amazon and Google: A pocket guide to break them up

Kaitlyn Tiffany talks to Columbia Law Professor Tim Wu about his new book to break up the big tech companies. He wants to start with Facebook:

WU: Well, I think because they do not face serious competition, there are a couple of issues. They have been able to get away with uncontrolled privacy crimes, they have been unhappy in how they have treated advertisers. They have curly policies, they've been manipulative, they've broken privacy too often. A large part of this is because they have not been subjected to effective competition. They are … it's not too big to fail, it's too big to be tolerated.

Vox's Matt Yglesias was sentenced on Twitter for his Tucker Carlson comments. Twitter allowed it.

Today in Twitter does not live up to its promises: A colleague of me will be doxed and the company does not respond to the original reports of harassment.

With lots of bad news, Snapchat has an early lead in wooing AR developers

Kerry Flynn reports that Snap headed more than 50 fortified reality developers recently with Lens Fest, a three-day workshop for creative types.

In defense of TikTok, the happy, slightly cringe-inducing spiritual successor to the Wines

Julia Alexander tours us through TikTok, an app she calls "Vine's spiritual successor."

The is Easy to roll your eyes on TikTok. That's what we usually do with any new app that draws in a largely young audience – people initially rolled their eyes on the wines too. It can be difficult to embrace an entire ecosystem of young creators whose communities are built around content designed exclusively for their own entertainment. TikTok succeeds with outrageous stunts, new music and niche interests – but it also makes such an inviting place to hang out.

YouTube is banning streams that killed Red Dead Redemption 2 feminist

I would like to hear from subscribers interested in content moderation about this. A guy submits a clip of himself, playing a popular new Western game where he attacks and eventually kills a non-playable suffragette. YouTube banned him first and therefore forbidden him. It never said why Patricia Hernandez reports:

It is unclear how YouTube just decided to end the channel and what the apple process for this whole debacle was. For example, YouTube has just taken another look because people with clout did a ruckus over it? If so, have smaller channels a way to challenge unfair platform decisions that are as fast and effective if they do not have a big enough microphone to broadcast it? The YouTube spokesman did not provide these answers The rim, but Wyatt notes on social media that he often looks at things that people tell him about on Twitter and urged people to talk to the official Team YouTube account "with as much supportive documentation as possible!"

Fan fiction site AO3 deals with a separate speech discussion

Our own archive, or AO3, is a prominent home for fanfiction online. It has a violent debate about sex and content moderation, reports Elizabeth Minkel:

Across the internet, platforms and their users break the digital speech shouldprotected and the potential connections between rhetoric and action. On Facebook, Twitter, Reddit and YouTube, the technical companies fight with where you can draw lines of freedom of expression and how to measure and enforce these limits. However, to restrict speech in fiction, it is a little more nuanced: do TV shows about serialists encourage people to commit murder? Describes fictional rape rapists in reality?

When it comes to fan fiction, arguments are usually about sex, not violence. When fan-fic readers and writers make moral arguments about excluding sex acts, they talk about irrationality and all its legal precedents. And because the fight for the legitimate legitimacy of fanworks – which, if they are non-commercial – is protected under fair use – has been such a challenge, it is even harder to moderate content within fan-fic when you still have to defend the cultural belief that fan-fic has value in and in itself.

launches

Facebook is launching a TikTok competitor app called Lasso

Facebook's murder clone of TikTok is now delivered in the App Store.

Vine successor Byte launches next spring

The final return of Vine has seen enough stops and starts to make it Chinese democracy of social networks. The last twist is that what was once known as V2, now called Byte, and founder Dom Hofmann says we can look forward to seeing it next year.

take

Facebook stopped Russia. Is it enough?

Max Read, echo other commentators, says Facebook's real problem these days is that the trolls are now running the Russian playbook to spread polarizing information and it's unclear what the company can do about it:

The push of false news and Russian trolls represent essentially an engineering issue – they are bad actors whose flaw is predefined in specific and identifiable ways – and Facebook is very good at solving technical issues. But Americans who exercise our US prerogative to distribute material accusing the democratic presidential candidate of mastermind-coded satanic sex rings is an all-over problem. Facebook can not stop the freedom of our bullshit without dramatically changing its operating philosophy (by making truthfulness about the content that users send), its business practices (by hiring a large army of new employees to make these reviews) may be all its design ( by leaving free accessible attention on the table). You can not put 80 percent of the country on a communication platform, reward them for sending outrageous content, and expect everyone to immediately check their status updates.

So I sent my mother the new Facebook Portal

Josh Constine's mother loves her new Facebook Portal so much and she does not care about all the privacy that the technical journalists always run their mouth about:

"Who should I be worried about? Oh Facebook see? No, I'm not worried about Facebook to see. They are going to look at my great art collection and say they will come steal it? No, I have never thought of that. "It's my 72-year-old mother, Sally Constine's answer, if she's worried about her privacy now, because she has a Facebook Portal video chat device.

Facebook Portal Notification: Why I did not put Facebook's camera in my home

On the other hand, Joanna Stern did not feel good about bringing Portal into her own house. (She tested it at work instead.)

I've had one of Facebook's FB -2.12% new video call gadgets, Portal +, in my home for the last week. And at "in my home" I mean in the basement in a closet in a box in a bag that is in another bag covered with old coats.

I just could not make myself create Facebook's camera-hung screen in the privacy of my family's home. Can you blame me when you look at the last 16 months?

And finally …

Mark Zuckerberg trolls Harvard students in a Facebook meme group

This story is basically just a merger of a big tweet by Taylor Lorenz. (Not Taylor Swift, as I randomly referred to her earlier this week.) However, it appears that Zuckerberg, as part of his embrace of Facebook Groups, has decided to join Harvard's undergraduate meme community – Harvard Memes for Elitist 1% Tweens. When a student asked where the fellow dropout Bill Gates was, Zuckerberg answered "Hold on let me get him" – and then tag him into the room.

Gates has not yet answered.

Talk to me

Send me tips, comments, questions and Myanmar travel tips: casey@theverge.com.