By now you have probably read enough about the disgraceful financier Jeffrey Epstein, his death in a prison in Manhattan, and the corresponding conspiracy theories that has used up social networking this weekend. President Trump led the charge and retweeted a conspiracy theory that former President Bill Clinton wanted to involve.
The core of the online fiasco is Twitter, which has largely programmed the political conversation and a large part of the press. Twitter is magnetic during big stories; news junkies come there for current information. But in the beginning there is often a big discrepancy between the attention that is focused on the platform and the available information about the developing story. That gap is filled by speculation and, through the worst users, rumor starving and conspiracy theories.
On Saturday, Twitter's trending algorithms removed, compiled, ranked and then placed the worst of this waste in the trending module on the right-hand side of the website. Despite being one very random and usually & # 39; worthless metric, & # 39; Trending topics on Twitter are often interpreted as a vague signal of the importance of a particular topic.
This hands-off approach to editorial intervention in the news cycle, coupled with algorithms promoting the most popular messages, has since become a well-known villain. It played an important role in, for example, the promotion of anti-vaccine zeolots on Facebook and the growth of the Alex Jones audience on YouTube. The Epstein case was already the dream of a conspiracy theorist before he apparently hung himself in his prison cell; in the early hours after his death, when little information was still available, Twitter was a perfect Petri dish for presenting and reinforcing shameful conspiracy theories.
As Warzel points out, Twitter reinforced those conspiracies through its trending algorithm, which has long survived its usefulness. Brian Feldman explained why in 2018:
The first problem with "trending" is that it selects and highlights content without regard to accuracy or quality. Automated trending systems are not equipped to judge; they can determine whether things are shared, but they cannot determine whether that content should be shared further. (…)
This is the other problem of & # 39; trending & # 39 ;, conceptual: it is pre-eminently playable, but the platforms that use the term never make the rules clear. & # 39; Trending & # 39; gets the impression of authority – videos & topics that have been transmitted from above, scientifically determined to have trending – while in reality it is a paved list of content that is obsessively shared or tweeted by people who love Justin Bieber. Or Logan Paul. Or who believe in crisis actors.
Removing algorithmically generated modules from trending content would deprive bad actors of an easy gamed path for delivering hoaxes to platform users. A more modest approach could be to build editorial teams that monitor trending hashtags and remove obvious hoaxes and conspiracy theories.
But what if that was illegal?
That is the question I had after reading the vague but disturbing White House plan to have the Federal Communications Commission and Federal Trade Commission police communications on social networks. Brian Fung saw a partial sketch:
The draft order, a summary of which has been obtained by CNN, calls on the FCC to develop new rules that clarify how and when the law protects social media websites when they decide to delete or suppress content on their platforms. Although it is still at an early stage and subject to change, the draft Trump government decision calls on the Federal Trade Commission to take that new policy into account when investigating or filing lawsuits against misbehavior companies. Politics reported first the existence of the design.
If executed, the order would reflect a significant escalation by President Trump in his frequent attacks on social media companies due to an alleged but unproven systemic preference for conservatives by technology platforms. And it could lead to an important reinterpretation of a law that, the authors claimed, was meant to give technology companies ample freedom to handle content according to their own judgment.
Fung talks to experts who describe the plan in different ways as "terrible" and "makes no sense". No one seems to think that the FCC or FTC want to do this work, or can do this work practically or constitutionally. It is another disturbing idea from the Trump administration that all of us are wondering whether we should take it seriously, literally or not at all.
I believe that you cannot have editorial neutrality without Nazis and other suppliers of hate speech and abuse. I also believe that restricting content moderation platforms that go beyond what is legally required would threaten their business – Nazis can drive users and advertisers away.
It would be heartwarming if Twitter would take this moment to retire trending topics and take other concrete steps to slow the spread of conspiracy theories. But with the aggressive saber chatter of the White House, that seems less likely.
Tony Romm presented a lecture from Friday's meeting between technology companies and the White House on his search for a mass murder prevention technology that does not involve arms control or tackles white supremacy:
Top executives in the Trump administration showed an interest in tools that can anticipate mass shootings or predict attackers by scanning social media messages, photos & videos during a Friday with technical giants such as Facebook, Google and Twitter.
The technology could serve as an early warning system for potential attacks, White House officials suggested during the brainstorming session, perhaps gathering information from social sites to identify deadly incidents before they occurred, according to three people familiar with the case but not authorized discuss a private meeting at the plate.
Max Fisher and Amanda Taub investigate how YouTube has changed politics in Brazil:
Teachers describe classrooms created by students who quote from YouTube conspiracy videos or who, encouraged by right-wing YouTube stars, secretly record their instructors.
Some parents watch & # 39; Dr. YouTube & # 39; for health advice, but get dangerous disinformation instead, which hinders the country's efforts to fight diseases like Zika. Viral video & # 39; s have threatened death against public health advocates.
And in politics, a wave of right-wing YouTube stars ran in front of the office next to Mr. Bolsonaro, some with historic margins. Most still use the platform, which rules the world's fifth largest democracy through internet trolling and provocation.
Jo Becker reports on how immigration led to a populist increase in Sweden. It is being propelled by a disinformation campaign likely to be funded by the Russian government, Becker reports:
Russian and Western entities that deal with disinformation, including an Islamafobe think tank whose former president is now Mr. Trump's national security adviser, have been crucial linkers to the Swedish sites, which have contributed to spreading their message to sensitive Sweden.
At least six Swedish sites have received financial support through advertising revenues from a car parts company in Russia and Ukraine, based in Berlin, whose online sales network strangely contains buried digital links to a range of extreme right and other socially distributed content.
Mike Isaac reports that Facebook tried to buy Houseparty, but canceled the bid on antitrust objections:
But weeks in the discussions, Facebook & # 39; s business development team ended the talks with Houseparty, people said. Houseparty & # 39; s executives were told that a deal would attract unwelcome federal government test to Facebook, they said. Houseparty was later purchased by Epic Games, the makers of the Fortnite video game.
Hi boy. From Kevin Poulsen:
Nearly 10 percent of the unverified accounts that President Trump has retweeted since his inauguration is currently suspended from Twitter due to various platform policy violations ranging from hateful language to running fake sock puppet accounts, an analysis of Daily Beast.
The most recent event was last week, when Twitter hit the brakes on "LYNNTHO06607841" just hours after Trump had retweeted the account proclamation that "DEMOCRATS ARE THE REAL ENEMIES OF AMERICA!". caps conspiracy theories prominent Democrats, including a recent tweet claiming that Bill and Hillary Clinton torture and sacrifice children & # 39; to get & # 39; a drug that can only be found in the human skull & # 39 ;.
April Glaser reports on how Telegram has become a popular retreat for white supremacists who are banned from more polite social networks:
While social networks such as Facebook and Twitter have become more aggressive against hate speech in the last year, one of the less discussed places where white supremacists, violent men's groups, anti-PC agitators and trolls of various stripes have come to is Telegram. At the same time, democracy activists in Hong Kong relied on Telegram to coordinate protests against new restrictions by the Chinese government, demonstrating that the intended goal is alive and kicking. At the same time, the app made a cameo in Puerto Rican politics, as leaking nearly 900 pages of sexist, disapproving, and homophobic messages from a Telegram group treasure led to popular protests and the resignation of the governor, which illustrates that , when they think nobody is bugging, politicians and their confidants will still say something.
Telegram supports private messages, chats with private groups and audio calls – all of which are reliably encrypted – for its hundreds of millions of members. But the ability to create public pages and groups has provided a platform for fallen stars on the right. "Instagram will ban me in a few minutes," wrote Laura Loomer, a popular right-wing social media figure known for her anti-Muslim and conspiracy-laden rhetoric, when Facebook gave Loomer and others a heads-up before suspending their accounts for breaking the rules. "Sign up for my Telegram," she said. On Telegram Loomer posts several times a day to more than 11,800 subscribers. (On Instagram she had more than 115,000 followers.)
Jeremy W. Peters, Michael M. Grynbaum, Keith Collins, Rich Harris and Rumsey Taylor investigate the link between right-wing TV hosts and the recent wave of mass killings.
An extensive New York Times review of popular right-wing media platforms found hundreds of examples of language, ideas, and ideologies that overlap with the mass murderer's written statement – a shared vocabulary of intolerance that arouses fears of color immigrants. The & # 39; s programs, on television and radio, reach millions of people. In the four years since Mr. Trump elected republican voters with shocking remarks about Muslims and Mexicans, demonizing references to immigrants have become more widespread in the news media, the Times review found.
In the four years since Mr. Trump accelerated Republican voters with shocking remarks about Muslims and Mexicans, demonic references to immigrants have become more widespread in the news media, the Times review found.
Funny how this problem seems to occur on every social network on a scale. Nilesh Christopher reports on TikTok & # 39; s hate speech problem, and links it to real violence:
Videos found on TikTok contain hateful language posted by users who identify themselves from tall closets as they celebrate and sing the praises of their community. These quickly lead to threats of physical violence with members of some communities who claim to have dominance over other castes.
"We must get a divorce, not the fingers, but the heads of those who dare to capture us (our community)," says a user in a video who identifies himself as part of the Nadar community. Nadars traditionally have a low status in the cupboard ladder but have risen due to entrepreneurship. This video is loved by thousands of users and more than 89 videos have started syncing lips to the speech.
Good April Glaser piece about the emerging trend of young people who no longer want to work for Silicon Valley giants because of ethical concerns:
Students don't feel that (working at Facebook) has the same cachet & # 39 ;, a San Francisco-based tech recruiter with 15 years of experience (who asked not to be named because Facebook is currently one of his customers) told me in an interview. “It doesn't seem like the kind of name that students want to have on their resumes for the first time, and because they are optional, there are very few reasons to go to Facebook, especially the feeling that that brand is somewhat compromised now. T Finally, he added, students elsewhere receive very attractive compensation packages from other technology companies of millions of dollars that do not have such negative headlines.
After & # 39; the world's most popular Fortnite player moved from Twitch to Mixer, Twitch began promoting other channels on his sleeping page, including one that was just a porn broadcast. Ninja seemed very annoyed about the situation in a video he posted on Twitter and forgive me, but I find the whole situation elusive.
Emine Saner interviews the CEO of YouTube about the past months:
Despite all her careful, frustrating business answers, Wojcicki is in an almost impossible position. Apart from the gigantic task of trying to fathom the endless flood of content, she has to contend with the fact that the removal of videos from extreme right-wing commentators turns them into torture with free speech. She must also make & # 39; makers & # 39 ;, many of whom deserve a good life on the site, happy. I have no reason not to believe Wojcicki when she says "responsibility is my first priority". The question is whether it is a task beyond her control – and whether Google tolerates changes that lead to lower profits.
Elizabeth Dwoskin talks to YouTube moderators who say they don't have adequate tools to remove bad content from the site:
Creators who violate YouTube's rules will be confronted with the result that their channels or videos will be stripped of advertisements or their content will be completely removed. But unlike rivals such as Facebook and Twitter, many YouTube moderators cannot delete content themselves. Instead, they are limited to recommending whether a piece of content is safe to display ads, and highlights it to higher ups who make the final decision.
An office complex that is known for its extremely wealthy residents might get another one, Noah Buhayar and Natalie Wong reported:
Facebook Inc. is negotiating a much larger lease agreement with Hudson Yards in Manhattan, giving the social media company space in three buildings for the $ 25 billion mega project, according to people with knowledge of the matter.
The deal could include around 1.5 million square feet (139,000 square meters), about 50% more than was previously reportedsaid the people who asked not to be identified while discussing private negotiations.
AND LAST BUT NOT LEAST.
Zheping Huang reports on a product that we all need to keep an eye on:
China & # 39; s ByteDance Ltd. launched a search engine that is strikingly similar to Google& # 39; s clean, well-arranged home page, but delivers heavily cleaned-up results in accordance with one of the most severely censored internet regimes in the world.
ByteDance, the maker of popular apps, including the viral short video service TikTok, is the most serious threat ever to rival Baidu Inc. After Google's departure from the market in 2010 amidst government censorship, Baidu has a near-monopoly on the internet searching in China. ByteDance still needs to display sponsored products or ads in the search feed, but the results largely determine the content of its own Toutiao news app.
Imagine that Twitter had this feature that Telegram just launched:
When an administrator turns on slow mode in a group, you can only send a message according to the interval they choose. A timer indicates how long you have to wait before sending your next message.
I don't know because it's terrible? But John Herrman says the site's career focus has a moderating impulse to our worst impulses:
"You talk on LinkedIn in the same way as in the office," says Dan Roth, editor-in-chief of LinkedIn. "There are certain limits around what is acceptable." Criticism of other users' messages, he says, is often measured – "there is a certain range in the voice," he said – and users will often make the many implicit standards of the platform explicit when they find it necessary. "If you read the comments," said Mr. Roth, "if someone goes outside the borders, you have other members who say," Hey, bring this back. "
"This is something that your boss sees, your future boss, people you want to work with in the future," said Mr. Roth. "It's as close to your permanent record as you can get."
Natalie Martinez states that Facebook has promoted the spread of racist, anti-immigrant ideologies that are at the core of recent mass shootings:
Although the company has policies that seem to prohibit most if not all "invasion" content, Facebook still allows it to exist and spread on its platform. In March, Facebook claimed that it did not regard a message that white supremacists claimed about a "Muslim invasion" in the UK as a violation of his community standards. A year later leaked documents revealed that Facebook allowed praise for white nationalism and separatism on its platform after the "Unite the Right" rally in August 2017 in Charlottesville, VA, the company implemented a so-called white nationalist ban. But the accountants who hired Facebook to oversee the objectives of "promoting civil rights on our platform" criticized the ban as "too narrow. "
And finally …
Brian Feldman reports on the emergence of high school bathrooms as an unlikely hub for creating content:
At TikTok we see the school bathroom as a meeting place. Users film video & # 39; s from themselves and friends act silly or robbery for the camera. In one video entitled & # 39; bathroom party & # 39 ;, six boys and girls sheepish exit a gender neutral bathroom (the latter removes the videographer). One of my favorite clips contains two students who realized that they could enter the crawl space above the ceiling of their bathroom. One of them opens a ceiling tile and falls down and actually says "I'm gay."
But there are also a lot of videos in which students re-contextualize the social space in the bathroom and play with its perception. Earlier this year a comparison of what is happening in boys 'and girls' dressing rooms started, with the general core that girls complained about gym class in their dressing room and boys made them a Mad Max state from anarchy.
My high school bathroom was a damp and ominous well, so happy to see that today's teenagers use it well.
Talk to me
Send me tips, comments, questions and bathroom TikToks: firstname.lastname@example.org.