Twitter collaborates with academic researchers to decide whether to ban white supremacists from its platform
- Academics who analyze whether they should ban white supremacists and nationalists
- They will look at what roles Twitter plays to make the discussion worse or better
- Comes as Twitter faces criticism for its failure to falter hate speech
Twitter says it is investigating whether or not white supremacists may be allowed onto their platform, amid rising calls to crack down on extremist content.
The social media giant is investigating how white nationalists and supremists use their platform to decide whether the groups should be banned or allowed to continue posting for other users to discuss, according to motherboard.
It is because Twitter is confronted with criticism of the abundance of extremist content that is shared on its site and the fact that it has taken few measures to curb hateful rhetoric.
Scroll down for video
Twitter investigates how white nationalists and supremacists use their platform to help decide whether groups should be banned or allowed to continue posting
Researchers investigate what roles Twitter plays in the deterioration or better conduct of conversations about white nationalism and white supremacy.
From there, it hopes to have a better idea of whether or not to ban these groups.
& # 39; Is it the right approach to de-platform these individuals? Is the right approach to trying to make contact with these people? How should we think about this? What actually works & # 39; Vijaya Gadde, the head of trust and safety at Twitter, told Motherboard.
Last month, Twitter CEO Jack Dorsey and Gadde met with President Donald Trump to discuss the & health of public conversations on the site.
Twitter has become notorious for its characteristic slow responses to urgent problems on the site, such as assault, trolling and hateful content.
For that reason, many are not surprised by the company's decision to strengthen the issue of white supremacists and white nationalism a few years after the start of this type of content on Twitter.
& # 39; The idea that they are now seriously looking at this issue as opposed to the past, indicates how insensitive they have become on this platform & # 39 ;, Angelo Carusone, president of Media Matters told Motherboard.
Twitter CEO Jack Dorsey (photo) has been repeatedly criticized for the characteristic slow response of his company to urgent issues such as abuse, trolling and intimidation
Similarly, Heidi Beirich, director of the intelligence project at the Southern Poverty Law Center, told Motherboard that it has been proven that white supremacists continue to thrive on Twitter.
& # 39; Twitter has David Duke there; Twitter has Richard Spencer, & # 39; she told Motherboard.
& # 39; They have some of the greatest ideals of white supremacy and people whose ideas have inspired terrorist attacks on their site, and it's outrageous. & # 39;
Twitter has taken a number of steps to combat extremism, last year he joined Facebook, YouTube, Spotify, LinkedIn and others by banning right-wing conspiracy theorist Alex Jones and his Infowars show from his platform.
Twitter and various social media platforms must still be complete in other ways take into account the amount of extremist content on their platforms.
YouTube has also become a popular destination for white nationalism and supremacy, but it has so far refused to ban both forms of content from its site.
So far, Facebook, which banned such posts in March, is the only major social media platform that takes a stand against white nationalism and white separatism.
Messages with statements like & # 39; I am a proud white nationalist & # 39; and & # 39; Immigration breaks this country apart & # 39; will be banned immediately.
If a user tries to publish a message on these & # 39; s topics, they are instead redirected to a non-profit organization called Life After Hate, which allows people involved in these extremist groups to safely leave them.
WHAT IS THE POLICY OF TWITTER RELATING TO HATE CONVERSATION?
Twitter says it does not tolerate behavior that intimidates, intimidates, or uses fear to silence other social network users.
Twitter users who violate these rules may see their content removed or their social network suspend access to the account.
What does Twitter prohibit?
According to the company, it removes all tweets that do the following:
- Threatening physical violence
- Promote attacks based on their race, ethnicity, national origin, sexual orientation, gender, gender identity, religious belief, age, disability, or serious illness
- References to mass killings, violent events or specific means of violence where such groups are the primary targets or victims
- Stimulates fear of a certain protected group
- Repeated use of non-consensual slogans, epithetas, racist and sexist tropics
- Content designed to degrade a specific user
Twitter users can target individuals or specific groups in a number of ways, for example by using the @ mention function or by tagging a photo.
How does Twitter enforce these rules?
According to the company, the first thing it does when an account or tweet is flagged as inappropriate is the context checked.
Twitter says: & # 39; some Tweets may appear offensive when viewed separately, but may not be viewed in the context of a larger conversation.
& # 39; Although we accept reports of violations from someone, sometimes we need to hear directly from the target to make sure we have the right context. & # 39;
Twitter says that the total number of reports received around an individual post or account has no bearing on whether something is deleted or not.
However, it may cause Twitter to prioritize the order in which it looks through highlighted tweets and accounts.
What happens if you violate Twitter policy?
The consequences for violating our rules depend on the severity of the violation and the person's previous record of violations, Twitter says.
The penalties range from requesting a user to voluntarily remove an unwanted tweet, to suspend a full account.
. (TagsToTranslate) Dailymail (t) sciencetech