Twitter asks academics to help eliminate toxic tweets with algorithms

Twitter is partnering with academic researchers to assess the seriousness of the problem of the 'eco chamber'. of Twitter and study whether or not the site helps reduce discrimination

Twitter is finally taking stock of the quality of the conversation that occurs on its platform.

The social networking giant is partnering with a group of researchers from academic institutions around the world to assess the seriousness of the problem of the 'eco chamber & # 39; of Twitter and study whether or not the site helps reduce discrimination.

The measure is part of the ongoing effort of Twitter to stop harassment and toxic behavior among users.

Scroll down to watch the video

Twitter is partnering with academic researchers to assess the seriousness of the problem of the 'eco chamber'. of Twitter and study whether or not the site helps reduce discrimination

Twitter is partnering with academic researchers to assess the seriousness of the problem of the 'eco chamber'. of Twitter and study whether or not the site helps reduce discrimination

In March, Twitter launched a call for proposals from outside experts so that it could work to get an idea of ​​the health of the Twitterphere by measuring abuse, spam and manipulation.

"After months of reviewing fantastic proposals from experts around the world, we have selected two partners who will help us measure our work to serve the public conversation," the Twitter Security unit wrote in a tweet.

& # 39; … We were overwhelmed by the thoughtful and intelligent ideas they shared, and we are looking for ways to partner with others in different ways & # 39;

The firm said it created a review committee made up of people on Twitter to choose from more than 230 proposals submitted to the company.

Twitter finally decided to form two groups of academic researchers that will focus on separate areas.

The research gathered by the academic groups will be used to create a system of metrics that can evaluate the state of public conversations on the site.

"Partnering with external experts who can provide thoughtful analysis, an external perspective and a rigorous review is the best way to measure our work and accountability to those who use Twitter," the firm wrote in a blog post.

The first team is composed of researchers from the University of Leiden in the Netherlands and the Delft University of Technology, the University of Syracuse and the University of Bocconi in Italy.

It is responsible for looking at "echo chambers and incivil discourse", specifically seeking to create a metric that can assess how communities are formed around the political discourse on Twitter and the challenges that can arise from the discussions in those groups.

"In the context of the growing political polarization, the spread of misinformation and the increase in incivility and intolerance, it is clear that if we are going to effectively evaluate and address some of the most difficult challenges that arise in social networks, academic researchers and Technology companies will have to work together much more closely, "said Dr. Rebekah Tromble, assistant professor of political science at the University of Leiden and leader of the group, in a statement.

The past findings of the University of Leiden have indicated that echo chambers, which are formed when people in a conversation share the same views, can increase "hostility and promote resentment towards those who do not have the same conversation".

Twitter finally decided to form two groups of academic researchers that will focus on separate areas. The research gathered by the academic groups will be used to create a system of metrics that can evaluate the state of public conversations on the site.

Twitter finally decided to form two groups of academic researchers that will focus on separate areas. The research gathered by the academic groups will be used to create a system of metrics that can evaluate the state of public conversations on the site.

Twitter finally decided to form two groups of academic researchers that will focus on separate areas. The research gathered by the academic groups will be used to create a system of metrics that can evaluate the state of public conversations on the site.

Twitter CEO Jack Dorsey (pictured) announced in March that the company was looking for ways to better measure the health of public conversation & # 39; in the place.

Twitter CEO Jack Dorsey (pictured) announced in March that the company was looking for ways to better measure the health of public conversation & # 39; in the place.

Twitter CEO Jack Dorsey (pictured) announced in March that the company was looking for ways to better measure the health of public conversation & # 39; in the place.

A set of indicators will evaluate the number of people who meet and interact with different points of view on the site.

The group will also develop an algorithm that distinguishes between incivility, which Twitter noticed can sometimes be constructive when it comes to politics and intolerant discourse.

In contrast to the lack of civility, intolerant discourse can include issues such as hatred, speech, racism and xenophobia.

The second group, which includes researchers from the University of Oxford and the University of Amsterdam, will analyze how people use Twitter and whether or not it diminishes prejudice and discrimination.

The idea is that by exposing people to different points of view, presumably through Twitter, they are less likely to transmit prejudices offline.

HOW DOES TWITTER ACT AGAINST DEBT ACCOUNTS?

Twitter can go after the offensive accounts from the tweeter level, direct message level and account level, according to the company's website.

The company said it will take action against accounts when its behavior violates the & # 39; Twitter Rules & # 39; or it may be in response to a valid & correctly delimited request & # 39; from an authorized entity in a given country & # 39;

Twitter can tell a user to remove a tweet that violates the terms of the site, hide a tweet because it "waits for its removal" or can even make a tweet "less visible" on the site by limiting how often it appears in the tweets. Search results, answers or timelines

Twitter takes a variety of measures to prevent infringing accounts from using the site. In the case of your last debugging, Twitter requested suspicious accounts to verify your phone number

Twitter takes a variety of measures to prevent infringing accounts from using the site. In the case of your last debugging, Twitter requested suspicious accounts to verify your phone number

Twitter takes a variety of measures to prevent infringing accounts from using the site. In the case of your last debugging, Twitter requested suspicious accounts to verify your phone number

The company can also prevent users from sending direct messages to another user by eliminating the user's conversation that informed the inbox of the incidents.

If certain accounts violate Twitter's policies, the company may make certain media unavailable, place an account in read-only mode, eliminating its ability to publish tweets, retweets or similar content "until the calmest ideas prevail".

Twitter can also ask a user to verify the ownership of the account, usually by asking them to verify an email or phone number linked to the account.

In extreme situations, Twitter can permanently suspend a global view account, or the violator will not be able to create a new account.

According to Twitter, they will create a set of text classifiers for language that are "commonly associated with positive sentiment, cooperative emotionality and integrative complexity" and can be adapted to communications on the platform.

While the measure should give Twitter a better idea of ​​the type of conversations and behaviors that exist on its platform, some argue that it is not a solution to some of its biggest problems around harassment and hate speech.

Victims of harassment believe that the company does not take enough action, or none, when reporting offensive accounts.

Nor will it leave Twitter with new tools to combat harassment, although it is likely to use the data unearthed by researchers to improve its security features.

(function() {
var _fbq = window._fbq || (window._fbq = []);
if (!_fbq.loaded) {
var fbds = document.createElement(‘script’);
fbds.async = true;
fbds.src = “http://connect.facebook.net/en_US/fbds.js”;
var s = document.getElementsByTagName(‘script’)[0];
s.parentNode.insertBefore(fbds, s);
_fbq.loaded = true;
}
_fbq.push([‘addPixelId’, ‘1401367413466420’]);
})();
window._fbq = window._fbq || [];
window._fbq.push([“track”, “PixelInitialized”, {}]);
.