For instance, Harvey said, "If you send the same message to four people, and two of them blocked you, and one reported you, we could assume, without ever seeing what the content of the message was, that was generally a negative interaction". This realization presents a unique challenge for the team: how can they proactively address troll-like behavior that distorts and detracts from the public conversation on Twitter that doesn't violate policies? More the users, more are the chances of trolls on the platform.
Twitter acknowledges there's a long way to go, but says its tech will learn more (and make mistakes) over time. "Past efforts to fight abuse "felt like Whac-A-Mole", he added". Criticism of its approach has led to the implementation of new rules meant to reduce hate speech, and more rules were announced in February targeted specifically to people encouraging self-harm. The goal was to create a method that would punish users who hurt the social experience but who should still be allowed on the platform.
Other new behavioral signals that Twitter will use include whether the same person signs up for several accounts simultaneously or behavior that suggests a coordinated attack, such as multiple accounts disrupting a conversation with the same hashtag.More news: Oil steady near $71 as sanctions and protests stoke risk
More news: Windshield Blows Out of Chinese Passenger Jet, Co-pilot 'Sucked Halfway' Out
More news: Samsung announces two new colours for Galaxy S9 and S9+
According to Twitter's Safety account, these "signals" can be identified by an algorithm, and are tied to behavior, not the content of the tweets themselves. This will happen even when the tweets themselves have not been found to violate any of the Twitter rules and regulations.
"No. It's important to remember this is about behaviour, not content", she said. Twitter said it would also look at how these accounts interacted with and were linked to those users who violate Twitter's rules.
Gasca and Harvey don't say whether the proclaimed troll tweets would be demoted for everyone, or just for specific users they've been known to target. "The result is that people contributing to the healthy conversation will be more visible in conversations and search", writes Twitter.
Despite the progress made, the company acknowledges that there's much more to be done when it comes to promoting healthy conversations on Twitter, but it made an affirmation to keep learning and improving its tools.