Twitter moves to combat savagery, tells users to be nice and think twice before replying

Twitter has rolled out improved prompts on iOS and Android that will encourage users to pause and reconsider a potentially harmful or offensive reply — such as insults, strong language, or hateful remarks — before tweeting it. 

The bird app which has often faced criticism over abusive user behaviour, tested the feature last year.

The upgraded feature is better at spotting "strong language" and now takes into account your relationship with the person you're messaging.

The social media platform said the tests showed that the prompts reduced offensive replies.

On Wednesday, Twitter said it would roll the prompts out to English language accounts using Twitter on Apple and Android.

"Starting today, we're rolling these improved prompts out across iOS and Android, starting with accounts that have enabled English-language settings," the company informed.

"Early tests revealed that if prompted, 34 per cent of people revised their initial reply or decided to not send their reply at all.

"After being prompted once, people composed, on average, 11 per cent fewer offensive replies in the future," Twitter said.

The company said the prompts are designed to consider the nature of the relationship between the accounts that tweets and the account that replies.

The blog post said: “If two accounts follow and reply to each other often, there’s a higher likelihood that they have a better understanding of preferred tone of communication.”

In recent years, tech platforms have grappled with how to police offensive content, abuse and harassment on their platforms.

Twitter’s most recent statistics, for January to June 2020, show that the social media platform removed potentially offensive content posted by 1.9 million accounts, and suspended 925,700 accounts for violating Twitter rules.

 
Back To Top

Want your friends to read this?

Hit the buttons below to share...