Twitter tests telling users their tweet replies may be offensive

Twitter tests telling users their tweet replies may be offensive

Twitter is experimenting with a new moderation tool that will warn users before they post replies that contain ’harmful’ language. When users hit ’send’ on their reply, they will be told if the words in their tweet are similar to those in posts that have been reported, and asked if they would like to revise it or not. It’s as a limited experiment, and will show up for iOS users, only.

Kathy
Kathy
Corwin Parker
Corwin Parker 3 weeks

"Your words may not fall in line with the way we demand you think. You might want to change your mind. We're not telling you how to think or what to say. Just a kind suggestion from your friendly Megacorp."

Boo
Boo 3 weeks

The Dems will be pissed. How dare anyone question their thoughts, while they question yours.

Violet
Violet 3 weeks

But just not to trump though

Top in Tech
Get the App