A year after starting testing prompts asking users to “rethink” average responses, Twitter is extending the functionality to all of its iOS and Android users who use the app in English. The functionality, similar to the anti-bullying measures of other companies, detects “potentially harmful or offensive” responses and prompts users to edit their tweet before sending it.
The company says it has made improvements over the past year to reduce the instances where people might see prompts unnecessarily. For example, its algorithms now take into account “the nature of the relationship between the author and the replicator” because people who know each other may be more likely to make jokes or communicate differently than strangers. The system can also “better capture situations where language can be taken over by under-represented communities and used in a non-harmful way.”
The prompts are one of the many updates Twitter has made to reduce bullying and harassment and stimulate “healthier conversations.” The company notes that testing of the feature has had some success, with 34% of people who were prompted choosing to reconsider their initial response. That may not seem particularly high, but Twitter notes that the prompts may have had downstream effects that resulted in fewer offensive responses going forward.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through any of these links, we may earn an affiliate commission.