Twitter clamps down on ‘dehumanising language’ aimed at religious groups
Platform bans talk of ‘rats’ and ‘viruses’ in connection with protected groups
Twitter has been heavily criticised for how it handled abuse cases in the past, with some groups feeling the new rules it implemented at various stages over the past couple of years to try to deal with the ongoing issue were not enough.
Twitter is clamping down on abuse on its platform, updating its rules to ban “dehumanising language” aimed at religious groups.
Describing religious groups as “rats”, “filthy animals” or “viruses” will be among the types of tweets that will be removed under the new policy.
“Our primary focus is on addressing the risks of offline harm, and research shows that dehumanising language increases that risk,” Twitter said. “As a result, after months of conversations and feedback from the public, external experts and our own teams, we’re expanding our rules against hateful conduct to include language that dehumanises others on the basis of religion.”
Tweets sent before the new rule was implemented will need to be removed once reported to Twitter, but will not directly result in offenders having their accounts suspended.
The company said it was making the changes based on feedback from the public, safety experts and its own teams. Although the company is starting with religious groups, it may extend the rule to language directed at other protected groups, Twitter said.
The social network said there were additional factors it needed to consider before expanding the rule to other groups, including marginalised users reclaiming terminology, taking context into account and how much should historical marginalisation of a group be taken into account when evaluating the severity of harm being done.
Twitter has been canvassing for feedback on its service, with more than 8,000 responses collected around the globe. It found people wanted potential violations to be more clearly laid out, to have the rules for consistently enforced and for Twitter to narrow down what was considered when applying the rules.
into our evaluation of severity of harm?
“We’ll continue to build Twitter for the global community it serves and ensure your voices help shape our rules, product, and how we work,” Twitter said.
The social network has been heavily criticised for how it handled abuse cases in the past, with some groups feeling the new rules it implemented at various stages over the past couple of years to try to deal with the ongoing issue were not enough.