Twitter is updating its hate-speech rules to ban posts that liken religious groups to rats, viruses or maggots, among other dehumanising terms. Source: BBC.
Over time, the ban would be extended to cover to some other groups, it said.
But a public consultation had indicated users still wished to use dehumanising language to criticise political organisations and hate groups.
Tech companies have struggled to strike a balance between free expression and protecting users from attack. Twitter said it had taken "months of conversations" to decide on the policy.
"Our primary focus is on addressing the risks of offline harm - and research shows that dehumanising language increases that risk," the company said in a blog.
Twitter's hateful conduct policy had already banned users from spreading scaremongering stereotypes about religious groups - such as claiming all adherents were terrorists.
In addition, it had prohibited the use of imagery that might stir up hatred, including photos edited to give individuals animal-like features or add "hateful symbols" such as the yellow Star of David badges associated with the Holocaust.
One UK-based civil rights campaign group said the latest move was "very belated".
"Twitter's known that it's had a problem with people using hate speech to target, harass and abuse people on the basis of their religious background for a long time," said Matthew McGregor, Hope Not Hate's campaigns director.
"So, it's been incredibly disappointing to see Twitter drag its feet over this.
"At the same time, their move today is welcome. But I think a lot of campaigners will want to see the extent to which this policy is implemented."
Twitter said it would respond to user reports as well as employ machine-learning tools to automatically flag suspect posts for review by human moderators.
Offenders face having their accounts suspended, although not for cases that occurred before the rule came into effect.