Facebook is hiring an additional 3,000 people to delete posts that it considers hate speech, the company announced Tuesday.
The social media giant said it has removed an average of 66,000 posts every week over the past two months.
How does Mark Zuckerberg’s company define hate speech?
“Our current definition of hate speech is anything that directly attacks people based on what are known as their ‘protected characteristics’ — race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, or serious disability or disease,” said Richard Allan, who is the company’s vice president for European policy.
“There is no universally accepted answer for when something crosses the line. Although a number of countries have laws against hate speech, their definitions of it vary significantly.”
Facebook declines to call it censorship, insisting that it is “living up to the values in our Community Standards.”
“When we remove something you posted that you believe is a reasonable political view, it can feel like censorship. We know how strongly people feel when we make such mistakes, and we’re constantly working to improve our processes and explain things more fully,” Allan said.
Zuckerberg said in May that the “reviewers” who are removing posts are working for the public good.
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else,” he said.