UN Investigator Says Mark Zuckerberg Didn’t Do Enough To Prevent Genocide In Myanmar
United Nation investigators believe Facebook is not doing enough to tamp down social media posts that promote hatred of a group of people who are being driven out of Southeast Asia, Gizmodo reported Thursday.
“I think there has been meaningful and significant change from Facebook, but it’s not nearly sufficient,” Christopher Sidoti, a member of a UN team that found Facebook failed to prevent people from using the platform to incite genocide in Myanmar, told reporters in late March.
Only two people on the platform spoke Burmese in 2015 when the country cracked down on the Rohingya people.
“At the height of the situation in 2017, Facebook was largely passive,” Sidoti said, adding: “Facebook’s actions can only be described as minimal. It was as though the approach was apologize after the fact rather than try to prevent it in the first place.”
More than 740,000 Rohingya are in refugee camps in Bangladesh after they were thrust out of Myanmar during military crackdowns in 2016 and 2017.
“The role of social media is significant … Although improved in recent months, the response of Facebook has been slow and ineffective,” according to a 2018 UN report that Sidoti contributed to. “The extent to which Facebook posts and messages have led to real-world discrimination and violence must be independently and thoroughly examined.”
Facebook has since added more than 100 employees who speak the country’s native language to help moderate content, Gizmodo noted. Company CEO Mark Zuckerberg has meanwhile wrestled with complaints that executives are not doing enough to moderate content on their platform.
Australia passed a law Thursday that requires internet companies to stop the spread of violent material. Executives face up to three years in jail, or fines of up to 10 percent of the platform’s annual turnover, if they fail to ding certain kinds of content. (RELATED: Facebook Expands Ad Monitoring Technology Ahead Of Upcoming Elections)
Australia’s law comes after Facebook and Google struggled to remove video images of shootings at two New Zealand mosques in March. At least 49 people died in the shootings. A 28-year-old Australian man calling himself Brenton Tarrant connected himself to the shootings after posting a manifesto discussing his fear of white genocide.
Facebook has not yet responded to The Daily Caller News Foundation’s request for comment.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact email@example.com.