A New Zealand public official accused Facebook Sunday of enabling a genocide in Myanmar as the Silicon Valley giant struggles to explain how a shooter was able to use one of the company’s features to videotape his killing spree.
Facebook is a “morally bankrupt” company that “cannot be trusted,” New Zealand privacy commissioner John Edwards said on Twitter. The social media platform “enabled genocide” in Myanmar, he added, referring to reports suggesting Facebook didn’t do enough to tamp down heated rhetoric in the Southeast Asia.
“[They] allow the live streaming of suicides, rapes, and murders, continue to host and publish the mosque attack video, allow advertisers to target ‘Jew haters’ and other hateful market segments, and refuse to accept any responsibility for any content or harm. They #DontGiveAZuck,” Edwards said in a follow-up tweet, according to the New Zealand Herald.
Edwards deleted the tweets late Sunday night because of what he said were the “volume of toxic and misinformed traffic they prompted.”
I have deleted the tweets promoting my discussion about Mark Zuckerberg’s interview because of the volume of toxic and misinformed traffic they prompted. Here is the actual conversation with @SusieFergusonNZ on @NZMorningReport https://t.co/YcCmnFvT7r
— John Edwards (@JCE_PC) April 8, 2019
Christopher Sidoti, a member of a United Nations investigators who believe Facebook is not doing enough to tamp down social media posts that promote hatred of the Rohingya people in Myanmar. More than 740,000 Rohingya are in refugee camps in Bangladesh after they were thrust out of Myanmar during military crackdowns in 2016 and 2017.
“I think there has been meaningful and significant change from Facebook, but it’s not nearly sufficient,” Sidoti told reporters in late March. Facebook employed only two people at the time of the genocide who spoke Burmese. (RELATED: UN Investigator Says Mark Zuckerberg Didn’t Do Enough To Prevent Genocide In Myanmar)
Australia passed a law Thursday that requires internet companies to stop the spread of violent material. Executives face up to three years in jail, or fines of up to 10 percent of the platform’s annual turnover, if they fail to ding certain kinds of content.
Australia’s law comes after Facebook and Google struggled to remove video images of shootings at two New Zealand mosques in March. At least 49 people died in the shootings. A 28-year-old Australian man calling himself Brenton Tarrant connected himself to the shootings after posting a manifesto discussing his fear of white genocide.
Content created by The Daily Caller News Foundation is available without charge to any eligible news publisher that can provide a large audience. For licensing opportunities of our original content, please contact email@example.com.