Facebook CEO Mark Zuckerberg tried to clarify his controversial comments about Holocaust deniers Wednesday afternoon, hours after he was quoted saying some deniers who post on Facebook aren’t “intentionally getting it wrong.”
Zuckerberg made his comments during an interview that ReCode’s Kara Swisher published Wednesday morning. He cited Holocaust denials as an example of controversial misinformation that Facebook would allow to remain on the platform. Facebook has said that it allows conspiracy theories to remain on the site, but limits their reach so fewer people see them.
“At the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong,” Zuckerberg told Swisher. “I don’t think that they’re intentionally getting it wrong.”
He went on to compare conspiracy theories to people who simply misspeak.
“It’s hard to impugn intent and to understand the intent. I just think, as abhorrent as some of those examples are, I think the reality is also that I get things wrong when I speak publicly,” he said prophetically.
His comments drew immediate condemnation on social media, in the press, and among civil rights activists.
“Holocaust denial is a willful, deliberate and longstanding deception tactic by anti-Semites that is incontrovertibly hateful, hurtful, and threatening to Jews,” Jonathan Greenblatt, CEO of the Anti-Defamation League, said in a statement to CNNMoney. “Facebook has a moral and ethical obligation not to allow its dissemination.”
Within hours, Zuckerberg emailed Swisher to say he got things wrong.
“I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny that,” he wrote in the email.
Zuckerberg’s comments came one week after Facebook confirmed it would allow Infowars, a site that traffics in conspiracy theories, to remain on its platform. Facebook said the site, which has, among other things, called the Sandy Hook school shooting a hoax, does not violate its community standards.
Even as Zuckerberg clarified his comments, Facebook said it will, in the coming months, begin taking down content it determines is contributing to imminent violence. The violence must be about to occur, not just speculative, according to Facebook. The company will rely on third-party partners to make this distinction.
“Reducing the distribution of misinformation—rather than removing it outright—strikes the right balance between free expression and a safe and authentic community,” Facebook said in a statement. “There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down.”
Under the new policy, Facebook will remove content that is flagged, escalated, and confirmed by local partners as false and possibly contributing to violence. Facebook will begin working with these local groups, which were not identified, in Sri Lanka and then Myanmar. Facebook said it is still assessing if it will roll out the policy in the United States.
The company offered the example of recent posts in Sri Lanka that claimed Muslims were poisoning food given to Buddhists. It consulted with a local organization to confirm the misinformation could lead to violence, then removed the posts.