Facebook said on Thursday it has permanently banned several far-right and anti-Semitic figures and organizations, including Nation of Islam leader Louis Farrakhan, Infowars host Alex Jones, Milo Yiannopoulos and Laura Loomer, for being “dangerous,” a sign that the social network is more aggressively enforcing its hate speech policies under pressure from civil rights groups.
Facebook had removed the accounts, fan pages, and groups affiliated with these individuals after it reevaluated the content that they had posted previously, or had reexamined their activities outside of Facebook, the company said. The removal also pertains to at least one of the organizations run by these people, Jones’ Infowars.
“We’ve always banned individuals or organizations that promote or engage in violence and hate, regardless of ideology. The process for evaluating potential violators is extensive and it is what led us to our decision to remove these accounts today,” Facebook said in a statement.
None of the people banned was immediately available for comment.
Jones, for example, recently hosted Gavin McInnes, the leader of the Proud Boys whom Facebook designated as a hate figure in December. Yiannopoulos, another alt-right social media star, publicly praised McInnes this year, and Loomer appeared with him at a rally. Jones and Yiannopoulos have been temporarily banned before by Facebook and other social media platforms including Twitter.
But Facebook and its counterparts have largely resisted permanent bans, holding that objectionable speech is permissible, so long as it doesn’t bleed into hate. Facebook has also been wary of offending conservatives, who have become vocal about allegations that the company unfairly censors their speech.
The bans are likely to be welcomed by civil rights activists, who have long argued that these individuals espouse violent and hateful views and that Silicon Valley companies should not allow their platforms to become a vehicle for spreading them.
Angelo Carusone, president of Media Matters, an organization that has long advocated for more enforcement against white supremacists, said Facebook has been lax against enforcing its policies against hate speech on these accounts because the company doesn’t want to deal with the right-wing blowback. “The reality is, people are getting killed. There are mass shootings and mass murders that are clearly being connected to ideas like white genocide, which are fueling radicalization,” Carusone said. “The conditions have changed. When you have these massive catalyzing moments that are connected to real-life consequences, it puts pressure on Facebook and others to look in the mirror.”
Facebook has recently signaled that it is willing to take a stronger stance against white nationalism and white supremacy, in particular. In March, the company said it would begin banning posts, photos and other content that reference white nationalism and white separatism, revising its rules in response to criticism that a loophole had allowed racism to thrive on its platform.
Facebook said it began to reexamine the figures last year, and some of the activities and posts the company said it had reevaluated took place within the past one or two years.
It is also banning Paul Nehlen, who described himself as a “pro-White Christian Candidate” when he ran for Congress and was also kicked off the website Breitbart News site last year for ties to neo-Nazis and racist comments about Meghan Markle, and Paul Joseph Watson, a far-right YouTube personality and an editor of Infowars, according to the Infowars site.
Some of the figures, such as Nehlen and Loomer, have been banned from Twitter already.
Madihha Ahussain, special counsel for anti-Muslim bigotry with the advocacy group Muslim Advocates, said that individuals like Loomer, Jones and Yiannopoulos have used social media platforms to broadcast dangerous hate speech and conspiracies targeting Muslims, Jews and others.
“We applaud Facebook for taking this positive step toward removing hate actors from the company’s platforms,” she said. “As we saw in Christchurch, New Zealand – where a white nationalist was able to live-stream the slaughter of 50 people at two mosques – online platforms like Facebook have been used to target communities and spread hate.”
(c) 2019, The Washington Post · Elizabeth Dwoskin