OPINION: White supremacists and social media: the double standard with jihadis

It seems all too familiar: an attack, the initial shock and an outpouring of support from all sides, followed by a search for reason.

To U.S. citizens looking at New Zealand, where a white supremacist attacked two mosques in Christchurch, killing 50 people during prayer on March 16 while live streaming on Facebook, it is painfully familiar. The difference is that it took New Zealand less than a week to make concrete policy changes to prevent a future attack as Prime Minister Jacinda Ardern announced the ban of military-style semi-automatic rifles.

Americans know gun law reform is needed. The numbers back it up: 67 percent favor banning assault weapons and 84 percent support background checks for private gun sales, including at gun shows, according to the Pew Research Center. The Christchurch tragedy rightfully restarted that debate in full throttle. However, an equally important debate revolves around terrorists’ use of social media.

The government and public tend to use the term terrorism when referring to one group: militant Muslims. Militant Islamic terrorist organizations pose a threat to the U.S., but another group poses a far greater one: white supremacist terrorists. These people are not patriots looking out for the good of the country.

Since 9/11, white supremacist groups have killed more people in the U.S. than jihadis. In a study of 573 “extremist-related fatalities” from 2002-18, the Anti-Defamation League found that right-wing extremists killed 80 percent of victims. In 2017, the Anti-Defamation League found white supremacists responsible for 18 of 34 extremist-related deaths compared to nine by jihadis. In 2018, white supremacists killed 50 people.

Militant radical Muslim terrorist attacks receive almost four times more press coverage than their white supremacist counterparts, according to a study from the University of Alabama. Fewer than one-fifth of FBI terrorism investigations, 900 of over 5,000, focus on domestic terrorism. The White House has ignored the simmering, but now rising threat of white supremacy and attempted to institute a ban on Muslims entering the country instead. Prominent congressman Steve King publicly questioned how “white supremacist” became an offensive term.

It should come as no surprise that the relative lack of negative media coverage and government persecution, as well as outright encouragement from the House of Representatives emboldens white supremacists. They have taken to the internet to spread messages of hate on platforms not subject to government regulations.

When the Islamic State began exploiting social media platforms to recruit and radicalize potential new members, Facebook, Twitter and Google faced a dilemma. They could stick to the message of freedom of speech and expression or seek out and eliminate terrorist content on their respective platforms.

Facing external pressure from advertisers and boycotts, they chose the latter, but not until ISIS had already created a massive social media presence. Thirteen social media companies joined the Global Internet Forum to Counter Terrorism, which is somewhat effective in removing jihadi content and reducing their influence. However, some content will slip through the cracks due to the sheer mass of posts every day.

Social media platforms are making the same mistakes with white supremacists that they made with Muslim extremists. White supremacists and Muslim extremists, while fundamentally opposed, are almost the same. They copy each other’s tactics and rationals. Each believes they must resort to violence to protect their race or religion from an outside invasion, real or imagined. Each uses social media algorithms to spread messages of hate and intolerance under the guise of free speech.

“The rise of the alt-right is both a continuation of a centuries-old dimension of racism in the U.S. and part of an emerging media ecosystem powered by algorithms,” behavioral scientist Jessie Daniels wrote in a research paper for the American Sociological Association.

“Algorithms deliver search results for those who seek confirmation for racist notions and connect newcomers to like-minded racists, as when (Charleston church shooter) Dylann Roof searched for ‘black on white crime’ and Google provided racist websites and a community of others to confirm and grow his hatred,” Daniels wrote.

Roof directly inspired the Christchurch shooter, according to the shooter’s trailing rant some called a manifesto.

YouTube is a haven for white supremacists, just as it was for ISIS. Despite recent crackdowns, white supremacist rhetoric still thrives, including content from the group that advertised at App State last year.

Over 4,000 people viewed the live stream of the Christchurch attack before it was reported, and Facebook took it down after 29 minutes. Facebook argued that the moderating artificial intelligence could not discern if the video was fake or not.

Social media platforms are responsible for the content posted on their sites. The balance between protecting free speech and allowing the spread of terrorist rhetoric is too far toward the latter. Facebook, Twitter and Google must correct this dangerous imbalance. White supremacists are terrorists. It is time the public, government and internet treated them as such.