Social media: Where voices of hate find a place to preach

CHARLOTTESVILLE, Va. — On Twitter, David Duke, former Grand Wizard of the Ku Klux Klan, sometimes tweets more than 30 times a day to nearly 50,000 followers, recently calling for the “chasing down” of specific black Americans and claiming the LGBTQ community is in need of “intensive psychiatric treatment.”

On Facebook, James Allsup, a right-wing advocate, posted a photo comparing migrant children at the border to Jewish people behind a fence during the Holocaust with the caption, “They present it like it’s a bad thing #BuildTheWall.”

On Gab, a censorship-free alternative to Twitter, former 2018 candidate for U.S. Senate Patrick Little, claims ovens are a means of preserving the Aryan race. And Billy Roper, a well-known voice of neo-Nazism, posts “Let God Burn Them” as an acronym for Lesbian Gay Bisexual Transgender.

Facebook, Twitter and other social media companies offer billions of people unparalleled access to the world. Users are able to tweet at the president of the United States, foster support for such social movements as Black Lives Matter or inspire thousands to march with a simple hashtag.

“What social media does is it allows people to find each other and establish digital communities and relationships,” said Benjamin Lee, senior research associate for the Centre for Research and Evidence on Security Threats. “Not to say that extreme sentiment is growing or not, but it is a lot more visible.”

Social media also allows something else: a largely uncensored collection of public opinion and calls to action, including acts of violence, hatred and bigotry.

Months before the violent 2017 Unite the Right rally in Charlottesville, Virginia, people associated with the far-right movement used the online chat room Discord to encourage like-minded users to protest the city’s efforts to remove long-standing Confederate statues — particularly one of Gen. Robert E. Lee.

Discord originally was a chat space for the online gaming community, but some participants used the platform to discuss weapons they might brandish at the Charlottesville rally. Some discussed guns and shields, and one suggested putting a “6-8 inch double-threaded screw” into an ax handle.

Multiple posts discussed the logistics of running a vehicle into the expected crowds of counterprotesters.

Charlottesville, Virginia, is home to Confederate statues and monuments built in the early 1900s — some of which would later become the focal point of the 2017 Unite the Right rally. (Kianna Gardner/News21)

Charlottesville, Virginia, is home to Confederate statues and monuments built in the early 1900s — some of which would later become the focal point of the 2017 Unite the Right rally. (Kianna Gardner/News21)

Flowers and gifts sit under the memorial wall for Heather Heyer in Charlottesville, just around the corner of what is now named the Honorary Heather Heyer Way. (Kianna Gardner/News21)

Heather Heyer was killed after James Alex Fields Jr. of Ohio rammed his car into an unsuspecting group of demonstrators. Others were injured. He has pleaded not guilty to multiple charges, including the death of Heyer and other hate crimes.

“They (the right) said it was a free speech rally, it was never meant to be such,” said Jalane Schmidt, a University of Virginia associate professor and counterprotester. “What had been happening in internet chat rooms came to in real life.”

The clash in Charlottesville attracted hundreds of members of the far-right community. The event garnered global attention, brought the violent side of America’s political divide into focus and prompted criticism and questions about social media’s role in inciting hate.

The far-right’s use of social media also prompted some companies to ban users. Since the Unite the Right rally, Facebook, Twitter, Spotify, Squarespace, PayPal, GoDaddy, YouTube and others have jointly suspended hundreds of users associated with the far-right.

Members of the far-right are calling it an “act of war” on their free speech rights — an unjustified censoring of conservative viewpoints.

Jalane Schmidt, a Black Lives Matter activist and University of Virginia professor, said some of her students were surrounded by far-right protesters on campus during the first night of the 2017 Unite the Right rally in Charlottesville, Virginia. (Kianna Gardner/News21)

News 21 monitored the daily social media activity of various far-right users, including white nationalists and neo-Nazis, from June 10 to June 24. Those tracked had more than 3 million followers combined. Reporters recorded and compiled more than 2,500 posts on popular platforms, such as Twitter and Facebook, and emerging social media platforms, including Gab and VK.

About half the posts were directed at specific demographics or communities, from black Americans and Latinos to Jewish people and LGBTQ members. The posts varied in sentiment from describing gays as “ill” to referring to black Americans as “chimps” and “sh*tskins.”

Most of the posts followed current events. When families were separated at the U.S.-Mexico border, anti-Latino sentiment was expressed by nearly every user tracked.

Almost all used coded terminology and symbols, which allows them to communicate with others who understand their unique online language. One common symbol is the use of parentheses or asterisks to show something or someone is Jewish or associated with Jewish people: “Our people can and will achieve the goals we desire — no matter how hard *they* try to stop it.”

By the end of the two weeks monitored by News21, the 2,500 posts resulted in more than half a million “likes” from social media followers and were shared nearly 200,000 times.

“Social media companies have succeeded in sort of negotiating a place for themselves in the world where they are not the publishers,” said Lee, who researches digital media and the far-right. “And somehow we all sort of sat down and accepted it up until the point we didn’t, and now they’re running to catch up.”

Lee said the purging of online extremists began when government officials noted the Islamic State terrorist group had been recruiting new members via the internet and social media. Since then, the eradication of users has expanded to include the far right.

“It wasn’t until after Charlottesville they (social media sites) started to understand that there was a risk on their site for keeping them (far right groups) there,” said Goad Gatsby, an anti-racism activist in Richmond, Virginia. “All these social media companies saw all these organizations as a way to generate revenue without any risk.”

Gatsby and others consistently receive online threats from extremists via social media. Some threats have become incidents of “doxing” — the publishing of private or identifying information about a person online, usually meant to provoke physical harm to that person.

“A lot of people have had to go into hiding because they were targeted in their own neighborhoods because of who they were as activists online,” Gatsby said.

According to Gatsby, doxing occurs regularly, raising concerns over whether social media companies are capable of curbing real-world violence.

“They’re confronting the question of whether or not it (social media) really is the Wild West and whether the people that are in control of these platforms are like a local sheriff,” said Bob Wolfson, a former regional director of the Anti-Defamation League (ADL).

Wolfson said social media executives appear to tolerate most individuals using their platforms so long as they don’t “spew hate.”

Gerhard Lauck, of Fairbury, Nebraska, has spent years providing a platform for far-right organizations through his website-hosting company, Zensurfrei. (Kianna Gardner/News21)

Gerhard Lauck, of Fairbury, Nebraska, has spent years providing a platform for far-right organizations through his website-hosting company, Zensurfrei. (Kianna Gardner/News21)

“And when the bad guys come, we’re going to tell them, ‘You can come and drink. You can come and rent our hotels. You can come and talk to the locals, but you’re going to have to check your guns.’ ”

On April 11, 2018, the Facebook account Altright.com released a post to its thousands of followers expressing concern over increasing internet censorship. Within hours, Altright.com was suspended from Facebook indefinitely.

The account was one of several linked to Richard Spencer — one of the most prominent leaders of modern white nationalism and a primary organizer of the Unite the Right rally last summer.

A few days after his Facebook account was removed, Spencer acknowledged the suspension in a single Tweet saying “The alt-right is being recognized as THE grounded, authentic anti-war movement in the U.S. For our enemies, that’s unacceptable.”

Later, his online white nationalist magazine, The Alt Right, was kicked off GoDaddy, a popular website-hosting company. The decision was made in response to a letter from the Lawyers’ Committee for Civil Rights Under Law, which claimed Altright.com incited violence, particularly against racial and ethnic minorities.

The committee compared Altright.com to the Daily Stormer — a neo-Nazi website GoDaddy removed in 2017 after a Daily Stormer editor belittled the killing of Heather Heyer in Charlottesville, calling her a “drain on society.”

In its two-week review of the social media activities of prominent far-right figures, News21 found that nearly half the posts were directed at Latinos, black Americans, Muslims, Jewish people, LGBTQ members, women and other groups. About one-third of those referred to Jewish people or Latinos, followed closely by Muslims and the LGBTQ community, which accounted for about a quarter of the posts.

This word cloud contains the most commonly used words by prominent people in the right (Alex Jones, Billy Roper, Christopher Cantwell, David Duke, James Allsup, Jason Kessler, Pamela Geller, Patrick Little, Peter Brimelow, Richard Spencer) when News21 tracked their posts on various social media platforms (Facebook, GAB, Twitter, YouTube, VK, and WrongThink) from June 10 to June 24.

This word cloud contains the most commonly used words by prominent people in the right (Alex Jones, Billy Roper, Christopher Cantwell, David Duke, James Allsup, Jason Kessler, Pamela Geller, Patrick Little, Peter Brimelow, Richard Spencer) when News21 tracked their posts on various social media platforms (Facebook, GAB, Twitter, YouTube, VK, and WrongThink) from June 10 to June 24.

Many posts followed real life events. When families were being separated at the U.S.-Mexico border, Roper, the neo-Nazi, posted a tweet stating “#KeepFamiliesTogether Deport them all, along with any who support them. With a catapult.”

On Facebook, Allsup, the white nationalist, compared the separations to Jewish people during the Holocaust, wondering what the problem was and calling immigrant parents “deadbeat parents.”

As of August 2018, many of these users still are online.

Researchers cite multiple possibilities as to why some users get to stay and others go including: the hiding of hateful rhetoric in coded language, adopting new neutral-sounding identities, such as “white civil-rights advocate,” and the lack of a concise definition of hate and hate speech.

In December 2017, Twitter announced it would start enforcing stricter rules on “hateful comments,” but Duke, the former KKK leader who calls the Holocaust a hoax, remains on the site.

Over the two weeks that News21 tracked his social media activity, Duke posted more than 230 tweets — 30 percent of them directed at Jewish people.

“I can’t look into the minds of the people making these decisions to allow or not allow prominent people to have platforms,” said Mark Pitcavage, a senior research fellow with the ADL’s Center on Extremism. “You can’t get much more obvious than David Duke.”

Pitcavage, a researcher for the league’s hate symbols database, which identifies various codewords and symbols commonly used by extremists, suggested the development of the far-right’s online vocabulary may be why some hate groups remain available online.

“By the nature of who is doing this type of moderating, a lot of stuff is going to be missed,” Pitcavage said. “A lot of times the message has to be really blatant for that person looking at it to understand what the objectionable part of it is.”

Some common terms, such as snowflake (someone who’s sensitive or feminine) or normies (those who lean left), may be predictable in meaning and are generally less offensive. However, other obscure symbols such as 88 (which stands for “Heil Hitler” because H is the eighth letter in the alphabet) could be instrumental to identifying hateful social media users.

Pitcavage said the ambiguous terminology allows users to bypass automated algorithm searches and average social media users — two ways Facebook, Twitter and other large platforms flag or report hate. It also allows users to remain in the mainstream undetected while still communicating with those who understand the language.

“With shorthand, where everyone knows what it means, it creates this degree of commonality,” Pitcavage said.

In August 2017, Facebook CEO Mark Zuckerberg posted on his page: “There is no place for hate in our community. That’s why we’ve always taken down any post that promotes or celebrates hate crimes or acts of terrorism — including what happened in Charlottesville.” Later, in April 2018, Zuckerberg reiterated that, saying “we do not allow hate groups on Facebook, overall.”

But it wasn’t until almost a year after Charlottesville that Facebook removed Jason Kessler, the primary organizer of the Unite the Right rally, which featured torch-wielding white nationalists marching through town shouting “Jews will not replace us!” and “Blood and soil!”

Kessler identifies as a white civil-rights advocate, which researchers say may contribute to his remaining on major social media platforms and to Facebook’s delay in his suspension.

“It’s a much more acceptable way of framing your ideology,” said Lee, of the Centre for Research and Evidence on Security Threats.

“It’s this idea of saying, ‘We’re not about this idea of race anymore,” and it takes the ethnic stuff, the racial stuff and the white out of white supremacy.”

Hundreds of members from the far-right community descended upon the University of Virginia campus with tiki torches on the first night of the two-day Unite the Right rally in August 2017. (Kianna Gardner/News21)

Hundreds of members from the far-right community descended upon the University of Virginia campus with tiki torches on the first night of the two-day Unite the Right rally in August 2017. (Kianna Gardner/News21)

Main organizer of the 2017 Unite the Right rally, Jason Kessler, primarily blames the Charlottesville, Virginia, police for last year's violence. His complaint that the police didn't keep the groups of protesters separated is shared by many who were part of the rally. (Kianna Gardner/News21)

Kessler, who goes by the Mad Dimension on Twitter and Gab, remained mostly self-censored across all social media platforms during News21 tracking.

“Anybody who goes back and looks at the content I put out on my Facebook and on Twitter, I don’t use ethnic slurs and I don’t take cheap shots at other races,” Kessler said. “I try to be constructive, that I am advocating certain policies.”

In a June interview with News21, Kessler predicted social media companies would consider removing him as he prepared for another Unite the Right rally, which was Aug. 12 in Washington, D.C.

About one month after the interview, Kessler set his Gab account to private about the same time his Facebook page was suspended. In the weeks before the suspension, Kessler posted about the upcoming “white civil rights rally.”

“Any time whites say they want something for white people and stand up for white interests, no matter how watered down it is or how radical it is, it’s always called ‘hate,’ ” said Jeff Schoep, leader of the National Socialist Movement (NSM).

The NSM also identifies as a white civil-rights organization on its website. Schoep said the group has been kicked off of every mainstream platform, including Facebook, Twitter and YouTube.

Jeff Schoep, commander of the National Socialist Movement, walks through the historic Packard Automotive Plant in Detroit. Schoep said the National Socialist Movement exists to protect the survival of the white race. (Megan Ross / News21)

The Southern Poverty Law Center, an advocacy group that tracks hate and bigotry toward marginalized communities, identifies the NSM as the largest neo-Nazi organization in the United States. Other suspended organizations include the American Nazi Party, League of the South and Vanguard America.

But not all those kicked off social media sites accept their suspension.

One white nationalist, Jared Taylor, sued Twitter for alleged ideological bias in February 2018 after the company banned his personal and business accounts for his nonprofit organization, American Renaissance. The two Twitter accounts combined had about 70,000 followers, Taylor said.

Twitter claimed the accounts were associated with extremist groups and promoted violence, which violated the company’s terms of service. Taylor says that’s inaccurate.

“They didn’t give a name, they never cited any of our tweets as being extremist or likely to promote violence or anything of the sort,” Taylor said. “It was just preposterous.”

American Renaissance, founded by Taylor in 1990, promotes “race realism,” which Taylor defines as the recognition that race is a biological fact, not a “sociological optical illusion.”

One recent article published by American Renaissance said the mission of Western women is to be homemakers and build up their husbands, not to hold corporate positions. Another elaborated on the “white fight,” or the struggle to preserve a shrinking white majority.

“Jared has a lot of opinions others find appalling, but he expresses them in a respectful manner,” said Noah Peters, one of the lawyers representing Taylor in Taylor v. Twitter. “You can be as genial and polite as you want, if they don’t like what you’re saying, they’ll kick you off.”

Jared Taylor sued Twitter in February 2018 for ideological bias after the company removed his organization, American Renaissance, from its platform. American Renaissance was one of many kicked off during the "Twitter purge." (Kianna Gardner/News21)

Taylor isn’t the first conservative user to sue Twitter on the basis of political censorship. But it is the first case so far against a private social media company that hasn’t been dismissed in court. On June 15, Judge Harold Kahn of San Francisco County Superior Court ruled Taylor could proceed with his lawsuit, saying the case “goes to the heart of free speech principles that long precede our Constitution.”

“We want to change the rules,” Taylor said. “We want to make it impossible for these companies, simply on pure whimsy, to decide to shut people up that they disagree with. That is the right they claim.”

Twitter declined to comment on the case and how the company determines who gets kicked off and who doesn’t. Facebook, Gab, YouTube, Google and others did not respond to requests for comment.

“The only free speech that matters is their version of free speech,” Peters said.

Taylor is among those who contend freedom of speech is absolute, but others say the fact these social-media companies are private poses a concern.

“There is this kind of lingering question, which is, what obligation are they under to provide services to people?” Lee said. “Freedom of speech is freedom to express yourself, but it’s not freedom to force other people to publish what it is you have to say.”

But sites built on the promise of First Amendment principles and as alternatives to mainstream platforms are available.

“Anybody that wants to say any damn thing on the internet is going to be able today find a place to be hosted,” said Wolfson of the ADL. “They’re going to find someone that is more sympathetic to their message.”

Gab, which describes itself as being “dedicated to preserving individual liberty, the freedom of speech, and the free flow of information on the internet,” is one of many emerging alternative platforms.

The censorship-free company launched in 2017 and today claims about 400,000 users. According to the company’s annual report, users post more than 1.5 million times per month.

Some far-right Gab users tracked by News21 were explicitly hateful. Christopher Cantwell, who hosts the alt-right radio show Radical Agenda, said in one post, “When you search for black lives matter and murder, all you get is a bunch of stories of police taking out the trash.”

Patrick Little, who is rumored on Gab to be running for president in 2020, posts pictures of himself holding a campaign sign in that reads “Expel the Jews by 22, vote Little, win Big” and refers to Adolf Hitler as a “saint.”

He detailed his removal from mainstream platforms like Twitter and YouTube on his Gab account, saying they shut him down for “truth-telling.”

News21 reporters Ashley Mackey, Ashley Hopko, Danny Smitherman, Storme Jones, Tessa Diestel, Renata Cló and Anya Zoledziowsk contributed to this report.

Kianna Gardner is a Don Bolles fellow, Ashley Mackey is a Knight Foundation Fellow, Danny Smitherman is a Donald W. Reynolds Fellow, Storme Jones is an Ethics and Excellence in Journalism Fellow, Renata Cló is a Hearst Foundation Fellow.

Republish our work; it's all Creative Commons.
News21 logo
Carnegie logo
Knight foundation logo
Cronkite School logo