Tens of thousands of Americans — many of them living here in Southern California — searched for specific information about conspiracy theories, joining armed or violent extremist groups, and making Molotov cocktails and other explosives, even napalm, before and after the November 2020 presidential election, a new report shows.
Between September and December, the Anti-Defamation League and Moonshot CVE, a London-based company that attempts to disrupt online extremism with counter-messaging, tracked more than 34,000 Internet searches nationwide related to violence or violence-inciting disinformation related to the 2020 election.
In a joint report released this week, titled “From Shitposting to Sedition: Tracking and countering conspiracy theories, disinformation and violence around the 2020 US presidential election,” the ADL and Moonshot detail how they tracked a large national appetite for conspiracy theories, political violence and membership in extremist groups and redirected people to content that countered falsehoods and extremist narratives.
At-risk searches defined
The report offers a snapshot of at-risk searches not just state by state, but also county by county. Of the 58 California counties, Los Angeles topped the number of at-risk searches with 30.1 per 10,000 residents. San Diego was in seventh place with 19.52 searches per 10,000 residents, Riverside 10th with 14.95 searches, Orange 12th with 11.03 and San Bernardino 15th with 10.19.
When it came to the most-searched topics, 53.5% of the searches in Orange County had to do with conspiracy theories. In Los Angeles County, 32.77% of the searches related to targeted violence. In San Bernardino County, 36% of the searches sought armed groups and in Riverside County, a majority of the searches, 31.9%, were for information about armed groups as well. Southern Californians were also looking for information online about anti-government topics and a small percentage in all counties wanted to find out more about political violence.
Moonshot and ADL essentially redirected people from around the country who searched for similar topics to counter narratives, many of them typically short videos. For example, when someone searched “how to join the three percenters,” a far right anti-government group, they were directed to videos that explored why some might feel hostile toward the federal government. When an internet search turned to advice on making a Molotov cocktail, Moonshot redirected them to videos that explained how rhetoric is used to normalize violence.
And those who entered the phrase “kill Mitch McConnell,” now Senate minority leader, were sent to a news anchor asking viewers to reflect on their mental health and well-being. According to the report, those who searched for these terms and more ended up collectively watching more than 33 hours of the content to which they were redirected.
Value of counternarratives
The Redirect Method, a strategy that tackles online extremism, is like talking a “jumper off a ledge,” said Tim Zaal, a former skinhead who now preaches nonviolence at the Museum of Tolerance in Los Angeles.
“Some are going to jump and some aren’t,” he said. “But the counternarrative has always been something I suggest and subscribe to when it comes to countering violent ideologies.”
While the Redirect Method can be effective, it also needs to be implemented on social media platforms, said Zaal, though he acknowledged it can be difficult to do because the platform algorithms are geared toward what the individuals want to see, hear and experience.
“If you have a counternarrative you can see for yourself and decide,” he said. “If the counternarrative also includes stories of former extremists — people who have gone down that path — it gives people an opportunity to know what happens if you go down that path. You can’t force anyone to do anything. But you can deter someone by giving them a different side of the story.”
The Redirect Method can find success with those who are new to and curious about extremist ideologies, said Brian Levin, director of the Center for the Study of Hate and Extremism at Cal State San Bernardino.
“The problem is not everyone who is running down that slippery slope is doing so at the same speed or intensity,” Levin said.
“Especially at a time when people are hostile to facts like something as simple as wearing a mask (during the coronavirus pandemic),” he added, “how are we going to persuade them to take in other facts where the impact is not immediate or apparent — like the value of not being bigoted?”
What is disturbing is not just that some want to learn more about conspiracy theories and armed groups, Levin said, but also the regularity with which aggression, bigotry and conspiracy have now become part of mainstream America.
“These trends in (online) searches are similar to what we see manifested in locations where there is an increase in hate crimes and hatching violent plots,” he said. “In November 2016, we saw that the hate crimes that were happening in the real world and the extremist language and bigoted epithets online went hand in hand.”
Meanwhile, Levin, said, the data produced by ADL and Moonshot is not yet refined enough to understand who in particular is going to be dangerous.
“You can’t do contact tracing with this kind of information,” he said.
The most telling part of the report is that in more than half of the counties in the United States, someone searched for information about joining an armed group, said Micah Clark, director of product for Moonshot CVE.
“It helps us understand the appetite, what people are seeking out and what they are wondering about,” Clark said. “It’s also meaningful that we’re not just providing access to counter content, but also resources like mental health services. If someone is going down a bad path, we can provide them with the opportunity to talk with someone.”
Clark added that Moonshot vetted mental health service providers to make sure they had the tools to help individuals before directing people to them.
According to Clark, the system does not treat all those searching for such content the same way. The system can discern who is headed down a more dangerous path, he said.
“For example, someone looking for Oath Keepers Los Angeles (a far-right anti-government militia organization), might just be curious,” he said. “But if someone types keywords like ‘Oath Keepers,’ ‘Telegram’ and ‘join,’ you get more clarity about their intent. Someone who’s looking for ‘neo-Nazi punk music’ is low risk. But if they are looking for a specific band or artist, you know they are already involved in the movement and are looking for specific things. Often, the level of knowledge required to conduct the search informs the degree of risk involved.”
Here are a few insights Moonshot gained from analyzing social media platforms such as Gab, Parler, MeWe, 4chan, 8kun, MyMilitia and Zello in addition to Google Search, according to the report.
• The QAnon community began talking about suicidal ideation soon after the presidential election results were confirmed on Jan 6. (QAnon is a disproven far-right conspiracy theory alleging that a secret cabal of Satan-worshipping, cannibalistic pedophiles running a global sex-trafficking ring plotted against former president Donald Trump, who has been battling the cabal.)
• QAnon movement followers are now aligning with anti-vaccine conspiracies in an attempt to remain relevant.
• After the Proud Boys, a white nationalist organization with strong roots in Southern California, were discussed in the first presidential debate, searches looking to engage with the group increased by 127%.
• Search activity into “how to make explosives/Molotov cocktails/napalm” peaked before and after Election Day.
• Election fraud narratives were propagated by armed groups and militias as early as June 20.
‘Good speech’ helps
Moonshot’s Redirect Method shows it is possible to intervene online with effective counternarratives, said Ryan Greer, ADL’s director of national security and government relations.
“We’ve accepted that online extremism is a cancer that has infected the Internet,” he said. “Now we know it is possible to flood the Internet with good speech. What was also helpful was for us to see where the geographical breakdown is so we can work with local officials and other organizations on the ground to counter those threats and issues.”
The threat of domestic terrorism is not over and much work remains to be done, Greer said.
“We are working with state and local officials as well as others who can be more credible to extremists, people who can act as credible messengers and take the temperature down because they know their communities,” he said.
Nationally, Clark said, the most remarkable finding was seeing the mental health crisis experienced by the QAnon community.
“It was a perfect storm of Q going silent, the election not aligning with their conspiracy combined with the continuing toll of COVID-19 on every aspect of our lives,” he said. “You had people saying they lost a job, their kids won’t talk to them or their wife left them. It’s hard to say what came first — their mental health issues or buying into the QAnon conspiracy theories.”
Looking ahead, Clark said, those who laid siege to the U.S. Capitol on Jan. 6 “have built a social movement that will outlast the political moment we are in.” The most resilient and insidious parts of the movement, he said, have deep roots in conspiratorial beliefs, misinformation and white supremacy — all long-term problems that will require a comprehensive strategy.
Banning or de-platforming extremists on social media is a necessary step and a tactic, but not a long-term solution, Clark said.
“It really breaks down the momentum of a group and it might force the movement to re-calibrate and reset,” he said. “But it does nothing to rehabilitate the individuals who are in these groups and this report shows that many people who are in these groups have significant vulnerabilities that have contributed to their participation. We need to find more ways to help these individuals make meaning in their lives in less destructive ways.”
Source: Orange County Register