August 2, 2021

Decoder Newsletter: How Does Social Media Handle Extremism?

Margaret Sessa-Hawkins

We’ve known for a while that social media networks can contribute to extremism and radicalization. A few reports out this week shed new light on the issue. First, both the Center for Countering Digital Hate and the Anti-Defamation League looked at how platforms are addressing antisemitism (spoiler: not well). Then, Politico reported that Jihadist content is rife on a platform founded by Trump insiders that was meant to promote free speech. How can platforms address these issues? A new quarterly harms report from the Real Facebook Oversight Board has some suggestions. We look at these issues, as well as what’s happening with the Big Lie and health disinformation. The Decoder is a weekly roundup of news about digital deception. Know someone who might be interested? Ask them to sign up!

  • The Big Lie I: This past week marked the first meeting of the Select Committee investigating the January 6th attack on the US Capitol. The New York Times reports Republican disinformation about the attack is continuing, and looks at how the media should handle reporting on such disinformation. Decode Democracy is urging the committee to ensure it fully investigates social media’s role in enabling the attacks. You can sign a petition urging the House Select Committee to hold social media companies accountable for their role here.  
  • The Big Lie II: The meeting of the Select Committee came shortly before reports that Trump urged the Justice Department to ‘just say the election was corrupt’ were released. The reports further show the lengths Trump was willing to go to stop the rightfully elected president, Joe Biden, from taking office. In an attempt to clamp down on election misinformation, Twitter has permanently suspended nine accounts associated with so-called ‘election audits’ in different states, for spreading lies about the election. In The Atlantic, Anne Applebaum profiles ‘MyPillow Guy’ Mike Lindell, concluding that he is a true believer, and also a threat to American democracy. In an effort to protect voting rights, Stacey Abrams’ Fair Fight Action and the Center for American Progress Action Fund will be bringing individuals to Congress to lobby lawmakers face-to-face in favor of national voting rights bills.
  • Antisemitism: A new report from the Center Countering Digital Hate has found that social media companies are failing to act on anti-semitism. The report found that platforms took no action on 84% of antisemitic posts that were reported to them by researchers, and that these posts had collectively been viewed 7.3 million times. Separately, the Anti-Defamation League’s 2021 Online Antisemitism Report Card found that a majority of social media platforms are not doing enough to curb antisemitism online.
  • Extremism: The pro-Trump social media network GETTR is being inundated by Jihadist content, according to Politico. The platform was started by members of the former president’s inner circle as a free-speech alternative to more traditional social media sites. Some of the past Jihadi posts had been taken down, indicating that GETTR may be trying to moderate at least some of the material on its site.
  • Poole and Youtube: In The Daily Beast, Robert Silverman profiles far-right media personality Tim Poole, looking at how he has risen to YouTube prominence peddling disinformation. The article is a must-read: in-depth, well-researched and utterly fascinating. It also shines a bit of a spotlight on a platform often given a pass when it comes to disinformation and hate speech, with Silverman noting that Poole’s “prominence [is] made possible in no small part by YouTube promoting his work to its front page.”
  • Health Disinformation: In The New York Times, Sheera Frankel and Tiffany Hsu write about how the “Disinformation Dozen” — known for being responsible for much of health’s digital disinformation — are also using local media outlets to spread lies about covid. The Real Facebook Oversight Board, which is critical of the social media network, also released a Quarterly Harms Report, which looks at health disinformation on the site, and especially at the Disinformation Dozen, and how Facebook’s algorithm amplifies health disinformation.
  • Language Disinformation: A coalition of Democrats are seeking information about non-English language disinformation on social media platforms. The group, led by Sens. Amy Klobuchar (Minn.) and Ben Ray Luján (N.M.) sent letters to the CEOs of Facebook, Twitter, YouTube and Nextdoor asking for information on their content moderation policies for misinformation in the top five languages on the platform.
  • Instagram and Kids: Instagram has added new protections for teens, including limiting targeting options for advertisers. The changes will bring its parent company, Facebook, in line with regulations from the U.K.’s data protection authority. In The Verge, Casey Newton reviews some of the changes, looking at whether they will actually make a difference, and what else the company could do.
  • Research: A new study called the ‘Covid States Project’ has found that individuals who get their news from Facebook are more likely to be vaccine hesitant than those who turn to Fox News, although Facebook’s numbers did mostly align with the general public’s. The Global Disinformation Index has also released a report on popular brands whose ads appear next to Anti-vaccine disinformation. Twitter is asking researchers and hackers  to identify potential harms coming from bias in its algorithms, and will offer a bounty to those that can do so.

Join the fight
against online
political deception.