Produced by Decode Democracy, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we’ll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!
- Voter suppression: On Thursday Georgia Gov. Brian Kemp signed a wide-ranging voter suppression bill which will disproportionately affect Black voters. The bill limits drop boxes, makes it a crime for anyone who is not an election worker to give food or beverage to those waiting in line to vote, and expands voter ID requirements. The Georgia bill and others that are now pending have been in-part spurred by false rhetoric about voter fraud in 2020.
- For the People Act: The Senate rules committee held its first hearing on S.1, the Democrats’ flagship democracy reform bill also known as the For the People Act. The hearing sets the bill up to be a flashpoint in an ongoing debate about the filibuster. Although, as we mentioned last week, many prominent Democrats have come out in favor of either killing or amending the filibuster, Politico reports that Democrats aren’t confident Chuck Schumer wants to end the practice. The House version of S.1 — H.R. 1 passed in early March.
- Congressional Hearing: Zuckerberg denied Facebook had a role in the Capitol riots, lawmakers focused more on misinformation and less on the bogeyman of conservative bias, and Jack Dorsey subtweeted in Thursday’s Congressional hearing on extremism and disinformation. This is the 18th such hearing since 2017, and considering we haven’t seen any meaningful legislative action yet, expectations were low, and met. You can read the witness’ opening statements here. Protocol has a good rundown of takeaways, as does Politico, and Jonathan Fischer proposes an interesting idea for future hearings in Slate. We would like to remind everyone that once again, YouTube CEO Susan Wojcicki escaped appearing.
- Section 230: Representatives Anna Eshoo (CA-18) and Tom Malinowski (NJ-7) have reintroduced their Protecting Americans from Dangerous Algorithms Act. The act removes platforms’ immunity from liability when they amplify content that leads to violence, and is supported by Decode Democracy. In other 230 news, Issie Lapowsky and Emily Birnbaum interviewed Sen. Mark Warner about Section 230 and his controversial SAFE TECH Act in Protocol. For those having trouble keeping track, Decode Democracy has a legislative hub, where we keep track of federal legislation that addresses the role of social media companies in amplifying digital deception.
- Ban Surveillance Advertising: Last week, we announced that Decode Democracy was part of an initiative to ban social media’s surveillance advertising practices. In Wired, Gilad Edelman has written about the growth of support for this idea over the past year, and why it has become so popular. TechCrunch also noted that the number of signatories on the campaign (there are 38) shows just how much momentum the movement is building. Rep. Eshoo recently announced that she and Rep. Jan Schakowsky (IL-9) will be introducing a bill to ban surveillance advertising. In the Financial Times Tim Bradshaw looks at how a payment-based business model could perhaps replace surveillance advertising.
- New approach to disinformation: The Center for Information, Technology and Public Life has created a Critical Disinformation Syllabus as a “provocation to disinformation researchers to rethink many of the assumptions of our nascent field.” The syllabus argues that disinformation is a key way in which white supremacy is reinforced, and that studies of it need to challenge this status quo.
- Extremist content: A new report from Avaaz finds that despite its policies, Facebook allowed 237 pages and groups that spread violence-glorifying material to remain on its platform. Media Matters for America also reports that TikTok is prompting users to follow far-right accounts that should be banned on its network. New evidence from a case against one of the participants in the Capitol riots raises questions about how to police disinformation on Messenger, Protocol reports. The messages show coordination between the Oath Keepers and the Proud Boys leading up to Jan. 6th.
- Public figures: According to internal moderator guidelines leaked to The Guardian, Facebook allows public figures to be harassed more than private individuals, with calls for their death permitted. Imran Ahmed of the Center for Countering Digital Hate pointed out the policy could be used by “hate actors who target women and minorities to dissuade participation by the very groups that campaigners for tolerance and inclusion have worked so hard to bring into public life.”
- “Digital totalitarianism”: Venezuela’s government has accused Facebook of “digital totalitarianism” after the company froze president Nicholas Maduro’s account for 30 days for spreading covid misinformation. The president was promoting an unproven Venezuelan remedy. The Information Ministry struck back by highlighting Facebook’s global power, calling it a “supranational” company trying to impose its “law on the countries of the world.”
- Suing Facebook: Reporters Without Borders is suing Facebook in France, citing a proliferation of anti-media hate speech on the platform. The group states that Facebook has been allowing disinformation and hate speech to flourish, which is contrary to its terms of service. Under EU law, companies cannot use deceptive commercial practices.
- QAnon updates: QAnon followers didn’t waste any time in claiming that a shooting in Colorado was a ‘false flag’ event. Extremism experts are also warning of an anti-Chinese and anti-Jewish QAnon rebranding that includes fears of vaccines, as well as a global plot to take over the world. Experts said the switch, with its focus on minorities, could lead to more violence. In Buzzfeed, Scaachi Koul talked to four former and current QAnon adherents about how their beliefs have changed following the inauguration. In the New York Times opinion pages, Matt Alt looks at whether Japan’s culture as well as a strong national broadcasting law may be hindering QAnon’s spread there.
- Do better: Twelve state’s Attorneys General have signed an open letter to the CEOs of Twitter and Facebook urging them to do more to combat vaccine disinformation on their platforms. An NRP analysis also found that misinformation connecting vaccines and death is some of the most engaged with online.
- Research: A new report from the Anti-Defamation League finds that despite a barrage of self-regulation from platforms, levels of hate and harassment reported by users was roughly the same this year as last year. A new study in Nature suggests that Americans are sharing disinformation online not necessarily because they lack the skills to tell it isn’t true, but simply because they aren’t taking the time to check. HKS’ Mossavar-Rahmani Center for Business and Government and the NYU Stern Center for Business and Human Rights have created a white paper of recommended steps the Biden administration should take to counter misinformation.
We value your privacy and will not send you unwanted emails. If you wish to limit your emails to just our weekly Decoder news roundup, please email email@example.com.