March 22, 2021

Decoder Newsletter: Changes Aren’t Always Fixes

Margaret Sessa-Hawkins & Viviana Padelli

Produced by Decode Democracy, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we’ll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • Changes to groups: Facebook is changing the way groups operate. The company announced Wednesday that it will make groups that break the rules less discoverable and gradually increase restrictions on rule-breaking groups until they are banned completely. As TechCrunch pointed out, the problem with these policies is that they are “ultimately hand slaps for those who break Facebook’s rules.”
  • All change, no fix: Speaking of policy changes, Decode Democracy released our new report on Big Tech’s failure to police itself this past week. “All Change, No Fix” documents how a patchwork of changes across major social media platforms over the last 18 months have consistently failed to have an impact on disinformation and protect our democracy. The Hill also delved into some of the highlights from the report and Decode Democracy’s Daniel G. Newman highlighted key findings and explained how the For the People Act would help combat disinformation in Newsweek.
  • Pre-hearing activism: As noted in our report, Facebook’s policy change on groups is part of a pattern where the company introduces new regulations ahead of an important hearing. In this vein, they also released a blog post containing “updates” on how they are dealing with misinformation across their platform in advance of Thursday’s hearing.
  • #BanSurveillanceAdvertising: Ahead of the hearing, numerous groups including Decode Democracy have come together to call for a ban on surveillance advertising. In a letter, the groups point out that surveillance advertising and extensive profiling by social media companies amplifies hate, encourages illegal activity and feeds conspiracy theories, and that it’s time to end this destructive practice.
  • Anti-AAPI hate: In line with its policies on Dangerous Individuals and Organizations, Facebook has said it is removing posts that support either the recent shooting in Atlanta, or the suspected shooter. The New York Times took a look at how anti-Asian activity on various online platforms has helped to spur real-world violence. A new report from Stop AAPI Hate found nearly 4,000 incidents of harassment targeting Asian Americans over the past year, in part due to racist scapegoating of COVID-19. Buzzfeed reported that Cherokee County Sheriff’s Capt. Jay Baker who stated the shooting suspect had “a bad day,” had previously shared photos of t-shirts promoting this racist scapegoating on his Facebook page.
  • Twitter wants you: Twitter is asking for public feedback on how it should handle world leaders on its platform. The company notes that this is not the first time it has turned to the public for input on its policies.
  • For the People: Senate Democrats announced the introduction of a democracy reform bill, S.1, the “For the People Act” on Wednesday. The bill is a counterpart to H.R.1, which was passed on March 3 along partisan lines. Decode Democracy President Daniel G. Newman called the bill, “the type of bold legislation we need to bring our democracy back from the brink of collapse”. In the Daily Beast, Jessica Huseman, the editorial director of Votebeat, argues that while the bill will help “ensure voting rights for all” there are issues with it that will cause incredible difficulty for election directors. The bill faces an uncertain future, with Stacey Abrams becoming the latest public figure to argue for a filibuster carve-out in order to pass it.
  • Another Section 230 bill: Rep. Jim Banks (R-IN) is introducing a bill to amend Section 230, reports The Hill. The bill would remove the liability shield in cases where platforms are judged to have knowingly distributed illegal material. In TechDirt, Mike Masnick lays out some of the problems with Banks’ interpretation of 230.
  • Cookies crumbling: Justice Department investigators have been looking into Google’s blocking of cookies, Reuters reports. The investigators are looking into whether the policy stifles competition by allowing Google to block cookies for rivals, while still using loopholes to collect its own data. State Attorneys General, led by Texas, are also updating an antitrust complaint against new privacy updates to Chrome. The complaint now alleges that, “Google does not actually put a stop to user profiling or targeted advertising — it puts Google’s Chrome browser at the center of tracking and targeting.”
  • Upcoming hearings: The House Energy and Commerce committee will be holding a hearing Thursday, March 25, on social media’s role in spreading disinformation and extremism. The CEOs of Google, Facebook and Twitter will all be testifying. And while Google might be represented, YouTube, once again, escapes notice, despite being a huge vector for the spread of conspiracy theories.
  • Content moderation: The content moderation debate comes for everyone in the end. This week, it’s Substack. Loyal readers will recall that a few weeks ago, amidst discussion of social media forays into newsletters, we took a look at how Substack could potentially contribute to online hate and disinformation. And hey, what do you know, last week a post from Jude Ellison Sady Doyle, criticized Substack for “giving massive advances … to people who actively hate trans people and women.” The blog made waves, and the company responded with a post maintaining that it is a transparent, neutral platform with content guidelines. The New Republic has a good analysis of the debate, looking especially at the implications of Substack approaching writers based solely on their followings. CJR also has a good article on Facebook’s Substack-like goals that touches on many of the content moderation issues.
  • Access denied?: A new article in Protocol explores Facebook’s standoff with researchers at NYU who ran the Ad Observatory and its implications for independent researchers. Facebook served the organization a cease-and-desist letter after arguing that its Ad Observatory was scraping data. It turns out the company was referring to advertiser accounts (not private user accounts) but the standoff continues.
  • Research: For Stanford’s Center for Internet and Society Daphne Keller reflects that while there’s an almost unanimous demand for ‘transparency’ of platform content moderation, there’s a lack of definition as to what that transparency would look like — and provides some recommendations. The Wall Street Journal looks at how the pandemic has “supercharged” Google, Facebook and Amazon’s advertising. A new report from Open Secrets found that “dark money” spending topped $1 billion in the 2020 election, including $132 million in digital ad spending.

Join the fight
against online
political deception.