May 4, 2021

Decoder Newsletter: The Oversight Board Prepares Its Trump Card

Margaret Sessa-Hawkins & Viviana Padelli

It’s been a month and a half since we wrote in the Decoder that the Oversight Board’s decision on Facebook’s indefinite suspension of former president Donald Trump could be coming soon. Six weeks later, the board has finally announced that it will release its ruling Wednesday morning. What will this mean for Facebook globally as well as social media at large? We explore this as well as deplatforming laws and failures to handle Covid-19 digital disinformation in this week’s Decoder.

  • Decision at last: The Facebook Oversight Board announced Monday that it will be releasing its long-awaited decision on the suspension of former president Donald Trump tomorrow. For the Columbia Journalism Review, Renee DiResta and Matt DeButts explore what the decision will mean for the concept of “newsworthiness.” David Kaye, from UC Irvine’s law center, has a thread about what he is looking for from the decision, especially from a human rights perspective. In The New Yorker, Sue Halpern writes that no matter what the Oversight Board decides, Facebook has long engaged in “the normalization of deviance” to the harm of users and society, a practice that will continue. The Real Facebook Oversight Board, meanwhile, is reasserting its opinion that whatever decision comes down is a smokescreen meant to mask Facebook’s faults.
  • No deplatforming: On Thursday, Florida’s legislature approved a bill that would prohibit social media companies from deplatforming political candidates. Although Florida Governor Ron DeSantis is expected to sign it into law, it will face nearly immediate legal challenge on the basis that it violates social media companies’ First Amendment rights.
  • Section 230 changes?: A recent ruling by a Los Angeles Court of Appeals could have implications for how Section 230 is interpreted. The court ruled that Amazon was liable after a hoverboard sold by a third party on its website exploded. In The Washington Post, Reps. Anna Eshoo (CA-18) and Tom Malinowski (NJ-7) have an op-ed promoting their bill to remove Section 230’s protections when a company’s algorithm amplifies content that contributes to terrorist acts or violates civil rights statutes.
  • Algorithms: For the Columbia Journalism Review, Matthew Ingram analyzed last Tuesday’s congressional hearing on social media’s algorithms. His conclusion? The hearing was too much of a softball session. Protocol’s Issie Lapowsky also has a great thread of key moments from the hearing. Dr. Joan Donovan, of Harvard’s Shorenstein Center, agrees. Meanwhile, France is considering using online algorithms to detect extremism as part of stricter counter-terrorism laws.
  • Dark patterns: Thursday, the FTC held a workshop looking at “dark patterns,” when companies use manipulative designs to impair customer choice (think having to navigate through multiple screens to avoid getting charged). In Digiday, Kate Kaye wrote about the workshop and the current fight around dark patterns.
  • Antitrust and privacy: Sen. Amy Klobuchar went on The Verge’s Decoder podcast to talk about her new antitrust book, and what she sees as the future of tech antitrust. Key takeaways included her opinions on how to regulate Big Tech, bipartisanship, and how antitrust legislation will affect innovation. European economists Gregory Crawford, Cristina Caffara and Johnny Ryan made the case last week that privacy should be considered when evaluating monopolistic harms. In the United States, a new Morning Consult poll has found that 83 percent of voters say national data privacy legislation should be a priority for Congress this year.
  • Disinformation Dozen sequel: A new report from The Center for Countering Digital Hate (CCDH) has found that many of the so-called “disinformation dozen” responsible for spreading Covid disinformation are still active on social media. The report, a follow-up to an earlier “disinformation dozen” report, found that ten of these disinformation superspreaders were still on Facebook and Twitter, and nine are on Instagram. The CCDH also found that some of the accounts were spreading antisemitic posts.
  • Facebook newsletters: On Thursday Facebook announced that it was accepting applications for what looks to be a Substack competitor newsletter service. Facebook says the program will prioritize journalists of color who cover local communities. In Scrawler, Ryan Lawler has written a pessimistic analysis of the idea. Former Buzzfeed tech reporter Craig Silverman’s advice to those considering applying is to be mercenary about it. The new launch also once again raises questions about social media’s role in both the demise of local news and rise of disinformation, something Emily Bell looks at in-depth for the Columbia Journalism review.
  • #ResignModi scandal: Facebook made headlines for hiding the hashtag #ResignModi in India Wednesday. After Buzzfeed reported the story, the company claimed the hashtag was blocked in error, but given the timing and recent events there’s some well-earned skepticism of the statement. In the New York Times’ “On Tech” column, Shira Ovide discusses the social media situation in India with Misha Choudhary, a digital rights lawyer who is “furious” with the failures of both the government and American tech companies in the midst of the pandemic. Ovide notes that this once again shows tech companies wielding power equivalent to that of elected governments. In Platformer, Casey Newton discussed a recent Oversight Board decision that also dealt with criticism of Modi, and looked at how the decision shows the board can be a force for good, as well as the importance of adequate human moderation.
  • The chaos machine: The NPR podcast Invisibilia is running a three-part series focused on a news website in Stockton, California. Is it an important investigative journalism outlet? Or an outlet for Russian trolls spreading lies about progressive leaders. The podcast is an interesting examination of what can happen when local news leaves, and digital media takes over. Those interested in how widespread news deserts have become should also check out this nifty map.
  • For the People Act: A new poll from Equal Citizens has found strong support for the “For The People Act.” Meanwhile, In Popular Info Judd Legum writes about the U.S. Chamber of Commerce’s opposition to the act. Legum notes that in a letter to the Senate, the Chamber made it seem as though “Senators that vote to protect voting rights and reform the electoral process may be deemed enemies of the business community.”
  • Research: A new report from the Institute for Strategic Dialogue looks at how Amazon’s recommendation algorithm can lead people down a hole of extremist reading. In Protocol, Issie Lapowsky spoke with Chris Bail, director of Duke’s Polarization lab, about his new book exploring what role social media algorithms play in promoting polarization, and how this polarization can be countered. The International Center for the Study of Radicalisation released a paper looking at the evolution of extremism in the first 100 days of Joe Biden’s presidency. The report finds that despite the administration change, the US is still dealing with a “persistent domestic extremist threat,” and looks at how this threat is linked to digital disinformation. A new paper in Cartography and Geographic Information Science looks at the threat deep fake disinformation poses to maps. In The Washington Post, Dominik Stecula and Matt Motta write that those who tune in to Joe Rogan’s YouTube channel have become increasingly more hesitant about taking a covid vaccine over the past few months. Rogan has expressed concerns about the vaccines repeatedly on his program.

Produced by Decode Democracy, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we’ll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up! 

We value your privacy and will not send you unwanted emails. If you wish to limit your emails to just our weekly Decoder news roundup, please email info@decode.org

Join the fight
against online
political deception.