November 15, 2021

Decoder Newsletter: Among many reports, what have we learned?

Margaret Sessa-Hawkins & Viviana Padelli

This past week has been, in many ways, a week of reports. The Aspen Institute’s Commission on Information Disorder released its recommendations for addressing disinformation today, while Meta released two quarterly reports — one on its responses to the Oversight Board, and one on its community standards enforcement. So what have we learned from these reports? In broad strokes, not enough is being done to address disinformation. This has disastrous real-world consequences, as we see in the climate disinformation spreading across social media, as well as the COVID misinformation that continues to circulate. 

We take a look at all these topics in this week’s Decoder, and, just because it’s Monday, also invite you to take a gander at this little piece of levity. The Decoder is a weekly roundup of news about digital deception. Know someone who might be interested? Ask them to sign up!

The Decoder will be taking a holiday break. We’ll be back soon.

  • Disinformation Report: The Aspen Institute’s Commission on Information Disorder has released its recommendations for addressing mis and disinformation. The report argues that while the problem is often portrayed as too big to solve, there are concrete steps that can be taken to confront the crisis. Among these steps, the report recommends increasing transparency by ensuring public interest research, disclosing high-reach content, and creating standardized digital ad transparency measures. Speaking of transparency, Issie Lapowsky of Protocol reports that Facebook employees were instructed not to use the words ‘bias’ or ‘discrimination’ when talking about Facebook’s algorithm. 
  • Climate Disinformation: During COP 26, a global team of activists and researchers have been tracking the spread of climate disinformation. NPR has a story profiling their work, as well as examining how climate disinformation impacts public policy. In the wake of COP 26, More than 250 companies, including Virgin Media, O2 and Ben and Jerry’s, have signed an open letter calling for action on climate disinformation, including asking tech companies to implement comprehensive climate disinformation policies. For CNN Business, Clare Duffy reports on Facebook’s struggle with how to handle climate disinformation, and how far the company is willing to go to tackle the problem.
  • Oversight Report: Tuesday, Meta published its quarterly update on its responses to the Oversight Board. For Protocol, Issie Lapowsky writes that the report reveals a faulty system. While Facebook has said one issue is it can’t keep pace with the 30-day deadline for responding to cases, it did not provide suggestions for ways to meet the deadline (hire more integrity workers?). Facebook also said it needs to improve its communication with the board, as the official written recommendations and responses are not the most efficient.
  • Q3 Report: Meta also published its third quarter Community Standards Enforcement report for Facebook and Instagram Tuesday. In the Washington Post, Elizabeth Dwoskin has an analysis of what the reports revealed, and whether it could potentially placate lawmakers unhappy with the platforms (unlikely). Co-founder of CrowdTangle Brandon Silverman reminds everyone that these reports are about marketing, not transparency. NBC’s Brandy Zadrozny notes that the report shows the platform is fairly spammy. The New York Times’ Davey Alba pointed out that the ‘spammy’ nature of popular content in the reports obscures much of the harmful material.
  • Whistleblowing: Tuesday was also the date of ‘Whistleblowing Women’ an event looking at how women are taking on Big Tech. The discussion can be streamed here. Pinterest whistleblower Ifeoma Ozoma also spoke to Justin Hendrix at Tech Policy Press about her Tech Worker Handbook for whistleblowing. Ozoma noted that, “if you’re working for a tech company, you’re essentially working for a surveillance operation.” Speaking of Big Tech and opacity, The Verge, which is sick of Big Tech spokespeople in particular offering information ‘on background’ has a new editorial policy.
  • Advertising Changes: Meta has announced that beginning in the new year, it will be removing certain targeting options for advertisers placing ads on ‘sensitive’ topics including politics, health, and religion on Facebook and Instagram. The New York Times reports that Meta has “faced a litany of complaints about advertisers abusing these targeting abilities.” In the New York Times, Nick Corasaniti writes that the changes will do little to stop specific targeting, as there are numerous ways to get around the new regulations.
  • Deprioritizing Disinformation: A Washington Post fact-checker analysis of a new GOP attack ad looks at the practice of manipulating or deceptively editing video to create disinformation. For Time, Billy Perigo and Vera Bergengruen report how a similar situation — a May 2019 deceptively edited video of Nancy Pelosi — led to researchers finding a way to potentially significantly reduce misinformation. The researchers deprioritized it, however, after a meeting with Mark Zuckerberg.
  • Covid disinformation: A new report from Decode Democracy finds that Facebook is allowing state lawmakers to spread coronavirus mis and disinformation. The report notes that, “Either Facebook is failing to enforce its own standards against posts that violate its COVID content policy, or it is actively allowing state lawmakers to post dangerous, false claims.” In other Covid disinformation news, NBC’s Ben Collins reports on new ‘detox’ treatments for those who want to clear a coronavirus vaccine from their system (which is not actually possible). In CodaStory, Isobel Cockerell reports that those opposed to covid vaccines are now also using social media to spread conspiracy theories about molnupiravir, a pill to treat covid.
  • Research: A new CNN poll conducted by SSRS finds that three out of four adults believe Facebook is making society worse. About half of poll respondents know someone who they think began to believe a conspiracy theory because of social media. For Nieman Lab, David Markowitz looks at whether research shows we lie more because of technology. In the Washington Post, Ethan Porter and Thomas Wood report that while research shows fact-checks work, Facebook is not deploying them as well as it could.

Join the fight
against online
political deception.