September 27, 2021

Decoder Newsletter: Can Social Media Be Held Accountable?

Margaret Sessa-Hawkins & Viviana Padelli

Will social media companies ever be held accountable for the harms they have pushed on users and society? This question gained renewed prominence last week following the Wall Street Journal’s publication of its explosive “Facebook Files,” a series of articles revealing just how egregiously the company has prioritized profits over people. Far from fading, the question remained active in the public discourse this past week — especially in light of the New York Times’ revelation that Facebook was planning to use its News Feed to push pro-Facebook stories on users. This week’s Decoder dives into the people and organizations working to hold social media companies accountable, what they’re doing, and the obstacles in their way. We also look at social media’s effect on the German election, and an Epik hack that is making waves. The Decoder is a weekly roundup of news about digital deception. Know someone who might be interested? Ask them to sign up!

  • News Feed Propaganda: Last Tuesday the New York Times’ Sheera Frankel and Ryan Mac reported that Facebook CEO Mark Zuckerberg had signed off on a project to show users pro-Facebook stories in their News Feed. The proposal, code-named Project Amplify, involved Facebook pushing positive stories — some of them written by the company — onto users to improve its image. In the Columbia Journalism Review, Matthew Ingram chronicles how this, combined with the company’s response to the Facebook Files, represents a stark change to how Facebook approaches negative reports and scandal.
  • Facebook Files Fallout: Speaking of the Facebook Files, the fallout from the revelations continues this week. Cristiano Lima reports for The Washington Post that a whistleblower claiming to be the source of the documents has identified themselves to members of Congress and will potentially testify before the Senate Commerce Committee’s Consumer Protection Panel by year’s end.
  • Instagram Fallout: Meanwhile, Facebook’s global head of safety, Antigone Davis, will testify at the panel about Instagram’s effect on teens’ mental health this Thursday. In advance of the hearing, Facebook published a blog post refuting the WSJ’s characterization of its research of teen well-being and Instagram. The Times’ Ryan Mac, questioned the motives behind Facebook’s decision not to release the full report. Monday morning, Instagram head Adam Mosseri announced it will be pausing work on its Instagram for kids app.
  • Oversight: Facebook’s Oversight Board has asked the company to provide it with more information on its XCheck program in light of the Journal’s reporting. The board previously asked Facebook to ‘clearly explain the rationale, standards, and processes of review’ of XCheck in its decision on the indefinite suspension of former president Donald Trump’s accounts.  Now, it’s asking for more information due to Facebook’s failure to provide the information back in June, as well as its misleading characterization of the XCheck program. The board says it will publish any new information in its quarterly transparency report in October. In Quartz, Scott Nover looks at what the back-and-forth says about the Oversight Board’s power and role as Facebook’s “Supreme Court.”
  • Whistleblowers: The Facebook Files are shining a spotlight on whistleblowers, and there were two particularly good pieces on tech and whistleblowing this week. One is an article by Ifeoma Ozoma over at One Zero on employer-tied healthcare preventing tech whistleblowing/accountability. In The Information, Sarah Krouse looks at how Google spies on employees to identify whistleblowers.
  • Obstructing research: Over the past few weeks, the Decoder has chronicled Facebook’s efforts to block independent research. Now, Decode Democracy has a new timeline outlining Facebook’s years of efforts to stymie research. In Scientific American, NYU’s Laura Edelson has an op-ed looking at how the social media platform consistently works to hinder misinformation research. Tuesday, the House Science, Space and Technology Committee will hold a hearing on obstacles researchers encounter when trying to study digital disinformation at which Edelson will testify. In new research obstruction news, The Markup reports that Facebook is rolling out an update that is making it harder for some researchers to collect data from the platform and risks damaging screen reader software for visually impaired users.
  • Bad ads: The Daily Beast reports that Facebook has still been running harmful “abortion reversal” ads, even after promising to look into them following a report from the Center for Countering Digital Hate. FWIW also reported that Facebook is allowing ads from Donald Trump’s PAC alleging that the 2020 election was rigged. Facebook did, however, release more information this past week on posts it demotes in News Feed.  Nieman Lab, meanwhile, looks at whether fact-checking can become a profitable endeavor for some organizations.
  • Epik Hack: A group of hackers claiming to be associated with the collective Anonymous recently released a trove of formerly private data on users of the far-right favorite domain server Epik. The data will be useful for extremism researchers, and could also open the company up to action from the Federal Trade Commission, given its promises of security.
  • German election: While the Social Democratic party has eked out victory in Germany, the close margin means the country is headed for a coalition government. Social media has influenced the election in several ways, with the New York Times noting that the harassment and abuse to candidates has revealed the shortcomings of one of the world’s toughest laws against online hate speech. The BBC chronicles many of the conspiracy theories that have been swirling around the election, and their links to US conspiracy theories. Politico Europe looked at Russia’s online interference in the election. In The Markup Angie Waller and Colin Lecher used Citizen Browser data to show that promoted posts from Germany’s far-right party were appearing on Facebook three times more often than opponents’ posts.
  • Research:  A new study in the journal PLoS One has found that searching for credible information can help people to let go of beliefs in misinformation. New research from the Gallup/Knight Foundation shows that younger people are more skeptical of news organizations, but also think that news is critical to democracy. A report from NewsGuard has found that children see Covid misinformation on TikTok within minutes of creating an account. Data & Society has published a new report looking at how news media amplified hate group messaging from 2016 to 2018, and how organizations can avoid the practice going forward.
  • One request: We’d love to learn more about our subscribers and what kinds of issues are most important to you. The survey should only take 5 minutes, and your responses are completely anonymous. If you haven’t already, please click here to take our brief survey.

Join the fight
against online
political deception.