September 20, 2021

Decoder Newsletter: It’s All About Facebook

Margaret Sessa-Hawkins

Normally in the Decoder, we try to make sure we are recapping the news about digital disinformation across a variety of social media platforms. This week marks a slight departure as we turn our focus almost exclusively to Facebook. From the Wall Street Journal’s publication of the Facebook Files — a damning look at how the company disregards democratic values and user wellbeing for profits — to the MIT Technology Review’s exposé on troll farms thriving on the platform, to the company’s swing and a miss on tackling climate change disinformation, Facebook has dominated headlines recently. We take a look at the  news, reactions, and whether the revelations could finally spur regulations. The Decoder is a weekly roundup of news about digital deception. Know someone who might be interested? Ask them to sign up!

  • AI Fueling Disinformation: A new report from the Anti-Defamation League, Avaaz, Decode Democracy,  the Mozilla Foundation, and New America’s Open Technology Institute as part of The Coalition to Fight Digital Deception looks at how Automated Intelligence is fueling the spread of digital disinformation. The report shows that social media companies’ reliance on AI to both moderate and curate content has contributed significantly to the spread of disinformation. To address the problem, the report calls on social media companies to direct more resources toward moderation and fact-checking, provide researchers with meaningful access to data, and provide users with more robust controls. It also recommends lawmakers enact policies that promote transparency and accountability for social media and technology companies.
  • Response: Facebook issued a response to the Wall Street Journal’s articles arguing that the reporting mischaracterized the situation. Former head of Facebook Elections Integrity Operations Yael Eisenstat hit back against this idea. Julie Carrie Wong, author of the excellent ‘Facebook Loophole’ series for The Guardian, argued that the response just further shows how ‘corrupted’ and ‘unfixable’ the social media network is. She argues that its harms must be mitigated until it is dismantled.
  • Reactions: Will Oremus has a good piece in the Washington Post on how much these reports reflect Facebook’s complex overlap between policy and politics. In light of the revelations, comparisons between Facebook and the tobacco industry, originally raised by a former director at Facebook, Tim Kendall, have been resurfacing from multiple sources. NYU’s Laura Edelson has a good thread defending why the Facebook files show once and for all that Facebook needs to be regulated. Speaking of regulation, The Markup has a great piece looking at why Facebook seems to be advocating for regulatory policies (but actually isn’t).
  • Instagram Reactions: One piece that generated quite a bit of response was the revelation that Instagram harms teen girls’ mental health. While Instagram issued a response to the article, Senator Ed Markey (D-Mass) Rep. Kathy Castor (FL-14) and Rep. Lori Trahan (MA-03) still had some questions about this for Facebook CEO Mark Zuckerberg. In the UK, Member of Parliament Damian Collins said that Facebook should be fined over withholding the information on Instagram’s harm, while, TechCrunch pointed out that the revelations don’t exactly cast Facebook’s ‘Instagram for kids’ idea in a particularly good light.
  • Feeding Trolls: An internal Facebook report obtained by MIT’s Technology Review shows that content from troll farms reached more than a third of all Americans before the 2020 election. Strikingly, the report notes that the content reached people not because they chose to follow a page, but because Facebook’s content-recommendation system was pushing it into their news feed.
  • Hate raids lawsuit: A new court case could set precedents in the world of social media. Gaming platform Twitch is suing two anonymous members of its platform who have been conducting ‘hate raids’ — using armies of bots to attack Black and LGBT+ users. In Protocol, Issie Lapowsky goes into some of the troubling aspects of the case, including potential problems with platforms suing users over terms of service breaches, as well as trying to undermine their anonymity.
  • Climate disinformation: Facebook claims that it will be instituting a series of new measures to tackle climate disinformation on its platform. Gizmodo breaks down why the measures might fall short. It was only three weeks ago, for example, that Facebook itself was revealing that one of its most viewed sites was a page that frequently peddles climate disinformation.
  • Coordinated Authentic Behavior: Facebook finally announced that it’s moving away from a longstanding policy of only targeting coordinated inauthentic behavior to also target coordinated authentic behavior in a blog post Thursday. The company will now work to enforce policies against groups that coordinate to spread disinformation and hate speech but do not qualify as ‘inauthentic’. Activists have been advocating for just this change for quite some time.
  • Research: A new study in the Journal of the European Economic Association looks at motivated belief — the idea that we believe things because we want to, not because they are correct — as a driver of echo chambers. Nieman Lab has a good write up on the study. A new study in PNAS looks at whether misinformation mitigation measures from richer Western industrialized countries could also work in developing countries in an attempt to fill a fairly substantial misinformation research void. Again, Nieman Lab has an excellent write up. A report out from NYU looks at how social media contributes to polarization, and what can be done about it.

Join the fight
against online
political deception.