July 19, 2021

Decoder Newsletter: Power and Transparency in Social Media

Margaret Sessa-Hawkins & Viviana Padelli

This week’s Decoder returns to the themes of  power and transparency. Social media companies currently wield an incredible amount of power — to an extent that sometimes rivals elected governments. Yet there is almost no transparency as to how they are wielding this power. This Decoder, we look at how power and transparency have affected social media companies’ role in the coronavirus pandemic, as well as the proliferation of hate speech on the platforms. We also examine ongoing antitrust cases, and what their impact could be. The Decoder is a weekly roundup of news about digital deception. Know someone who might be interested? Ask them to sign up!

  • Disinformation kills: President Joe Biden made a splash Friday when he stated that social media platforms are ‘killing’ people with misinformation. It came after the US Surgeon General, Dr. Vivek Murthy, released a report warning that health misinformation poses a significant risk to the public. The advisory noted that social media platforms have contributed to both the speed and scale of the misinformation problem. Renee DiResta, who has studied the anti-vaccine movement online for years, has a must-read thread on its evolution, and social media platforms’ complicity. On NBC.com, Harvard’s Joan Donovan and Jennifer Nilsen argue that consumer protection regulations are needed to protect individuals from the harms of social media. Codastory also reports that western vaccine myths are thriving in Africa, where many countries are seeing a surge in the virus.
  • Facebook’s response: Facebook, shockingly, did not agree with the assessment that its platform is hindering vaccine uptake. Facebook’s Vice President of Integrity argued that the administration is wrongly casting blame, and Facebook users in fact can’t get enough of vaccines. In a rebuttal, Carl Bergstrom, a biology professor at the University of Wisconsin, systematically picks apart much of the blog post point by point to show why it is, to put it politely, utter malarkey.
  • No transparency: One of the points in Bergstrom’s post was that while Facebook stresses the need for facts and data in its blog, Facebook itself makes it hard to find that information. This was exemplified last week by the revelation from The New York Times’ Kevin Roose that numerous employees of CrowdTangle, a popular Facebook analytics tool used by journalists, had been reassigned — signalling that Facebook may be looking to axe the tool. The article underscores Facebook’s allergy to transparency and its failure to address disinformation. As one Facebook employee tells Roose, “When transparency creates uncomfortable moments, their reaction is often to shut down the transparency.”
  • Transparency delays: Speaking of transparency, Facebook announced the results of its study of the 2020 election will be delayed until 2022.  Facebook has also delayed a brand safety audit that it agreed to after advertisers expressed concerns their messages were both appearing adjacent to and supporting hate speech. The company also has yet to sign a contract with the Media Rating Council to perform the audit, meaning it might never happen.
  • Antitrust: Facebook has submitted a petition to the Federal Trade Commission asking that Lena Khan, its chair, recuse herself from an antitrust case against the company. Following in the footsteps of Amazon, the company argued that Khan’s history in academia and think tanks show she is prejudiced against the case. In Protocol, Ben Brody looks at why the petitions could complicate matters, even if they don’t succeed. For those interested in antitrust and Big Tech, The Information has a database of current antitrust probes and cases against Big Tech.
  • Racism in Big Tech: A Google project lead has quit, saying that at the company she experienced some of the worst corporate racism of her career. The move coincidentally came as platforms were slow to respond to racist abuse that was piled on three black football (soccer) players following England’s loss to Germany in the European football championship. One of the players, 19-year-old Bukayo Saka, has urged Facebook and Twitter to deal with the problem of racism on their platforms, stating that no one should have to receive the type of ‘hateful and hurtful messages’ that he and his teammates have.
  • Black Twitter: Speaking of the interplay between race and technology, in Wired, Jason Parnham has an oral history of how Black Twitter rose to become a cultural juggernaut. The story is a fascinating read about a powerful cultural force in social media. Check out the first part of the series here.
  • Oversight: Facebook has released its first quarterly update on the Oversight Board. Evelyn Douek has a thread highlighting and analyzing key points from the document. Facebook has also said that it will be choosing some members of groups to become ‘experts’ designed to stamp out disinformation. In Business Insider, Katie Canales called the move, “reminiscent of Facebook’s creation of the Oversight Board’ and declared it amounts to the company ‘enlisting someone else to do its work for it.”
  • Research: A new Online Regulation Series handbook from Tech Against Terrorism analyses online regulation around the globe, looking at legislation and regulatory proposals in 17 countries. In New Media and Society, Madhavi Reddi, Rachel Kuo and Daniel Kreiss, look at ‘identity propaganda’, examining how narratives strategically target and exploit identity-based differences. In light of the news about CrowdTangle, we also wanted to elevate this slightly older paper using the tool’s data to look at how outgroup animosity drives virality on social media.

Join the fight
against online
political deception.