September 13, 2021

Decoder Newsletter: The Difficulties of Content Moderation

Margaret Sessa-Hawkins

Work in the social media or digital disinformation spheres long enough and it quickly becomes clear that nearly everything is about content moderation in the end. This week’s Decoder looks at some of the challenges of and issues around content moderation — from reports on outsourcing, to how to deal with emojis, to protecting kids. We also explore the role researchers can play in making content moderation more transparent, and the challenges they continue to face. We also look at Twitter’s new ‘communities’ feature, electoral disinformation, and the latest research. The Decoder is a weekly roundup of news about digital deception. Know someone who might be interested? Ask them to sign up!

  • Social Media & Research I: Davey Alba writes for the New York Times that Facebook sent flawed data to researchers, potentially costing them years of work. According to internal emails, while Facebook touted a plan to provide data on U.S. users who engaged with political pages, they only provided around 50 percent of the promised material. The error was only discovered after Fabio Giglietto, an associate professor and social media researcher from the University of Urbino, in Italy, compared data Facebook released publicly on its most-viewed pages to the data from Facebook and found a discrepancy. Dr. Rebekah Tromble has a thread on what needs to be done to prevent future data errors.
  • Social Media & Research II: In Wired, Will Knight writes about a new paper looking at how many academics studying Artificial Intelligence get funding from Big Tech — and how that affects the field. After Facebook forced it to shut down a project, Algorithm Watch has published an open letter to European Union lawmakers asking them to turn the Digital Services Act into an effective tool to grant public interest researchers access to platform data. Organizations or individuals in agreement can also sign the letter.
  • Social Media & Research III: Another angle on academic ethics concerns and Big Tech was raised this past week when a Northwestern professor wrote a legal brief arguing Federal Trade Commission Chair Lina Khan should recuse herself from a Facebook antitrust case. While the professor, Dr. Daniel Rodriguez, disclosed that he was retained by a law firm, he did not disclose that the law firm was employed by Facebook. The incident prompted nine organizations to co-sign a letter calling on Northwestern to implement better disclosure practices.
  • Electoral disinformation: A new report from Common Cause has found that social media companies are failing to enforce their regulations on electoral disinformation as the election recedes — despite the prevalence of disinformation linked to the Big Lie. Media Matters for America also reports that Fox News is pushing conspiracy theories about the California recall election. Candidate Larry Elder is also pushing similar theories, according to the Sacramento Bee. In the Huffington Post, Jesselyn Cook has an article on members of Trump’s inner circle paying teens for Instagram posts.
  • Content moderation: In the New York Times Adam Satariano and Mike Isaac look at the consulting firm Accenture’s role in content moderation on Facebook. The piece touches on the trauma of content moderation, and the human costs of outsourcing the work. For Bloomberg, Ivan Levingston reports that racist emojis are posing a problem for social media content moderators. Both Sen. Elizabeth Warren and Rep. Adam Schiff (D-Calif) have sent letters to Amazon raising concerns that the company’s algorithms are promoting coronavirus disinformation, and asking for clarification on company policies.
  • Social media & kids: Salvador Rodriguez reports for CNBC that teachers are struggling with how to help students who have been exposed to conspiracy theories while stuck at home during the pandemic. Meanwhile, in the Wall Street Journal Rob Barry, Georgia Wells, John West, Joanna Stern and Jason French report on TikTok showing inappropriate videos to minors. A new study from University College London in England has also found extremist views and ‘conspiracism’ are rising among students.
  • Ivermectin story: A viral story about the antiparasitic drug Ivermectin serves as a cautionary tale about how disinformation pushback can sometimes be disinformation itself. While Ivermectin is being touted as treatment for covid despite being unproven, the viral story reported that gunshot wound patients were being turned away from a hospital due to an overload of overdosing Ivermectin patients. This was not true. In the Columbia Journalism Review, Matthew Ingram writes about how the mistake happened, and what it says about news and disinformation. In the end, the story serves as a reminder that we’re all susceptible to disinformation, even if — and perhaps especially if — we think we’re not, and that, as Derek Thompson pointed out, we shouldn’t “miss the forest of vaccine denial for the tree of ivermectin.”
  • Research: A new study from the Reuters Institute and the University of Oxford has found that indifference, not hostility, is the biggest challenge when trying to increase trust in news. A new paper in PNAS looks at the efficacy of fact-checking around the world. Co-author Ethan Porter has a thread summarizing it. There is a new organization in the disinformation sphere. The Disinfo Defense League will work to disrupt online campaigns targeting communities of color.

 

Join the fight
against online
political deception.