August 23, 2021

Decoder Newsletter: The Role of Social Media In International Affairs

Margaret Sessa-Hawkins

Recently we’ve talked about how problematic it is that unelected social media heads wield power equivalent to that of elected governments. Perhaps never has that been more apparent than with the recent Taliban takeover of Afghanistan. This Decoder, we look at how social media platforms are handling the Taliban, and what that says about their policies going forward. We’re also looking at Facebook’s continued hostility toward independent researchers, and the spread of health disinformation on multiple platforms. The Decoder is a weekly roundup of news about digital deception. Know someone who might be interested? Ask them to sign up!

  • Housekeeping Notes: The Decoder will be off next week and back in your inbox after Labor Day. Also, Decode Democracy is looking for an Executive Director to help lead our work fighting disinformation and holding social media companies accountable. Click here to learn more about the role and apply to join us.
  • Afghanistan & social media: Reports about the Taliban using social media to target individuals began to emerge this week, starting with a Wall Street Journal report Tuesday that the Taliban was searching people’s phones for communications in English. In NBC News, April Glaser and Saphora Smith subsequently reported that many Afghans were deleting information from their social media accounts that could potentially link them to Westerners. Facebook announced that it was removing the ability to search the friends lists for accounts in Afghanistan, as well as allowing people to easily lock their accounts. Glaser also notes, though, that the help pages for many social media sites haven’t been translated into useful languages for most Afghans.
  • Taliban on social media: Facebook and YouTube have also blocked the Taliban from their social media sites. Twitter however, still allows the group, which this week led to criticism from former free-speech fan app Parler of all places. It has also been noted that we currently live in a world where a former United States president is banned from Twitter, but the Taliban is not. Even though they are banned, some Taliban members are still finding ways onto the sites.
  • Reactions: In The New York Times’ On Tech newsletter, Shira Ovide muses on how the decision over what to do about the Taliban has once again put tech CEOs in a position of power fairly equal to elected global officials. Both The New York Times and The Washington Post took a look at how the Taliban is using social media for propaganda. And for the Columbia Journalism Review, Jon Allsop wrote about the Taliban’s PR strategy writ large, and its evolution over time.
  • What we see on Facebook: Wednesday, Facebook put out a transparency report that it said showed the most viewed content for the second quarter of the year. The criticisms to the report, especially in light of the company’s recent systematic stamping out of independent research, were swift and numerous. In Gizmodo, Shoshana Wodinsky pointed out that the blog is Facebook’s way of trying to convince people it isn’t a home for hate speech and disinformation. It’s rather unfortunate then, that Brandy Zadrozny of NBC noticed that right-wing disinformation outlet Epoch Times had snuck subscription links into this most-viewed content. Dr. Rebekah Tromble has a thread looking at what was — and wasn’t — included in the report.
  • Facebook’s oopsies: Of course, it later turned out that Facebook actually was going to release a report before this one, but didn’t, “because of concerns that it would look bad for the company.” The most-viewed story in that report was a news article suggesting that the coronavirus vaccine was responsible for a Florida man’s death. For Decode Democracy, Viviana Padelli writes that Facebook’s continued war on transparency is also taking a toll on democracy. In Vox, Shirin Ghaffary writes that Facebook’s allergy to transparency is blocking researchers from fully studying coronavirus content on its platform.
  • Health disinformation I: The Oversight Board has ruled on a case involving potential health disinformation in Brazil. The case centered on a post from a local health authority that was anti-lockdown, but promoted vaccines and social distancing. In its response the board argued that Facebook should consider local context when making health decisions. It also said that while leaving the post up was correct, other information —such as warning labels — should have been applied. This Twitter thread around the ruling is well worth a read, as it provides an interesting look at how health disinformation is handled on the platform.
  • Health disinformation II: The Washington Post also reports that the Biden administration asked Facebook for its data in order to help tackle health mis and disinformation on the platform, and Facebook refused to share it. In Tech Policy Press, the Center for Countering Digital Hate hit back at a Facebook blog post aimed to undermine its reporting on the so-called ‘disinformation dozen’ spreading COVID-19 disinformation on the platform. In other platform news, Twitter has created a new process for reporting health disinformation, which will now be similar to reporting harassment or other harmful content.
  • Research: A Tech Transparency Project investigation has found that militia groups on Facebook are rife with conspiracy theories. A new report from Public Citizen looks at how predictive algorithms exacerbate racial discrimination. The author has a good summary thread on Twitter. A new Pew survey has found that the number of Americans saying the government should act on false information online has increased since 2018, although the shift is split along partisan lines. For Republicans, the number supporting the measure has actually gone down.

Join the fight
against online
political deception.