NEWS

Secrecy at All Costs: A Timeline of Facebook’s Obstruction of Research 

MapLight | September 24, 2021

For years, it’s been clear that Facebook is no friend to transparency. The company has gone to extreme lengths to limit researchers’ access to data and even hide its own internal research when it doesn’t like the results. Last week, the Wall Street Journal published the “Facebook Files,” documenting — with the help of a whistleblower — alarming examples of Facebook’s efforts to hide internal research showing its platforms cause grave harms to kids and teens, increase divisiveness in our society, and allow disinformation to run rampant and threaten our democracy.

Below, we document the recent history of Facebook curtailing access to data analytics tools, bullying researchers who expose the platforms’ shortcomings, delaying external audits, and hiding and undercutting internal research that may damage its public image.

  • Mid-2019: Facebook’s researchers are told to stop exploring racial bias in Instagram’s content moderation algorithms. According to NBC News, researchers at Facebook studying a new set of proposed rules for Instagram’s content moderation algorithms found that users identified as Black were about 50 percent more likely to have their accounts automatically disabled than their white counterparts. Instead of squashing the changes, researchers were told to stay quiet and stop conducting any further research into racial bias.
  • May 2020: The Wall Street Journal reveals that Mark Zuckerberg and other senior Facebook executives shelved an internal research project to understand how the social media platform polarizes users. Jeff Horwitz and Deepa Seetharaman of the Wall Street Journal reveal that Mark Zuckerberg and other senior executives have shelved an internal research effort to understand how the social media platform shapes user behavior and how the company might address potential harms.. Among the concerns underpinning the decision were claims that efforts to make conversations on the platform more civil were "paternalistic,” and could disproportionately impact conservative users and publishers.
  • October 2020: As Election Day looms, Facebook sends New York University researchers a cease-and-desist letter demanding they shut down an ad transparency tool and delete all data collected. Just weeks before the 2020 presidential election, Facebook sends a cease-and-desist letter to two NYU researchers, demanding they shut down their Ad Observatory by November 30 and delete all the data collected. The company says that because the plug-in “scrapes'' data from Facebook, it violates its terms of service.
  • April 2021: Facebook’s top executives disband the team for the Facebook-owned transparency tool CrowdTangle following the release of CrowdTangle data suggesting far-right commentators outperformed other users. Facebook moves CrowdTangle, a popular analytics tool used by journalists that had been running quasi-independently since being acquired by Facebook in 2016, under the social network’s integrity team. Facebook also reassigns some CrowdTangle employees to other divisions — hinting at an intention to axe the tool. The move became public only in July, after Kevin Roose of the New York Times broke the news.
  • July 15, 2021: Researchers postpone the release of a Facebook election study.Protocol’s Issie Lapowsky reveals that results of a study of Facebook's impact on the 2020 election will be released in 2022. Facebook and the researchers had previously said the results  would be ready by the summer of 2021.
  • July 22, 2021: Facebook blocks a newsletter’s access to CrowdTangle shortly after the newsletter reveals that Donald Trump’s PAC was running Facebook ads despite Trump’s suspension from Facebook. Kyle Trap, writer of the For What It’s Worth (FWIW) newsletter, announces that Facebook shut off their access to the transparency tool. FWIW recently made headlines for publishing a story on how Donald Trump’s political action committee (PAC) is still running fundraising ads on Facebook, despite Trump’s suspension from the platform.
  • August 3, 2021: Facebook deplatforms researchers from New York University. Facebook disables the accounts of several researchers associated with NYU’s Ad Observatory project, effectively cutting off their ability to investigate how misinformation spreads on the platform. NYU’s Ad Observervatory is a plug-in tool that allows Facebook users to voluntarily share limited and anonymous data about the political ads Facebook shows them, as well as some basic demographics, for research purposes.
  • August 5, 2021: The Federal Trade Commission calls Facebook out for disingenuously citing “privacy” as a rationale for shutting down NYU’s research. Issie Lapowsky outlines in Protocol how the Federal Trade Commission's acting director for the Bureau of Consumer Protection told Facebook not to use privacy as a “pretext to advance other aims” following the company’s decision to shut down the accounts of the NYU researchers working to understand ad targeting on the platform.
  • August 31, 2021: Tens of thousands of Facebook posts relating to the January 6th insurrection disappear from CrowdTangle. Politico reports that tens of thousands of Facebook posts relating to the Jan. 6th insurrection at the U.S. Capitol Building have been missing from CrowdTangle since at least May. Facebook tells Politico the posts were accidentally removed from the tool, but provides no details on when the situation will be remedied.
  • September 4, 2021: Facebook criticizes a peer-reviewed study for failing to use impression data, but refuses to make such data available. Facebook criticizes a forthcoming peer-reviewed study from researchers at NYU and Université Grenoble Alpes in France finding that news publishers putting out misinformation got six times more engagement compared to reputable sources during the election cycle.  While Facebook faults the study for looking at engagement data, it fails to make impression data (which it argues would be better) available for researchers.
  • September 10, 2021: The New York Times reports that Facebook sent flawed data to misinformation researchers, potentially costing them years of work. The error in the dataset reported by the New York Times was spotted by Fabio Giglietto, an associate professor and social media researcher from the University of Urbino, while comparing data provided to researchers with the "Widely Viewed Content Report" Facebook released in August.
  • September 14, 2021: The Wall Street Journal provides evidence of Facebook hiding internal research showing that Instagram is harmful to teen girls. Thanks to internal documents provided by a Facebook whistleblower, The Wall Street Journal unveils that the company has repeatedly hidden internal research showing that Instagram is harmful to teen girls. This revelation is part of a series of articles known as the Facebook Files, which expose Facebook’s failing in addressing the harms to users and democracy caused by its business model.
  • September 21, 2021: The Markup reports on changes to Facebook’s News Feed that block watchdogs’ browser-based tools to study misinformation. The Markup outlines how Facebook recently rolled out a News Feed change that blocks automated data collection of news feed posts, effectively barring watchdogs from collecting data on what content is being algorithmically served to Facebook users (such as NYU’s Ad Observatory and The Markup’s Citizen Browser). According to The Markup, the change also risks damaging accessibility apps for visually impaired users.