Produced by Decode Democracy, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we’ll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!
Decode Democracy: Monday, MapLight launched a new campaign to fight online political deception, Decode Democracy. While MapLight will continue its efforts to increase transparency in government, Decode Democracy will focus specifically on holding technology and social media companies accountable for the damage they are doing to our democracy. This will be done through advocating for changes to public policy, conducting research into how digital deception is negatively impacting democracy, and creating tools that will empower researchers and journalists to counter the spread of false information. From tomorrow, you’ll start receiving regular updates about our work and ways to get involved to end political deception online. Meanwhile, learn more about Decode Democracy’s approach to addressing online political deception in our new 90-second video and be on the lookout for a report tomorrow outlining how hundreds of platform policy changes are failing to alleviate online disinformation.
Hooked on misinformation: Several great deep dives into the tension between Big Tech and civil society came out this past week. First up, in MIT’s Technology Review, Karen Hao profiled Joaquin Quinoñero Candela, director of AI at Facebook. The article looked at how Facebook got hooked on misinformation, and why the company’s Responsible AI team isn’t addressing the problem. In a thread, Timnit Gebru, who works on algorithmic bias in AI, addressed concerns that the article minimizes the concerns of bias in the field, and articulated that those in positions of power are unlikely to act on the issue without external pressure, as they have never experienced it firsthand. Separately, an article in The New York Times looks into Dr. Gebru’s recent firing from Google, and at how bias manifests in AI.
Dollars Vs. Democracy: In the Atlantic, Anne Applebaum and Peter Pomerantsev looked at how social media’s interests diverge from what’s in the best interests of an American Democracy. The article also explored several potential solutions and new ideas to tackle this problem. Speaking of the tension between profits and social good, a new article in Nieman Lab points out that search engines, as well as social media, can spread misinformation.
Addressing terrorism: Examining how social media contributed to the Capitol riots, a new in-depth article in Protocol looks at Big Tech’s reluctance to address disinformation and violence on its platforms. The article also reports that companies have been much more proactive about addressing international terrorism than domestic terrorism.
Regulation needed: In TechCruch, Color of Change Vice President Arisha Hatch argues that rather than relying on tech companies to self-regulate to address many of these issues, Congress needs to enact reforms that will protect marginalized communities that have been harmed by Big Tech’s policies both online and off.
Facebook’ s polarization playbook: Buzzfeed reports that Facebook has created a playbook for how employees should respond to criticisms that it is harming civil society and discourse. A former employee described the memo as “corporate gaslighting.”
Digital harassment: In the Washington Post, Media Columnist Margaret Sullivan wrote about how persistent digital harassment of women — especially women of color — who cover technology stories can have a silencing effect. For the Wilson Center’s Science and Technology Innovation Program, Nina Jankowicz, Jillian Hunchak, Alexandra Pavliuc, Celia Davies, Shannon Pierson, and Zoë Kaufmann look at how gendered disinformation is used against women in politics. In response to this rising threat of women being silenced online, the International Women’s Media Foundation has launched the Coalition Against Online Violence which provides resources and support to individuals and newsrooms experiencing persistent harassment.
Health Misinformation: Facebook has announced that it will begin labeling all COVID-19 posts with a blurb directing users to official information in a bid to combat health misinformation. Separately, Facebook has been secretly studying vaccine hesitancy on its platform, the Washington Post reports. So far, the research has found that small groups are responsible for much of the health misinformation spreading. The Washington Post also reports that vaccine misinformation is thriving on social media. In Nature, Imran Ahmed of the Center for Countering Digital Hate argues that we need to treat and combat anti-vaxxers more like professionals, and less like lone conspiracy theorists.
Antitrust: Facebook announced Wednesday that it would be filing a motion asking to dismiss antitrust lawsuits filed by the Federal Trade Commission and state Attorneys General. Politico analyzed the motion, and looked at whether it has any chance of succeeding. This throwback from the New York Times looks at the FTC’s case, and some of the hurdles it will have in proving that Facebook has been acting in an anti-competitive manner.
Hearings: Thursday the House Judiciary Committee will hold its third hearing on tech antitrust, focused on how to address monopoly power. For those interested in more on antitrust, Tech Policy Press has a rundown of key developments happening. Axios also has a profile of Sen. Amy Klobuchar, focusing on her approach to antitrust.
Research: Protocol sent out a survey to see what tech workers think about tech, and one of their key findings was that 80% of tech employees think the industry is too powerful. The Markup has used data from its citizen browser project to try to show what users on different sides of the political spectrum see in their newsfeed. In Big Data and Society, Kelley Cotter, Mel Medeiros, Chankyung Pak, and Kjerstin Thorson look at how Facebook’s system for ad targeting prioritizes those who have more social and economic weight. In Columbia Journalism Review, Brandi Collins-Dexter and Joan Donovan look at how white nationalist groups like the Proud Boys are using “keyboard squatting” to promote the term 1776 online as a rejoinder to the 1619 Project, which looked at Black Americans’ role in helping to create the nation.