Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we’ll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!
H.R. 1: The House passed the For the People Act Wednesday, a comprehensive voting rights and campaign finance reform bill. MapLight President and Co-Founder Daniel G. Newman said the bill “begins to address the online disinformation crisis fueled by conspiracy theories, lies, and political deceit.” The Washington Post has a good explainer piece on what the bill would do if it passes. As discussed in last week’s Decoder, state legislatures are trying to enact numerous voter suppression measures, raising the For The People Act’s importance.
Privacy (Google edition): On Wednesday Google announced in a blog post that as it phases out cookies it won’t “build alternate identifiers to track individuals as they browse across the web.” But (and this is a pretty big but) it will still target users on mobile devices, and still tailor ads based on user’s behavior on its own platforms. It will also introduce Federated Learning of Cohorts, which will group users into interest-based FLoCs (get it?) which advertisers can then target. Protocol has a good overview of what the new system means. The Electronic Frontier Foundation (which advocates for privacy protections online) has an in-depth criticism of the new system that’s worth a read.
Privacy (Facebook edition): In a blog post Thursday, Facebook applauded Virginia’s recently passed Consumer Data Protection Act and said it would like to see similar federal legislation passed. The move came as Washington inched closer to passing its own privacy legislation, which Maureen Mahoney of Consumer Reports argued in a Seattle Times op-ed doesn’t go far enough. As Politico’s Digital Bridge pointed out, if tech companies like Facebook can convince more states to pass lighter legislation (like Virginia’s) that tilts the federal debate in their favour, that would be a win for them.
Trump returns?: YouTube CEO Susan Wojcicki announced Thursday during an event with the Atlantic Council that the platform will reinstate Donald Trump’s channel once they “determine the risk of violence has decreased.” She did not offer any suggestions as to what that might mean for a timeline. Justin Hendrix of Tech Policy Press has a thread summing up her comments on the decision. For those wanting to keep track of who is deplatformed where, check out the Deplatforming Index from Harvard’s Shorenstein Center, the Media Manipulation Casebook and the Technology and Social Change Project.
Political ad ban lifted: Facebook has lifted a political ad ban that it put in place to curb the spread of misinformation around the election. Coincidentally, Duke’s Center on Science and Technology has a new paper finding that assessing the efficacy of political ads bans is really difficult.
Twitter and coronavirus: Twitter also announced in a blog post last Monday that it would be trying to cut down on coronavirus misinformation by using a system of strikes and labels. The Wall Street Journal reported Sunday that Russian intelligence agencies are running a campaign to undermine confidence in Western vaccines.
Racial disparities: The Markup reports that official information about COVID-19 is reaching fewer Black people on Facebook than those from other demographic groups, according to new data from its citizen browser project. The findings once again bring up questions of racial inequality in the tech world, and come in the same week as reports that Facebook is being investigated for systemic racial bias in hiring, a Washington Post examination of how Google’s policies towards Historically Black Colleges and Universities led to it hiring fewer Black engineers, and an NBC report that Google advised employees who complained about racism and sexism to seek mental health care.
Trending Twitter: In OneZero, Will Oremus interviewed the woman in charge of Twitter’s trending summaries team. They spoke about how the team chooses trends to summarize, how it handles diverse topics that require a breadth of expertise and background, and how it deals with mis and disinformation.
Twitch transparency: Twitch has released its first transparency report. Wired looks at the key takeaways and explains why the report is long overdue.
EIP report: The Election Integrity Partnership has released its report on mis and disinformation in the 2020 election. One of their chief findings was that one-off stories of voter concerns were taken by verified individuals and turned into a metanarrative of a ‘stolen election.’ In The New York Times’ On Tech, Shira Ovide highlighted policies suggested by the report to help to fix current disinformation issues, and prevent further events like the Capitol riots. Casey Newton focused on YouTube’s role in the spread of disinformation in his Platformer substack (YouTube’s role is something the chairs of the Energy and Commerce Committee have also zeroed in on).
Wray testimony: FBI Director Christopher Wray testified before the Senate Judiciary Committee Wednesday, speaking about his department’s response to the January 6 attack on the Capitol. When asked by Senator Chris Coons about the role social media played, Wray responded, “I sometimes say terrorism today, and we saw it on the 6th, moves at the speed of social media.” Instead of focusing on the role of social media in the Capitol riots though, Wray switched to attacking encryption, arguing law enforcement needs “lawful access” to communications. Wired also reports that TikTok played a key, and often overlooked, role in spreading disinformation.
Antitrust: Longtime Big Tech critic Tim Wu is joining the Biden administration as an adviser on technology and competition. The move is seen as a signal that the administration is seriously considering antitrust action. Acting FTC Chair Rebecca Kelly Slaughter also said in her opening remarks at the Future of Privacy Forum Thursday that breaking up companies may be a better solution in many cases than imposing regulation.
Research: A new study from NYU’s Center for Cybersecurity (formerly the Online Political Transparency Project) is adding to a growing body of evidence that far-right news sources get more engagement on Facebook. The study found that the more partisan a source was, the more engagement it got (led by far-right sources) and that while left and center sources were subject to a ‘misinformation penalty,’ far-right sources were not. UT Austin’s Center for Media Engagement looked at what Americans know about how Facebook and Google work — nearly one in four thought that editors and journalists made news feed decisions that are in fact programmed by an algorithm.