NEWS

Decoder Newsletter: Is Trump’s Disinformation Working?

Margaret Sessa-Hawkins | November 30, 2020

Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • Trump’s strategy: President Trump’s efforts to undermine the election keep suffering setbacks, but he is still managing to subvert trust in the electoral process (a recent Economist-YouGov poll found that 80% of Republicans and 45% of independents do not believe Biden legitimately won the election). This is despite the fact that in many states, Republican officials helped to counter Trump’s messaging, and Chris Krebs (the fired director of the U.S. cyber agency and a conservative) further countered Trump’s claims in a 60 Minutes interview. 

  • Reforms: How did we get here? The Washington Post took a close look at Trump’s quest to delegitimize the results. In the New York Times, Zeynep Tufekci argues that partisan distrust of elections has long been increasing, and that in addition to addressing disinformation, we should look at simple electoral reforms to build trust in the democratic process. However, Nate Cohn, a Times political reporter, pushed back against the assessment. 

  • AAPI misinformation: As with many other minority voting groups, misinformation targeting Asian American and Pacific Islanders flourished both before and after the election, reports Terry Nguyen in Vox. Language diversity among those targeted has made it especially hard to track, and to combat. Of special concern is disinformation focused on voting processes.

  • Vaccine misinformation: Concerns on how vaccine misinformation could affect COVID-19 vaccination rates continue to emerge. To counter this, Alex Stamos of Stanford’s Internet Observatory argues that the Biden administration should open an equivalent to rumorcontrol.gov which would post “detailed refutations of the day's misinformation with links to primary sources” (Joan Donovan of the Shorenstein Center agreed).

  • Facebook’s disinformation: Facebook had a whitelist of roughly 112,000 politicians and government officials whose posts were not to be fact-checked, Alex Heath reports in The Information. The company declined to scrap the list, despite evidence that false information was more believable if shared by a politician. 

  • Keeping ‘nicer’ newsfeed: In an emergency decision taken to counter Trump’s attempts to spread misinformation about the election, Facebook gave increased algorithmic weight to high-quality journalism, (as decided by a secret ranking), reports the New York Times. Now, some employees want the changes to become permanent.

  • Verified: Twitter has announced the relaunch of its verification system. One proposed change, TechCrunch reports, is removing verifications for accounts that repeatedly violate its rules. Alex Howard, of Demand Progress, has further suggestions for fixing verification to curtail misinformation. Meanwhile, researchers at Cornell University and Washington University in St. Louis have found that when Twitter fact-checks Trump it further polarizes Americans (a write-up is in the Washington Post).  

  • Youtube’s echo chambers: A new study has found that there are political news ‘echo chambers’ on YouTube. The researchers also found that, “users of radical content on YouTube are actively seeking it & do so both on & off the platform.” Kate Starbird, of the University of Wisconsin, wrote that the findings match her previous research showing that disinformation and activism campaigns use Youtube as a resource, using other social media sites to push the content.

  • Toxic comments: We’ve all been warned never to read the comments section, but now, a new study is backing this adage up. The researchers found that the more polarized someone was politically, the more likely they were to comment on an article, and that the more frequently you comment, the more toxic your comments are likely to be.

  • Democracy Action Plan: The Financial Times obtained a draft plan of the European Commission’s Democracy Action Plan, which is due to be published this week. According to the draft, the EU is preparing stringent regulations to address online political advertising -- including limiting microtargeting and profiling. Carnegie Europe will be hosting an online discussion with European Commission Vice-President Věra Jourová and Richard Youngs on Friday to mark the plan’s launch. 

  • Is That a Fact?: The News Literacy Project is launching a new podcast called ‘Is That a Fact’ which aims to “help listeners understand the impact of misinformation, policies — or the lack thereof — guiding social media companies, the importance of trust in the news, why facts matter more than ever and other topics central to the health our democracy.”

  • Educating: How should teachers address misinformation? Education Week reports that many teachers simply don’t -- and provides a few strategies from those who do.