Decoder Newsletter: Is Facebook subject to oversight?

Margaret Sessa-Hawkins | February 01, 2021

Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • Oversight: Facebook’s self-appointed oversight board released its first content decisions this week. It overturned four out of five of the cases it examined. In the Lawfare Blog, both Evelyn Douek and Jacob Shulz analyzed the decisions. Lawfare is also hosting a searchable tracker of all Oversight Board cases. In Mashable Matt Binder took a closer look at one of the decisions -- a post that dealt with potential anti-Muslim rhetoric. The organization Muslim Advocates released a statement objecting to the decision. In Time, Roger McNamee and Maria Ressa argue that the Oversight Board is “a sham” providing a sense of legitimacy to Facebook’s decision to regulate itself, rather than being regulated by elected officials.

  • The influence debate: The fact that only a select number of journalists and academics received an embargoed copy of the Oversight Board’s decisions before they were made public sparked a Twitter debate Sunday about embargoes and influence. Julia Angwin, Editor-in-Chief of The Markup pointed out that embargoes released to a small group can often serve as a PR tactic. Ryan Gallagher highlighted that social media companies have put researchers in a tough position by restricting access to information. Black Hounshell of Politico pointed out the benefits to embargoes, while Alex Howard of the Digital Democracy Project argued for embargoes being made more widely available. Lastly, Dr. Joan Donovan called out anyone harassing any individual as part of the discussion.

  • Your feedback: The board also announced that it is taking opinions on the Trump case, but you have to be signed in to Facebook or Instagram to leave them. 

  • Groups: Facebook CEO Mark Zuckerberg declared on Wednesday that the company will no longer be recommending political groups to users. In The Wall Street Journal, Jeff Horwitz reported on Facebook’s about-face on the policy, examining what the company knew when. The move could also be a big blow to activism and civil rights groups. MapLight has pointed out that this is just another sign the company should be subjected to Congressional regulation.

  • Fact Checking: Social media companies announced a few interesting policy changes this week. Twitter launched a fact-checking forum to help combat misinformation. Alex Howard has a thread offering some insights on the new tool. In a timely coincidence, the HKS Misinformation Review has published a study looking at biases in fact-checking.

  • Anti-semitism: The Anti-Defamation League has released an ‘Online Holocaust Denial Report Card’. The report card only focuses on a cross-section of tech platforms, but it does provide good insight into how the issue is being addressed. Twitch ranked the highest, with a B grade, while Facebook and Instagram came in with a D, despite announcing bans on Holocaust denialism last year (the network was the only one to either not respond to ADL’s reported content, or claim the content reported did not violate its policies). Facebook has also found itself embroiled in a moderation policy debate centered on the use of the word ‘zionist’. 

  • Section 230: With a new administration, questions about the fate of Section 230 are always simmering near the surface. Biden’s Commerce Department nominee Gina Raimando has said she’s in favor of reforming the law. In TechDirt, Mike Maskin writes that a not-yet-introduced-bill (you can read the discussion draft) from Rep. Yvette Clarke and Rep. Mike Doyle will create issues, without addressing the civil rights problems it is trying to solve.  In an open letter, dozens of advocacy groups have also warned the incoming administration that changing or repealing the law could have dangerous -- if unintended -- consequences for human rights.

  • Regulation: Also with the incoming administration, questions of regulation beyond Section 230 continue to swirl. In a deep dive for The New York Times Magazine, Emily Bazelon looks at the fallout from social media’s decision to ban Trump, and with thorough analysis reinforces the point that if the government was regulating Big Tech, the industry wouldn’t be regulating itself. In The New York Times, Shoshana Zuboff traces a different path to come to the same point, “the digital must live in democracy’s house.” In The Hill Cameron Kerry argues that to fix social media we need to first fix privacy (via legislation). In The Guardian, Joan Donovan and Amed Khan make the case for using antitrust regulations to rein in Big Tech -- and argue that the Biden administration must avoid appointing tech-friendly individuals to key positions in its administration.

  • Spreading disinformation: Democrats are trying to get Republicans to censure Rep. Marjorie Taylor Greene -- who has been spreading violent, incendiary rhetoric, as well as several conspiracy theories. House Majority Leader Steny Hoyer is planning to tell Minority Leader Kevin McCarthy that if Republicans don’t strip Greene of her committee assignments within the next three days, the matter will be brought to the House floor. Republicans have been reluctant to act against Greene despite her dangerous rhetoric. Greene’s espousing of conspiracy theories has once again brought to the fore questions of how to report on disinformation without giving it (or those spreading it) further oxygen. Wagatwe Wanjuki points out that often reporting on white supremacy inadvertently helps feed it, while Charlie Warzel wonders what it would be like if the coverage of these individuals reflected the actual power they held. Data & Society has a good tip sheet on reporting on disinformation. 

  • Research: It may not always have had the best relationship with researchers, but in an exciting development, Twitter has announced that it will no longer require researchers pay for its public archive, and Facebook will be giving access to targeting data for some political ads as well. 

  • Chinese social media offers misinformation lessons: For Nieman Lab, Kaping Chen looked at how misinformation about coronavirus differs on Chinese social media, and what lessons it offers for combatting conspiracy theories. The German Marshall Fund of the United States found that social media engagement with deceptive sites reached record highs in 2020, despite content moderation. The Stanford Internet Observatory looked at moderation on Parler, as well as its national and international growth.

  • Events: Tomorrow Harvard’s Berkman Klein Center will be holding an event titled, “White Surveillance and Black Digital Publics” with Dr. Apryl Williams and Dr. Allissa Richardson. Video will be available after the event.
Viviana Padelli and Margaret Sessa-Hawkins | February 22, 2021
Margaret Sessa-Hawkins and Viviana Padelli | February 17, 2021
Viviana Padelli and Alec Saslow | February 12, 2021