Legislative Hub

Legislative Hub

MapLight is tracking federal legislation dealing with the role of technology and social media platforms in politics.

The Legislative Hub, which is meant as a one-stop-shop for people with an interest in the topic, provides a curated overview of related bills introduced in the 116th Congress, organized by topic (i.e., Microtargeting, Section 230, Deepfakes, Democracy, Money in Digital Politics, Security, Foreign Interference, Privacy)

For each bill, the Legislative Hub provides: the bill’s name, a link to the official page in the Library of Congress, sponsors, cosponsors, summary, and status. The database is updated on a daily basis.

MapLight monitors all the bills in the Hub. Depending on the circumstances, MapLight may also issue an official statement voicing opposition or support for a specific bill. When this happens, a link to the official statement is made available as a separate item after the bill summary. For more info about MapLight’s position, contact Viviana Padelli at viviana@maplight.org

The Legislative Hub is part of MapLight’s Digital Deception Solutions project led by former Federal Election Commission chair Ann Ravel. For press inquiries, please contact Alec Saslow at alec@maplight.org

By Laura Curlin, Laurie Moskowitz, Brad Neal, Viviana Padelli, Alec Saslow, and Bergen Smith


Microtargeting

Protecting Democracy from Disinformation Act (H.R. 7012)

Sponsor: House Rep. David Cicilline (D-RI)—introduced with Rep. Sean Casten (D-IL), Rep. Alcee Hastings (D-FL), Rep. Jahana Hayes (D-CT), Rep. Hank Johnson (D-GA), and Rep. Stephen Lynch (D-MA) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Committee on House Administration in May 2020.

The “Protecting Democracy from Disinformation Act” would amend the Federal Election Campaign Act of 1971 to restrict the targeting categories political advertisers can use to only age, gender, and location. Contextual advertising and advertising lists that people have expressly consented to being on would be permitted. It would also require large social media platforms and digital advertising intermediaries to maintain public archives of all political advertising purchases, with information about the advertiser, how much was spent, who was targeted, and who was reached. Violations will be enforced through the Federal Election Commission’s existing authority, a private right of action, and criminal penalties for online platforms and ad intermediaries that knowingly and willfully violate the Act.

MapLight supports the bill as outlined in a statement by MapLight President and Co-Founder Daniel G. Newman.


Banning Microtargeted Political Ads Act (H.R. 7014)

Sponsor: House Rep. Anna Eshoo (D-CA) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on House Administration in May 2020.

The “Banning Microtargeting Political Ads Act” would amend the Federal Election Campaign Act of 1971, prohibiting online platforms (i.e., social media, ad networks, and streaming services) from targeting political advertisements based on users’ demographic or behavioral data. Violations will be enforced by the Federal Election Commission and through a private right of action. Targeting ads to recognized geographies—such as states, municipalities, and districts—would be permitted, while targeting by narrow geographies would not. Opt-in, consent-based advertising would also be permitted.

MapLight supports the bill as outlined in a statement by MapLight President and Co-Founder Daniel G. Newman.


Section 230

Ending Support for Internet Censorship Act (S. 1914)

Sponsor: Sen. Josh Hawley (R-MO) (All Cosponsors)
Status: There has been no movement on the bill since being read twice and referred to the Senate Committee on Commerce, Science, and Transportation in June 2019.

Under the “Ending Support for Internet Censorship Act,” large tech platforms—i.e., those with more than 30 million active monthly users in the U.S., more than 300 million active monthly users worldwide, or companies with more than $500 million in global annual revenue—will receive Section 230 immunity only if they can prove to the Federal Trade Commission (FTC) that they do not moderate content “in a manner that is biased against a political party, political candidate, or political viewpoint.” For the purpose of this bill, the term “moderate” includes any actions taken through an algorithm or other automated process. The FTC immunity certification will be valid for a period of 2 years, and may subsequently be renewed for an additional 2-year period.


Limiting Section 230 Immunity to Good Samaritans Act (S. 3983)

Sponsor: Sen. Josh Hawley (R-MO) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Senate Committee on Commerce, Science, and Transportation in June 2020.
Companion: H.R. 8596

The “Limiting Section 230 Immunity to Good Samaritans Act” would require large tech platforms that want to maintain Section 230 immunity to update their terms of service to include a commitment to operate in “good faith.” This provision would open these companies to being sued for “selectively” enforcing their own terms of service or failing to uphold them, with a penalty of $5,000 per claim or actual damages, whichever is higher, as well as attorney’s fees.The bill would only apply to platforms with more than 30 million active monthly users in the U.S., more than 300 million active monthly users worldwide, or companies with more than $500 million in global annual revenue.


Stopping Big Tech’s Censorship Act (S. 4062)

Sponsor: Sen. Kelly Loeffler (R-GA) (All Cosponsors)
Status: There has been no movement on the bill since being read twice and referred to the Senate Committee on Commerce, Science, and Transportation in June 2020.

The “Stopping Big Tech’s Censorship Act” would amend Section 230 of the Communications Decency Act to apply only when online platforms take reasonable measures to prevent or address “cyberstalking, sex trafficking, trafficking in illegal products or activities, child sexual exploitation,” and any other unlawful activity. As a result, internet users could challenge Section 230 protections if they think a platform is censoring free speech. The “Stopping Big Tech’s Censorship Act” would also require tech companies to “explain the practices and procedures” they use to censor content and provide an explanation to users when their content is restricted, labeled, or taken down.


PACT Act: Platform Accountability and Consumer Transparency Act (S. 4066)

Sponsor: Sen. Brian Schatz (D-HI)—introduced with Sen. John Thune (R-SD) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Senate Committee on Commerce, Science, and Transportation in June 2020.

This bipartisan bill does four things. First, it requires online platforms to explain their content moderation practices in an “acceptable use policy” that is easily accessible to consumers, and to publish a detailed quarterly transparency report. Second, it introduces an obligation for large online platforms to put in place a formal complaint system that processes reports and notifies users of moderation decisions within 14 days. Third, it amends Section 230 to require large online platforms to remove court-determined illegal content and activity within 24 hours. The PACT Act also opens tech platforms up to civil lawsuits from federal regulators, and gives license to State attorneys general to enforce federal civil law against them. These accountability requirements will be less stringent for small online platforms, depending on their size and capacity. Finally, the bill requires the Government Accountability Office to study and report on the viability of an FTC-administered whistleblower program for employees or contractors of online platforms, and NIST to develop voluntary standards for “good faith moderation practices.”


BAD ADS Act: Behavioral Advertising Decisions Are Downgrading Services Act (S. 4337)

Sponsor: Sen. Josh Hawley (R-MO) (All Cosponsors)
Status: There has been no movement on the bill since being read twice and referred to the Senate Committee on Commerce, Science, and Transportation in July 2020.

The “BAD ADS Act” would strip Section 230 protections from the biggest tech companies—i.e., platforms with either 30 million users in the U.S. (or 300 million worldwide), and $1.5 billion in global revenue—that display behavioral advertising or provide user data to a third party knowing that these data will be used to create or display behavioral advertising. This bill defines “behavioral advertising” as targeting based on the personal traits of the user, past location info on the user, personal information from a profile about a user made to sell ads, or a user’s previous online or offline behavior. Behavioral advertising does not include targeting based on users’ locations or the content of the site they’re on.


Online Freedom and Viewpoint Diversity Act (S. 4534)

Sponsor: Sen. Roger Wicker (R-MS)—introduced with Sen.Lindsey Graham (R-SC), and Sen. Marsha Blackburn (R-TN) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Senate Committee on Commerce, Science, and Transportation in September 2020.

The “Online Freedom and Viewpoint Diversity Act” would weaken online platforms’ liability protections when they “restrict access to or availability of material.” In particular, the safe harbor for content and account removals would continue to apply only if the defendant is able to prove that it took the decision “in good faith,” on the basis of “an objectively reasonable belief” that the content falls within a narrow set of categories (i.e., material that is “obscene, lewd, lascivious, filthy, excessively violent, harassing, promoting self-harm, promoting terrorism, or unlawful”). Importantly, these categories do not include electoral disinformation and other content aimed at discouraging voter participation (but also spam, fake accounts, conspiracy theories, Holocaust denial, white supremacist racial theories, and hate speech). This bill would also clarify that the definition of “information content provider” includes situations where a person or entity “editorializes or affirmatively and substantively modifies the content created or developed by another person or entity but does not include changes to the content’s format, layout, or basic appearance.”

MapLight opposes the bill. Learn more about MapLight’s position.


Online Content Policy Modernization Act (S. 4632)

Sponsor: Sen. Lindsey Graham (R-SC) (All Cosponsors)
Status: The bill, which was introduced and referred to the Senate Committee on the Judiciary in September 2020, has been fast-tracked for markup.

The “Online Content Policy Modernization Act” aims to narrow Section 230 liability protections, borrowing language from the bill introduced in September with Senators Wicker and Blackburn. But the bill also incorporates a copyright small claims dispute provision based upon S. 1273, the Copyright Alternative in Small-Claims Enforcement Act of 2019 (“CASE Act”). If implemented, this provision would subject any American organization or individual to up to $30,000 in damages for actions such as posting a photo.

MapLight opposes the bill. Learn more about MapLight’s position.


Don’t Push my Buttons Act (S. 4756)

Sponsor: Sen. John Kennedy (R-LA) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Senate Committee on Commerce, Science, and Transportation in September 2020.
Companion: H.R. 8515

The “Don’t Push my Buttons Act” would deprive online platforms of their Section 230 immunity if they use algorithms that attempt to optimize user engagement by funneling polarizing and insular information to users—unless a user opts-in to such an algorithm.


See Something, Say Something Online Act of 2020 (S. 4758)

Sponsor: Sen. Joe Manchin (D-WV)—introduced with Sen. John Cornyn (R-TX) (All Cosponsors)
Status: There has been no movement on the bill since being read twice and referred to the Senate Committee on Commerce, Science, and Transportation in September 2020.

The “See Something, Say Something Online Act” would amend Section 230 of the Communications Decency Act to require online platforms to “take reasonable steps to prevent or address unlawful users of the service through the reporting of suspicious transmissions” to law enforcement. If platforms fail to comply, they “may be held liable as a publisher for the related suspicious transmission.” The bill also states that, should online platform “fail to exercise due care in the implementation, filing of the suspicious transmission activity reports, and reporting of major crimes” (e.g., illegal drug sales, hate crimes, murder, or terrorism), Congress will look at removing the legal shield provided by Section 230. In addition, the bill would require the Attorney General to designate an agency within the Department of Justice, or establish a new agency, to act as a “clearinghouse” for these reports.


Stop Suppressing Speech Act of 2020 (S. 4828)

Sponsor: Sen. Kelly Loeffler (R-GA) (All Cosponsors)
Status: This bill was read twice and referred to the Senate Committee on Commerce, Science, and Transportation in October 2020.

The “Stop Suppressing Speech Act of 2020” would narrow Section 230 immunity currently enjoyed by Big Tech companies when they moderate content on their platforms by removing the phrase “harassing, or other objectionable” and inserting “or harassing, that the provider or user determined to be unlawful, or that promotes violence or terrorism.” At the same time, the bill would extend the Section 230 c(2) immunity to “any action taken to provide users with the option to restrict access to any other material, whether or not such material is constitutionally protected.”


Stop the Censorship Act of 2020 (H.R. 7808)

Sponsor: House Rep. Paul Gosar (R-AZ)—introduced with GOP Representatives Doug Collins (GA), Ralph Norman (SC), Lance Gooden (TX), Steve King (IA), Jim Banks (IN), Matt Gaetz (FL), Ted Yoho (FL), Tom Tiffany (WI), Ron Wright (TX), Glenn Grothman (WI), and Rick Crawford (AR) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on Energy and Commerce in July 2020.

The “Stop Censorship Act” would narrow the immunity for online platforms that moderate and remove problematic content by striking the open-ended phrase “otherwise objectionable” material, and replacing it with a new standard of immunity for content removal—“unlawful, or that promotes violence or terrorism.”


Don’t Push My Buttons Act (H.R. 8515)

Sponsor: House Rep. Paul Gosar (R-AZ)—introduced with Rep. Tulsi Gabbard (D-HI) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on Energy and Commerce in October 2020.
Companion: S. 4756

The “Don’t Push My Buttons Act” would deprive online platforms of the immunity they’re afforded under Section 230 if they use algorithms that attempt to optimize user engagement by funneling polarizing and insular information to users—unless a user opts-in to such an algorithm.


Protect Speech Act (H.R. 8517)

Sponsor: House Rep. Jim Jordan (R-OH)—introduced with GOP Representatives James Sensenbrenner (WI), Louie Gohmert (TX), Doug Collins (GA), Ken Buck (CO), Andy Biggs (AZ), Tom McClintock (CA), Debbie Lesko (AZ), Guy Reschenthaler (PA), Ben Cline (VA), Greg Steube (FL), and Tom Tiffany (WI) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on Energy and Commerce in October 2020.

The “Protect Speech Act,” which is very similar to the “Online Freedom and Viewpoint Diversity Act,” would amend Section 230 of the Communications Decency Act to narrow online platforms’ liability protections when they block or screen offensive material. In particular, the bill would protect platforms from lawsuits over their moderation decisions only if they are able to show that they “acted in good faith” on the basis of “an objectively reasonable belief” that the content was “obscene, lewd, lascivious, filthy, excessively violent, promoting terrorism or violent extremism, harassing, promoting self-harm, or unlawful.” Furthermore, the bill would make Section 230 liability protections subject to the publication of content moderation practices, the consistent application of its terms of service or use, and the provision of a “timely notice” providing a specific explanation of why someone’s content was removed.


Limiting Section 230 Immunity to Good Samaritans Act (H.R. 8596)

Sponsor: House Rep. Ted Budd (R-NC)—introduced with GOP Representatives Eric Crawford (AR), Bill Flores (TX), Andy Biggs (AZ), Lance Gooden (TX), Dan Bishop (NC), Michael Waltz (FL) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on Energy and Commerce in October 2020.
Companion: S. 3983

The “Limiting Section 230 Immunity to Good Samaritans Act” would require large tech platforms that want to maintain Section 230 immunity to update their terms of service to include a commitment to operate in “good faith.” This provision would open these companies to being sued for “selectively” enforcing their own terms of service or failing to uphold them, with a penalty of $5,000 per claim or actual damages, whichever is higher, as well as attorney’s fees. For the purpose of this bill, large tech platforms are those with more than 30 million active monthly users in the U.S., more than 300 million active monthly users worldwide, or companies with more than $500 million in global annual revenue.


Protecting Americans from Dangerous Algorithms Act (H.R. 8636)

Sponsor: House Rep. Tom Malinowski (D-NJ)—introduced with Rep. Anna Eshoo (D-CA) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on Energy and Commerce in October 2020.

The “Protecting Americans from Dangerous Algorithms Act” would amend Section 230 of the Communications Decency Act to remove liability immunity for a platform if its algorithm is used to amplify or recommend content directly relevant to a case involving interference with civil rights (42 U.S.C. 1985); neglect to prevent interference with civil rights (42 U.S.C. 1986); and in cases involving acts of international terrorism (18 U.S.C. 2333). Importantly, the bill would only apply to online platforms with 50 million or more users.

MapLight supports the bill as outlined in a statement by MapLight Digital Deception Project Director Ann Ravel.


CASE-IT Act: Curbing Abuse and Saving Expression In Technology Act (H.R. 8719)

Sponsor: House Rep. Greg Steube (R-FL) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on Energy and Commerce in October 2020.

The “CASE-IT Act” introduces three key changes to Section 230 of the Communication Decency Act. First, it eliminates any sort of immunity for digital platforms that create, develop, post, or materially contribute to illegal online content (or induce another person to do it); knowingly permit or facilitate illicit sexual contact between an adult and a minor; and/or make indecent, obscene, or otherwise harmful content readily available to minors. Second, it makes Big Tech companies liable for the content they host on their platforms if their content moderation decisions “are not reasonably consistent with the First Amendment”. Third, it builds in a private right of action against Big Tech companies that are considered not to adhere to the First Amendment. The bill distinguishes between “Big Tech” and “Small Tech” by employing a market dominance test.


Deepfakes

Deepfake Report Act of 2019 (S. 2065)

Sponsor: Sen. Rob Portman (R-OH)—introduced with Senators Martin Heinrich (D-NM), Brian Schatz (D-HI), Cory Gardner (R-CO), Mike Rounds (R-SD), Joni Ernst (R-IA), Gary Peters (D-MI), and Margaret Hassan (D-NH) (All Cosponsors)
Status: The bipartisan bill, introduced in July 2019, was passed by unanimous consent in the Senate and is now in the House, where it was referred to the Subcommittee on Consumer Protection and Commerce.

The “Deepfake Report Act” would mandate the Department of Homeland Security (DHS)’s Science and Technology Directorate to produce an annual report on the state of deepfake technology for five years after this bill were to be enacted. Under this bill, the report would need to assess the underlying technology used to create deepfakes, and how foreign and domestic actors could use deepfakes to threaten national security, as well as potential countermeasures the federal government could take. The bill also calls for congressional hearings to gather information and advice relevant to the annual report.


Bot Disclosure & Accountability Act of 2019 (S. 2125)

Sponsor: Sen. Diane Feinstein (D-CA) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Committee on Commerce, Science, and Transportation in July 2019.

The “Bot Disclosure Act” would increase transparency for bots (automated accounts run by software programs). Many bots are innocuous, or are even programmed to increase public awareness, however, bots can also be used to misleadingly amplify particular topics, a process known as astroturfing. Under this bill, the FTC would have to create regulations ensuring that social media companies require users to publicly disclose all bots. The bill would also make it illegal for candidates and political parties to purchase or use bots to spread their messages.

MapLight supports this bill as outlined by an opinion piece on CNN.com by MapLight Digital Deception Project Director Ann Ravel.


DEEPFAKES Accountability Act (H.R. 3230)

Sponsor: House Rep. Yvette Clark (D-NY) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Subcommittee on Crime, Terrorism, and Homeland Security in June 2019.

The “DEEPFAKES Accountability Act” introduced by Rep. Yvette Clark (D-NY), which is co-sponsored by twenty-eight other Democratic House members, would require anyone creating a piece of synthetic media imitating a person to label the video/audio with an irremovable digital watermark and a written disclaimer informing its viewers that the content has been altered or generated. Failure to comply may result in criminal and civil penalties. In addition, the bill establishes a right on the part of victims of synthetic media to sue the creators in court. Finally, the Act will create a Deep Fakes Task Force at the Department of Homeland Security to amplify efforts to detect—or otherwise counter and combat—deep fake technology, and to work with technology companies to fight against disinformation.


Democracy

For the People Act of 2019 (H.R. 1)

Sponsor: House Rep. John Sarbanes (D-MD)—introduced with Rep. Nancy Pelosi (D-CA) (All Cosponsors)
Status: The bill passed along party lines in March 2019 and is now sitting in the Senate, where it is unlikely to pass given the Republican majority.

This comprehensive legislative package contains numerous provisions to expand voting access, address gerrymandering, and regulate political spending. Among provisions that would tackle digital deception, the bill would apply existing advertising disclosure requirements to digital media; restructure the Federal Election Commission so that it is better able to enforce campaign finance law; and enable the Internal Revenue Service to better track dark money. This bill was the incoming Democratic majority’s first priority in the House in 2019. The bill incorporates the “Honest Ads Act.”

MapLight supports this bill as outlined by an opinion piece on CNN.com by MapLight Digital Deception Project Director Ann Ravel.


Preventing Unwarranted Communications Shutdowns Act (H.R. 8659)

Sponsor: House Rep. Anna Eshoo (D-CA)—introduced with Rep. Morgan Griffith (R-VA) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Committee on Energy and Commerce, and in addition to the Committee on Rules in October 2020.

This bipartisan bill would limit the President’s authority to shut down the internet, a power he does maintain, according to 1934’s Communications Act. Section 706 of the act gives the President the right to shutdown wireless communications “upon proclamation by the President that there exists war or a threat of war, or a state of public peril or disaster or other national emergency, or in order to preserve the neutrality of the United States.” This bill would not entirely revoke the President’s ability to turn off the internet, but it would narrow the circumstances under which he could shut down the internet, specifically threats to human life or national security. It would also force the President to notify the Pentagon, Congressional leadership, and the FCC within 12 hours of shutting down the internet. According to the bill, the shutdown would automatically cease should that notification not come. And even with notice, the order would expire in 48 hours unless extending it is approved by ⅗ of both the House and Senate, with at least ¼ of the minority party in each chamber voting to do so.


Money in Digital Politics

DISCLOSE Act of 2019: Democracy is Strengthened by Casting Light on Spending in Elections Act of 2019 (S. 1147)

Sponsor: Sen. Sheldon Whitehouse (D-RI)—introduced with Democratic Senators Ron Wyden (OR), Charles Schumer (NY), Chris Van Hollen (MD), Patrick Leahy (VT), Dianne Feinstein (CA), Patty Murray (WA), Richard Durbin (IL), Jack Reed (RI), Tom Carper (DE), Debbie Stabenow (MI), Maria Cantwell (WA), Robert Menendez (NJ), Ben Cardin (MD), Sherrod Brown (OH), Bob Casey (PA), Amy Klobuchar (MD), Jon Tester (MT), Tom Udall (NM), Jeanne Shaheen (NH), Mark Warner (VA), Jeff Merkley (OR), Michael Bennet (CO), Kirsten Gillibrand (NY), Chris Coons (DE), Richard Blumenthal (CT), Brian Schatz (HI), Tammy Baldwin (WI), Chris Murphy (CT), Mazie Hirono (HI), Martin Heinrich (NM), Tim Kaine (VA), Elizabeth Warren (MA), Edward Markey (MA), Cory Booker (NJ), Gary Peters (MI), Tammy Duckworth (IL), Margaret Wood Hassan (NH), Kamala Harris (CA), Catherine Cortez Masto (NV), Tina Smith (MN), Doug Jones (AL), Jacky Rosen (NV), and Independent Senators Angus King (ME) and Bernard Sanders (VT) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Senate Committee on Rules and Administration in April 2019.
Companion: H.R. 2977

The “DISCLOSE Act of 2019” aims to shed light on how secret money perverts our elections. More specifically, the bill requires all corporations to report campaign expenditures of $10,000 or more to the Federal Election Commission (FEC) within 24 hours and to disclose all campaign-related expenditures to their shareholders and members; strengthens the foreign money ban by prohibiting foreign nationals from participating in decision-making about corporate campaign spending; mandates groups that run and fund a political advertisement to state that they “approve this message.”


Honest Ads Act (S. 1356)

Sponsor: Sen. Amy Klobuchar (D-MN)—introduced with Senators Mark Warner (D-VA), and Lindsey Graham (R-SC) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Committee on Rules and Administration in May 2019. This bill faces fairly long odds of passing in the Senate, though Facebook and Twitter have publicly stated that they support it.
Companion: H.R. 2592

The “Honest Ads Act”—firstly introduced in 2017—aims to prevent foreign interference in future elections and modernize campaign finance laws to account for the increased relevance of online political advertising. If passed, the bill would expand the definition of “electioneering communication” to include online political ads, thus subjecting them to the same rules as TV and radio ads. Furthermore, the bill would require online platforms reaching more than 50 million monthly viewers to disclose who paid for an advertisement and what targeting criteria they used, create a public database of political ads, and tighten restrictions to ensure that ads are not purchased by foreign entities.

MapLight supports this bill as outlined by an opinion piece on CNN.com by MapLight Digital Deception Project Director Ann Ravel.


Honest Ads Act (H.R. 2592)

Sponsor: House Rep. Derek Kilmer (D-WA)—introduced with Representatives Elise Stefanik (R-NY), John Sarbanes (D-MD), Brian Fitzpatrick (R-PA), Kathleen Rice (D-NY), Mike Gallagher (R-WI), Charlie Crist (D-FL), John Katko (R-NY), Thomas Suozzi (D-NY), Jamie Herrera Beutler (R-WA), Jimmy Panetta (D-CA), Sean Duffy (R-WI), Bradley Scott Schneider (D-IL), Mario Diaz-Balart (R-FL), Jim Costa (D-CA), David McKinley (R-WV), Dean Phillips (D-MN), Jeff Fortenberry (R-NE), Scott Peters (D-CA), Jenniffer Gonzalen-Colon (R-PR-AtLarge), Adam Smith (D-WA), Don Bacon (R-NE), Kurt Schrader (D-OR), Pete Stauber (R-MN), Ben Ray Lujan (D-NM), and Anthony Gonzalez (R-OH) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Committee on House Administration in May 2019, but it was included in the “For the People Act of 2019” that was passed by the House in March 2019.
Companion: S. 1356

The “Honest Ads Act” aims to prevent foreign interference in future elections and modernize campaign finance laws to account for the increased relevance of online political advertising. If passed, the bill would expand the definition of “electioneering communication” to include online political ads, thus subjecting them to the same rules as TV and radio ads. The bill would also require online platforms reaching more than 50 million monthly viewers to disclose who paid for an advertisement and what targeting criteria they used, create a public database of political ads, and tighten restrictions to ensure that ads are not purchased by foreign entities.

MapLight supports this bill as outlined by an opinion piece on CNN.com by MapLight Digital Deception Project Director Ann Ravel.


DISCLOSE Act of 2019: Democracy is Strengthened by Casting Light on Spending in Elections Act of 2019 (H.R. 2977)

Sponsor: House Rep. David Cicilline (D-RI)—introduced with 172 other Democratic Representatives (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Committee on House Administration, and in addition to the Committee on Ways and Means, in May 2019.
Companion: S. 1147

The “DISCLOSE ACT of 2019” aims to shed light on how secret money perverts our elections. More specifically, the bill requires all corporations to report campaign expenditures of $10,000 or more to the Federal Election Commission (FEC) within 24 hours and to disclose all campaign-related expenditures to their shareholders and members; strengthens the foreign money ban by prohibiting foreign nationals from participating in decision-making about corporate campaign spending; mandates groups that run and fund a political advertisement to state that they “approve this message.” The DISCLOSE Act includes several provisions that the House passed in their ethics and government reform package, H.R.1, which Senate Majority Leader Mitch McConnell has so far refused to bring up for a vote.


Security

Protecting Against Public Safety Disinformation Act (H.R. 7282)

Sponsor: House Rep. Lauren Underwood (D-IL)—introduced with Democratic representatives Donald Payne (NJ), Val Demings (FL), Bennie Thompson (MS), and Sheila Jackson Lee (TX) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Subcommittee on Intelligence and Counterterrorism, and the Subcommittee on Cybersecurity, Infrastructure Protection, and Innovation in August 2020.

The “Protecting Against Public Safety Disinformation Act,” introduced in June 2020, would do four things. First, it would direct the DHS Office of Intelligence and Analysis to assess the impact of malign disinformation operations carried out by foreign actors on homeland security threats. Second, it would charge the DHS Science and Technology Directorate with developing research-based methods for identifying and countering these disinformation operations. Third, it would ensure that state and local public health officials, first responders, and emergency managers have access to those assessments and recommendations. Finally, it would require the DHS to report to Congress on its findings.


Foreign Interference

SHIELD Act: Stopping Harmful Interference in Elections for a Lasting Democracy (H.R. 4617)

Sponsor: House Rep. Zoe Lofgren (D-CA)—introduced with Democratic Representatives John Sarbanes (MD), Jamie Raskin (MD), Susan Davis (CA), G. K. Butterfield (NC), Marcia Fudge (OH), Pete Aguilar (CA), Jerrold Nadler (NY), Stephanie Murphy (FL), Donald McEachin (VA), Tom Malinowski (NJ), Derek Kilmer (WA) (All Cosponsors)
Status: This bill passed along party lines in October 2019 and has now gone to the Senate, where it will likely die.

The “SHIELD Act” aims to fight against foreign efforts to interfere in U.S. elections. The bill would require campaigns, parties, and political committees like PACs and Super PACs to report “illicit offers” of election assistance from foreign governments or individual actors to both the Federal Bureau of Investigation (FBI) and the Federal Election Commission (FEC); subject political advertisements sold online to the same stringent rules as ads sold on television, radio, and satellite; prohibit foreign nationals from participating in decision-making about contributions or expenditures by corporations, PACs, Super PACs, and other entities and closes other avenues of foreign spending to influence elections, including digital advertising; deem the offering of non-public campaign material to foreign governments and their agents as an illegal solicitation of support. The bill also incorporates the “Deceptive Practices and Voter Intimidation Prevention Act” (S. 1834) that prohibits deceptive practices about voting procedures.


FADE Act: Foreign Agent Disclaimer Enhancement Act (H.R. 8464)

Sponsor: House Rep. Abigail Davis Spanberger (D-VA)—introduced with Rep. John Katko (R-NY) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the House Committee on the Judiciary in September 2020.

This bipartisan bill would require disclaimers attributing political content to a foreign principal be embedded on the face of a social media post itself. With this new requirement, platforms will also be responsible for ensuring that disclaimers are maintained with a post at all times, including when the post is subsequently shared—but only if the platform has 50,000,000 or more unique monthly United States visitors or users for a majority of months during the 12 months preceding the dissemination of the materials. The FADE Act would also clarify that these disclaimer requirements apply to any political communications directed at the United States, regardless of the foreign agent’s location around the world.


Privacy

Social Media Privacy Protection and Consumer Rights Act of 2019 (S. 189)

Sponsor: Sen. Amy Klobuchar (D-MN)—introduced with Sen. John Kennedy (R-LA) (All Cosponsors)
Status: There has been no movement on the bill since being read twice and referred to the Committee on Commerce, Science, and Transportation in January 2019.

The “Social Media Privacy Protection and Consumer Rights Act of 2019” would require online platform operators to inform a user, prior to a user creating an account or otherwise using the platform, that their personal data will be collected and used by the operator and third parties. The bill would also require platforms to provide users the option to specify their privacy preferences, although the operator may deny certain services or access to its platform if a user’s privacy settings make them impossible. Furthermore, platforms would also need to provide users with a free copy of their personal data, and notify them of a data breach no later than 72 hours after becoming aware of it. If an operator or third party violates the bill’s privacy requirements, violations will be considered unfair or deceptive under the Federal Trade Commission Act. The FTC may enforce this bill against common carriers regulated by the Federal Communications Commission under the 1934 Communications Act, and nonprofit organizations.


DETOUR Act: Deceptive Experiences To Online Users Reduction Act (S. 1084)

Sponsor: Sen. Mark Warner (D-VA)—introduced with Sen. Deb Fischer (R-NE) (All Cosponsors)
Status: There has been no movement on the bill since being referred to the Senate Committee on Commerce, Science, and Transportation in April 2019.

This bipartisan bill is meant to prohibit online platforms from using so-called “dark patterns,” or deceptive interfaces to trick users into agreeing to give away personal data without fully understanding what they are doing.