The Shortlist: Seven Ways Platforms Can Prepare for the U.S. 2024 Election

An ounce of election risk mitigation is worth a pound of cure.
Tech Platforms’ Consequential Choices in the “Year of Elections”

Tech Platforms’ Consequential Choices in the “Year of Elections”

In 2024, the “year of elections,”1Editorial, This Isn’t Just an Election Year. It’s the Year of Elections, WASH. POST (Dec. 31, 2023, 7:15 AM), the technology platforms that comprise today’s online information ecosystem are facing a watershed moment. With the U.S. election season underway and 82 other elections being held around the world this year, platforms’ election preparations and guardrails are poised to play a critical role in the production and spread of election information for more than four billion voters.2See Elections Everywhere All at Once, ATLANTIC COUNCIL (Dec. 6, 2023, 3:00 PM),; Katie Harbath & Ana Khizanishvili, Insights from Data: What the Numbers Tell Us About Elections and Future of Democracy, INTEGRITY INSTITUTE (Oct. 30, 2023),

Protect Democracy has produced four recommendations for each of three platform categories, (1) social media platforms, (2) messaging platforms, and (3) generative AI platforms, to inform their preparations to safeguard the information ecosystem surrounding the U.S. general election. These recommendations are not intended to be comprehensive; rather, they are priority interventions which can be adapted to platforms’ nuances and implemented with the time remaining before November. Notably, we do not suggest that platforms ban large categories of content or avoid being sites of election information. Nor do we expect that platforms will be able to identify and act upon every piece of election-threatening content created with or published on their surfaces.

There is no way to fully address the digital threats surrounding the U.S.’s 2024 election cycle. Increased scrutiny by the Select Subcommittee on the Weaponization of the Federal Government and via litigation like Murthy v. Missouri has put pressure on platforms’ integrity measures and coordination with external stakeholders, including government agencies.3See Emily Brooks & Rebecca Klar, ‘Weaponization’ Subcommittee Members Spar Over ‘Twitter Files,’ THE HILL (March 9, 2023, 2:21 PM),; Ann E. Marimow, Supreme Court Says White House May Continue Requests to Tech Companies, WASH. POST (Oct. 20, 2023, 6:14 PM), Moreover, even if platforms were to implement every guardrail and mitigation available, bad actors would still find ways to produce and distribute election-threatening content online. 

Recognizing this, our recommendations offer a pragmatic and systemic approach to risk mitigation — the platform equivalent to an ounce of prevention being worth a pound of cure. They highlight proactive measures that do not censor, but rather lay safeguards along the path to scaled production or distribution.  

Our recommendations offer a pragmatic and systemic approach to risk mitigation — the platform equivalent to an ounce of prevention being worth a pound of cure. They highlight proactive measures that do not censor, but rather lay safeguards along the path to scaled production or distribution.

We recognize that neither silencing voices nor allowing a small set of users to dominate the online environment is healthy for democracy or online communities. And we believe the adoption of these recommendations would meaningfully reduce the volatility the online information environment threatens to inject into the 2024 election. By implementing these measures, platforms will serve as sites for democratic discourse and demonstrate that protecting our experiment in self-government is a priority worth optimizing for.

What’s Changed Since 2020

What’s Changed Since 2020

Today’s landscape of online platforms is far larger and more fragmented — though no less interconnected — than the field of platforms available during the 2020 election or any previous election cycle. Platforms vary widely in design, content formats, user base, and company size, but broadly fall into three major categories: (1) social media platforms, (2) messaging platforms, and (3) generative AI platforms. None of these platform categories are siloed – instead, their products are complementary and connected, together driving the dynamics of how content is created and spread online. 

As the most significant content distribution platforms in the online ecosystem, social media and messaging platforms have been both channels for valuable election information as well as vectors for disinformation and election subversion narratives. Generative AI platforms, in contrast, are content production platforms. Their widespread availability has made it easier than ever to develop high-quality synthetic media across content formats (visual, audio, and text). This includes first and third-party offerings that integrate generative AI capabilities into a range of products and surfaces, including social media and messaging platforms.4For example, Meta has announced their integration of generative AI across Facebook, Instagram, Messenger, and WhatsApp, including Meta AI. What’s New Across Our AI Experiences, META: NEWSROOM (Dec. 12, 2023, 12:05 PM), Examples of synthetic content being created and spread with the aim of influencing elections are already multiplying.5See Morgan Meaker, Slovakia’s Election Deepfakes Show AI is a Danger to Democracy, WIRED (Oct. 3, 2023, 7:00 AM),; Ali Swenson & Will Weissert, New Hampshire Investigating Fake Biden Robocall Meant to Discourage Voters Ahead of Primary, AP NEWS (Jan. 22, 2024, 11:32 PM),; Rishi Iyengar, How China Exploited Taiwan’s Election—and What It Could Do Next, FOREIGN POLICY (Jan. 23, 2024, 2:37 PM),; Kate Lamb, Fanny Potkin & Ananda Teresia, Generative AI May Change Elections This Year. Indonesia Shows How, REUTERS (Feb. 8, 2024, 6:59 AM), -boom-2024-02-08.

Alongside changes in the platform landscape, the risks facing American elections and democracy more broadly have shifted and escalated since 2020.6Deana El-Mallawany, Justin Florence & Jessica Marsden, The Triple Threat to Democracy in 2024, PROTECT DEMOCRACY (Jan. 10, 2024), As a result, the 2024 election faces a rise in threats and harassment directed at election officials,7William Brangham & Ian Couzens, Election Workers Face Violent Threats and Harassment Amid Dangerous Political Rhetoric, PBS NEWSHOUR (Nov. 16, 2023, 6:35 PM), experts’ concerns about the increased risk for violence,8Jennifer Dresden & Lilliana Mason, Violence and Democracy Impact Tracker: December 2023 Update, PROTECT DEMOCRACY (Dec. 12, 2023), and narratives proliferating online and offline that erode confidence in our electoral systems.9El-Mallawany, supra note 6. 

Social media, messaging and generative AI platforms inevitably will be critical sources and conduits of election information this cycle. As such, they can and should make choices that will meaningfully enhance the degree to which the 2024 presidential election is free and fair. Of course, these choices require tradeoffs – both in terms of resource investment and short-term engagement on a platform. Fortunately, there is good precedent for platforms making tough choices that uphold their responsibility as hosts of election information and democratic engagement. 

With finite time remaining and platforms’ 2024 product roadmaps already in flight, it is now more essential than ever that platforms make these choices again. The work to implement election-protection strategies that are pragmatic, achievable, and impactful has begun and continues through Inauguration Day 2025. What follows are recommendations to help achieve those aims.

Summary of Recommendations

Summary of Recommendations

Full Platform Recommendations

Full Platform Recommendations

With contributions and thanks to: Aaron Baird, Alexandra Chandler, Chris Crawford, Rachel Goodman, Brad Jacobson, Christian Johnson, Jess Marsden, and Cecelia Vieira.

About the Author

Nicole Schneidman

Technology Policy Strategist

Nicole Schneidman is a Technology Policy Strategist at Protect Democracy working to combat anti-democratic applications and impacts of technology, including disinformation.

Related Content