Skip Navigation
Research Report

Digital Disinformation and Vote Suppression

Summary: Election officials, internet companies, the federal government, and the public must act to defend the 2020 elections against digital disinformation attacks designed to suppress the vote.

Published: September 2, 2020

Executive Summary and Recommendations

U.S. elections face extreme pressure in 2020. The Covid-19 crisis has created new challenges for election officials and pushed them to make last-minute changes to the voting process, typically with resources that were already stretched thin. Pandemic-related voting changes have become an election issue themselves, with political actors sowing confusion for the benefit of their party. Bad actors have circulated lies to trick certain groups out of voting — and thanks to social media, these deceptive practices can instantly reach huge numbers of people. Experts warn that foreign powers have learned from Russia’s 2016 election interference efforts and will try to covertly influence the American electorate this year. 

State and local election officials play a crucial role in defending U.S. elections against these threats and in protecting American voters from disenfranchisement due to disinformation. Internet companies and members of the public can also take action against deceptive practices, voter intimidation, and other forms of digital vote suppression. In all cases, accurate information from trusted official sources provides the best antidote to disinformation about voting. 

Summary Recommendations

Election officials should:

  1. Develop plans and procedures to publicize corrective information. Make written plans to push out correct information without repeating falsehoods. Establish channels of communication with the public and with key actors like community groups, candidates, and the media.
  2. Publicize official sources of accurate information to build public trust. Disseminate information on well-publicized sources like websites, emails, advertising, and social media accounts that are active and verified by the platform.
  3. Protect official sources from hacking and manipulation. Secure official websites and social media accounts from being used to trick voters by implementing cybersecurity best practices like tight access controls, multifactor authentication, and anti-phishing procedures.
  4. Monitor for disinformation. Actively watch for falsehoods about elections, set up ways for the public to report instances of digital disinformation, work with internet companies, and participate in information-sharing networks.
  5. Build relationships with communities and media. Perform early public outreach to communities, including in appropriate languages, to facilitate communication before an incident occurs. Build relationships with local and ethnic media.

Internet companies should:

  1. Proactively provide information about how to vote.
  2. Maintain clear channels for reporting disinformation.
  3. Take down false information about voting but preserve the data.
  4. Protect official accounts and websites.
  5. Push corrective information to specific users affected by disinformation. 

The federal government should: 

  1. Enact the Deceptive Practices and Voter Intimidation Prevention Act.
  2. Share intelligence about incidents of disinformation and help disseminate correct information.

Introduction

The Covid-19 pandemic has created unprecedented challenges for election administration. Many state and local officials have had to completely change their plans in order to address new obstacles like stay-at-home orders, exponentially higher demand for absentee ballots, and the potential shortage of poll workers, who tend to be older and thus are at higher risk for severe illness from the virus. Elections that have already taken place in 2020 have seen massive changes, including drastic increases in voting by mail or absentee ballot. Due to the pandemic, some officials have made significant last-minute changes to the voting process. In some cases, rules have changed back and forth even in the final hours before Election Day, such as when courts have intervened to block changes. footnote1_MfEi5XISB7l3nshHh-pHn5GmzDYK3-iXrwC6uGoDE_szETFJpxrXym1Jim Rutenberg and Nick Corasaniti, “How a Supreme Court Decision Curtailed the Right to Vote in Wisconsin,” New York Times, April 13, 2020, https://www.nytimes.com/2020/04/13/us/wisconsin-election-voting-rights.html; and Allan Smith, “Ohio Primary Called Off at Last Minute Because of Health Emergency,” NBC News, March 16, 2020, https://www.nbcnews.com/politics/2020-election/ohio-governor-calls-state-postpone-tuesday-s-primary-elections-n1160816. 

This dynamic, even chaotic, environment has enormous potential to create confusion among voters. Key voting information — including election dates, polling locations, and mail-in voting rules — are suddenly subject to change. Voters may not learn about such changes in time to comply, or they may receive conflicting information and not know which sources to believe.

These factors leave voters more vulnerable to bad actors who use deceptive practices to spread false information in an attempt to trick people out of voting. In the United States, there is a long history of using such practices to keep certain voters away from the polls. And in recent years, the internet and social media platforms have increased the threat of vote suppression. For example, a deceptive tweet can reach millions of readers in a matter of minutes. footnote2_TVO-mKxlb5OXR6uEN7xMguCEmAmNnjh1rvAKUcI9zGc_dIGiGVyUyMCT2More Americans get news on social media than from print newspapers. In 2018, one-in-five adults said they often get news on social media. A. W. Geiger, “Key Findings About the Online News Landscape in America,” Pew Research Center, September 11, 2019, https://www.pewresearch.org/fact-tank/2019/09/11/key-findings-about-the-online-news-landscape-in-america/. Because of stay-at-home orders and quarantines due to Covid-19, voters are all the more dependent on online information. And even before the pandemic, national security experts warned repeatedly that foreign powers plan to attack American elections. footnote3_ZLuYA8WzdJiPbJp8YeRgMj6ELCzzfPyFpTkbm-2QRD4_gWGLkHe2Hgl03See, e.g., Ken Dilanian, “U.S. Intel Agencies: Russia And China Plotting To Interfere In 2020 Election,” NBC News, January 29, 2019, https://www.nbcnews.com/politics/national-security/u-s-intel-agencies-russia-china-plotting-interfere-2020-election-n963896 (“U.S. intelligence agencies assess that Russia and China will seek to interfere in the 2020 presidential election . . . .”). There is already evidence of the warning’s accuracy. Shane Harris and Ellen Nakashima, “With a Mix of Covert Disinformation and Blatant Propaganda, Foreign Adversaries Bear Down on Final Phase of Presidential Campaign,” Washington Post, August 21, 2020, https://www.washingtonpost.com/national-security/with-a-mix-of-covert-disinformation-and-blatant-propaganda-foreign-adversaries-bear-down-on-final-phase-of-presidential-campaign/2020/08/20/57997b7a-dbf1–11ea-8051-d5f887d73381_story.html

As a result, the risk for voter disenfranchisement due to disinformation — lies spread for a political purpose — is perhaps higher in 2020 than ever before.

The United States must build systems now to defend the electorate from disinformation about voting procedures. The country cannot afford to wait until incidents unfold. The officials who run American elections are aware of the threats and have been working to safeguard their systems and keep voters informed. Many agencies need more resources, however, and of course all governments are required to make decisions about where to focus the resources available.

This report builds on election officials’ efforts so far, along with recommendations from security and communications experts, to recommend the crucial steps to shield against deceptive voter suppression and to prepare for fast responses to disinformation attacks.

Deceptive Practices 

There is a multitude of stories about attempts to trick certain people out of voting. These deceptive practices have often involved the use of flyers, mailers, and robocalls. footnote4_iwataRbTqpqImb0foxYcNvddxls8v5o2A8fWi4I5Vk0_tdVNOZZwkpPw4See, e.g., Wendy Weiser and Vishal Agraharkar, Ballot Security and Voter Suppression: What It Is and What the Law Says, Brennan Center for Justice, 2012, 9, https://www.brennancenter.org/sites/default/files/2019–08/Report_Ballot_Security_Voter_Suppression.pdf. For example, during Texas’s Super Tuesday primary in March 2020, robocalls falsely told voters that they could vote “tomorrow.” footnote5_HiKVmTCzZqSvAfj3KL4PwSUh3kLWHdOL5I5wo4aRSI_bdlRVmePmfZ25Alia Slisco, “Robocalls Spreading Super Tuesday Misinformation Throughout Texas,” Newsweek, March 3, 2020, https://www.newsweek.com/robocalls-spreading-super-tuesday-misinformation-throughout-texas-1490368. Similarly, in 2004, flyers distributed in Franklin County, Ohio, falsely told voters that Republicans should vote on Tuesday and Democrats on Wednesday due to high levels of voter registration. footnote6_EOjf6Y0R85VCPgcZV1YT9p2QAASarA4bta-RErUGT6k_o2uSzGouq97q6Wendy Weiser and Adam Gitlin, Dangers of “Ballot Security” Operations: Preventing Intimidation, Discrimination, and Disruption, Brennan Center for Justice, 2016, 6, https://www.brennancenter.org/our-work/research-reports/dangers-ballot-security-operations-preventing-intimidation-discrimination. A related tactic is to intimidate voters with false reports of law enforcement presence, immigration enforcement actions, or election monitoring by armed individuals. footnote7_Sdk9f5C61RgS1N7OpvJsaxuX0Ud8Vb5eyZ6fZrpMk1E_i2xRQPIjXh1b7“In 2008 in Philadelphia, flyers posted near Drexel University incorrectly warned that police officers would be at polling places looking for individuals with outstanding arrest warrants or parking tickets.” Weiser and Gitlin, Dangers of “Ballot Security” Operations, 6.

These voter suppression tactics frequently target historically disenfranchised communities, including communities of color, low-income communities, and immigrant communities. footnote8_ke6XbBLlmTPupZWaX0LCZBoh13qPGhj8fxfPOjRlKU_fqqyKd4MATkm8Common Cause and the Lawyers’ Committee for Civil Rights Under Law, Deceptive Election Practices and Voter Intimidation: The Need for Voter Protection, 2012, 4, https://lawyerscommittee.org/wp-content/uploads/2015/07/DeceptivePracticesReportJuly2012FINALpdf.pdf. For example, during Alabama’s U.S. Senate special election in 2017, residents of Jefferson County — where the largest city, Birmingham, is predominantly African American — received text messages with false information about polling site changes. footnote9_wFg6A56qBjC4zHHUfJO7bUQ1fvQ7GdWx4jNxg0GzKw_nvNBHlu9V3dP9Sean Morales-Doyle and Sidni Frederick, “Intentionally Deceiving Voters Should Be a Crime,” The Hill, August 8, 2018, https://thehill.com/opinion/civil-rights/400941-intentionally-deceiving-voters-should-be-a-crime. And on Election Day in 2010, Maryland gubernatorial candidate Bob Ehrlich’s campaign manager targeted African American households with robocalls claiming that Governor Martin O’Malley had already been reelected, implying that his supporters could stay home instead of voting. footnote10_KqIGNHSZtFWNVjSY1NuYcbH3Pjvr3syKXTQIecHXqJ8_nsxW0to3j1Qr10Weiser and Agraharkar, Ballot Security and Voter Suppression, 9; and Luke Broadwater, “Prosecutors: GOP ‘Robocall’ Plan to Suppress Black Votes Hatched on Hectic Election Day,” Baltimore Sun, November 29, 2011, https://www.baltimoresun.com/politics/bs-md-shurick-trial-20111129-story.html.

Deceptive election practices are most commonly used in the last days before an election because they are presumably most effective if spread without time for rebuttal before voting begins. As a result, the scale and scope of voter suppression tactics for the 2020 election remain unknown, although recent history suggests disinformation will be a significant problem.

New Dangers Online

While dirty tricks in elections are an old phenomenon, in the 21st century deceptive practices have become more dangerous than ever before. The continued growth of the internet and social media platforms has made it easier and more affordable to reach huge numbers of people instantaneously and anonymously. Traditionally, deceptive practices involved narrow targeting by geography, such as with flyers on telephone poles in certain neighborhoods. Now, however, bad actors can use sophisticated microtargeting to surgically focus on certain demographics, and they can direct disinformation either toward disrupting a specific local election or toward a national audience. 

Amid rising polarization and mistrust of institutions in recent years, the Cold War–era concept of “disinformation” — the intentional spread of false information — has regained currency in American politics. footnote11_UrdNorX3aauNQTw8Se9xO6S1ErZPylv9-cUOqj2GBo_lz1tSjaFAZ8e11Caroline Jack, Lexicon of Lies: Terms for Problematic Information, Data & Society Research Institute, 2017, 3, https://datasociety.net/pubs/oh/DataAndSociety_LexiconofLies.pdf. Disinformation is often distinguished from misinformation, which is false information that is spread without bad intent. Although this report focuses on disinformation, the key remedy — disseminating correct information — is the same regardless of the intent of those who spread false information. 

In recent U.S. elections, bad actors, both foreign and domestic, have attempted to stop certain people from voting by spreading false information online. Leading up to the 2016 U.S. presidential election, operatives of the Internet Research Agency, a Russian company tied to President Vladimir Putin, engaged in voter suppression by posing as Americans and posting messages and ads on social media. They used deceptive practices like directing people to vote by text, which is not a valid option anywhere in the United States. Operatives also targeted African Americans with messages recommending election boycotts or votes for third-party candidates. footnote12_38ucj1NJa8VAyRB34HuxzQuf0jRn1P9eGvxichfN7NQ_nXbH9UVStHBH12Renee DiResta et al., The Tactics & Tropes of the Internet Research Agency, New Knowledge, 2018, 8, 85, https://disinformationreport.blob.core.windows.net/disinformation-report/NewKnowledge-Disinformation-Report-Whitepaper.pdf; Kurt Wagner, “These Are Some of the Tweets and Facebook Ads Russia Used to Try and Influence the 2016 Presidential Election,” Vox, October 31, 2017, https://www.vox.com/2017/10/31/16587174/fake-ads-news-propaganda-congress-facebook-twitter-google-tech-hearing (including examples recommending that Clinton voters vote by text); and Joseph Bernstein, “Inside 4chan’s Election Day Mayhem and Misinformation Playbook,” Buzzfeed, November 7, 2016, https://www.buzzfeednews.com/article/josephbernstein/inside-4chans-election-day-mayhem-and-misinformation-playboo#.hsq0K9jOjM (same).

In 2018, many social media accounts posted false voting information, including instructions to vote by text and claims that voters of one party were required to vote the day after Election Day. footnote13_obyojK7-S74nY6CjBlLV9wgBz6bkmUOko3SltwZMrko_zDYLjUdPhsQJ13Young Mie Kim, “Voter Suppression Has Gone Digital,” Brennan Center for Justice, November 20, 2018, https://www.brennancenter.org/our-work/analysis-opinion/voter-suppression-has-gone-digital. A candidate paid for a Facebook ad that falsely suggested that Kansans would need documentary proof of citizenship in order to register to vote. footnote14_jdp0btLmXqjmCczGbjxO8t5W7l-eHgm65n3zNpzO9e8_eewq35SgJLTj14Kim, “Voter Suppression Has Gone Digital.” Other messages conveyed threats about people bringing guns to the polls or law enforcement rounding people up at polling places. footnote15_jdp0btLmXqjmCczGbjxO8t5W7l-eHgm65n3zNpzO9e8_bSBguiLEgrq115Kim, “Voter Suppression Has Gone Digital.” Just days before the 2018 election, U.S. Immigration and Customs Enforcement had to publicly address rumors that had spread via flyers and social media by announcing that it would not, in fact, conduct enforcement operations at polling places. footnote16_wOLekrTxAN306CqejmusTsD1xhZkPZLSVfAse95gxis_exK7eSnKzAVL16Center for the Advancement of Public Integrity, Prosecuting Vote Suppression by Misinformation, 2019, 1, https://web.law.columbia.edu/public-integrity/Prosecuting-Vote-Suppression-By-Misinformation.

Digital disinformation about voting has spread online during the 2020 presidential primaries, even as many voters have relied more than ever on the internet for election information due to public health directives to stay home amid Covid-19. On Super Tuesday, there were instances of disinformation across several states — for example, one tweet targeting supporters of Kentucky gubernatorial candidate Matt Bevin read, “Bevin supporters do not forget to vote on Wednesday, November 6th!” footnote17_3N1bJKp6TZOO8OTdDRDwQqDzSu7tkqlkLim1SyGDc8_f71QgVYCJYXR17Election Protection monitored social media for voting disinformation and found many messages directing certain voters to vote on Wednesday. For a preserved screenshot of the tweet quoted in the text, see https://monosnap.com/file/fgCcA3nsv4rdk1IkUIC8xB60TQWcR0. On the day of an August congressional primary in Florida, some voters received texts linking to a YouTube video falsely presented as a candidates’ announcement that he dropped out. footnote18_Ernv5hDle1O59rDhCZxJ35on2N8g1siHJQpECXNe98c_speCPWaIK57L18Alex Marquardt and Paul P. Murphy, “Fake Texts and YouTube Video Spread Disinformation about Republican Primary Candidate on Election Day,” CNN, August 18, 2020, https://www.cnn.com/2020/08/18/politics/byron-donalds-fake-texts-florida-republican-primary/index.html. 

Bad actors are spreading lies about Covid-19 to try to stop people from voting. For example, on Super Tuesday, some circulated messages on Twitter like this one: “warning that everyone over age 60 that #coronavirus has been reported at ALL polling locations for #SuperTuesday.” footnote19_zWMfK-oVa2IDBONqzO4JQpyWzBJxfpTSu9TPd3s3to_cXCsiwOaBYwT19Adam Rawnsley, “Sick: Trolls Exploit Coronavirus Fears for Election Fun,” Daily Beast, March 3, 2020, https://www.thedailybeast.com/trolls-exploit-coronavirus-fears-for-super-tuesday-fun. 

Foreign states have attempted to influence the 2020 elections as well, including through attacks intended to discourage voters from supporting specific front-runner candidates. footnote20_TC8gbqDQ4tCXHkyaZrGBdComf-Ojwba1VzjrQ18Kc_a0dl4sWkxSg220Young Mie Kim, “New Evidence Shows How Russia’s Election Interference Has Gotten More Brazen,” Brennan Center for Justice, March 5, 2020, https://www.brennancenter.org/our-work/analysis-opinion/new-evidence-shows-how-russias-election-interference-has-gotten-more. They have also circulated stories pushing the narrative that American elections are rigged. footnote21_YGuGDUCl1i89oE3LJcmXHhqpavtAAzzhpAKMBrEjd00_zCjMXncFK96L21Samantha Lai, “Russia’s Narratives about U.S. Election Integrity in 2020,” Foreign Policy Research Institute, May 25, 2020, https://www.fpri.org/fie/russia-election-integrity-2020/; Samantha Lai, “Iran’s Narratives about U.S. Election Integrity in 2020,” Foreign Policy Research Institute, June 19, 2020, https://www.fpri.org/fie/iran-election-integrity-in-2020/. In late 2019, the FBI and Department of Homeland Security (DHS) warned state election officials that Russia could try to interfere in the election in 2020 by discouraging voter turnout. footnote22_tji61yzAEPlP4fiyhoabFw7YsAJ9SDy5TPBSg9n5n2E_rhUsiBC8eeRL22Kevin Collier, “Russia Likely to Focus on Voter Suppression in 2020, Feds Warn States,” CNN, October 13, 2019, https://www.cnn.com/2019/10/03/politics/russia-voter-suppression-warning/index.html.

Legal Remedies 

People who attempt to suppress the vote by spreading false information may be violating several federal and state laws. footnote23_8WVY7QmNrIckmzVAU0XuYUkpgA9DOcHp-glXAQK9Xto_n6DMwizC9vyy23See, e.g., 42 U.S.C. § 1985(3) (providing cause of action if “two or more persons . . . conspire . . . for the purpose of depriving, either directly or indirectly, any person or class of persons of the equal protection of the laws, or of equal privileges and immunities under the laws”). Many states prohibit various forms of voter intimidation and election interference and impose criminal or civil penalties. footnote24_pUe6tSLhibPGROtOX2sO6AIhzLWsYWyjO91h6h7qTo_sfisflChSjxI24See generally Weiser and Gitlin, Dangers of “Ballot Security” Operations, 1–4. For instance, Virginia imposes criminal penalties for knowingly communicating false election information to a registered voter: “It shall be unlawful for any person to communicate to a registered voter, by any means, false information, knowing the same to be false, intended to impede the voter in the exercise of his right to vote.” Va. Code Ann. § 24.2–1005.1(A). Wisconsin similarly prohibits “false representation[s] pertaining to a candidate or referendum.” Wis. Stat. § 12.05. States have criminally prosecuted operatives involved in spreading disinformation about voting. For example, the campaign manager and a political consultant for 2010 Maryland gubernatorial candidate Bob Ehrlich — both of whom were involved in robocalls designed to suppress Black votes — were convicted of offenses including fraud and failing to identify the source of the calls. footnote25_6CgDSMqQ07pq0mjiGTtxaCQtJ9JtB8ONCg1HLCYNs4_w6vdVlYuy3rJ25Gubernatorial candidate Bob Ehrlich’s campaign manager was convicted of election fraud and failing to identify the source of the calls. John Wagner, “Ex-Ehrlich Campaign Manager Schurick Convicted in Robocall Case,” Washington Post, December 6, 2011, https://www.washingtonpost.com/local/dc-politics/ex-ehrlich-campaign-manager-schurick-convicted-in-robocall-case/2011/12/06/gIQA6rNsaO_story.html. A consultant for the campaign was convicted of failing to identify the source of the calls. Luke Broadwater, “Julius Henson Sentenced to Jail in ‘Robocall’ Case,” Baltimore Sun, June 13, 2012, https://www.baltimoresun.com/news/bs-xpm-2012–06–13-bs-md-henson-sentencing-20120613-story.html. 

These existing laws are important but insufficient. To be sure, people who spread disinformation should be held accountable, and enforcement should serve as a deterrent to future misconduct. But litigation and similar measures happen after the election, and therefore after any damage to the franchise has already been done. Additionally, existing laws against deceptive practices differ in breadth, and enforcement can be irregular. footnote26_g35XbmcCcTVAWTSLDFlbUsNElbNw-5xcFIhEDb-CFrc_cBkiF2MAMG9N26Common Cause and the Lawyers’ Committee for Civil Rights Under Law, Deceptive Election Practices, 2–3. Below, this paper recommends federal legislation that would expressly prohibit deceptive practices and provide for clear sanctions and corrective action. footnote27_NS79UT8u3xI-Ln8whXxP8ja94uirOQNNkYQdiIr6Luc_sFOdZaOJ92Zv27Deceptive Practices and Voter Intimidation Prevention Act of 2019, H.R. 3281, 116th Cong. (2019), included in the For the People Act of 2019, H.R. 1, 116th Cong. (2019). 

Regardless of whether legal reforms are enacted, there are concrete actions that state and local election officials, internet companies, members of Congress, and ordinary citizens can take today that will help protect voters from the effects of deceptive practices, voter intimidation, and other forms of digital vote suppression.

End Notes

Recommendations for Election Officials

State and local election officials are the most important actors in addressing the use of disinformation to suppress votes. They must establish themselves as trusted sources of accurate information for voters in their communities. The most urgent response to instances of disinformation to suppress voting is to disseminate correct information, which is discussed immediately below. But officials cannot wait for incidents to come along. This paper describes crucial preparations that should be made long in advance.

1. Develop plans and procedures to publicize corrective information

 The most important way to respond to disinformation is to correct it with the truth without helping to spread the lie, a task that requires long-term planning and infrastructure building. It is too late to formulate a plan on Election Day, or even during early voting periods. Instead, agencies should formulate written procedures that cover what to do and who should do it when deceptive practices are discovered. footnote1_Kbbf3eGcqnpOTCcp1KdxHzrluVAXr4cb3tKXH7rU2Vs_hdqgtS5YXdcP1Election officials should “[d]raft, review, and approve a communications plan prior to negative developments.” Edgardo Cortés et al., Preparing for Cyberattacks and Technical Problems during the Pandemic: A Guide for Election Officials, Brennan Center for Justice, 2020, 22, https://www.brennancenter.org/our-work/research-reports/preparing-cyberattacks-and-technical-problems-during-pandemic-guide. Relevant staff members should receive checklists and training that includes simulated incidents or tabletop exercises. footnote2_zS1YcCZwsRui7sxu9r-PHOqOkZ3QISTHfnJTlxv10_o5iLAtRryn422Advice on and examples of checklists can be found in related election infrastructure resilience guidance. Brennan Center for Justice, “Preparing for Cyberattacks and Technical Problems During the Pandemic: A Checklist for Election Officials,” 2020, https://www.brennancenter.org/sites/default/files/2020–06/2020_06_PreparingforAttack_Checklist.pdf; and Belfer Center for Science and International Affairs, Election Cyber Incident Communications Plan Template, 2018, https://www.belfercenter.org/sites/default/files/files/publication/CommunicationsTemplate.pdf.

The DHS Cybersecurity and Infrastructure Security Agency (CISA), offers standardized incident response plans, trainings, and simulated disinformation scenarios to assist state and local agencies. footnote3_HnW8qla0e4t3U3836dtkwg9gM7mkWgrwC2rfuNdYPTI_zyQOU7AaXnlk3U.S. Department of Homeland Security (hereinafter DHS), Cybersecurity and Infrastructure Security Agency, #Protect2020 Strategic Plan, 2020, 12, https://www.cisa.gov/sites/default/files/publications/ESI%20Strategic%20Plan_FINAL%202.7.20%20508.pdf. The Belfer Center for Science and International Affairs has outlined best practices for communicating with the public to counter false information. footnote4_JXYnt-kjwmeBNNIWm9VknPI8JEA5iumovedg9v9hlA_fakY1RUpXOo74Belfer Center, Election Cyber Incident Communications Plan Template, 11.

Establish channels of communication with key actors 

The goal of contingency plans is to prepare to most effectively correct false information with the truth without helping to spread lies. When officials find instances of disinformation about voting, they should proactively distribute accurate information to all the appropriate actors and channels, including:

  • election officials’ websites, social media accounts, and other channels
  • individuals staffing phone lines and reception areas at election offices
  • information-sharing networks like the voter assistance coalition Election Protection
  • media outlets that serve affected communities, including those in languages other than English
  • community groups and faith leaders
  • parties and candidates running in the relevant election

Officials should establish these channels of communication long before Election Day. Relationships must be built and maintained over time, not when an emergency is already underway. Establishing lines of communication early allows election officials to include points of contact in their plans. That way, when disinformation is detected, officials can simply go down the list of contacts and share corrections with the relevant individuals and groups. In addition, ongoing communication helps inform political actors of whom to contact to report incidents. Community groups, for example, may be the first to discover disinformation about voting, and information will be shared most efficiently if they know the right election official to report it to.

Election agencies should consider naming an individual official or a small team to lead information collection, which includes processing reports that are made to an official email address, phone number, or social media account. footnote5_JLFvov5d7h2Lzzmf9LKMTv7c7gwccvMSNhFCa54X5M_jwcNjxHjoKzE5“Every organization, political campaign, activists and those using these platforms for outreach should have a focused disinformation arm.” Stop Online Violence Against Women, A Threat to an American Democracy: Digital Voter Suppression, 2020, 21, http://stoponlinevaw.com/wp-content/uploads/2020/02/7.pdf. The same official or team should also monitor social media more broadly for false content or receive direct reports from staff or vendors who are monitoring digital content. footnote6_8gcOq9tQmYZsfAGVP1kRe73NSrDMtmyIYr8ZgAx0pA8_xNIRzAB1z22T6Belfer Center for Science and International Affairs, The State and Local Election Cybersecurity Playbook, 2018, 18, https://www.belfercenter.org/sites/default/files/files/publication/StateLocalPlaybook%201.1.pdf. And a single authority should be in charge of sharing information about incidents with appropriate networks and informing affected communities of the correct information. Without clear accountability, issues and incidents may fall through the cracks. 

Do not repeat falsehoods

Messages from election officials correcting disinformation should not repeat falsehoods. Research shows that repeating falsehoods to debunk them can backfire and make people more likely to remember the false information. footnote7_Ix2ohTwY5uu5KhSSjRCqlX982SK1bggZcM-V2AYTBdQ_q1fPs3UtCcTz7“One of the most frequently used correction strategies, the myth-versus-fact format, can backfire because of repetition of the myth, leaving people all the more convinced that their erroneous beliefs are correct.” Norbert Schwarz et al., Making the Truth Stick & the Myths Fade: Lessons from Cognitive Psychology, Behavioral Science & Policy Association, 2016, 86, https://behavioralpolicy.org/wp-content/uploads/2017/05/BSP_vol1is1_Schwarz.pdf. If officials consider it absolutely necessary to include the original disinformation, they should structure the messages to present accurate and easy-to-understand information first, warn that the disinformation is false before mentioning it, and repeat the facts. footnote8_1fuf0e8w3ZUclBHOeJd3lYlZ492eP7qeQy7HVL6GqnI_jTqvMYVeTQEx8Schwarz et al., Making the Truth Stick, 92–93. People are more likely to remember the first and last things they hear, as well as information that’s repeated. 

As an example, during the Texas presidential primary on Super Tuesday, robocalls told voters to vote “tomorrow.” footnote9_IENwk1iefgZLalaK1kdvzWEFNFr9UqG1zqLaIClKEw_wJzXyiNZvjeh9Slisco, “Robocalls Spreading Super Tuesday Misinformation.” The official Twitter account of the Texas secretary of state alerted the public that false information was being circulated and provided correct information without repeating the falsehood. footnote10_FFwgPgvaBQ7VyJq8gZh88olcdaQqeApu9rUxBrZ3X1Y_r8pijmK4kLsW10Texas Secretary of State (@TXsecofstate), “Our office has received reports of robocalls stating misinformation about today’s primary election. To be clear, all eligible voters should vote today,” Twitter, March 3, 2020, 4:30 p.m., https://twitter.com/TXsecofstate/status/1234954361941479424?s=20.

2. Publicize official sources of accurate information to build public trust

Election officials must work to build resilience to disinformation long before Election Day. Accurate information from a trusted source provides the most effective shield against deceptive vote suppression. Accordingly, election agencies should build public trust in specific sources of information before disinformation attacks occur. footnote11_rnUq5poa7-wnoVnfwACFIaElz5geGYC5hDAQZNCJK7E_vFSiowcpGbKy11Belfer Center, The State and Local Election Cybersecurity Playbook, 18. They should proactively inform the public of key information like dates for voting and encourage voters to look up polling places and registration status well in advance. 

State and local agencies should designate and publicize specific sources of information — at a minimum, a phone number and website — as authoritative. They should provide an official email address and maintain social media accounts on platforms popular in local communities. footnote12_6PUXBRZuyZJ9izYPxbSeg38LwJKPrUGasH4db28Ei6A_jWICixXVGOiP12University of Pittsburgh Institute for Cyber Law, Policy, and Security, The Blue Ribbon Commission on Pennsylvania’s Election Security: Study and Recommendations, 2019, 52, https://www.cyber.pitt.edu/sites/default/files/FINAL%20FULL%20PittCyber_PAs_Election_Security_Report.pdf. When using traditional forms of outreach like mailers and advertising, agencies should direct voters to official digital sources. 

Wherever possible, election agencies and officials should ensure their social media accounts are designated as official by the associated platforms — for example, as a verified or “blue check” account on Twitter. footnote13_UxgrIbcBsxUb6TSbjdvsJlNzJ7O1vZ3V9pf0o7cXYp8_g5gFyrtLlt9b13“Election officials should . . . secure ‘verified’ status for their official accounts on social media platforms.” Ad Hoc Committee for 2020 Election Fairness and Legitimacy, Fair Elections During a Crisis: Urgent Recommendations in Law, Media, Politics, and Tech to Advance the Legitimacy of, and the Public’s Confidence in, the November 2020 U.S. Elections, 2020, 20, https://www.law.uci.edu/faculty/full-time/hasen/2020ElectionReport.pdf. The procedures and requirements for obtaining verified accounts vary depending on the platform. footnote14_DgusTEydbPahR33O0LPNWKO2FQFCCLwq8BLRFjgKik_f5ESOShwHMVM14For example, Facebook and Twitter both have verified account programs. “How do I request a verified badge on Facebook?” Facebook Help Center, accessed August 14, 2020, https://www.facebook.com/help/1288173394636262; and “Verified Account FAQs,” Twitter Help Center, accessed August 14, 2020, https://help.twitter.com/en/managing-your-account/twitter-verified-accounts. Some local election officials have had difficulty getting verified due to issues such as low follower counts. Social media companies ought to accommodate local governments, but if they do not, involvement by state officials could help. For example, Arizona Secretary of State Katie Hobbs has worked with social media companies to get county administrators’ accounts verified. footnote15_7kk9c5oxSATZPXUg1vAthTPg78DK4-YxtI0ZifUl2k_khHjCFBa3NXk15Matt Vasilogambros, “How Your Local Election Clerk Is Fighting Global Disinformation,” Pew Stateline, July 20, 2020, https://www.pewtrusts.org/en/research-and-analysis/blogs/stateline/2020/07/20/how-your-local-election-clerk-is-fighting-global-disinformation. 

Election agencies and officials should not allow their social media accounts to lie dormant, but rather should work actively to gain followers, including journalists and community groups. They should maintain uniform icons, logos, and other branding across all media — including websites and social media accounts, which should link to each other. footnote16_QDNyyxSI1WfkefVtw1r84Lj2POGwLyExAbZdxrhk_sdmz8idmRyTq16Jesse Littlewood, Vice President for Campaigns, Common Cause, personal communication with author, July 24, 2020. 

The National Association of Secretaries of State (NASS) initiative #TrustedInfo2020 promotes election officials as reliable sources of voting information. footnote17_ReFbjxx5xs9PzqoG7oHb5VRYrBwbpSaeLHlTF7y8hE_bxLUAkDnJISf17“#TrustedInfo2020,” National Association of Secretaries of State website, accessed August 17, 2020, https://www.nass.org/initiatives/trustedinfo-2020. Election officials should participate by using the hashtag to help direct audiences to their websites and social media accounts. footnote18_8-TZUBZ1hENpDHygjFKOwyeG1cKn7ebE7mBw4i160GY_pWHkZhf002dE18According to Kathy Boockvar, secretary of the Commonwealth of Pennsylvania, “We should continue to focus on initiatives like #TrustedInfo2020 that promote trusted election sources because there’s so much misinformation out there that has the potential to cause disruption or to undermine the confidence of our voters.” Tim Lau, “Last Chance to Secure the 2020 Elections,” Brennan Center for Justice, July 27, 2020, https://www.brennancenter.org/our-work/research-reports/last-chance-secure-2020-elections. 

Officials should also find ways to distribute information to relevant audiences. In California, for example, officials collected millions of email addresses during the voter registration process, then sent an email to that list directing voters to the official state elections guide. footnote19_WKySQvQo1Y5I6h23J1tW4T212K5FAWYVL3TGyyv4Dq4_f2n1xcR85oWs19Brian Fung, “States Launch ‘Trusted Information’ Efforts against Fake News on Social Media,” CNN, March 3, 2020, https://edition.cnn.com/2020/03/02/politics/state-efforts-against-social-media-misinformation/index.html. Facebook allows government officials to send “local alerts” within various geographic boundaries to users who follow the officials’ page. “Local Alerts Overview,” Facebook Business Help Center, accessed August 17, 2020, https://www.facebook.com/business/help/1064049677089136?id=1549080658590154. Illinois officials bought YouTube ads pointing viewers to county election websites. footnote20_Udm8MbaNq0WSbDOV1BnSKXJzhr6poj1c8JApvZfqfn0_oVg8q54Akqtt20Vasilogambros, “How Your Local Election Clerk Is Fighting Global Disinformation.” Officials should make information accessible to members of the community who speak languages other than English and to people with disabilities. footnote21_It088NYIG-pEkYiODRGwD5F1wM6wADS98OzEZKHFAo_pxpQFm9YFn2b21Wendy Weiser and Max Feldman, How to Protect the 2020 Vote from the Coronavirus, Brennan Center for Justice, 2020, 10, https://www.brennancenter.org/our-work/policy-solutions/how-protect-2020-vote-coronavirus. Finally, officials should frequently review and update digital materials to help ensure their accuracy.

3. Protect official sources from hacking and manipulation

The potential for bad actors to hijack official sources of election information — either through spoofing or hacking — represents perhaps the gravest threat to the ability of election officials to disseminate accurate information and maintain public trust. Accordingly, election agencies should invest in security measures to protect those information sources from hacking and manipulation. 

Spoofing protections 

“Spoofing” is what happens when bad actors set up look-alike websites with similar or disguised URLs to divert audiences from official sources. footnote22_RgEM8dEpf2KDa7jtfRndedVoccySdEt0ZQDp8NxdHZM_lU3UDHvW51QM22Belfer Center, The State and Local Election Cybersecurity Playbook, 41; Alex Hern, “Unicode Trick Lets Hackers Hide Phishing URLs,” Guardian, April 19, 2017, https://www.theguardian.com/technology/2017/apr/19/phishing-url-trick-hackers; and Michael Archambault, “The Danger of Spoofed Websites: Learn to Tell the Difference,” PSafe dfndr blog, May 7, 2018, https://www.psafe.com/en/blog/the-danger-of-spoofed-websites-learn-to-tell-the-difference/. In 2019, a government imposter sent a phishing scam to thousands of email addresses, using the web domain “uspsdelivery-service.com” to mimic the U.S. Postal Service website. footnote23_aK6QZI0oOmDJwuLT9W990ZRY1BGcIl7yDyNMF42o5kg_wHGUZZXYexVg23Bryan Campbell et al., “TA2101 Plays Government Imposter to Distribute Malware to German, Italian, and US Organizations,” Proofpoint blog, November 14, 2019, https://www.proofpoint.com/us/threat-insight/post/ta2101-plays-government-imposter-distribute-malware-german-italian-and-us. The official website of the Postal Service is usps.com. Officials should regularly check for attempts to mimic their websites and consider using services that offer custom monitoring.

Election agencies should use a .gov domain for their websites rather than other options such as .com, which are easier to spoof. footnote24_idmkGTCAQHZootPAGg14SsBUZ4rgTbWkC0KT8-gR0ho_vIzE9Cdn27MG24“Election officials should obtain a .gov domain for an authenticated internet presence.” Ad Hoc Committee for 2020 Election Fairness and Legitimacy, Fair Elections During a Crisis, 20. A survey by computer security company McAfee found that fewer than 2 in 10 counties in anticipated 2020 battleground states are doing so. footnote25_LZyR2pja96sRa531xb7HtzhzmJlPPvHlIhE4PMIhw1o_i007e4QwLIMz25McAfee, “Website Security Shortcomings Could Render U.S. Elections Susceptible to Digital Disinformation,” accessed August 21, 2020, https://www.mcafee.com/enterprise/en-us/assets/faqs/faq-election-2020.pdf. Using a .gov domain offers protections like two-step verification and monitoring from federal agencies like DHS, the General Services Administration, and the National Institute of Standards and Technology. footnote26_uH6E81E1DIPtrfx3-Lkqgpur5b8AEM-NIDnrpDlPxM_u9IP6sNkdhZv26DHS, Cybersecurity and Infrastructure Security Agency, Leveraging the .gov Top-Level Domain, accessed August 21, 2020, https://www.cisa.gov/sites/default/files/publications/cisa-leveraging-the-gov-top-level-domain.pdf.

Additionally, agencies should encrypt their official websites with Secure Sockets Layer (SSL) technology, which is signaled to users with an “HTTPS” prefix in the URL and, in most browsers, a lock icon. footnote27_j-t5b6Cnt4Ubib2u7HUUg9O5tb00XkA73PmEbBMuM6I_kIQRQTUJDirc27McAfee, “Website Security Shortcomings.” They should use search engine optimization to improve their rankings in search engine results. Officials should regularly monitor for search engine manipulation by copycat websites and report spoofing attempts to the relevant search engine company. footnote28_AcmNTAWztOm-3Sz3N8fpxo9FRfptcqN7A4AlIyRTUcg_e4xc3kpw4zw328Google, “Report Phishing Page,” accessed August 17, 2020, https://safebrowsing.google.com/safebrowsing/report_phish/?hl=en.

Bad actors may set up fake social media accounts to masquerade as election officials. footnote29_TOtL9rgiAhwaxKntJn2cU9pwpaRo1YORYXO9GozNKI_z96qVFZqUFX029Belfer Center, The State and Local Election Cybersecurity Playbook, 46. During the 2016 election cycle, for example, operatives of Russia fooled some Twitter users into thinking that their @TEN_GOP account was a mouthpiece of Tennessee Republicans. footnote30_xsHEKCSXoSwEZsgi7-sM6pu0jwRJnn42Rgg5lKvcBY8_t8rC627BnJgI30Special Counsel Robert S. Mueller III, Report on the Investigation into Russian Interference in the 2016 Election: Volume I of II, U.S. Department of Justice, 2019, 22, https://www.justice.gov/storage/report.pdf. Officials should regularly monitor platforms for these copycat accounts and report them to the relevant social media companies. footnote31_Vhk1BdUfCwQEnspYj-Dfq8eoXiDvNurMhQPuCr2ymw_i4sBiGHxOJDa31Distinct from vote suppression harms, impersonation violates the terms of service of many platforms. Facebook, “Impersonation and Hacked Accounts,” accessed August 21, 2020, https://www.facebook.com/help/532542166925473; and Twitter, “Report Impersonation Accounts,” accessed August 21, 2020, https://help.twitter.com/en/safety-and-security/report-twitter-impersonation.

Hacking protections

Bad actors can hack websites, email accounts, and social media accounts to take control of official channels. For example, in 2019, hackers took over Twitter CEO Jack Dorsey’s account and tweeted racist messages from his Twitter handle. footnote32_XqbdFGZxfR76Og9P8cVW931NYRtUUT3YNVtf4J9bkjs_jtlTYZYdpOxl32Kevin Collier and Ahiza Garcia, “Jack Dorsey’s Twitter Account Was Hacked — and He’s the CEO of Twitter,” CNN, August 30, 2019, https://www.cnn.com/2019/08/30/tech/jack-dorsey-twitter-hacked/index.html. Late in the day of a primary election in Knox County, Tennessee, hackers shut down the local election commission’s website for an hour, disrupting the publication of early results. footnote33_glmg8imbpYvgdEQ9CsFseykVGgbal76bV6It215aWQY_rVv0bqMa3At333Miles Parks, “Not Just Ballots: Tennessee Hack Shows Election Websites Are Vulnerable, Too,” NPR, May 17, 2018, https://www.npr.org/2018/05/17/611869599/not-just-ballots-tennessee-hack-shows-election-websites-are-vulnerable-too. And in a July 2020 scam, hackers took control of several prominent Twitter accounts, including those of politicians, and urged their followers to send them Bitcoin. footnote34_QZy9lR4nsi1808PQkXIcLgR7JsW-4R5vgeG9wOVegQ_gWUWOHmEh3t934Sheera Frankel et al., “A Brazen Online Attack Targets V.I.P. Twitter Users in a Bitcoin Scam,” New York Times, July 15, 2020, https://www.nytimes.com/2020/07/15/technology/twitter-hack-bill-gates-elon-musk.html. Florida prosecutors have charged a 17-year-old hacker with masterminding the attack. Nathaniel Popper et al., “ From Minecraft Tricks to Twitter Hack: A Florida Teen’s Troubled Online Path,” New York Times, August 2, 2020, https://www.nytimes.com/2020/08/02/technology/florida-teenager-twitter-hack.html. Overall, the risk for cyberattacks — including ransomware attacks on election infrastructure — has increased in 2020 as Americans have shifted to working from home due to the Covid-19 pandemic. footnote35_lMguRnSz1b9Ld4nOOJIpVg-YQlqj5d3Gt6HcoirE2OY_tAmAEtnD4x8935David E. Sanger and Nicole Perlroth, “Russian Criminal Group Finds New Target: Americans Working at Home,” New York Times, June 25, 2020, https://www.nytimes.com/2020/06/25/us/politics/russia-ransomware-coronavirus-work-home.html.

Officials should implement cybersecurity best practices, such as the Brennan Center’s detailed recommendations for election infrastructure and staff training. footnote36_hgaA8TttAJS8YaGbtO5nYKn4MSDM111ETCOIZUOYq8_i6ExlAzmDUEV36Cortés et al., Preparing for Cyberattacks, 4–5. Agencies should maintain tight controls on who has the ability to make changes to official websites. When employees leave, their credentials should be removed immediately. Log-in credentials for website edits should include strong passwords that are changed regularly and multifactor authentication, and automatic logout should occur after a period of inactivity. footnote37_L-Agn6J59fUo8Y0HBs2U4M-49AwINIWSWQm46ZfiR4_qxEeArYoTPwz37There is published guidance on multifactor authentication. DHS, Cybersecurity and Infrastructure Security Agency, “Multi-Factor Authentication,” 2019, https://www.cisa.gov/sites/default/files/publications/cisa-multi-factor-authentication.pdf; see also Cory Missimore, “The Multiple Options for Multi-Factor Authentication,” ISACA Now blog, July 26, 2018, https://www.isaca.org/resources/news-and-trends/isaca-now-blog/2018/the-multiple-options-for-multi-factor-authentication (explaining different forms of multifactor authentication). Agencies should install software updates immediately and consider the use of a web application firewall. footnote38_a9MMExSqoo50GeoDu8dfpuLWEU9gbYnihs16zb0Avuw_n2vakiT5vtLF38Center for Internet Security, “MS-ISAC Security Primers — SQLi,” accessed August 21, 2020, https://www.cisecurity.org/white-papers/sqli/.

Election agencies should catalog social media accounts — including those that belong to the agency itself and those of individual officials — along with a list of individuals with log-in credentials for each account. footnote39_rnUq5poa7-wnoVnfwACFIaElz5geGYC5hDAQZNCJK7E_zWPObIDDpnRz39Belfer Center, The State and Local Election Cybersecurity Playbook, 18. Officials should protect their accounts with strong passwords and multifactor authentication, and they should change passwords regularly, especially as an election approches. Agencies should consider a comprehensive social media policy that directs staff to exercise caution when posting election information, even on personal accounts, to prevent them from sharing inaccurate information. footnote40_ZBM-Axgnz6gf7vSAA7eq85ng3n6KuG2k9ikmKuHCA_brICcs7oGXCX40Center for Internet Security, “Social Media: The Pros, Cons, and the Security Policy,” March 2020, https://www.cisecurity.org/newsletter/social-media-the-pros-cons-and-the-security-policy/. Staff should receive regular training on how to avoid phishing scams. They should never use personal email accounts for official business. footnote41_vTSaALUH7AS8MePUDXIxMti0cot0tB7UorA0dp9yNU_bysOlxqYvXl141Area 1, Phishing Election Administrators, 2020, https://cdn.area1security.com/reports/Area-1-Security-PhishingForElectionAdministrators.pdf.

Officials should monitor agency websites and social media accounts at least once a day for hacking and prepare to recover hacked sites and accounts. They should identify points of contact with social media companies for troubleshooting problems like losing control of an account. footnote42_rnUq5poa7-wnoVnfwACFIaElz5geGYC5hDAQZNCJK7E_qQ0FwwbreOS842Belfer Center, The State and Local Election Cybersecurity Playbook, 18. Some platforms offer enhanced protection for official accounts. footnote43_03LQNdUKUDEK9y3iLCZZtMVQYUoVRKbutbSNlB6fA_cBjTB4LVDriz43Facebook, for example, offers enhanced protection for official accounts through its “Facebook Protect” service. Facebook, “What is Facebook Protect?” accessed August 21, 2020, https://www.facebook.com/gpa/facebook-protect#tab-0-what-is-facebook-protect-.

Election agencies should back up websites and other resources that will be needed by voters confused by disinformation, like polling place lookup tools and voter registration checks. footnote44_hMpotkRTgRrBxfLrYE0jwth2t54LpTn9tw8rou95RiQ_caL8OSIL1aO644Cortés et al., Preparing for Cyberattacks, 7. Google Project Shield is a free service that protects election websites from distributed denial-of-service (DDoS) attacks. Google Project Shield, “Protecting Free Expression from Digital Attacks,” accessed August 21, 2020, https://projectshield.withgoogle.com/landing. They should have contingency plans for how they will keep voters informed if online tools are disabled. It is also useful to stress test web-based tools and databases ahead of time to learn how much traffic they can handle. footnote45_BjHgiyNPr25Z3t3NMIdrcdpvUc4UVvAnolmu4fzNQ_wwhdzznzm6oZ45“Officials should ensure that state and county election websites undergo periodic independent load and vulnerability testing, as these websites will get heavier usage while the public practices social distancing” due to Covid-19. Cortés et al., Preparing for Cyberattacks, 21.

The Election Infrastructure Information Sharing and Analysis Center (EI-ISAC), a nonprofit membership organization for election officials, offers guidance on how to secure websites against hacking, as well as a service that checks domains for outdated software. footnote46_q1zIer6YvJsvYIOI9lolZKlH6Fq35bzn0R2dfJEPYjU_lpfkIErOonnK46Center for Internet Security, “Cybersecurity Spotlight — Website Defacements,” accessed August 21, 2020, https://www.cisecurity.org/spotlight/cybersecurity-spotlight-website-defacements/. For more information about the services available to local governments who join the Center for Internet Security’s Election Infrastructure Information Sharing and Analysis Center (EI-ISAC), see Center for Internet Security, “EI-ISAC Services,” accessed August 21, 2020, https://www.cisecurity.org/ei-isac/ei-isac-services/. CISA offers cybersecurity assessments that can help secure official web portals and protect against phishing attacks that may endanger website credentials. footnote47_nsLBStJg54RAHJf40z4cNGRTAo5-jhoSfR6OkZzTmBw_l1RhHvIYGpWD47DHS, Cybersecurity and Infrastructure Security Agency, “Cyber Resource Hub,” last modified July 24, 2020, https://www.cisa.gov/cybersecurity-assessments; and DHS, Cybersecurity and Infrastructure Security Agency, #Protect2020 Strategic Plan, 13. 

4. Monitor for disinformation

Officials must actively monitor for disinformation that could suppress votes in their jurisdiction. This requires dedicating staff time to searching for it and creating channels for others to report incidents. Officials should also work with social media companies, notifying them of activity that may violate their terms of use, and participate in broader information-sharing networks.

Set up an ongoing monitoring operation

Election officials in every state — ideally at both the state and local levels — should monitor social media for false information about how to vote. footnote48_DkX7GutfIdkj3eKFXGbTdf1MkZPgsyG38EVFJhcnk_dkXu5Dh6BuXU48See, e.g., Wisconsin Elections Commission, Election Security Report, 2019, 49, https://elections.wi.gov/sites/electionsuat.wi.gov/files/2019–09/2019%20Elections%20Security%20Planning%20Report.pdf. (“WEC staff has worked with officials at major social media companies to quickly communicate any attempts to misinform voters and to have the offending posts removed.”); and University of Pittsburgh, The Blue Ribbon Commission on Pennsylvania’s Election Security, 52 (“Relevant officials need to be ready to contact social media companies to alert them to [disinformation and] have a reliable and widely known set of social media accounts to rebut disinformation . . . .”). They should conduct monitoring efforts in consultation with legal counsel. footnote49_Vg2oU0RZyFwOJu7Cx-rTY8ArL3d3B1jKamM2negBAYs_re2Epv4k7Rtw49Judd Choate, Colorado Director of Elections, personal communication with author, July 19, 2020. Monitoring should focus on the relevant jurisdiction, of course, as well as communities typically targeted by vote suppression. Search terms should include concepts around voting logistics, including dates and locations for voting and details about voting by mail. Disinformation is also likely to mention candidates or parties standing in local elections.

There are services that can assist with monitoring efforts. The MITRE Corporation, for example, offers a tool designed to quickly analyze reports of disinformation and help election officials report incidents up their agency hierarchy. footnote50_MOsaCol7jxRtmLoxpyUrgcPNGYbdS3xlohRuzMA9yws_ekAcBZ7WT9QZ50Molly Manchenton, “SQUINT Sharpens Officials’ Perspective to Combat Election Distortion,” MITRE, February 2020, https://www.mitre.org/publications/project-stories/squint-sharpens-officials-perspective-to-combat-election-distortion. Commercial social media monitoring or “listening” services — some available for free — can track mentions of elections or voting procedures in a jurisdiction.

Set up channels for officials to receive reports of disinformation

Election officials should provide and publicize a clear line of communication for voters, journalists, internet companies, and officials in other jurisdictions to report deceptive practices. At a minimum, there should be an email address and phone number, but official social media accounts are also useful. For example, California’s secretary of state launched Vote Sure, an initiative to increase voter awareness of false and misleading information, and has directed the public to report disinformation to a dedicated email address. footnote51_XR9ewSa156Opn2zPqlE4VZ8ytNGW-2WAdwC7lqNJsk_eVsoSPOHRe7M51California Secretary of State, “VoteSure Voting Resources” (“Report Misinformation” link), accessed August 21, 2020, https://www.sos.ca.gov/elections/vote-sure/.

Report content that violates platform policies and check for takedowns 

Election officials should report instances of disinformation to the relevant internet company. While the platforms’ terms of use differ, most major social media companies have some form of prohibition against deception about voting. Election officials should establish contacts at key platforms in advance of the election. They should report disinformation to their specific point of contact — not through a “flagging” option, which is often an inadequate mechanism to address voter deception and leaves no way to monitor platform responses. On Facebook, for example, users can report content as “voter interference,” but that only triggers monitoring for trends in the aggregate; it does not lead to a manual review for takedowns. footnote52_p9ZcM9LLCtVF2IcmYPaUirwjUcIpdHr3J8cZ35sM_kkkmCYDGqXzj52Laura W. Murphy and Megan Cacace, “Facebook’s Civil Rights Audit: Final Report,” Facebook, 2020, 32–33, https://about.fb.com/wp-content/uploads/2020/07/Civil-Rights-Audit-Final-Report.pdf. Officials should look online to check whether false posts, shares, or retweets have actually been removed, and if the disinformation is still live, they should follow up with their contacts.

Participate in information-sharing networks

Officials should participate in information-sharing networks with other election officials, federal government agencies, internet companies, and community groups. For example, the national nonpartisan coalition Election Protection, which operates the voter help line 1–866-OUR-VOTE, provides voter information and maintains contact with election officials across the country. Election Protection coalition members can alert affected communities of disinformation and push corrective information out to individuals.

Agencies participating in the EI-ISAC have access to a platform for sharing threat information in real-time. footnote53_SgIjJ3ra22-nbBi77DNRv2JgXLeJ24J6TrUgzCzMkc_xfGuPP6YQw5A53DHS, Cybersecurity and Infrastructure Security Agency, #Protect2020 Strategic Plan, 14. At DHS, CISA plans to operate a switchboard during the election that will transmit reports of disinformation from election officials to internet companies and law enforcement. footnote54_EgIgJavG9MJjYAV2SJ6iCP4rXCJ22xAfv5cJZR5Hs_uh8IEAXYNRZn54DHS, Cybersecurity and Infrastructure Security Agency, #Protect2020 Strategic Plan, 22.

Where appropriate, election agencies should report incidents to federal and local law enforcement.

5. Build relationships with communities and media

Election officials should do extensive public outreach to build trust with all the communities they serve, and those efforts should incorporate all commonly spoken languages in those communities. Cultivating ongoing relationships with communities will make it easier to communicate during an emergency.

Election agencies should consider designating a spokesperson to provide accurate information on how to vote, to promote that information online and on social media, and to make local media appearances. footnote55_lnKEcNcGDG0wOSFwFtdivdQyu4mjHh1LfWGMLXb59hM_aysDFHZe7EyA55There are lessons to be learned in the Belfer Center’s guide for official communications plans around cyber incidents, which provides samples for designating official roles, setting out processes, and crafting checklists. The designated official will benefit from media training. Belfer Center, Election Cyber Incident Communications Plan Template, 18. This designated official should also build relationships with community groups. footnote56_aCCZagATcCs8j1C9P8OKGMy9Q0hEF39XDwz2g3i3Q_vvE2TdeoX6cM56See, e.g., Wisconsin Elections Commission, Election Security Report, 49.

Agencies can build public trust and familiarity by working to increase the number of followers on official social media accounts, a process that could include advertising and engaging in conversation with community leaders, journalists, and other relevant figures. They should also consider recruiting high-profile surrogates, such as influencers with significant followings on particular social media channels, to amplify accurate information.

Local and ethnic media that serve frequently targeted communities are key partners in disseminating correct information in response to deceptive practices. Officials should build relationships with outlets and reporters to help establish themselves as trusted sources. footnote57_aFu78l0JXNb-vhW096hdrL6dZSGZ5Z9YVkXHSHWj18_uVooTqMTLGrV57Belfer Center, Election Cyber Incident Communications Plan Template, 18. Media partners should receive instruction in advance on how to avoid repeating falsehoods when reporting incidents. footnote58_4esCdbMfKgMt9Pm878GCR8KjhChRfPEPjN5xi2HnT1o_scEdtplGjF3U58The American Press Institute’s Trusted Elections Network is a resource for journalists covering the spread of false information around elections. American Press Institute, “Trusted Elections Network,” accessed August 21, 2020, https://www.americanpressinstitute.org/trusted-elections-network/. It includes advice about how journalists should cover misinformation, including by going to official sources. American Press Institute, “Trusted Elections Network Resource Guide,” accessed August 21, 2020, https://www.americanpressinstitute.org/trusted-elections-network/resource-guide/.

End Notes

Recommendations for Internet Companies

Many of the biggest social media platforms have banned disinformation about voting procedures and will take down posts with incorrect information when they find them. Facebook, for example, bans the misrepresentation of voting times, locations, methods, and eligibility, along with threats of violence related to voting or registering to vote. footnote1_b2rhQi5gGWZWNiboDtqz6Jt479eQeRS8qdWmPNktso_q5xQfB7LQjaH1Guy Rosen et al., “Helping to Protect the 2020 US Elections,” Facebook, October 21, 2019, https://about.fb.com/news/2019/10/update-on-election-integrity-efforts/. Despite Facebook’s policy of allowing politicians to make false claims in ads, it appears that the ban on voter suppression applies to everyone. (“We remove this type of content regardless of who it’s coming from . . . .”). Other popular social networks like YouTube, Twitter, and Pinterest have similar policies, and Google bans ads that contain false information about voting. footnote2_lBK2x0rgpH8q9JZPWemZFI32lTEjyBh67nOWaSpIdU_jUeFbj8vzpi62Pinterest waited until January of 2020 to ban vote suppression. Cat Zakrzewski, “The Technology 202: Pinterest Bans Misinformation About Voting and the Census,” Washington Post, January 29, 2020, https://www.washingtonpost.com/news/powerpost/paloma/the-technology-202/2020/01/29/the-technology-202-pinterest-bans-misinformation-about-voting-and-the-census/5e307ba288e0fa6ea99d60fc/; and Google, “Misrepresentation,” accessed August 21, 2020, https://support.google.com/adspolicy/answer/6020955?hl=en&ref_topic=1626336 (prohibiting “claims that are demonstrably false and could significantly undermine participation or trust in an electoral or democratic process,” such as “information about public voting procedures”).

However, it is unclear how reliable these disinformation bans are. Enforcement can be uneven, and companies frequently change their policies. This section outlines the essential actions that all internet companies, including social networks, ad sellers, and search engines, should take to combat disinformation about voting.

1. Proactively provide information about how to vote

Social media platforms, search engines, and other web and mobile sites should point all users to accurate information about voting, including how to register and vote, relevant deadlines, and how to contact the appropriate election officials. footnote3_awD-EaMhk7lqPxeeZGerS07Jq1Z8pmeG9KtAaEXlhM_oH8mBhmyYSDg3John Borthwick, “Ten Things Technology Platforms Can Do to Safeguard the 2020 Election,” Medium blog post, January 7, 2020, https://render.betaworks.com/ten-things-technology-platforms-can-do-to-safeguard-the-2020-u-s-election-b0f73bcccb8. They should make this information available long before the election but increase its visibility as voting or registration deadlines approach.

Internet companies should direct users to reliable sources like official election sites or the “Can I Vote” resource page prepared by NASS. footnote4_O9gLNM2ukj5ojcEzj7N0uuI5itYqBVEfrq2wKBHlidY_nIhTEjBHb8im4National Association of Secretaries of State, “Can I Vote,” accessed August 21, 2020, https://www.nass.org/can-I-vote. They should promote posts appearing on the official accounts of election agencies. Algorithms should prioritize official posts to make them more prominent in users’ feeds, in search results, and in lists of trending topics.

2. Maintain clear channels for reporting disinformation

Companies should make it easy for users, officials, and community groups to report false information about voting. Social media platforms should offer users a clear and accessible option to tag posts as voter suppression, and those reports should be subject to immediate review by an employee, which can help determine the context of posts and whether they pose a risk of voter suppression. One issue with review by algorithms is that they frequently cannot distinguish between a message pushing disinformation and a message quoting and debunking a lie. For example, a post by a get-out-the-vote group warning community members that bad actors are circulating false “vote Wednesday” messages should not be treated the same as an attempt to trick people out of voting on Election Day. Human review is more likely to take contextual factors like these into account.

Internet companies should build relationships with state election officials to facilitate information sharing in both directions.

3. Take down false information about voting but preserve the data

Social media companies should ban disinformation that attempts to suppress the vote — and do so with clear standards that are transparent to users. They should remove such disinformation promptly but also provide an accessible appeals option as part of the review process. Users whose posts are removed should be able to quickly contact a company employee to make the case that the removal was a mistake.

Companies should monitor their platforms for repeat offenders and impose more severe consequences on them. YouTube, for example, bans vote suppression and has a schedule of escalating consequences for violations of its community guidelines. If a user gets three strikes within 90 days, YouTube will permanently delete that person’s account. footnote5_XCAamLvqL9o5rzxPnU1YFl6NchSOZvjteKIoTomD5Y_sxQsrRmQkacF5Google, “YouTube Help: Spam, Deceptive Practices & Scams Policies,” accessed August 21, 2020, https://support.google.com/youtube/answer/2801973; and Google, “YouTube Help: Community Guidelines Strike Basics,” accessed August 21, 2020, https://support.google.com/youtube/answer/2802032.

Even when a social media company removes posts and accounts, it should preserve the data for analysis by researchers, which can help identify patterns in vote suppression activity to better prevent it in the future. These data may provide valuable information on, for example, deceptive foreign influence campaigns, such as the interference by operatives of Russia in the 2016 election. footnote6_I0rF85Kv3kQfP2dJ8OX8Sz8SEJ9AElrz6y2G1jkqzZA_eYMGaIyD8V1r6Mueller, Report on the Investigation into Russian Interference, 4. Companies should retain these data in a manner that is accessible to and searchable by researchers consistent with users’ privacy interests.

4. Protect official accounts and websites

Social media platforms should provide special protection for election officials’ accounts against hacking and spoofing. Facebook, for example, offers enhanced protection for official accounts through its “Facebook Protect” service. footnote7_DbBqUilv83oVqUs59RJx217oUmQKj9eJ4NfHEAD97a0_be2C63W9tJku7Facebook, “What is Facebook Protect?” Facebook offers verified badges. Facebook, “How do I request a verified badge on Facebook?” accessed August 21, 2020, https://www.facebook.com/help/1288173394636262. As of this writing, Twitter’s verified badge application process is on hold. Twitter, “About Verified Accounts,” accessed August 21, 2020, https://help.twitter.com/en/managing-your-account/about-twitter-verified-accounts. Social media companies should offer verified status for official accounts, like Twitter’s “blue check,” through a process that accommodates the realities of local election officials’ operations. For instance, local election officials are less likely to have a large social media audience, so a high follower count should not be a prerequisite for obtaining verified status. Instead, social media platforms should allow state officials to certify accounts that belong to local officials.

Search engine companies should proactively watch their platforms for attempts to direct users to spoofed websites.

5. Push corrective information to specific users affected by disinformation

When social media companies find false information about voting on their platforms, they should identify which users received the disinformation and notify those individuals with messages that include accurate information and directions for how to contact election officials. In 2017, Facebook said it would be “challenging” to notify users who saw deceptive political ads from a Russian company with ties to Putin. footnote8_Qv5NWljb3-CRvfijl6a1-L0q-oWS4r-2fpJ6GVQ9aIc_mEbUDv1qyydp8Tony Romm and Kurt Wagner, “Here’s How to Check if You Interacted with Russian Propaganda on Facebook During the 2016 Election,” December 22, 2017, Vox, https://www.vox.com/2017/12/22/16811558/facebook-russia-trolls-how-to-find-propaganda-2016-election-trump-clinton. But the industry has had two years to solve any technical challenges. In fact, in 2020, Facebook has contacted users who interacted with false information about Covid-19, proving the feasibility of the practice. footnote9_9uWIV2zVkXgzgfPcvXL4UHtdY6DHYjKgBLhaOt4Oag_dlpdwCp3v4P19“New alert messages will appear in the Newsfeeds of people who have liked, reacted to or commented on known [Covid-19] misinformation, connecting people to WHO myth-busting.” Chloe Colliver and Jennie King, The First 100 Days: Coronavirus and Crisis Management on Social Media Platforms, Institute for Strategic Dialogue, 2020, 35, https://www.isdglobal.org/isd-publications/the-first-100-days/; and Billy Perrigo, “Facebook Is Notifying Users Who Have Shared Coronavirus Misinformation. Could It Do the Same for Politics?” Time, April 16, 2020, https://time.com/5822372/facebook-coronavirus-misinformation/. Social media companies should not operate a service that feeds disinformation to voters without ensuring they can provide those users with a corrective remedy.

End Notes

Recommendations for Federal Action

Congress

Congress should clarify and strengthen prohibitions against voter suppression through disinformation. footnote1_oOwJXt7Wa0fYf20APdCjuOf4VJT2Uo9KeedcEy2YQQA_bxieunDtFKqB1Wendy Weiser and Alicia Bannon, eds., Democracy: An Election Agenda for Candidates, Activists, and Legislators, Brennan Center for Justice, 2018, 12–13, https://www.brennancenter.org/our-work/policy-solutions/democracy-election-agenda-candidates-activists-and-legislators. It can do this by passing the Deceptive Practices and Voter Intimidation Prevention Act. footnote2_EpzucdXFqoLnecF-paL6sFukVTM4ZVzBpoEmYkaEzxE_b9VjevLUVVnr2Deceptive Practices and Voter Intimidation Prevention Act of 2019, H.R. 3281, 116th Cong. (2019), included in the For the People Act of 2019, H.R. 1, 116th Cong. (2019). Although federal law already bans intentional efforts to deprive others of their right to vote, existing laws have not been strong enough or specific enough to deter misconduct. And existing law does not give any federal authority the mandate to investigate deceptive practices and provide voters with corrected information. footnote3_33n7UNId71GYoou2LtJwGQOAbAFOTRXjRqLEjOo12fI_rxBqQi3ujjmh3Wendy Weiser et al., The Case for H.R. 1, Brennan Center for Justice, 2020, 6, https://www.brennancenter.org/our-work/policy-solutions/case-hr1.

The Deceptive Practices and Voter Intimidation Prevention Act would clearly prohibit attempts to block people from voting or registering to vote, including by making false or misleading statements. It would impose criminal penalties on offenders and enable citizens to go to court to stop voter deception. And it would require the U.S. attorney general to affirmatively correct disinformation if election officials fail to. These improvements would give federal law enforcement agencies and the public more tools to stop bad actors from attacking the right to vote. The House of Representatives passed a version of this legislation in 2019 as part of H.R. 1, also known as the For the People Act, an omnibus democracy reform bill. footnote4_WeuzM5yWHnkBVE9qsftji5ZxZNCQzW5pzbQmBwKGCk_iAuiLEBqn0MD4For the People Act, H.R. 1, 116th Cong. (2019).

Federal agencies

Federal agencies like DHS and the FBI are sharing information with local government officials about disinformation threats. The Department of Justice, whose Voting Section enforces federal laws around voting practices, should participate in sharing information and, as in past elections, prosecute individuals and entities that violate the law by spreading disinformation about how to vote. footnote5_4cfXZEAEeL8BpeyX0Ss6Zs8NXg5PG3UbCZUzhHEpoCk_jjOwVEWTxrAp5For example, prosecutors working with Special Counsel Robert Mueller charged companies and individuals involved in the Russian Internet Research Agency’s attempts to interfere with the 2016 election through disinformation with conspiracy to defraud the United States. Mueller, Report on the Investigation into Russian Interference, 9. Generally, federal agencies that discover disinformation that might block people from voting should take the following steps:

  • alert local election officials, media, and community groups
  • take measures to ensure that correct information is publicized in a manner that reaches the disinformation targets
  • refer the matter to the appropriate agency to investigate deceptive practices for prosecution after the election

End Notes

Conclusion

There can be no doubt that disinformation designed to suppress the vote will continue to be a threat in the 2020 elections and beyond. State and local election officials are our most important defense. They must plan and prepare now by protecting their ability to communicate corrective information and defending their infrastructure against attacks. Internet companies, too, must make their services as safe as possible for voters and keep them from being easy tools for bad actors to hoodwink the electorate. And state and local officials need federal support to protect access to voting in the face of digital deception.

At the same time, all internet users are responsible for helping to stop the spread of digital disinformation about voting. Users who see a message that seems suspicious or outlandish — or fits too neatly into a partisan political narrative — should investigate before sharing. It is useful to consider the source, to find out the speaker’s political agenda and what other content they have shared, like satire or jokes. Headlines and tweets are short and often misleading, so it is necessary to read more, looking for specific facts and checking dates to tell whether the story is accurate and current. Finally, one of the most important tools to check for disinformation is “lateral reading,” when the reader sets the message aside and searches for the information elsewhere, or other sources that have already determined whether it is a hoax.

Users who see something suspicious, incredible, or shocking online about voting should check with state and local election officials for the authoritative answer. NASS provides contact information for voting officials on its “Can I Vote” resource page. footnote1_0zgerNmSDyilMdlj290WOoqgs1h6nbZw0nmRm0kc3Sw_ee0ZYem6y23l1National Association of Secretaries of State, “Can I Vote.” If the information is false, officials will want to know about it. Members of the public can also share alerts about disinformation in their communities with a call to Election Protection at 1–866-OUR-VOTE. Election Protection volunteers will record incidents, look for patterns, and help pass information to internet companies and officials. Social media users can help spread correct information once they have confirmed it by sharing it in their own networks without repeating the falsehood.

Amid a global pandemic, foreign interference, partisan dirty tricks by domestic political operatives, and digital tools giving bad actors the power to widely disseminate messages instantly, the threat from disinformation is greater than ever before. But with awareness of the threat and time to prepare, the government, internet companies, and the public can work together to protect everyone’s ability to vote in 2020.

End Notes