Introduction: A Case Study from Arizona
This guide is a joint project of Institute for the Future, The Elections Group, and the Brennan Center for Justice.
As potential artificial intelligence threats to elections have grown increasingly dire, many election officials worry that they have little awareness of the risks, nor practical guidance for how to prepare for this new technology and the threats it poses to election security. In particular, they feel limited in their ability to communicate with voters about possible AI-driven disruptions to the 2024 elections. In recent weeks, the Cybersecurity and Infrastructure Security Agency (CISA) provided election officials with a risk analysis that details ways in which AI might be used maliciously against election processes, offices, officials, and vendors and provides invaluable suggestions for mitigating these risks. CISA also released an assessment of foreign malign influence operations targeting elections, including how foreign adversaries might leverage AI, which similarly included suggestions for countering such threats.
Mere awareness of the ways that AI might threaten elections is no substitute for actually seeing how it could do so. On December 15–16, 2023, the office of Arizona Secretary of State Adrian Fontes, in collaboration with the Brennan Center, the Elections Group, and the Institute for the Future, conducted a first-of-its-kind tabletop exercise on how AI could disrupt election operations in 2024. The goal of this exercise was to prepare officials at all levels of government for AI-generated or supported attacks against election offices and infrastructure.
This two-day tabletop exercise was a crisis scenario planning exercise in which participants practiced responding to simulated emergency situations. Among the scenarios were an attempt to harvest county office login credentials using AI-generated emails and text messages that appeared to be from the state’s election security office; an audio deepfake from a state official directing offices to keep polling locations open because of a nonexistent court order; and AI-generated photos that purported to show an election official involved in criminal activity circulating on social media. In all cases, the AI tools used were available on the web for free or at low cost and did not require special technical skills to operate. AI tools were used in other ways during the tabletop exercise as well, including to create deepfake videos using material from the secretary of state’s X (formerly Twitter) account.
The tabletop exercise included participants from 14 of 15 Arizona counties, including county election officials and representatives from county information technology (IT) offices, law enforcement, emergency management services, federal and state agencies (such as CISA and the National Guard), and other members of the elections community. Most participants were broken into 10 teams of 8 to 10 people each. Most approached the exercises from the vantage point of their current employment (e.g., county recorder, emergency manager, information officer, or board of supervisors member). Each team was given a fake county name and its own table. There were individual teams for secretary of state employees, law enforcement, and vendors.
At the beginning of the first day, participants were given a budget to purchase security and resiliency items, ranging from anti-phishing training to backup communications capability (for phone and internet) to multifactor authentication systems. Their budgets were intentionally not enough to pay for all the security and resiliency measures that were offered. The scenarios each team faced were recalibrated based on the risks they used their budgets to mitigate. For example, if a team purchased anti-phishing training, then the phishing scenario and consequences were removed from that table’s experience. At the end of each day, participants debriefed on lessons learned and additional steps they might have taken to prepare for and address the emergency scenarios.
For the most part, participants did not learn that deepfakes and other AI-generated material were inauthentic or AI-generated until the end of the second day. But skepticism about what could be trusted grew over the course of the two days, and by the second day, many participants were asking for credentials when contacted and using group chats they created with the secretary of state’s office and law enforcement to confirm the accuracy of information they were receiving.
This scenario planner will delve deeper into the lessons that participants learned during the tabletop exercise to help their counterparts around the country understand, identify, prepare for, and respond to various AI-related threats that may arise during the upcoming elections. One major insight was the importance of reinforcing fundamental security measures, such as implementing multifactor authentication, securing essential communication channels, conducting regular impersonation checks, and creating rapid-response communications plans. As Harvard University professor Bruce Schneier has noted, artificial intelligence will increase the “speed, scale, scope, and sophistication” of threats to our democracy. Put another way, many of the threats are not new, but they could become more dangerous in the rapidly evolving AI environment.
The scenarios participants faced during the tabletop exercise were frightening for their realism but also reassuring, making clear that election officials and others already have many tools at their disposal to combat AI threats. As one participant put it, “I appreciated learning about AI threats and experiencing the mock [scenarios] that may happen. The chaos was disorienting at first but easier to deal with by the second day.” Michael Moore, the secretary of state’s chief information security officer, emphasized that these kinds of exercises are exactly what election officials and those supporting them need in the coming months, noting that “AI is going to increase the quality, quantity, and urgency of [mis-, dis-, and malinformation]. Training events like this are one of the best ways we can ensure we are ready for what’s coming.”