Election officials are role-playing AI threats to protect democracy

Arizona Secretary of State Adrian Fontes led a tabletop exercise for journalists to role-play as election officials to understand the speed and scale of AI threats they face. | Photo by Ash Ponders for The Verge The job has never been harder, and the threats have never been stranger. It’s the morning of Election Day in Arizona, and a message has just come in from the secretary of state’s office telling you that a new court order requires polling locations to stay open until 9PM. As a county election official, you find the time extension strange, but the familiar voice on the phone feels reassuring — you’ve talked to this official before. Just hours later, you receive an email telling you that the message was fake. In fact, polls must now close immediately, even though it’s only the early afternoon. The email tells you to submit your election results as soon as possible — strange since the law requires you to wait an hour after polls close or until all results from the day have been tabulated to submit. This is the sort of whiplash and confusion election officials expect to face in 2024. The upcoming presidential election is taking place under heightened public scrutiny, as a dwindling public workforce navigates an onslaught of deceptive (and sometimes AI-generated) communications, as well as physical and digital threats. The confusion played out in an Arizona conference room in early May as part of an exercise for journalists who were invited to play election officials for the day. The subject matter — AI threats in elections — was novel, but the invitation itself was unusual. The entire event was unusual. Why is the Arizona secretary of state reaching out to journalists months in advance of the election? Election officials have been on the receiving end of unprecedented harassment During the 2020 election, Arizona swung blue, tipping the election to Joe Biden. Fox News forecast the win well ahead of other news outlets, angering the Trump campaign. Trump and his supporters pointed to unsubstantiated incidents of voter fraud and later filed (then dropped) a suit against the state demanding that ballots be reviewed. Later, Republicans commissioned an audit of the votes, which ultimately upheld the accuracy of the original tabulation. And only last Friday, Rudy Giuliani was served with an indictment in which he is charged with pressuring Arizona officials to change the outcome of the 2020 election in favor of Trump. Election officials have been on the receiving end of unprecedented harassment. As recently as February, a California man was arrested for a threatening message he allegedly left on the personal cell phone of an election official in Maricopa County, Arizona, in November 2022. The aftershocks of 2020 have not yet faded for election officials, and yet, the next presidential election is already on the horizon. Arizona officials are proactively seeking to restore confidence in the process. There’s a lot on the line for them. Unsubstantiated accusations of voter fraud or election interference are dangers to democratic stability. But for the officials that end up in the crosshairs of conspiracy theories, their personal safety is also at risk. Journalists were invited to the role-playing event as part of an effort to educate the public not just about the threats that election officials are preparing for but also about the scale and seriousness of the preparation itself. “We’re facing the kinds of threats that no one has ever seen before.” “We want to make sure in this that we have done everything that we can to make 2024 the best election that [it] possibly can be,” Arizona Secretary of State Adrian Fontes said at the start of the day’s events. “And we’re facing the kinds of threats that no one has ever seen before.” The proliferation of generative AI tools presents the latest set of challenges for election workers because of how easily and quickly these tools can pump out convincing fodder for sophisticated social engineering schemes. The exercise being conducted was a version of a program created for actual Arizona election officials, who participated in the training back in December. Law enforcement is expected to also undergo the training soon. The Arizona secretary of state’s office spearheaded the initiative to expose election officials to the kinds of threats — particularly related to AI — that they might see in the lead-up to the elections. Photo by Ash Ponders for The Verge Susan Lapsley is the elections security advisor for the Cybersecurity and Infrastructure Security Agency region that includes Arizona. “It is unnerving to be where we’re at,” Fontes said, referencing an AI-generated deepfake of himself that played for attendees, showing the secretary of state seamlessly speaking both German and French — two languages he doesn’t speak fluently. Fontes said he hopes to inoculate election officials against some of the known AI threats, givi

Election officials are role-playing AI threats to protect democracy
Arizona Secretary of State Adrian Fontes
Arizona Secretary of State Adrian Fontes led a tabletop exercise for journalists to role-play as election officials to understand the speed and scale of AI threats they face. | Photo by Ash Ponders for The Verge

The job has never been harder, and the threats have never been stranger.

It’s the morning of Election Day in Arizona, and a message has just come in from the secretary of state’s office telling you that a new court order requires polling locations to stay open until 9PM. As a county election official, you find the time extension strange, but the familiar voice on the phone feels reassuring — you’ve talked to this official before.

Just hours later, you receive an email telling you that the message was fake. In fact, polls must now close immediately, even though it’s only the early afternoon. The email tells you to submit your election results as soon as possible — strange since the law requires you to wait an hour after polls close or until all results from the day have been tabulated to submit.

This is the sort of whiplash and confusion election officials expect to face in 2024. The upcoming presidential election is taking place under heightened public scrutiny, as a dwindling public workforce navigates an onslaught of deceptive (and sometimes AI-generated) communications, as well as physical and digital threats.

The confusion played out in an Arizona conference room in early May as part of an exercise for journalists who were invited to play election officials for the day. The subject matter — AI threats in elections — was novel, but the invitation itself was unusual. The entire event was unusual. Why is the Arizona secretary of state reaching out to journalists months in advance of the election?

During the 2020 election, Arizona swung blue, tipping the election to Joe Biden. Fox News forecast the win well ahead of other news outlets, angering the Trump campaign. Trump and his supporters pointed to unsubstantiated incidents of voter fraud and later filed (then dropped) a suit against the state demanding that ballots be reviewed. Later, Republicans commissioned an audit of the votes, which ultimately upheld the accuracy of the original tabulation. And only last Friday, Rudy Giuliani was served with an indictment in which he is charged with pressuring Arizona officials to change the outcome of the 2020 election in favor of Trump.

Election officials have been on the receiving end of unprecedented harassment. As recently as February, a California man was arrested for a threatening message he allegedly left on the personal cell phone of an election official in Maricopa County, Arizona, in November 2022.

The aftershocks of 2020 have not yet faded for election officials, and yet, the next presidential election is already on the horizon. Arizona officials are proactively seeking to restore confidence in the process. There’s a lot on the line for them. Unsubstantiated accusations of voter fraud or election interference are dangers to democratic stability. But for the officials that end up in the crosshairs of conspiracy theories, their personal safety is also at risk.

Journalists were invited to the role-playing event as part of an effort to educate the public not just about the threats that election officials are preparing for but also about the scale and seriousness of the preparation itself.

“We want to make sure in this that we have done everything that we can to make 2024 the best election that [it] possibly can be,” Arizona Secretary of State Adrian Fontes said at the start of the day’s events. “And we’re facing the kinds of threats that no one has ever seen before.” The proliferation of generative AI tools presents the latest set of challenges for election workers because of how easily and quickly these tools can pump out convincing fodder for sophisticated social engineering schemes.

The exercise being conducted was a version of a program created for actual Arizona election officials, who participated in the training back in December. Law enforcement is expected to also undergo the training soon. The Arizona secretary of state’s office spearheaded the initiative to expose election officials to the kinds of threats — particularly related to AI — that they might see in the lead-up to the elections.

CISA Elections Security Advisor Susan Lapsley Photo by Ash Ponders for The Verge
Susan Lapsley is the elections security advisor for the Cybersecurity and Infrastructure Security Agency region that includes Arizona.

“It is unnerving to be where we’re at,” Fontes said, referencing an AI-generated deepfake of himself that played for attendees, showing the secretary of state seamlessly speaking both German and French — two languages he doesn’t speak fluently.

Fontes said he hopes to inoculate election officials against some of the known AI threats, giving them a baseline wariness like most people nowadays would have for an email from a “Nigerian prince” seeking some extra cash. The goal, according to Angie Cloutier, security operations manager at the secretary of state’s office, “is to desensitize election officials to the newness and the weirdness” of AI technology.


Throughout the day, reporters viewed presentations from AI experts demonstrating how easy it is to use free online tools to create disinformation at scale.

One presentation used the LinkedIn profile of a reporter in the room to write a personalized email to the reporter with an AI text generator. The email included a phishing link in the signature masquerading as a LinkedIn profile URL. Later, the presenter used an image generator to put the reporter in a prison jumpsuit and attach that image to a fake article with false allegations, on a webpage designed to look like The New York Times. They also used a podcast recording to clone his voice to say whatever the presenter inputted.

Reporters were also presented with timed exercises. One condensed the months before Election Day into less than an hour and had reporters (role-playing election officials) choosing how to spend a $30,000 budget on a list of fortifications ranging from installing a firewall for the elections website to providing active shooter training or mental health resources to election workers. As time ticked by, organizers unveiled one new crisis after another: an influx of public information requests, a disinformation campaign, complaints of some voters failing to receive their mailed ballots, and sketchy messages asking for login credentials. Some of the obstacles could be avoided by picking the right fortifications, though the budget constrained how many each group could buy. Election Day itself was simulated in a similar — but shorter — timed exercise. The speed of the exercise was overwhelming, with problems popping up before we’d solved the last one. Actual election workers, Deputy Assistant Secretary of State C. Murphy Hebert said, were given even less time in the simulation.

Organizers wanted to simulate the stress and time crunch election officials feel while handling a wide range of threats while administering an election. “We prepare for the unexpected. And the way that we do that is by training ourselves to think in crisis mode,” said Hebert.

The work, for election officials, is very much like the myth of Sisyphus, Fontes told The Verge in an interview after the event. (In ancient Greek lore, Sisyphus was condemned to spend eternity in the afterlife rolling a boulder up a hill only for it to roll back down again.) “It’s just like, every year, there’s another set of folks who just want to dismantle our democracy because they’re upset about political outcomes,” he said.

Even in the roughly five-month gap between the election officials’ training and the media exercise I was invited to, new AI tools and capabilities have become readily available. In an environment where the threats are so rapidly evolving, officials need to quickly develop skill sets and heuristics that will aid them in evaluating threats that may not even exist yet.

Fontes said that even though the technology evolves, the training prepares election workers to understand its overall trajectory. “When people look at it for the first time now, they’re like, ‘Wow, this is really scary.’ The folks that saw it in December are like, ‘Okay, this is a logical progression from what there was,’ so they can be a little more thoughtful about this,” he said. “Is it challenging to keep up with the changes in technology? Absolutely. But that’s part of the job.”

Although they are preparing for AI to be used against them, Fontes and his colleagues are also open to using the same tools to make their work more efficient as they balance constrained resources. Fontes sees AI as just another tool that could be used for good or bad. When asked about the role of AI companies in ensuring their products are used responsibly, he said he’s “not in the business of telling people how to utilize their tools or how to develop their tools.”

“I think there’s enough good uses in AI, not just for people, but for the economy, that that needs to be developed,” said Fontes. He’s open to what automation can do effectively. It’s understandable — election officials have never been more pressed for time or resources.


As the threats to the electoral process widen in range and complexity, the job of an election official gets increasingly complex, even as their ranks dwindle in number.

AI is just the latest challenge to the work of administering free and fair elections in the US. Both tech experts and election officials emphasized at the event that AI isn’t all good or bad and doesn’t necessarily outweigh the importance of all the other threats they must prepare for. The office chose to focus on AI threats in particular this year because they’re so new.

Michael Moore, the chief information security officer for the Arizona secretary of state, said his role is more expansive than it used to be. “It used to be that a CISO was just focused on cybersecurity. But when I started [in] elections, that was not the case,” said Moore, who’s been working in the field since 2019. AI and online disinformation can fuel physical threats, meaning security teams need to think holistically about how to protect elections.

Arizona Secretary of State Chief Information Security Officer Michael Moore Photo by Ash Ponders for The Verge
Michael Moore, the chief information security officer for the Arizona secretary of state, said that title encompasses a greater range of threats than it used to.

Meanwhile, election officials are doing more with less. This isn’t by choice. Unprecedented scrutiny and outright harassment of election officials during the 2020 election have contributed to significant turnover in election workers. Giuliani was most recently indicted for his alleged activities in Arizona, but the problem extends far beyond Arizona. Two Georgia election workers, for instance, were the victims of such extreme harassment that a jury awarded them $148 million in damages in a defamation suit against Rudy Giuliani after he admitted to falsely accusing them of ballot fraud.

Last year, nonpartisan group Issue One found that 40 percent of chief local election officials in the western states would change between 2020 and 2024. The trend was even more pronounced in battleground states, including Arizona, where President Joe Biden won over then-President Donald Trump in 2020 with a slim majority. As of September 2023, Issue One reported that 12 out of 15 Arizona counties had new election officials since November 2020, covering 98 percent of the state population. Such turnover means a loss in institutional knowledge, which is especially important in a time-crunched field like elections.

Even as their job gets harder, election officials are trying to bolster trust in the system. Educating the press about the checks and safeguards in their processes is a part of this effort.

Election officials are trying to get people not to believe everything they see and hear. They also don’t want to scare voters and election workers into believing nothing they see or hear. They’re walking a fine line. “Part of that sweet spot is getting people to be vigilant but not mistrustful,” says Fontes. “Vigilant in that they’re going to look out for the stuff that isn’t real, but not mistrustful so that they don’t lose confidence in everything, which is kind of counterproductive to what our mission is in the first place.”

Officials want to avoid a scenario where voters throw their hands up in the air and just don’t vote. “It used to be, ‘They’re all corrupt,’” said Susan Lapsley, elections security advisor for the region covering Arizona at the Cybersecurity and Infrastructure Security Administration (CISA).

These days, she says, that kind of low-grade nihilism comes mostly in the form of “I don’t know what’s real.”

How much of a role will AI play in the 2024 elections? Will 2024 be as rocky as 2020? Will Arizona become a battleground of misinformation and distrust again? Arizona is trying to prepare for all scenarios. “What exactly is going to happen? We’re not sure,” Fontes said. “What are we best preparing for? Everything. Except Godzilla.”