How ready are we for AI-powered election deception? Arizona just found out 

It was Election Day in the crucial voter battleground of Maricopa County, Arizona, and by 11:15 a.m., inside a windowless convention center ballroom bustling with Arizona election officials, federal agents, and a smattering of reporters, I was already beginning to panic.

Moments before, election offices across the state had received an AI-generated phone call from the secretary of state’s office. In an uncanny monotone, a voice clone imitating the assistant secretary said the state had received a court order requiring polling places to stay open two hours later than they’re supposed to. Now, all around me, the people inside the convention center were scrambling to figure out whether the court order and the call were real.

Well, not real real. The phone call—and a deluge of other crises being thrown our way—were all part of an elaborate Election Day simulation orchestrated by Arizona Secretary of State Adrian Fontes in preparation for the main event. Since last year, Fontes and his team have staged a series of tabletop exercises like this one with election officials across the state in hopes of preparing them for a new generation of AI-fueled threats they may face this year. The simulation I attended in early May was the first where Fontes invited reporters to participate, giving us a taste of what those threats might look like and the pressure election officials will be under as they try to respond. 

Fontes was the Maricopa County recorder during the 2020 election, putting him and his staff at the white-hot center of conspiracies about everything from Sharpies ruining ballots to malfunctioning tabulators. Later, Fontes testified before Congress about the death threats and menacing protests visited upon him and his staff. Now, as secretary of state, he is in an equally fraught position: Arizona remains one of the most tightly contested states in the country. The big difference now is that the proliferation of generative AI could make spreading convincing election lies that much easier.

“Artificial intelligence is an amplifier for misinformation, disinformation, malinformation—pick your bad stuff,” Fontes told the crowd inside the convention center. “We want to inoculate our election officials from the newness of it, from the uncertainty of it.” 

To prepare election workers for what may be in store this fall, Fontes’s office designed a simulation that runs participants through the six months leading up to Election Day, complete with a series of obstacles along the way, including the phony phone call about the court order. As part of the event, Fontes even worked with the Institute for the Future to create a convincing—if somewhat dead-eyed—AI-generated video of himself introducing the exercise in English, German, and French. Participants in the exercise must respond to each new problem by using a $30,000 budget to purchase certain safeguards or make up their own responses on the fly. 

The risk of bad actors using new, readily available AI tools to sow chaos this election cycle isn’t exactly far-fetched. Already, in New Hampshire, robocalls featuring an AI voice clone of President Joe Biden attempted to mislead voters into believing that voting in the primaries would disqualify them from voting in November. In Slovakia, another AI-audio recording purported to show the country’s liberal party leader attempting to rig the election.

“This is already happening. It’s not happening at a massive scale right now, and it is a question mark of whether it’s going to happen in this election. But we’ve already seen it in a couple different places,” Siddharth Hiregowdara, cofounder of the tech firm CivAI, told the crowd, as he demonstrated how quickly off-the-shelf AI tools can be used to create everything from voice clones to fake New York Times articles. “We need to update our societal immune response to all this by understanding what [AI] can do.” 

That, Fontes told Fast Company, is exactly what he’s hoping to accomplish with the tabletop exercises. During my afternoon in the hot seat, my team, representing the fictional Copper County, Arizona, faced a barrage of seemingly automated public records requests, hundreds of AI-generated voice mails that crashed our phone system, phishing emails attempting to hack the state’s voter registration software, and a WhatsApp chatbot targeting Latino people with maps of fake drop box locations. At one point, we were told a reporter for the local Fox affiliate wanted to run a story about the phony maps.

“Do you really have a drop box at a strip club?” the fake reporter asked in an urgent email. We faced fed-up poll workers walking off the job to join social media-fueled protests, and a group of armed protestors using a dump truck to block the exit to one polling place, preventing the poll workers and the votes—which had to be counted at another location—from getting out.

My team beat back some of the obstacles just as the game’s designers hoped we would. We thwarted a denial-of-service attack with a website firewall. And we had a crisis communications plan in place to ward off a persistent reporter (in reality, a cybersecurity consultant posing as a reporter) who wouldn’t stop badgering us. We held an impromptu press conference to correct the record about ballot drop boxes and invested in physical security services for the office. Over the course of six months, I think we remembered to break for lunch once.

When the simulation was over, and we were told that we’d survived Election Day, the crowd cheered and applauded. But as I sat back in my chair, I couldn’t help thinking about how flimsy all the defenses we had at our disposal felt, even during a dress rehearsal. Again and again, as my team and I tried to counter a deluge of misinformation, we responded by urging our imaginary voters to visit the county website or check the county’s official social media channels. But what good was all that, I wondered, at a moment when authentic-looking false information is easier than ever to reproduce, and when trust in election officials is so very low? Wasn’t it just our word against theirs? And would election officials really be so capable of separating fact from very believable fiction when they’re not sitting in a convention center ballroom primed and waiting for it?

Even before the proliferation of generative AI tools, the plight of election workers in America was bleak. According to one report last year, in Arizona alone all but two of the state’s counties have seen top election officials step down from their posts since 2020. Other key swing states have seen similar turnover. That has not only led to a dearth of institutional knowledge within election offices, but it’s also made it challenging to bring new election workers in. One Brennan Center poll shared during the tabletop exercise revealed that one-fifth of election officials are unlikely to serve again in the 2026 midterms.

“We’re having trouble recruiting,” Patty Hansen, county recorder for Coconino County, Arizona, said during the event. “We see poll workers becoming national news stories. It scares people.”

At the same time, voters are now less likely to look to election officials for election-related information than they were just two years ago, according to the Bipartisan Policy Center. Meanwhile, legal battles have disrupted communication between tech platforms looking to thwart election threats and key federal agencies that are identifying and monitoring them. And, according to Fontes, his office gets little input from tech companies about emerging threats. “Our communication with the tech companies I don’t think is as robust as a lot of people would imagine,” Fontes said.

Given all of these challenges, it’s little wonder that, at the end of the game, I felt so defeated. We were warned at the start that it was designed to be “impossible to beat.” To me, that felt like the most realistic part of the whole exercise. In a game, hiring a public relations firm might be enough to effectively counter a media frenzy about people dropping ballots in unofficial locations. In reality, it’s far from that simple. 

Fontes acknowledges as much. As he put it to me later, “We cannot convince every single person in the world that we’re doing our job well, even if that is an absolute fact.”

And yet, he says he doesn’t believe that convincing election deniers is his job. Instead, his job is to ensure those deniers’ doubts are unfounded by leaving no room for error in the administration of the election itself. That means preparing election workers for the range of threats they may face—AI-generated or otherwise.

“What I can control is getting the best training out there for these counties, and really believing in them to execute the fundamentals as best as they can,” Fontes said. “If we can accomplish the fundamentals, then we’ve accomplished the mission. We’ve done the election.”

<hr class=“wp-block-separator is-style-wide”/> https://www.fastcompany.com/91124372/ai-powered-election-deception-arizona-just-found-out?partner=rss&amp;utm_source=rss&amp;utm_medium=feed&amp;utm_campaign=rss+fastcompany&amp;utm_content=rss

Établi 15d | 14 mai 2024 à 11:10:05


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

TikTok ban: U.S. court to consider legal challenges in September

A U.S. appeals court on Tuesday set a fast-track schedule to consider the legal challenges to a new law requiring

28 mai 2024 à 19:40:05 | Fast company - tech
AI isn’t the best part of Google’s latest Chromebooks

Riddle me this: What exactly is a Chromebook?

Ask 20 people that question, and you’re bound to get 20 different answers. And for good reason: For the past decade and a half, Google has k

28 mai 2024 à 17:30:03 | Fast company - tech
Klarna slashes $10 million annually in marketing costs, here’s their secret

Fintech firm Klarna, one of the early adopters of generative AI (GenAI), sa

28 mai 2024 à 15:10:05 | Fast company - tech
Common image search results are overwhelmingly white, a new study finds

Search engines are undergoing significant changes of late, with many users particularly worried that ge

28 mai 2024 à 15:10:04 | Fast company - tech
We asked 6 chatbots about the 2024 election. Here’s what they said

Despite ample evidence to the contrary, about one-third of U.S. voters still believe the 2020 e

28 mai 2024 à 12:50:02 | Fast company - tech
How to back up and clean up your untidy phone contact list

This story first appeared in Advisorator, Jared’s weekly tech advice newsletter. Sign up for Jared’s

28 mai 2024 à 10:30:07 | Fast company - tech