The Federal Communications Commission (FCC) on Thursday made it illegal to place AI-generated robocalls to voters in the U.S.
The issue came into sharp focus when New Hampshire voters received AI-generated calls in the voice of Joe Biden implying that if they cast a ballot in the state’s primary they couldn’t vote in the general election.
The FCC commissioners voted unanimously to make explicit that the 1991 Telephone Consumer Protection Act, which already outlaws artificial or prerecorded messages, also covers AI-generated calls. Starting now, the FCC can fine companies that place AI robocalls up to $23,000 per call, and can block service providers that carry the calls. Robocall recipients can also now sue for up to $1,500 per call in damages per.
But according to the consumer rights advocacy group Public Citizen, the FCC’s move still doesn’t cast a wide enough net.
“The Telephone Consumer Protection Act applies only in limited measure to election-related calls,” says Public Citizen president Robert Weissman. “The Act’s prohibition on use of ‘an artificial or prerecorded voice’ generally does not apply to noncommercial calls and nonprofits.”
Still, the revised law would likely apply to Walter Monk, the Texas man who New Hampshire authorities believe was behind the Biden robocalls. Monk is the proprietor of Life Corporation, and authorities believe the calls were distributed by the Texas carrier Lingo Telecom.
The FCC’s action may make robocallers even more careful about covering their tracks. Generating an AI robocall is relatively simple with available tools (New Hampshire authorities believe Monk may have used ElevenLabs’ voice-cloning tool), and techniques for masking the origin of a call are readily available.
But law enforcement’s investigative powers are becoming more high-tech as well. The New Hampshire authorities used traceback technology to follow the robocalls back up through the communications network and finally to the originator of the calls.
The FCC has been helping state authorities with both federal resources and investigation tactics to hunt down robocallers. The federal-state partnership can help build an air-tight case against suspected violators when it’s time to prosecute. Now, the FCC has provided prosecutors more tools for making robocallers pay.
Still, experts say that all these efforts won’t be a silver bullet for AI robocalls and other misinformation this election season. In the end, the FCC’s action, and the attention it gets, may help more than the threatened penalties. It may communicate to voters to be aware that those election-year dinnertime calls may not be what they seem.
Login to add comment
Other posts in this group

Elon Musk on Monday targeted Apple and OpenAI in an antitrust lawsuit alleging that th


Meta is setting up a new California-focused political action committee (PAC) to back s

The U.S. Army is turning to sponcon to reach Gen Z.
Steven Kelly, who has more than 1.3 million Instagram followe

EDITOR’S NOTE — This story includes discussion of suicide. If you or someone you know needs help, the national suicide and crisis lifeline in the U.S. is available by calling or texting 988.

Meghan, Duchess of Sussex’ latest season of her reality show, With Love, Meghan, drops today on Netflix. In line with the stream

For understandable reasons, most technology coverage tends to focus more on the physical or visual