The FCC wants to make AI robocalls like this creepy one illegal

Robocalls are annoying enough, but the growth of generative artificial intelligence could make it much easier for phone-based scams to fool people. Now, the Federal Communications Commission (FCC) is hoping to head off this potential threat before it becomes an even bigger problem.

FCC Chairwoman Jessica Rosenworcel, on Wednesday, proposed new rules that would make robocalls using AI-generated voices illegal.

“AI-generated voice cloning and images are already sowing confusion by tricking consumers into thinking scams and frauds are legitimate,” said Rosenworcel in a statement. “No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be a target of these faked calls.”

The calls can be convincingly human-sounding, but some appear to be subject to the same shortcomings of other chatbots. A video on TikTok, posted by user lothalrebels, appears to show a telemarketing cold call from an AI calling itself “Andrew” that claims to be from real estate firm Keller Williams.

After asking questions about whether the person is interested in buying or selling a home, Andrew asks, “Is there anything else I can assist you with or any other questions you may have?” That’s when things get creepy.

The cold call recipient asks Andrew for a 10th-grade-level essay about the U.S. military’s naval base in Bermuda, to which Andrew obliges. (It should be noted, unsurprisingly, that Andrew got some important facts wrong.) Next, Andrew immediately rattles off a recipe for homemade hot fudge as requested.

The voice on the call had a familiar, lilting British accent. Commenters noted it sounded very similar to professor Brian Cox, an English physicist who has sold out lectures around the world.

Keller Williams spokesperson Darryl Frost, when asked about the video, told Fast Company, “Keller Williams Realty, Inc. has not enabled or encouraged franchisees or agents to use AI to make telemarketing calls to consumers. While there may be exciting opportunities to use AI to make real estate a more efficient marketplace for all, we emphasize and train our Keller Williams franchisees and their affiliated agents that they must comply with all federal, state, and local telemarketing laws in all their communications with consumers.”

Last year saw roughly 55 billion robocalls in the U.S., a bit lower than the 2019 peak of 58.5 billion, as estimated by YouMail, which blocks the calls. Just last week, a robocall began spreading among voters that imitated President Joe Biden, telling voters not to vote in the New Hampshire primary. That was likely the catalyst behind Rosenworcel’s proposal.

The proposal aims to identify AI-generated voices as artificial under the Telephone Consumer Protection Act (TCPA). This makes them illegal under the existing law and gives state attorneys general the ability to pursue legal action against the companies behind them. (That act was passed in 1991 to quell the volume of robocalls.)

Consumers would still have the ability to give permission for some robocalls, including those using AI-generated voices that “do not include an advertisement or constitute telemarketing.” So the law would not impact the sort of automated calls routinely made by pharmacies letting you know your medication is ready.

Last year, the FCC used the TCPA to impose a $5 million penalty against conservative activists, who used robocalls to inform Black voters that any votes by mail would result in their personal information being put into “a public database that will be used by police departments to track down old warrants and be used by credit card companies to collect outstanding debts.”

Americans lost an estimated $87 billion to robocalls and phone scams in 2022, according to RoboKiller, an app that strives to eliminate spam calls. That’s a 116% increase over 2021. Globally, phone fraud is soaring as well. A report from Hiya, a voice security company, screened over 21.5 billion suspected spam calls in just the first nine months of last year.

https://www.fastcompany.com/91021571/generative-ai-robocalls-fcc-illegal-new-rules?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Erstellt 1y | 01.02.2024, 22:40:05


Melden Sie sich an, um einen Kommentar hinzuzufügen

Andere Beiträge in dieser Gruppe

‘The /r/overemployed king’: A serial moonlighter was exposed for holding 19 jobs at Silicon Valley startups

A software engineer became X’s main character last week after being outed as a serial moonlighter at multiple Silicon Valley startups.

“PSA: there’s a guy named Soham Parekh (in India) w

08.07.2025, 22:20:04 | Fast company - tech
Texas flood recovery efforts face an unexpected obstacle: drones

The flash floods that have devastated Texas are already a difficult crisis to manage. More than 100 people are confirmed dead

08.07.2025, 17:40:02 | Fast company - tech
The internet is trying—and failing—to spend Elon Musk’s $342 billion

How would you spend $342 billion?

A number of games called “Spend Elon Musk’s Money” have been popping up online, inviting users to imagine how they’d blow through the

08.07.2025, 15:20:07 | Fast company - tech
What happened at Wimbledon? ‘Human error’ blamed for ball-tracking tech mishap

The All England Club, somewhat ironically, is blaming “human error” for a glaring mistake by the electronic

08.07.2025, 15:20:04 | Fast company - tech
Elon Musk has ‘fixed’ Grok—to be more like him than ever

As Elon Musk announced plans over the Fourth of July weekend to establish a third political party,

08.07.2025, 12:50:09 | Fast company - tech