AI brought a road rage victim ‘back to life’ in court. Experts say it went too far

When Christopher Pelkey was killed in a road rage incident in Arizona, his family was left not only to grieve but also to navigate how to represent him in court. As they prepared to confront his killer, Gabriel Horcasitas, during sentencing, they made an unusual and deeply controversial choice: to have Pelkey appear to speak from beyond the grave.

To do so, they turned to technology: An AI-generated video featuring a re-created voice and likeness of Pelkey was presented as a victim impact statement ahead of sentencing. The video showed a digitally resurrected Pelkey appearing to speak directly to the judge.

Of course, the statement wasn’t truly Pelkey’s. He couldn’t have possibly said those words—he died the day Horcasitas shot him. Yet the judge accepted the AI-generated message, even acknowledging its effect. “You allowed Chris to speak from his heart as you saw it,” the judge said. Horcasitas was sentenced to 10 and a half years in prison.

The extraordinary courtroom moment has sparked widespread discussion, not just for its emotional power but for the precedent it may set. Arizona’s victims’ rights laws allow the families of deceased victims to determine how their impact statements are delivered. But legal and AI experts warn that this precedent is far from harmless.

“I have sympathy for the family members who constructed the video,” says Eerke Boiten, a professor at De Montfort University in the U.K. specializing in AI. “Knowing Pelkey, they likely had a preconceived view of how he might have felt, and AI gave them a way of putting that across that many found attractive and convincing.”

Still, Boiten is uneasy about how the video has been interpreted by both the public and possibly the court. “The video should be read as a statement of opinion from the family members, with AI providing a convincing presentation of that,” he explains. Yet public reaction suggests it was taken as something more. “The reactions show that it was taken as an almost factual contribution from Pelkey instead,” Boiten says.

The victims’ rights attorney who represented Pelkey’s family told 404 Media that “at no point did anyone try to pass it off as Chris’s own words.” Yet the emotionally charged format of presenting a deepfaked version of the deceased gives those words far more weight than if they had simply been read aloud. And it’s worth emphasizing: Pelkey could never have written them himself.

“It’s an inappropriate use of AI which has no relevance and should have no role at sentencing,” says Julian Roberts, emeritus professor of criminology at the University of Oxford and executive director of the Sentencing Academy. Data protection specialist Jon Baines of the firm Mishcon de Reya adds that the incident is “profoundly troubling from an ethical standpoint.”

Roberts argues that using an AI-generated likeness of a victim oversteps the purpose of a victim impact statement. “The victim statement should inform the court about the impact of the crime on the victim and advise of any possible impact of the imposition of a community order, et cetera,” he says. “It is not an exercise in memorializing the victim.” In his view, that’s exactly what the Pelkey video did.

Roberts also criticized the content of the statement itself: “The statement should contain information, not opinion or invention—human or AI-derived.”

Still, a precedent has now been set—at least in Arizona. One that blurs the line between mourning and manipulation. One that allows people to “speak” from beyond the grave—and could, in the future, influence the length of prison sentences in ways that justice systems may not yet be prepared to handle.


https://www.fastcompany.com/91331139/ai-brought-a-road-rage-victim-back-to-life-in-court-experts-say-it-went-too-far?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Vytvorené 3mo | 9. 5. 2025, 11:10:05


Ak chcete pridať komentár, prihláste sa

Ostatné príspevky v tejto skupine

This free email scam detector gives you the protection Gmail and Outlook don’t

I don’t know if you’ve noticed, but email scams are getting surprisingly sophisticated.

We’ve had a handful of instances here at The Intelligence International Headquarters where we’ve h

9. 8. 2025, 12:20:05 | Fast company - tech
You might want a VPN on your phone. Here’s how to get started

Interest in virtual private networks (VPNs) has surged in America and Europe this year. Countries on both sides of the Atlantic have recently enacted new age-verification laws designed to prevent

9. 8. 2025, 9:50:05 | Fast company - tech
Instagram’s new location sharing map: how it works and how to turn it off

Instagram’s new location-sharing Map feature is raising privacy concerns among some users, who worry their whereab

8. 8. 2025, 17:40:06 | Fast company - tech
The one part of crypto that’s still in crypto winter

Crypto is booming again. Bitcoin is near record highs, Walmart and Amazon are report

8. 8. 2025, 13:10:06 | Fast company - tech
Podcasting is bigger than ever—but not without its growing pains

Greetings, salutations, and thanks for reading Fast Company’s Plugged In.

On August 4, Amazon announced that it was restructuring its Wondery podcast studio. The compan

8. 8. 2025, 13:10:04 | Fast company - tech
‘Clanker’ is the internet’s favorite slur—and it’s aimed at AI

AI skeptics have found a new way to express their disdain for the creeping presence of

8. 8. 2025, 10:50:02 | Fast company - tech
TikTok is losing it over real-life octopus cities

Remember when the internet cried actual tears for an anglerfish earli

7. 8. 2025, 23:20:03 | Fast company - tech