AI brought a road rage victim ‘back to life’ in court. Experts say it went too far

When Christopher Pelkey was killed in a road rage incident in Arizona, his family was left not only to grieve but also to navigate how to represent him in court. As they prepared to confront his killer, Gabriel Horcasitas, during sentencing, they made an unusual and deeply controversial choice: to have Pelkey appear to speak from beyond the grave.

To do so, they turned to technology: An AI-generated video featuring a re-created voice and likeness of Pelkey was presented as a victim impact statement ahead of sentencing. The video showed a digitally resurrected Pelkey appearing to speak directly to the judge.

Of course, the statement wasn’t truly Pelkey’s. He couldn’t have possibly said those words—he died the day Horcasitas shot him. Yet the judge accepted the AI-generated message, even acknowledging its effect. “You allowed Chris to speak from his heart as you saw it,” the judge said. Horcasitas was sentenced to 10 and a half years in prison.

The extraordinary courtroom moment has sparked widespread discussion, not just for its emotional power but for the precedent it may set. Arizona’s victims’ rights laws allow the families of deceased victims to determine how their impact statements are delivered. But legal and AI experts warn that this precedent is far from harmless.

“I have sympathy for the family members who constructed the video,” says Eerke Boiten, a professor at De Montfort University in the U.K. specializing in AI. “Knowing Pelkey, they likely had a preconceived view of how he might have felt, and AI gave them a way of putting that across that many found attractive and convincing.”

Still, Boiten is uneasy about how the video has been interpreted by both the public and possibly the court. “The video should be read as a statement of opinion from the family members, with AI providing a convincing presentation of that,” he explains. Yet public reaction suggests it was taken as something more. “The reactions show that it was taken as an almost factual contribution from Pelkey instead,” Boiten says.

The victims’ rights attorney who represented Pelkey’s family told 404 Media that “at no point did anyone try to pass it off as Chris’s own words.” Yet the emotionally charged format of presenting a deepfaked version of the deceased gives those words far more weight than if they had simply been read aloud. And it’s worth emphasizing: Pelkey could never have written them himself.

“It’s an inappropriate use of AI which has no relevance and should have no role at sentencing,” says Julian Roberts, emeritus professor of criminology at the University of Oxford and executive director of the Sentencing Academy. Data protection specialist Jon Baines of the firm Mishcon de Reya adds that the incident is “profoundly troubling from an ethical standpoint.”

Roberts argues that using an AI-generated likeness of a victim oversteps the purpose of a victim impact statement. “The victim statement should inform the court about the impact of the crime on the victim and advise of any possible impact of the imposition of a community order, et cetera,” he says. “It is not an exercise in memorializing the victim.” In his view, that’s exactly what the Pelkey video did.

Roberts also criticized the content of the statement itself: “The statement should contain information, not opinion or invention—human or AI-derived.”

Still, a precedent has now been set—at least in Arizona. One that blurs the line between mourning and manipulation. One that allows people to “speak” from beyond the grave—and could, in the future, influence the length of prison sentences in ways that justice systems may not yet be prepared to handle.


https://www.fastcompany.com/91331139/ai-brought-a-road-rage-victim-back-to-life-in-court-experts-say-it-went-too-far?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Établi 3mo | 9 mai 2025, 11:10:05


Connectez-vous pour ajouter un commentaire

Autres messages de ce groupe

AI-generated errors set back this murder case in an Australian Supreme Court

A senior lawyer in Australia has apologized to a judge for

15 août 2025, 16:40:03 | Fast company - tech
This $200 million sports streamer is ready to take on ESPN and Fox

Recent Nielsen data confirmed what many of us had already begun to sense: Streaming services

15 août 2025, 11:50:09 | Fast company - tech
This new flight deck technology is making flying safer, reducing delays, and curbing emissions

Ever wondered what goes on behind the scenes in a modern airliner’s cockpit? While you’re enjoying your in-flight movie, a quiet technological revolution is underway, one that’s

15 août 2025, 11:50:07 | Fast company - tech
The case for personality-free AI

Hello again, and welcome to Fast Company’s Plugged In.

For as long as there’s been software, upgrades have been emotionally fraught. When people grow accustomed to a pr

15 août 2025, 11:50:07 | Fast company - tech
Why AI is vulnerable to data poisoning—and how to stop it

Imagine a busy train station. Cameras monitor everything, from how clean the platforms are to whether a docking bay is empty or occupied. These cameras feed into an

15 août 2025, 09:40:03 | Fast company - tech
5 ways to keep your electronic devices from overheating this summer

The summer holidays are here and many of us will heading off on trips to hot and sunny destinations,

14 août 2025, 17:30:04 | Fast company - tech
Why Nvidia and AMD’s China pay-to-play deal with Trump could backfire

Welcome to AI Decoded, Fast Company’s weekly new

14 août 2025, 17:30:02 | Fast company - tech