• News
  • Subscribe Now

Why a US court allowed a dead man to deliver his own victim impact statement—via an AI avatar

By Unknown Author|Source: Phys.org|Read Time: 2 mins|Share

The incident occurred on a busy street during rush hour traffic. The two men got into a heated argument which escalated into violence. Horcasitas fired multiple shots at Pelkey, resulting in his death. Horcasitas fled the scene but was later arrested and charged with murder. The tragic incident shocked the local community and raised concerns about road rage incidents.

Why a US court allowed a dead man to deliver his own victim impact statement—via an AI avatar
Representational image

June 18, 2025 by James D Metzger, Tyrone Kirchengast, The Conversation edited by Gaby Clark, reviewed by Andrew Zinin scientific editor lead editor

The First AI-Generated Victim Impact Statement in the US

This article discusses a unique case that occurred in November 2021 in Chandler, Arizona, where an AI-generated victim impact statement was presented in court for the first time in the United States.

Chris Pelkey was shot and killed by Gabriel Horcasitas in a road rage incident, leading to Horcasitas being convicted of reckless manslaughter. During the sentencing phase, Pelkey's family decided to have him speak for himself using an AI-generated avatar, a move that had never been done before in any court.

The AI avatar was created by Pelkey's sister Stacey Wales and her husband Tim, using samples of Pelkey's voice and photos to make him 'speak' directly to the judge. The avatar conveyed messages of forgiveness and understanding, surprising the court with its impact.

Reception in the Courtroom

When the video was played in court, Judge Todd Lang expressed his admiration for the AI-generated statement, noting the sense of forgiveness it conveyed. The judge ultimately sentenced Horcasitas to the maximum penalty of ten-and-a-half years, aligning with the family's wishes.

Possibility in Australia

While the use of AI avatars in courtrooms may be permissible in the US, the article suggests that such technological advancements are unlikely to be accepted in Australian courts. Current rules in Australia limit statements to written or verbal form by victims or their families, with strict guidelines on what can be presented in court.

Unlike the US, where emotional presentations and visual aids are more common, Australian courts have stricter regulations regarding victim impact statements. The use of AI-generated avatars to deliver statements from deceased victims would require significant changes to existing laws and practices.

Overall, while the Arizona case showcases a new approach to victim impact statements, it highlights the contrasting legal frameworks between the US and Australia regarding courtroom presentations.

Discover the latest in science, tech, and space with over 100,000 subscribers who rely on Phys.org for daily insights. Sign up for our free newsletter and get updates on breakthroughs, innovations, and research that matter—daily or weekly.

Provided by The Conversation. This article is republished from The Conversation under a Creative Commons license. Read the original article.


By entering your email you agree to our terms & conditions and privacy policy. You will be getting daily AI news in your inbox at 7 am your time to keep you ahead of the curve. Don't worry you can always unsubscribe.