By Stuart Kerr, Technology Correspondent
🗓️ Published: 16 July 2025 | 🔄 Last updated: 16 July 2025📩 Contact: liveaiwire@gmail.com | 📣 Follow @LiveAIWire
🔗 Author Bio: https://www.liveaiwire.com/p/to-liveaiwire-where-artificial.html
Whispers from the Digital Beyond
The line between memory and simulation is blurring. Across the world, people are using artificial intelligence to reconnect with lost loved ones. From voice-cloned chatbots to hyperrealistic avatars, a new frontier is emerging—one that raises profound questions about grief, consent, and what it means to live on after death.
This phenomenon, often dubbed “digital necromancy,” is no longer confined to science fiction. In countries like South Korea, virtual reality reunions between parents and deceased children have been aired on television. In the U.S., startups offer services to recreate deceased relatives as AI companions. Some families find comfort. Others feel unsettled.
A New Kind of Immortality
According to a recent feature by Neuroscience News, public attitudes toward posthumous AI simulations are sharply divided. While some view digital resurrection as therapeutic, others warn it may inhibit healthy grieving by offering a synthetic substitute for closure.
Ramhee explored the ethical contours of these technologies, particularly around consent. Can someone truly approve their own digital replication from beyond the grave? What happens when an AI version outlives the memory it was based on—morphing into something the original person never intended?
Deepfakes and Deadbots
The mechanics of digital resurrection vary. Some services rely on large language models trained on a person’s texts, emails, or social media history. Others use archived audio to create voice clones or construct 3D avatars based on photos and video. These deadbots can simulate conversations, recall personal events, and even mimic mannerisms.
Advertising Week reports a growing trend among celebrities and public figures having their likenesses digitally preserved. But it’s not just for tribute. Holographic performers, AI-narrated ads, and posthumous endorsements are increasingly blurring the line between homage and exploitation.
As discussed in Ghost Writers of the Courtroom, once algorithms begin to act as proxies for real people, the legal and moral boundaries start to dissolve.
The Grief Equation
Supporters argue that AI replicas offer a modern form of mourning—an interactive memorial that keeps memories alive. In some cases, bereaved relatives report feelings of healing after speaking with a chatbot designed to mimic a deceased loved one.
But grief experts urge caution. Reanimation, no matter how convincing, is not resurrection. Psychologists warn that prolonged reliance on AI stand-ins may complicate grief processing, especially for children and those already prone to emotional trauma.
As explored in The Algorithm Will See You Now, when AI enters emotionally vulnerable spaces, designers must consider not just what a model can do, but what it should do.
Digital Consent and Legacy
One of the thorniest ethical dilemmas is consent. Can an AI truly represent the wishes of someone no longer alive to speak for themselves? Laws governing digital afterlife are patchy and inconsistent. In some jurisdictions, a person’s voice, image, or data may not be protected posthumously at all.
In cases involving public figures, such as musicians or actors, families and estates may license likenesses for commercial use—often without the deceased's prior approval. This commodification of memory turns identity into intellectual property.
A 2024 academic paper titled "Digital Resurrection" explores these risks in detail, while another from Lund University, "NECROROBOTICS", calls for legal safeguards to prevent digital impersonation and emotional exploitation.
As we touched on in The AI Gender Gap, concerns around bias and representation are magnified when resurrecting those who can no longer consent or correct.
A Future in Memory’s Image
Digital necromancy is not inherently unethical. Like many AI technologies, its value depends on how it is designed, deployed, and contextualised. Used thoughtfully, it may preserve family heritage or support therapeutic mourning. Abused, it risks violating memory, distorting legacy, and disrupting the human need to let go.
As this technology evolves, it forces us to ask: Should we resurrect the dead—or remember them? In a world increasingly shaped by code, perhaps the most human decision we can make is to respect the silence that follows a life lived.
About the Author
Stuart Kerr is the Technology Correspondent at LiveAIWire. He writes about AI’s impact on infrastructure, governance, creativity, and power.
📩 Contact: liveaiwire@gmail.com | 📣 @LiveAIWire