AI Ghosts: Shaping the future of grief tech

Representation of AI
(Image credit: Shutterstock)

Leaving behind a ‘clone’ of yourself in the form of a digital avatar for loved ones to interact with once you’ve passed away used to be the stuff of science fiction, but is fast becoming a reality. Cheap, accessible, and user-friendly tools like Replica, Here After, and ElevenLabs (for voice cloning), allow people to create avatars which can mimic their likeness, their personality traits, and are armed with intimate knowledge of their lives to make for genuinely compelling ’recreations.’

In a recent study, we looked at attitudes towards this rapidly evolving form of ‘grief tech’ – a growing industry that aims to lessen the pain of losing a loved one. It was immediately apparent that there is a discernible generational divide in attitudes towards the idea of ‘AI clones’.

Challenging taboos

A substantial proportion of younger generations find the idea of leaving behind a digital avatar appealing. Approximately 24% of Gen Zers (born 1997-2010) and 27% of Gen Yers (born 1982-1996) believe that access to these avatars could mitigate the pain of grieving by facilitating continued interaction with departed loved ones. However, support steadily declines with age, with just 2% of Older Adults (born pre-1946) having positive sentiments towards the idea.

Younger generations are increasingly challenging the taboos around mortality, openly talking about death – in stark contrast to older generations that have traditionally avoided the subject. The chasm between these generations’ perception of leaving behind ‘AI ghosts’ is reflective of the differing approaches towards dealing with death – and how attitudes towards death are evolving.

Aside from generational divides, there are a number of ethical questions that are raised through digital death defiance. For example, concerns regarding data privacy and manipulation loom large regardless of age group. Nearly 30% of respondents expressed apprehension about their data being misused or manipulated to create a version of themselves that they would not approve of. High public awareness of the perils of AI misuse is driving this concern, raising questions about the use of such representations after death.

For example, it’s not uncommon for AI to be used to allow long-since departed Hollywood film actors to be brought back to reprise roles long after their deaths. But what would this mean for the everyman? It could soon become a consideration for everyday roles; if you work in advertising for example, you may not want your ‘AI clone’ promoting gambling or smoking brands when you’re gone.

Alex Strang

Senior Insights Editor, Canvas8.

Expressed wishes

Express wishes regarding one's digital self for both personal and professional use may become a routine aspect of will creation. For those who don’t want any form of digital likeness, there’s the arrival of digital do-not-reanimate (DDNR) orders, akin to physical do-not-resuscitate DNR orders. For others, more nuanced instructions may be necessary to specify the exact use cases. For example limiting the technology’s use to an onscreen avatar of oneself that can answer questions about their lives, and share memories with loved ones only.

The study also uncovered apprehensions about grief tech disrupting the natural grieving process, with over a quarter of respondents expressing concern.

This sentiment is echoed by Ian Dibb, CEO and founder of Keylu, a leading life planning platform, “In the age of AI and ChatGPT, there is a boom in companies offering digital avatars that can interact with loved ones. In our continuous dialogues with people, we've identified a polarised response towards this groundbreaking technology. People either love it or they hate it. One of the primary concerns raised is the potential impact such technologies may have on the grieving process.”

This technology is still so new we’re only starting to understand even the short-term impact of the grieving process, let alone the lasting impact.

Blurring the lines

As technology continues to blur the lines between the physical and digital realms, the debate surrounding digital avatars and 'grief tech' is likely to intensify. While these innovations offer unprecedented opportunities to preserve parts of ourselves, they also raise profound questions about identity, privacy, and the nature of human existence in the digital age.

Andy Headington, a speaker on all aspects of digital, offered opinions at the Independent Funeral Directors Conference and believes legislative work is needed to get to grips with what is a revolutionary technology.

Andy says, “In creating digital versions of ourselves, we need to remember the importance of privacy and legacy. It's not just about who gets access to these technologies, but also how they're used and by whom. It's one thing to sound like us or look like us; it's another to truly represent who we are and only we as individuals know how best to do that. Without clear legal documentation and legislation, what’s to stop others from manipulating or altering our digital selves once we’re gone?”

Ultimately, navigating this uncharted territory will require a delicate balance between technological advancement and ethical considerations, ensuring that the legacy we leave behind in the digital realm reflects our true selves and honors our memory.

We've listed the best identity management software.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

Alex Strang, Senior Insights Editor, Canvas8.