Back to blog

LCN Blogs

Ghostbots: who ya gonna call?

Ghostbots: who ya gonna call?

John MacKenzie

21/05/2024

Reading time: four minutes

At this point, I assume most of us are at least vaguely aware of Generative AI, most likely in the form of natural language models and image generators. However, a somewhat obscure, and rather more macabre application of GenAI has been developing for a few years now, in the form of ‘ghostbots’ − AI-powered digital recreations of the dead. As effectively posthumous deepfakes, ghostbots typically use AI to synthesise the voice, face and even personality of deceased individuals. They might be trained on the deceased’s digital footprints on social media or messaging services, personal photos/media and other sources of identifying data. Innovations like deepfakes and holographic projections have made it possible to see, hear and hold conversations with someone who’s passed away, in a manner unlike simply viewing a photograph or video of a loved one.

I’ve previously written on Generative AI fears and deepfakes specifically, if you wish to read these for a quick background to this blog.

We’ve seen quite a few examples of ghostbots over the past few years. In 2021, the DeepNostalgia service from genealogy company MyHeritage was released, allowing users to ‘reanimate’ photographs of relatives. Similarly, Project December, powered by GPT-3, made it possible for users to ‘chat’ with deceased loved ones, such as Joshua Barbeau and his deceased girlfriend.

Ghostbots seem increasingly lucrative in China, where the fusion of traditional reverence for ancestors with modern technology has led to a burgeoning market for digital avatars of the deceased. Sun Kai has weekly video calls with a digital avatar of his deceased mother. This replica, created using AI from Sun's own company, responds with basic conversation and listens. While fairly rudimentary, this provides emotional comfort.

These posthumous programs, ranging from chatbots to holograms, have been tenuously suggested as technologies to assist the mourning process, providing a new short-term outlet to grieve for loved ones and come to terms with death. 

However, these post-mortem avatars naturally bring with them a host of ethical, legal and psychological considerations. Legal criticisms focus on how these technologies might exploit the deceased's likeness or infringe their legal personality for commercial gain and disrupt the natural grieving process. The emotional impact on users − especially those potentially more vulnerable, such as children interacting with digital representations of deceased parents − is also a significant concern. Ethical considerations are particularly concerning when the deceased's persona is used without some form of explicit prior consent, risking privacy and dignity after death.

The legal framework surrounding ghostbots is still very much in its infancy. A Computer Law & Security Review article by UK academics in 2023 suggests adding a ‘do not bot me’ clause in wills and contracts to prevent unauthorised digital reincarnations. The study, titled Governing Ghostbots, highlights various issues, including the lack of post-mortem privacy or dignity protections for the deceased. Dr Marisa Mcvey, a co-author of the article, notes the absence of UK laws extending privacy or data protection after death, raising concerns about potential emotional and economic harm to the deceased's relatives and estate. 

The European Union has made some progress towards comprehensive laws governing digital identities and ensuring transparency in the use of ghostbots. The recently enacted EU AI Act has various provisions potentially applicable to ghostbots, including transparency obligations and prohibitions on deploying exploitative AI models. 

The UK doesn’t have any formal personality or image rights, like those that exist in some US states. Sometimes referred to as the right of publicity, this is an individual's right to control the commercial use of their image and appropriation of various aspects of their identity. It should be emphasised that, in most forms, personality rights only cover commercial usage.

In terms of general measures against the challenges of ghostbots, experts recommend establishing strict guidelines for the creation and use of digital avatars. These might include user age-restrictions, transparency (as in the EU AI Act) and instituting appropriate ‘retirement’ procedures for avatars, so as to respect the dignity of the deceased. Moreover, there’s a push for increased societal awareness and understanding of the implications of digital legacies, which could foster more informed decisions about the use of ghostbots and similar technologies.

While the digital recreation of real people  could impact how some people might deal with loss, it also impact other industries. The presence of AI-cloned influencers in China, for example, shows how the same technology used for ghostbots can have broader commercial applications. Beyond ghostbots, the proliferation of deepfakes and other technologies attempting to replicate personalities may soon become increasingly commonplace in various facets of daily life. There’s such a growing need for more stringent ethical controls and regulatory oversight − both for the living and the dead.

Ghostbots represent a significant technological advancement, arguably with potential to transform our relationship with memory and mortality. As with other uses of AI, it’s crucial we don’t pursue wanton innovation contradictory to ethics and the law. There’s a clear need to protect our digital afterlives and ensure the deceased and their personhoods are treated with the same respect and dignity we afford to the living.