Death has always been a profound and personal experience. But in the age of technology, the concept of loss is being redefined. In China, several tech companies have started to create AI avatars for the deceased, sparking both hope and controversy. Can digital clones of the dead truly offer solace, or do it blur the lines between life and death?
The Rise of AI ‘Deadbots’ in China
Several years ago, digitally cloning people is only possible in the realm of science fiction. But now, thanks to the ever-advancing chatbots like Ernie of Baidu or ChatGPT of Open AI and serious investment in computing power, private companies are now able to create affordable digital “clones” of real people, dead or alive.
One of those companies is Silicon Intelligence. Based in Nanjing, China, the tech startup develops AI chatbots using a person’s likeness and voice. Silicon Intelligence offers a service called “Resurrection”, in which the company creates a “deadbot” or a digital likeness of someone who passed away.
Sun Kai, an executive at Silicon Intelligence, has a deadbot of his mother who died in 2018. Through a tablet device, Kai is able to talk to his mom’s “deadbot”, which is rendered from the shoulders up by artificial intelligence to look and sound just like his flesh-and-blood mother.
“I do not treat [the avatar] as a kind of digital person. I truly regard it as a mother,” Sun, 47, told NPR.org during a remote interview. “I feel that this might be the most perfect person to confide in, without exception.”
View this post on Instagram
The Challenges of Creating ‘Deadbots’
Silicon Intelligence’s most affordable digital avatar service costs about $30, while more interactive avatars that use AI technology to move on screen and talk with a client have a price tag of over $1000 each.
While there are a lot of Chinese tech firms offering digital cloning today, getting one for your departed loved ones isn’t always guaranteed, as these services require mountains of data.
“The crucial bit [in creating ‘deadbots’] is cloning a person’s thoughts, documenting what a person thought and experienced daily,” said Zhang Zewei, CEO of AI firm Super Brain, in an interview with NPR.org.
Zhang pointed out that for a digital clone to be truly life-like, it would need higher volumes of data, with clients preparing “at least 10 years” ahead of time by keeping a daily diary.
Where to Find ‘Deadbots’
While some people have the digital avatars of their late loved ones placed on their gadgets like phones and personal computers, others opt to put them on their departed’s respective graveyards.
For instance, in Shanghai, Fu Shou Yuan International Group has built a digitized cemetery, where visitors can hold up a cell phone to scan a QR code placed on select headstones and access a multimedia record of the deceased’s life experiences and achievements. As of this writing, nearly 100 customers have opted to place digital avatars on their loved ones’ headstones.
“For the family members who have just lost a loved one, their first reaction will definitely be a sense of comfort, a desire to communicate with them again,” Jiang Xia, a funeral planner for the Fu Shou Yuan International Group, told NPR. “However, to say that every customer will accept this might be challenging, as there are ethical issues involved.”
Ethical Issues Surrounding ‘Deadbots’
China isn’t the first country to attempt to create digital avatars of dead people. In 2017, Microsoft actually filed a patent application for simulating virtual conversations with someone who had passed, but an executive of the tech giant later said it was “disturbing” so plans to launch it as a full commercial service didn’t push through.
In the past, Project December, a platform first built off ChatGPT’s technology, also provided customers the ability to talk with a chatbot modeled off their loved ones. But soon after the service was launched, OpenAI terminated the platform’s access to its technology, fearing its potential misuse for emotional harm.
While companies like Microsoft and OpenAI have dedicated internal committees to evaluate the behavior and ethics of their generative AI services, both the United States and China don’t have a centralized regulatory body for overseeing the impacts of these technologies or their use of a person’s data. This lack of regulation partly contributes to the potential emotional harm to family members caused by “deadbots”.
“That is a very big question since the beginning of humanity: What is a good consolation? Can it be religion? Can it be forgetting? No one knows,” Michel Puech, a philosophy professor at the Sorbonne Université in Paris, told NPR.org. “There is the danger of addiction, and [of] replacing real life. So, if it works too well, that’s the danger. Having too much consoling, too much satisfying experience of a dead person will apparently annihilate the experience, and the grief, of death.”
But in the end, every person grieves differently. So, while “deadbots” work well with others, it doesn’t necessarily mean that it will provide the same positive effect to other mourning parties.