Stephenie Lucas Oney is 75 years old, but she still turns to her father for advice. How did he deal with racism? she asks herself. How did he succeed when everything was against him?
The answers are rooted in the experience of William Lucas, a black man from New York’s Harlem neighborhood who made his living as a police officer, FBI agent and judge. Butme Oney does not receive his advice in person. His father has been dead for over a year.
Instead, she listens to the answers, delivered through her father’s voice, on her phone using HereAfter AI, an artificial intelligence (AI)-powered app that generates responses based on hours of interviews conducted with him before his death in May 2022.
His voice comforts her, but she says she created this profile more for her four children and eight grandchildren.
“I want kids to hear all these things from his voice,” M said.me Oney, an endocrinologist, from his home in Grosse Pointe, Michigan. And not that I’m trying to paraphrase, but that they hear it from his point of view, in his time. »
Some people are turning to AI technology as a way to communicate with the dead, but its use as part of the grieving process has raised ethical questions while disturbing some of those who have used it.
HereAfter AI launched in 2019, two years after the creation of StoryFile, which produces interactive videos in which subjects appear to make eye contact, breathe and blink while answering questions. Both generate responses from what users say in response to requests like “Tell me about your childhood” or “What is the biggest challenge you faced?” “.
Their appeal comes as no surprise to Mark Sample, a professor of digital studies at Davidson College who teaches a course called Death in the Digital Age.
“Whenever a new form of technology comes along, there’s always this desire to use it to communicate with the dead,” Mr. Sample says. It recalls Thomas Edison’s unsuccessful attempt to invent a “spirit telephone”.
“High fidelity” version
StoryFile offers a “high fidelity” version in which a person is interviewed in a studio by a historian, but there is also a version that only requires a laptop and webcam to get started. Co-founder Stephen Smith asked his mother, Holocaust educator Marina Smith, to try it. His StoryFile avatar responded to questions asked at his funeral in July.
According to StoryFile, around 5,000 people have created a profile. Among them, actor Ed Asner was interviewed eight weeks before his death in 2021.
The company sent Asner’s StoryFile to his son Matt Asner, who was stunned to see his father watching it and appearing to answer his questions.
I was amazed. I found it incredible to be able to have a relevant and meaningful interaction with my father, and that was his personality. This man I really missed, my best friend, was there.
Matt Asner, son of actor Ed Asner
He played the file at his father’s memorial service. Some people were moved, but others felt uncomfortable.
“Some people found it morbid and were frightened,” Mr. Asner said. I don’t share that view, but I can understand why they said that. »
“A little difficult to watch”
Lynne Nieto also understands. She and her husband, Augie, founder of Life Fitness, which makes gym equipment, created a StoryFile before he died in February of amyotrophic lateral sclerosis (ALS). They thought they could use it on the website of Augie’s Quest, the nonprofit they founded to raise money for ALS research. Maybe his grandchildren will want to watch it one day.
Mme Nieto looked at his file for the first time about six months after his death.
“I’m not going to lie, it was a little hard to watch,” she said, adding that it reminded her of their Saturday morning chats and it was a little too “raw.”
These feelings are not uncommon. These products force consumers to confront the thing they are programmed not to think about: mortality.
People are disgusted by death and loss. This might be a hard sell because people are forced to face a reality they’d rather ignore.
James Vlahos, co-founder of HereAfter AI
HereAfter AI was born from a conversational robot that Mr. Vlahos created based on his father’s personality, before the latter died of lung cancer in 2017. Mr. Vlahos, a specialist in conversational AI and journalist who contributed to New York Times Magazinewrote about the experience for Wired and quickly started hearing from people asking if he could make them a mombotA spousebot And so on.
“I didn’t look at it from a business perspective,” Mr. Vlahos said. And then it became obvious: this should be a business. »
A question of consent and perspective
Like other AI innovations, chatbots created in the image of a deceased person raise ethical questions.
Ultimately, it’s about consent, said Alex Connock, a senior lecturer at the University of Oxford’s Saïd Business School and author of The Media Business and Artificial Intelligence (The media industry and artificial intelligence).
“As with all ethical questions related to AI, this is a question of authorization,” says Mr. Connock. If you did it knowingly and willingly, I think most ethical issues can be resolved quite easily. »
The Dr David Spiegel, associate chair of the department of psychiatry and behavioral sciences at Stanford Medical School, says programs like StoryFile and HereAfter AI could help people grieve, like flipping through an old album. of photos.
“The main thing is to keep a realistic perspective of what you see: it’s not a question of whether this person is still alive and communicating with you, but to review what they left behind. »
This article was originally published in the New York Times.