Stephenie Lucas Oney is 75 years old, but she still looks to her father for advice. How did he deal with racism? she wonders. How did he manage to do this when everything was against him?
Published at 1:16 am. Updated at 8:00 a.m.
Rebecca Carballo The New York Times
The answers are based on the experiences of William Lucas, a black man from New York's Harlem neighborhood who made a living as a police officer, FBI agent and judge. But Ms. Oney doesn't receive her advice personally. His father has been dead for over a year.
Instead, she listens to the answers delivered via her father's voice on her phone through HereAfter AI, an artificial intelligence (AI)-based app that generates answers based on hours of interviews conducted with him before his death in May were conducted in 2022.
His voice comforts her, but she says she created this profile more for her four children and eight grandchildren.
“I want kids to hear all of these things in his voice,” said Ms. Oney, an endocrinologist, from her home in Grosse Pointe, Michigan. And I'm trying not to paraphrase it, but for them to hear it from his perspective, in his time. »
Some people are using AI technology as a way to communicate with the dead, but its use as part of the grieving process has raised ethical questions and worried some of those who have used it.
HereAfter AI was launched in 2019, two years after the founding of StoryFile, which produces interactive videos in which subjects appear to make eye contact, breathe and blink while answering questions. Both generate responses based on what users say in response to queries like “Tell me about your childhood” or “What was the biggest challenge you faced?” “.
Their appeal is no surprise to Mark Sample, a professor of digital studies at Davidson College who teaches a course called “Death in the Digital Age.”
“Whenever a new form of technology comes onto the market, there is always a desire to use it to communicate with the dead,” says Mr. Sample. It is reminiscent of Thomas Edison's unsuccessful attempt to invent a “ghost telephone”.
“High Fidelity” version
StoryFile offers a “high-fidelity” version in which a person is interviewed in a studio by a historian, but there is also a version that only requires a laptop and a webcam to get started. Co-founder Stephen Smith asked his mother, Holocaust educator Marina Smith, to try it out. His StoryFile avatar answered questions asked at his funeral in July.
According to StoryFile, around 5,000 people have created a profile. Among them, actor Ed Asner was interviewed eight weeks before his death in 2021.
PHOTO WALLY FONG, ASSOCIATED PRESS ARCHIVE
Actor Ed Asner, in a scene from the series Lou Grant, in October 1980
The company sent Asner's StoryFile to his son Matt Asner, who was amazed to see his father see it and appear to answer his questions.
I was surprised. I thought it was incredible to be able to have a relevant and meaningful interaction with my father, and that was his personality. This man that I really missed, my best friend, was there.
Matt Asner, son of actor Ed Asner
He played the act at his father's memorial service. Some people were moved, others felt uncomfortable.
“Some people found it morbid and were afraid,” Mr. Asner said. I don't share that view, but I can understand why they said that. »
“A little difficult to watch”
Lynne Nieto gets it too. She and her husband Augie, founder of Life Fitness, a fitness equipment manufacturer, created a StoryFile before he died of amyotrophic lateral sclerosis (ALS) in February. They thought they could use it on the website of Augie's Quest, the nonprofit they founded to raise money for ALS research. Maybe his grandchildren will want to see it one day.
About six months after his death, Ms. Nieto looked at his file for the first time.
“I'm not going to lie, it was a little hard to watch,” she said, adding that it reminded her of their Saturday morning chats and it was a little too “raw.”
These feelings are not uncommon. These products force consumers to confront what they're not supposed to think about: mortality.
People are disgusted by death and loss. This could be a hard sell because it forces people to confront a reality they would rather ignore.
James Vlahos, co-founder of HereAfter AI
HereAfter AI was born from a conversational robot that Mr. Vlahos created based on his father's personality before he died of lung cancer in 2017. Mr. Vlahos, a conversational AI specialist and contributing journalist for The New York Times Magazine, wrote about the experience for Wired and soon heard from people asking him if he could make it a mombot, a spouse bot, etc.
PHOTO FROM JAMES VLAHOS' X ACCOUNT
HereAfter AI co-founder James Vlahos
“I didn’t look at it from a business perspective,” Mr. Vlahos said. And then it became clear: This should be a business. »
A question of consent and perspective
Like other AI innovations, chatbots created in the image of a deceased person raise ethical questions.
Ultimately it's about consent, said Alex Connock, lecturer at the University of Oxford's Saïd Business School and author of “The Media Business and Artificial Intelligence”.
“As with all ethical issues surrounding AI, this is a question of authorization,” says Mr Connock. If done knowingly and voluntarily, I think most ethical problems can be solved quite easily. »
Dr. David Spiegel, vice chair of the department of psychiatry and behavioral sciences at Stanford Medical School, says programs like StoryFile and HereAfter AI can help people grieve, much like flipping through an old photo album.
“The most important thing is to keep a realistic perspective on what you see: it's not about whether this person is still alive and communicating with you, but about reflecting on what they left behind.” »
This article was originally published in The New York Times.