
Greek American James Vlahos has turned recordings of his father into an AI-powered chatbot that could answer questions about his dad’s life in his father’s voice.
In 2016, the HereAfter founder constructed a chatbot, also known as Dadbot, that responds in the voice of his dead father. This is based on data left by his father. People quickly got in touch with Vlahos after the creation of Dadbot, asking if he could make them bots of their own.
The BBC writes that Vlahos’ business idea was born from wanting to spend time with his dying father. In 2016, James Vlahos, who lives in Oakland, CA, received some terrible news. His father was diagnosed with terminal cancer and “he was determined to make the most of the remaining time he had with his father.”
“I loved my dad.” he said. “I was losing my dad…I did an oral history project with him, where I just spent hours and hours, and hours just audio recording his life story.”
“I thought, gosh, what if I could make something interactive out of this…For a way to more richly keep his memories, and some sense of his personality, which was so wonderful, to keep that around,” Vlahos said.
According to the BBC, “In 2019, James turned his chatbot into an app and business called HereafterAI—he is a cofounder—which allows users to do the same for their loved ones.”
Chatbot offers an “interactive compendium” with dead dad
While the chatbot didn’t remove the pain of his dad’s death, it does give him, he says, “more than I otherwise would have…It’s not him retreating into this very fuzzy memory. I have this wonderful interactive compendium I can turn to.”
The platform enables users to upload photos of their loved one so that they appear on the screen of their smartphone or computer when they use the app.
The BBC notes that another company that turns people into AI chatbots goes much further. South Korea’s DeepBrain AI creates a video-based avatar of a person by shooting hours of video and audio to capture their face, voice, and mannerisms.
“We are cloning the person’s likeness to 96.5 percent of the similarity of the original person,” says Michael Jung, DeepBrain’s chief financial officer. “So mostly the family [doesn’t] feel uncomfortable talking with the deceased family member, even though it is an AI avatar.”
The company believes such technology can be an important part of developing a “well dying” culture whereby we prepare for our death in advance, leaving family histories, stories, and memories as a form of “living legacy.”
The process isn’t cheap though, and users cannot create the avatar themselves. Instead, they have to pay the firm up to $50,000 for the filming process and the creation of their avatar.
Ethical considerations of chatbots of dead loved ones
The grief tech sector, also called “death tech,” is now valued at more than $125 billion globally, according to tech news website TechRound.
A recently released study, however, urges caution in the development of artificial intelligence (AI) chatbots designed to mimic deceased loved ones, known as ‘deadbots.’ Researchers have warned that these chatbots, while potentially comforting, could lead to psychological distress if not designed with safety in mind.
While the idea of holding a digital conversation with a lost loved one may be appealing to those coping with grief and loss, the study highlighted potential risks.
Companies offering these services need to adopt safety standards to ensure that their technologies do not manipulate or cause psychological distress to the users, the paper published in the journal Philosophy & Technology noted.
Experts warn that chatbots are trained on massive amounts of data, which can reflect existing biases in the real world. This can lead to chatbots perpetuating stereotypes or even discriminating against users.
Sometimes it’s not clear to users that they’re interacting with a machine, which can lead to a false sense of trust or security. It’s important for chatbots to be transparent about their limitations and capabilities, they say.
Lastly, chatbots can be very good at mimicking human conversation, which could lead to users forming emotional attachments. This is especially concerning for vulnerable users, such as children or the elderly.
See all the latest news from Greece and the world at Greekreporter.com. Contact our newsroom to report an update or send your story, photos and videos. Follow GR on Google News and subscribe here to our daily email!


