After struggling for four years to cope with her grief following the death of her mother, actress Shirine Maras turned to an AI tool that claims to “simulate the dead.”
by Arthi Nachiappan, Technology Correspondent
Tuesday 12 March 2024 02:19, United Kingdom
After her mother’s death, Shirine Maras was desperately looking for an outlet to grieve.
“When you’re weak, you accept anything,” she says.
The actress was separated from her mother Nadja after immigrating to Germany from her native Syria in 2015.
In Berlin, Sirine gave birth to her first child, a daughter named Ishtar, but she wanted more than anything to meet her mother. But before the opportunity presented itself, tragedy struck.
Nadja suddenly passed away at the age of 82 due to kidney failure in 2018.
“She was a guiding force in my life,” Shirine says of her mother. “She taught me how to love myself.
“Everything was brutal because it happened so suddenly.
“I really, really wanted her to meet her daughter and have one last reunion.”
Shirine says the grief was unbearable.
“I just want an outlet,” she added. “About all these emotions… if you leave it there, it starts to kill you, it starts to suffocate you.
“I wanted one last chance (to talk to her).”
After four years of struggling to process her loss, Shireen turned to Project December. A.I. A tool that claims to “simulate the dead.”
Users fill out a short online form with information about the deceased person, including age, relationship to the user, and a quote from the person.
The responses are then input into an AI chatbot that has the following capabilities: OpenAI GPT2, an early version of the large-scale language model behind ChatGPT. This generates a profile based on the user’s memory of the deceased person.
Such models are typically trained on a vast number of books, articles, and texts on the Internet to generate responses to questions in a manner similar to word prediction tools. Answers are not based on factual accuracy.
For a cost of $10 (about £7.80), users can message the chatbot for around an hour.
For Sirine, the results of using the chatbot were “creepy.”
“There were moments that felt very real,” she says. There were moments when I thought anyone would answer this.
The message from the chatbot imitated her mother, calling her by the nickname Shirine (which she had filled out in an online form), asking if she was eating well and telling her it was watching over her.
“I’m a bit of a spiritual person, so this felt like a vehicle,” Shirine says.
“My mom would know from one word to the next if it was really me or if someone was just pretending to be me, and I think there were moments like that. ”
Project December has more than 3,000 users, the majority of whom use it to imitate a deceased loved one in conversation.
Jason Lawler, the service’s founder, said users are typically people dealing with the sudden loss of a loved one.
“Most people who use Project December for this purpose have this final conversation with their deceased loved one in a simulated format and then move on,” he says.
“So there are very few customers who keep coming back and keep that person alive.”
He says there’s not much evidence that people become “hooked” to the tools and have a hard time letting go.
However, there are concerns that such tools may interfere with the natural grieving process.
Billy Dunleavy, a certified therapist with the British Association of Counseling and Psychotherapy, says: ‘A big part of grief therapy is learning to come to terms with absence – learning to recognize a new reality, a new normal. …So this is a possibility.” Please interrupt it. ”
👉 Listen above, then tap here to follow Sky News Daily wherever you get your podcasts 👈
In the aftermath of grief, some people become withdrawn and isolated, therapists say.
Furthermore, she added: “We have this vulnerability and the potential to create ghost versions of lost parents, lost children, or lost friends.
“And that can be very detrimental to people who are actually working through their grief and moving on and recovering.”
There are currently no specific regulations governing the use of AI technology to imitate the dead.
The world’s first comprehensive legal framework for AI, which has passed its final stages in the European Parliament, will see regulation based on the level of risk posed by different uses of AI.
read more:
Customer service chatbot slams company as ‘worst delivery company’
Fake AI images continue to spread – here are 8 images that are catching people
AI drone “kills” human operator during “simulation”
The Project December chatbot gave Sirine part of the solution she needed, but she warned families to tread carefully.
“It’s so convenient and so revolutionary,” she says.
“I was careful not to get too involved.
“I see people easily becoming addicted to its use, becoming disillusioned with it, and wanting to believe in it until it gets worse.
“I don’t recommend getting too attached to something like that because it can be dangerous.”


