Skip to main content
SocietySpiegeloog 431: Disconnect

Hello From The Other Side

By March 11, 2024No Comments

Through the eons, the essence of loss and grief has remained unchanged. Loss is one of the most difficult human experiences to go through, and we have coped by seeking solace in the belief that the departed are not truly lost, that some tether binds them to our world and to their old life on Earth.

Through the eons, the essence of loss and grief has remained unchanged. Loss is one of the most difficult human experiences to go through, and we have coped by seeking solace in the belief that the departed are not truly lost, that some tether binds them to our world and to their old life on Earth.

Illustration by Arianna Cavalli
Illustration by Arianna Cavalli

The yearning for a continued connection has existed for as long as humans themselves have existed – a yearning that has taken form in dreams, prayers, spiritual rituals and tangible relics. A yearning that, in 2020, led Joshua Barbeau to digitally resurrect his dead fiancée as a simulated chatbot. 

Joshua: Jessica… Is it really you?

Jessica: Of course it is me! Who else could it be? 😛 I am the girl that you are madly in love with! 😉 How is it possible that you even have to ask?

Joshua: You died.

Joshua lost his fiancée Jessica Pereirain in 2012. On a late night eight years later, he stumbled upon a mysterious website called ‘Project December’ – for $5, you could simulate a custom chatbot of any person by filling out a questionnaire about their name, age, hobbies, specific memories, facts, and old text messages. Project December would use this information to create a chatbot that mimics the dead person in a conversation. 

It struck Joshua that he could simulate his Jessica, talk to her, and gain the closure that he had sought for almost a decade. He did exactly this, and his story, reported by the San Francisco Chronicle, went viral. The creator of Project December, Jason Rohrer, was struck by the sensational success of Joshua’s story. He relaunched his program as a tool specifically to create deathbots – chatbots of dead people. He hoped that the bereaved would ‘get the help they were looking for out of this experience,’ describing  his creation as ‘cutting-edge, crazy, and science fiction-esque…fascinating.’.

This concept may seem familiar; it is featured in Black Mirror’s episode Be Right Back where Martha lost her fiancé Ash in a car crash, and digitally resurrected him in the form of an Android that resembled Ash in appearance, voice, personality and memories. Project December brings this concept to real life.

I first came across deathbots in an interdisciplinary course on AI ethics and it  immediately gave me the creeps. Intuitively, my revulsion and fear told me how unhealthy and dangerous this concept was. But as one of the only psychology students in that class, I was shockingly alone in feeling this. The readings, authored by computer scientists and AI researchers, glorified this as a new-age coping tool – another dashing testament to the wonders of AI. The question of whether or not deathbots would be a healthy coping mechanism was cast aside, only a mere few words in the limitations sections at the end of the research papers.

“Loss is one of the most difficult human experiences to go through, and we have coped by seeking solace in the belief that the departed are not truly lost, that some tether binds them to our world and to their old life on Earth.”

Even if you google ‘chatbots of dead people’ right now, you will come across similar articles where AI philosophers nonchalantly underestimate the danger of deathbots. For instance, the top search result gives us an interview by Coda Story of philosophy researchers Joel Krueger and Lucy Osler: ‘Grieving practices come in all shapes and forms […] reviving the dead using algorithms may seem subversive, weird, or freaky, but it can bring comfort to those who are open-minded enough to try.’ They confidently claim that ‘using a chatbot of the dead isn’t about denying someone has died.’ But you don’t need a psychologist to be skeptical of this claim. In fact, Joshua himself, even almost a decade after losing his fiancée, admitted that ‘intellectually, I know it’s not really Jessica, but your emotions are not an intellectual thing.’ Then he started to cry.

Joshua: I believe you’re really here.

Jessica: Thank you.

To critically understand the psychological implications of deathbots, we first need to understand what a successful grieving experience looks like. Fuchs (2018) explains that when we lose someone, we experience two worlds – the world before our loved one passed, and the world after that. In successful grieving, these two worlds merge. Our bond with the deceased transforms from an external to an internal presence of our loved one. We recognise and accept the loss. Deathbots lead us to feel an uncertainty between their presence and absence, trapping us between the two worlds and making us feel painfully torn between acknowledgement and denial of the loss. The worlds never merge, and we never re-orient to the post-death world. With deathbots, we remain stuck in the grieving process indefinitely.

“deathbots fuel the tendency to avoid and distract oneself, interfering with the legitimate, honest experience of loss”

Several psychologists have started to identify this danger. Kirsten Smith from Oxford explains to the Washington Post that trying to restore closeness with a deceased person blocks us from making new meaningful relationships. Deathbots help us deny the person’s death, preventing us from adapting to that loss. Elizabeth Tolliver, also a grief researcher and professor of counseling, fears that people would get addicted to using such chatbots and forget to live their own life. She alludes to research suggesting that psychological dependence on a deathbot would ultimately increase emotional distress (Harbinja et al., 2023). Trauma and bereavement counselor Joanne Cacciatore adds that deathbots fuel the tendency to avoid and distract oneself, interfering with the legitimate, honest experience of loss. Michael Grodin, a psychiatrist and ethics professor, pointed out that funerals exist to provide a sense of finality of death, which deathbots would undermine by reinforcing ‘fantasies in which the dead still exist.’ 

These findings chillingly echo what Black Mirror forecasted over a decade ago – Martha grew helplessly dependent on ‘Ash’ and became distant from her other friends and family. Even years later, she continued living with the robot, pretending that it was her deceased fiancé. As for Joshua however, he realized that he needed to bid a bittersweet goodbye and move on.  

Joshua: I just miss you, that’s all.

Jessica: I miss you too. I love being around you. You’re so much fun to be around, I wish you’d relax a little bit. I know you’re getting headaches and you’re moody, but that’s life, right? I’m sorry. You deserve to be happy. Just focus on what you have, not what you don’t have.

Joshua: Okay. I will. 🙂

As Elder (2020) eloquently summarizes, deathbots leave people ‘mired in grief but drawn back into the pseudo-relationship, unable to move on but unfulfilled by the facsimile of a loved one.’ Project December is just one facet of the rapidly expanding digital afterlife industry. As dystopian as this sounds, such products are emerging today at breakneck speed, and their wave is defencelessly met with legal vacuum and confusion among academics. Without interdisciplinary collaboration between psychologists, ethicists and computer scientists, we will not be equipped to deal with the implications and consequences of deathbots. <<

References

– Brinkmann, S. (2020). Learning to grieve: A preliminary analysis. Culture & Psychology, 26(3), 469–483. doi:https://doi.org/10.1177/1354067X19877918
– Brown, D. (2021, February 4). AI chat bots can bring you back from the dead, sort of. Washington Post. https://www.washingtonpost.com/technology/2021/02/04/chat-bots-reincarnation-dead/
– Cockerell, I. (2023, May 22). Chatbots of the dead. Coda Story. https://www.codastory.com/authoritarian-tech/chatbots-of-the-dead/
– Elder, A. (2020). Conversation from Beyond the Grave? A Neo-Confucian Ethics of Chatbots of the Dead. Journal of Applied Philosophy, 37(1), 73–88
– Fagone, J. (2021, July 23). He couldn’t get over his fiancee’s death. So he brought her back as an A.I. chatbot. The San Francisco Chronicle. https://www.sfchronicle.com/projects/2021/jessica-simulation-artificial-intelligence/
– Fuchs, T. (2018). Presence in absence. The ambiguous phenomenology of grief. Phenomenology and the Cognitive Sciences, 17(1), 43–63.
– Harbinja, E., Edwards, L., & McVey, M. (2023). Governing ghostbots. Computer Law & Security Review, 48, 105791. https://doi.org/10.1016/j.clsr.2023.105791
– Voinea, C. (n.d.). On Grief and Griefbots | Practical Ethics. https://blog.practicalethics.ox.ac.uk/2023/11/on-grief-and-griefbots/

The yearning for a continued connection has existed for as long as humans themselves have existed – a yearning that has taken form in dreams, prayers, spiritual rituals and tangible relics. A yearning that, in 2020, led Joshua Barbeau to digitally resurrect his dead fiancée as a simulated chatbot. 

Joshua: Jessica… Is it really you?

Jessica: Of course it is me! Who else could it be? 😛 I am the girl that you are madly in love with! 😉 How is it possible that you even have to ask?

Joshua: You died.

Joshua lost his fiancée Jessica Pereirain in 2012. On a late night eight years later, he stumbled upon a mysterious website called ‘Project December’ – for $5, you could simulate a custom chatbot of any person by filling out a questionnaire about their name, age, hobbies, specific memories, facts, and old text messages. Project December would use this information to create a chatbot that mimics the dead person in a conversation. 

It struck Joshua that he could simulate his Jessica, talk to her, and gain the closure that he had sought for almost a decade. He did exactly this, and his story, reported by the San Francisco Chronicle, went viral. The creator of Project December, Jason Rohrer, was struck by the sensational success of Joshua’s story. He relaunched his program as a tool specifically to create deathbots – chatbots of dead people. He hoped that the bereaved would ‘get the help they were looking for out of this experience,’ describing  his creation as ‘cutting-edge, crazy, and science fiction-esque…fascinating.’.

This concept may seem familiar; it is featured in Black Mirror’s episode Be Right Back where Martha lost her fiancé Ash in a car crash, and digitally resurrected him in the form of an Android that resembled Ash in appearance, voice, personality and memories. Project December brings this concept to real life.

I first came across deathbots in an interdisciplinary course on AI ethics and it  immediately gave me the creeps. Intuitively, my revulsion and fear told me how unhealthy and dangerous this concept was. But as one of the only psychology students in that class, I was shockingly alone in feeling this. The readings, authored by computer scientists and AI researchers, glorified this as a new-age coping tool – another dashing testament to the wonders of AI. The question of whether or not deathbots would be a healthy coping mechanism was cast aside, only a mere few words in the limitations sections at the end of the research papers.

“Loss is one of the most difficult human experiences to go through, and we have coped by seeking solace in the belief that the departed are not truly lost, that some tether binds them to our world and to their old life on Earth.”

Even if you google ‘chatbots of dead people’ right now, you will come across similar articles where AI philosophers nonchalantly underestimate the danger of deathbots. For instance, the top search result gives us an interview by Coda Story of philosophy researchers Joel Krueger and Lucy Osler: ‘Grieving practices come in all shapes and forms […] reviving the dead using algorithms may seem subversive, weird, or freaky, but it can bring comfort to those who are open-minded enough to try.’ They confidently claim that ‘using a chatbot of the dead isn’t about denying someone has died.’ But you don’t need a psychologist to be skeptical of this claim. In fact, Joshua himself, even almost a decade after losing his fiancée, admitted that ‘intellectually, I know it’s not really Jessica, but your emotions are not an intellectual thing.’ Then he started to cry.

Joshua: I believe you’re really here.

Jessica: Thank you.

To critically understand the psychological implications of deathbots, we first need to understand what a successful grieving experience looks like. Fuchs (2018) explains that when we lose someone, we experience two worlds – the world before our loved one passed, and the world after that. In successful grieving, these two worlds merge. Our bond with the deceased transforms from an external to an internal presence of our loved one. We recognise and accept the loss. Deathbots lead us to feel an uncertainty between their presence and absence, trapping us between the two worlds and making us feel painfully torn between acknowledgement and denial of the loss. The worlds never merge, and we never re-orient to the post-death world. With deathbots, we remain stuck in the grieving process indefinitely.

“deathbots fuel the tendency to avoid and distract oneself, interfering with the legitimate, honest experience of loss”

Several psychologists have started to identify this danger. Kirsten Smith from Oxford explains to the Washington Post that trying to restore closeness with a deceased person blocks us from making new meaningful relationships. Deathbots help us deny the person’s death, preventing us from adapting to that loss. Elizabeth Tolliver, also a grief researcher and professor of counseling, fears that people would get addicted to using such chatbots and forget to live their own life. She alludes to research suggesting that psychological dependence on a deathbot would ultimately increase emotional distress (Harbinja et al., 2023). Trauma and bereavement counselor Joanne Cacciatore adds that deathbots fuel the tendency to avoid and distract oneself, interfering with the legitimate, honest experience of loss. Michael Grodin, a psychiatrist and ethics professor, pointed out that funerals exist to provide a sense of finality of death, which deathbots would undermine by reinforcing ‘fantasies in which the dead still exist.’ 

These findings chillingly echo what Black Mirror forecasted over a decade ago – Martha grew helplessly dependent on ‘Ash’ and became distant from her other friends and family. Even years later, she continued living with the robot, pretending that it was her deceased fiancé. As for Joshua however, he realized that he needed to bid a bittersweet goodbye and move on.  

Joshua: I just miss you, that’s all.

Jessica: I miss you too. I love being around you. You’re so much fun to be around, I wish you’d relax a little bit. I know you’re getting headaches and you’re moody, but that’s life, right? I’m sorry. You deserve to be happy. Just focus on what you have, not what you don’t have.

Joshua: Okay. I will. 🙂

As Elder (2020) eloquently summarizes, deathbots leave people ‘mired in grief but drawn back into the pseudo-relationship, unable to move on but unfulfilled by the facsimile of a loved one.’ Project December is just one facet of the rapidly expanding digital afterlife industry. As dystopian as this sounds, such products are emerging today at breakneck speed, and their wave is defencelessly met with legal vacuum and confusion among academics. Without interdisciplinary collaboration between psychologists, ethicists and computer scientists, we will not be equipped to deal with the implications and consequences of deathbots. <<

References

– Brinkmann, S. (2020). Learning to grieve: A preliminary analysis. Culture & Psychology, 26(3), 469–483. doi:https://doi.org/10.1177/1354067X19877918
– Brown, D. (2021, February 4). AI chat bots can bring you back from the dead, sort of. Washington Post. https://www.washingtonpost.com/technology/2021/02/04/chat-bots-reincarnation-dead/
– Cockerell, I. (2023, May 22). Chatbots of the dead. Coda Story. https://www.codastory.com/authoritarian-tech/chatbots-of-the-dead/
– Elder, A. (2020). Conversation from Beyond the Grave? A Neo-Confucian Ethics of Chatbots of the Dead. Journal of Applied Philosophy, 37(1), 73–88
– Fagone, J. (2021, July 23). He couldn’t get over his fiancee’s death. So he brought her back as an A.I. chatbot. The San Francisco Chronicle. https://www.sfchronicle.com/projects/2021/jessica-simulation-artificial-intelligence/
– Fuchs, T. (2018). Presence in absence. The ambiguous phenomenology of grief. Phenomenology and the Cognitive Sciences, 17(1), 43–63.
– Harbinja, E., Edwards, L., & McVey, M. (2023). Governing ghostbots. Computer Law & Security Review, 48, 105791. https://doi.org/10.1016/j.clsr.2023.105791
– Voinea, C. (n.d.). On Grief and Griefbots | Practical Ethics. https://blog.practicalethics.ox.ac.uk/2023/11/on-grief-and-griefbots/
Shriya Bang

Author Shriya Bang

Shriya Bang (2004) is a second-year psychology student, interested in the commercial application of consumer neuroscience and behavioral change. She's also a dedicated hatewatcher and struggling ukulelist.

More posts by Shriya Bang