Ghosts in the machine

Hans Block and Moritz Riesewieck on exploring AI 'life after death' in Eternal You

by Amber Wilkinson

The modern world is increasingly paying attention to the implications of artificial intelligence tools like Chat GPT are for the world of work. But Eternal You, the latest documentary from Moritz Riesewieck and Hans Block, scrutinises the burgeoning industry growing up around death and grieving, which sees tech companies attempting to put the ghosts of deceased loved ones inside their machines. In its simplest form, this technology is present as chatbots, called “thanobots”, which use dead loved ones’ online history as a springboard for AI to conduct conversations after death. Eternal You also dives into more complex forms of AI, including a mother who was given a chance to “meet” her dead daughter in virtual reality for a TV show. We caught up with the directors after the film’s premiere at the Sundance Film Festival to talk about technology and morality.

How did you come across this industry?

Moritz Riesewieck: In 2018, when we had just premiered our previous firm, The Cleaners, and were familiar with this industry in Silicon Valley, we discovered a website of an MIT fellow stating “become virtually immortal”. We were like, “What? How should that work? And what is this? And is there more behind it? Or is this just a cheap scam or something?” And we learned that this guy had already achieved something like 30,000 people on the waitlist, and a lot of people took it seriously. So we interviewed him and we understood he was working on it but he was far from having the technology for fulfilling these dreams.

So he really had to respond to all these people, “Actually, I can't do that. But if you can wait…” But, of course, these people couldn't wait because they were in an existential moment of their lives, they were to die soon themselves, or grieving people. And were deeply hoping for it. So we were like, okay, these folks are actually playing with the emotions of people in very vulnerable states.

This is interesting, because obviously, so many people hope for something like this, otherwise, they wouldn't subscribe there. But at the same time, you play with the very deep emotions of people. So we followed up on this and over time, and it took some time, there were more and more startups popping up from all over the world, from Asia, from New Zealand, from Europe, from North America. We got the chance to always meet them meet them when they presented the first beta test products to the first clients. We met people who met their avatars, when still alive, we met other people who met the avatars of the deceased. What we were mostly surprised about was that, even when the technology back then wasn't that far developed, the emotional reactions of the people trying them out were actually quite there. And so we were more and more interested in the effects it has on the human psyche. And two years ago with this massive push in the development of artificial intelligence, what was formerly maybe just an idea became all the more real. And so now we are at this moment, where there's really a turning point where there is a rather abstract idea becoming beoming a reality.

Director Hans Block: 'We were super surprised when we saw the first users using the services and they are so emotionally connected to the bots'
Director Hans Block: 'We were super surprised when we saw the first users using the services and they are so emotionally connected to the bots' Photo: Courtesy of Sundance Institute
Ever since people have been dying, people have been wanting not to die. It’s a very old urge, connecting everything form ancient necromancy to Victorian and modern spiritualism. But even in the case of spiritualism, there is a medium there that is responding to another human being. If I'm telling you about dead relatives, then presumably, I'm going to engage with you on an emotional level. But obviously, with this, you're talking about something that's purely computer coded. So were you surprised how emotionally engaged people were becoming with these technologies?

Hans Block: First of all, I think the idea of talking to dead people isn't new. It's a very old cultural thing. And of course, more and more people do not believe in the afterlife with God any more, especially in the Western world. So there's so to say, like transcendental homelessness in dealing with death and grief. I think the tech industry uses these voids to pile in, and to build a new narrative for them. And many people were longing for a new narrative in how we deal with these kinds of things. That was surprising for us. That was really what we found out while researching that there's something connected to each other. And, of course, we were super surprised when we saw the first users using the services and they are so emotionally connected to the bots.

Sometimes we sat behind them, and couldn't find anything in the words coming back, but they were longing so much for something that they deeply feel connected to what the bot was saying, even if the bot systems are weak, if they had really big failures, something in their mind, animated these kinds of failures. So we, we try to emulate this as humans, this is our desire, we try to see something in it, we try to make an object to human, we fill it with life. I think that's very human-like. And that was interesting for us that they feel so much. And of course, as American sociologist Sherry Turkle said in our film, these machines can trick us into thinking they (our loved ones) are there.

Were you shocked when he saw some of the things that the bots were saying? I was astounded that nobody had thought to run something through this AI, so it didn't say things like, “I’m in hell”, for example.

MR: The way it works is that the artificial intelligence tries to analyse statistical patterns in a big dataset. And it's trained on so much data that it has all the human expressions in there, and it has so many different situations it can use for conversations. And then, of course, it's also fine-tuned by the programmers, and by the people feeding this AI with the very personal data of the deceased. So this combination actually makes for the fact that a lot of what it spits out is connected to the lives of the deceased. So it is not just pure coincidence that a lot of the stuff it says is fitting to the person. And it's not just for the fact that these people are in this specific states of vulnerability and grieving that they are deceived or something, there is a lot there. But what's interesting is that even people who think absolutely clearly who sometimes even work in the tech fields, can be so deceived and so seduced that after a while, they forget that on the other side there’s not a “person” there.

It seems like a human connection that even sometimes the AI seems to have empathy that these formally, specifically human qualities of creativity, of empathy, of warmth of saying the right words in the right moment, maybe even of humour, can be expressed by the AI. And that doesn't mean that it's sentient, or that it does have a consciousness or anything like that. It doesn't need that actually, to be able to be really good at pretending all that. And for people that sometimes enough, that’s sometimes better than being alone. As Sherry Turkle puts it, we have a loneliness crisis in a lot of Western societies.

It seems we've also got a tech-bros-wanting-to-play-God crisis by the looks of it. I noticed that these companies all seem to be run by men, some of whom seem to be having real personal issues as well. They seem to be playing God without any concern for the morality element of it. Were you surprised by how little bothered they were with what was happening with these bots?

HB: When we tried to find the right ensemble for our film, we were looking so much for female founders in this industry, and we almost couldn't find anybody. So we noticed, okay, they are all white men creating these kinds of services. And in most cases, they are also very personally connected to a loss. Some of the founders lost their mother recently or their wife, so they had a deep connection to that kind of topic and then suffering from that they create something. That was interesting for us. Also we tried to find out what else it could be as to why they are also just men? Maybe it's that they cannot give birth, for example, as you mentioned they're godlike, creating something new. We don't know at the end.

It's noticeably split down the sexes in the film. All the white guys are creating and most of the end users seem to be female, who are sort of buying into it? Was that your experience?

MR: That was exactly our observation.

Do you have a theory as to why that might be?

HB: I have to mention this jealousy towards women, maybe to not be able to give birth naturally and then to compensate for that with virtual birthing. We don’t know if that’s the case. Also, of course, it has been the case that in tech in general men have all the resources and women are under-privileged very often, as in so many other cases. Also, this techno libertarian idea of, “Oh, we are just the engineers, we are not responsible for anything the users do with it, like we only provide it and then users do what they want. And it's their responsibility”.

Moritz Riesewieck: 'We were more and more interested in the effects it has on the human psyche'
Moritz Riesewieck: 'We were more and more interested in the effects it has on the human psyche' Photo: Courtesy of Sundance Institute
I mean, it's somehow a fair point, maybe, for people having this ideology to think like that. We believe it's deeply cynical. If you provide something like this, as with every other product, as Carl Öhman says in our film, you can’t provide a self-driving car, and then not make sure that this self-driving car is not killing people. And if you provide a bot where, you know, actually, and this is what he states, like, he likes the spooky aspects of it, he has it fine-tuned that way, then you are responsible if it deceives people and if it pulls people into a conversation they can't control any more.

They all seem very happy to talk to you about this, did you get any resistance?

MR: Of course we did. We contacted several companies and of course, not everyone responded to us. We tried to get in contact with a former Googler, he's also working on a similar service, and he didn't reply at all. So some of them were distant because they knew we are filmmakers, we did The Cleaners, we are tech-critical, but some of them like Jason Rohrer, the Project December founder, he really loves to spread his belief in not being responsible. He was at the premiere, for example, that was so interesting for us. We offered him to see the film beforehand, but he didn't want to and he was really happy to argue with the audience afterwards because he really is a tech believer and he's not the only one with that kind of mindset. There are many powerful people in Silicon Valley, with the same mindset. And that's dangerous in a way, because especially if you have a service like this, where very vulnerable people are using it, it causes very much harm.

How did you happen upon the South Korean story, of the mother whose daughter was recreated in AI?

MR: With this mother, Jang Ji-Sung, what surprised us a lot was that the general impression from this TV experiment from all over the world was this the view, you can't run an experiment like this on a mother who is grieving her daughter. We were surprised, and that’s always our method. We really want to avoid judgement and meet the people who have had this experience and learn from them first-hand. And it was a moment when she could have opened up and admitted that it caused harm to her. But the opposite was true. She told us, “In general, this helped me a lot to get this chance to overcome this.” Because the moment in the hospital, the last moment she talked to her daughter, was actually a fight. She was arguing with her not to push the other beds away. So it was horrible. We, as two very skeptical people, in this regard, suddenly understood, okay, yeah, there are reasons why people want these experiences. And it was very clear that, in this sense, it helped her while at the same time her dreams about her daughter were overwritten by this virtual reality and isn’t this kind of a symbol of a subtle danger in all of this that reality and virtual reality are intertwined.

Share this with others on...
News

Man about town Gay Talese on Watching Frank, Frank Sinatra, and his latest book, A Town Without Time

Magnificent creatures Jayro Bustamante on giving the girls of Hogar Seguro a voice in Rita

A unified vision DOC NYC highlights and cinematographer Michael Crommett on Dan Winters: Life Is Once. Forever.

Poetry and loss Géza Röhrig on Terrence Malick, Josh Safdie, and Richard Kroehling’s After: Poetry Destroys Silence

'I’m still enjoying the process of talking about Julie and advocating for her silence' Leonardo van Dijl on Belgian Oscar nominee Julie Keeps Quiet

More news and features

Interact

More competitions coming soon.


DJDT

Versions

Time

Settings from settings.local

Headers

Request

SQL queries from 1 connection

Templates (11 rendered)

Cache calls from 2 backends

Signals