Digital doppelgängers may extend lifespan

Tech Science 17. dec 2024 3 min Postdoctoral Fellow Sebastian Porsdam Mann Written by Kristian Sjøgren

Researchers think that large language models such as ChatGPT may ‘extend’ a part of lifespan. One researcher says that this presents major legal and ethical challenges when creating high-fidelity digital doppelgängers of people becomes easy in the future – especially if they have not consented.

Interested in Tech Science? We can keep you updated for free.

What does living forever mean? Many people might think that this means that even 100 years from now you could experience many things and be present with friends and family.

This is correct, but if a person’s thoughts, way of speaking and other personal traits continue, even though the person is dead, does that mean that their life has in some way been extended?

Ethicists have asked themselves this question and have concluded that large language models such as ChatGPT may achieve some of the goals behind the desire to extend one’s lifespan because they can continue a part of a person even if the person is dead in the physical sense.

This could mean that the children of a dead father could ask for advice on dating or sports or that dead authors could still “write” books.

However, this also places great demands on navigating this possibility ethically and legally, since creating digital doppelgängers that can continue after a person dies is not a future scenario but already exists.

“This is interesting and necessary to focus on now, because there is a gap within this field since our existing norms and laws have not been able to see that far into the future. The problem now is that these large language models can extend parts of people’s lives, and the personal data on which personalised language models must be built already exists in abundance on Instagram, Facebook and other social media,” explains Sebastian Porsdam Mann, Postdoctoral Fellow, Center for Advanced Studies in Bioscience Innovation Law, University of Copenhagen, Denmark.

Sebastian Porsdam Mann and colleagues have published their thoughts on digital doppelgängers and the ethical considerations in The American Journal of Bioethics.

We want to keep partying

The ideas that Sebastian Porsdam Mann and colleagues have been considering include defining the reasons why people want to live longer than normal.

They found three general reasons why continuing to live 120, 150 or perhaps 200 years instead of only 80 years would make sense for many people.

First, many people want to have more subjective experiences. We want to “keep partying”, as Sebastian Porsdam Mann puts it.

“That would probably be the main reason for most people, but some people may have other reasons for living longer,” he says.

Digital doppelgängers can extend authors’ “lifespans”

One of the other two reasons is what researchers call legacy or impact, which means that a person’s knowledge can be continued and their projects completed.

For example, consider an author who did not finish writing his last major novel. A personalised language model could finish writing the book by basing it on the author’s previous works. In that context, the author will “live on”.

“I have published many articles, and a large language model could learn from my articles and, for example, finish writing one of my books. The interesting thing is not that this is possible, because it is, but that we have difficulty saying why or how doing this with other people’s data is legally or ethically wrong,” explains Sebastian Porsdam Mann.

Like speaking with a dead father

The last reason why some people want to extend life is that they want to “be there” for their loved ones. For example, a father becomes terminally ill but wants to “be there” for his children after he dies.

A large language model could create a digital doppelganger with whom the children can talk about both easy and difficult topics. The father does not even have to decide this. His widow or other people can do this for him without his consent.

“There are, at least, these three reasons to extend lifespan. Even though the first reason is not currently possible – as far as we know – we should examine the other two,” says Sebastian Porsdam Mann.

Lack of legal and ethical guidelines

Sebastian Porsdam Mann explains that personalised language models that can “extend lifespan” can already be created. For example, many people have made their personal information available to the public on Facebook, Instagram or other social media.

For some people, years or even decades of personal information can be accessed and thereby create a digital doppelgänger using a language model that will behave like the person in many ways.

Today, anyone can create a digital doppelgänger of another person without either moral or legal consent, and that is a problem.

“We lack a legal basis for how close you can get to people in making digital versions of them. Today, if you post a lot about yourself on the internet, it will be available, and people can legally make a digital doppelgänger. We have proposed that you should always have consent before making a personalised language model, but nothing legally currently stops people from doing this without consent, and there is no consensus on the ethics of doing so. We lack guidelines for what people must not or should not do in some major areas,” concludes Sebastian Porsdam Mann.

"Digital Doppelgängers and Lifespan Extension: What Matters?"has been published in The American Journal of Bioethics. The research was supported by the National Research Foundation, Singapore, the European Commission, UK Research and Innovation, the European Union and a Novo Nordisk Foundation Grant for a scientifically independent International Collaborative Bioscience Innovation & Law Programme (Inter-CeBIL programme.

Sebastian Porsdam Mann is a postdoctoral researcher at the University of Copenhagen's Center for Advanced Studies in Bioscience Innovation Law. With a...

Explore topics

Exciting topics

English
© All rights reserved, Sciencenews 2020