All Discussions > Steam Forums > Off Topic > Topic Details
Would you consider a "VR copy" of a fallen person a them being back?
I've heard a case where a tech company in China is experimenting with AI clones of people who are deceased, for instance: they "digitally resurrected" a mother who passed from cancer, into a VR copy avatar version (as in it's AI pretending to be her), think of it like speaking to a virtual assistant but it's putting on a mask of someone you once knew or is close to you but they died.

But the question is: would you consider them as living (physically alive in front of your eyes) or just AI cosplaying as a dead person virtually? Also, is it even deemed respectful to those grieving (they have just passed then suddenly be like "let AI roleplay as them" as their form of digital resurrection or virtual copy of themselves, and refusing to let that person rest in peace).
< >
Showing 1-8 of 8 comments
I personally think its insane and it shouldn't be done.


Life happens, the pain of loss hurts, you move on.
if I put your face on a scarecrow, does it become you?

the entire point of the drive for 'digital resurrection' of this nature is to normalize mockingbird ai research. the peaks of the push to enact it always coincide with mockingbird field tests.
Last edited by rabapraba p; 19 hours ago
Just an ai, pretending to be someone
Leto 19 hours ago 
Difficult to balance on booth sides customer/provider.

On customer side interaction can lead to situation where you generated a certain amount of context with your interactions and "grow" a new companion.

Would guess in the attempt to compensate a loss, but context size is limited so you have to live with a companion who got dementia or pay for more context size.


Provider side wont care much about it but i would consider it there part, to some how care for what they create in a way a customer would expect it from a ancestor replica producer.

And so on and a dozen of buts...
Last edited by Leto; 19 hours ago
simply having the service extant, combined with our broad data harvesting practices, means that companies will gather all the data necessary to impersonate you before you die. so they can market test demos, have the data on hand, etc.

then as per usual wealthy interests can simply illegally buy access to this data to impersonate you while you're alive.

they can't currently do this (very well) as there is no mass market drive to gather data specifically for impersonation.
Last edited by rabapraba p; 18 hours ago
Would I? It depends on how much input the person had in the AI's behavior and personality.

Unfortunately, they'd still be dead to themselves. You only get one life. I hope everyone is doing what they like within reason and not stepping on other's happiness.
Originally posted by Sit With Me:
Would I? It depends on how much input the person had in the AI's behavior and personality.

Unfortunately, they'd still be dead to themselves. You only get one life. I hope everyone is doing what they like within reason and not stepping on other's happiness.

Being able to talk to cat or dog versions of my friends in ai form would be fun,

but the fun comes at the cost of enabling microimpersonation and the death of individual identity.
< >
Showing 1-8 of 8 comments
Per page: 1530 50

All Discussions > Steam Forums > Off Topic > Topic Details