1188
you are viewing a single comment's thread
view the rest of the comments
[-] DeathByBigSad@sh.itjust.works 38 points 2 months ago* (last edited 2 months ago)

I saw a video of a ~~woman in China~~ (Edit: actually she was Korean) talking to an AI recreation of her child (then-deceased) through VR.

I felt so creeped out by it. Like wtf, if I die, I want my mom to remember me, not talking to a fucking AI IMPOSTER.

Edit: Looked it up, actually it's a Korean woman, I mixed it up: https://www.reddit.com/r/MadeMeCry/comments/12zkqy8/mother_meets_deceased_daughter_through_vr/

[-] Datz@szmer.info 9 points 2 months ago

There's a whole company/llm about doing that whose CEO gave a Ted talk about it.

https://m.youtube.com/watch?v=-w4JrIxFZRA

After that, I actually had a pretty wild idea about someone using to replace dead/missing people in chats. Imagine the horror of finding out your friend died months ago, or got kidnapped. Horribly impractical but sounds like a good novel.

[-] Avicenna@programming.dev 12 points 2 months ago

"watch me talk about how I get rich of off exploiting people's emotional fragilities and try to pass it as providing closure and community service"

[-] explodicle@sh.itjust.works -2 points 2 months ago

I'd be cool with it IFF it was a strong AI that could be free, like Alan Watts in the film Her.

[-] Windex007@lemmy.world 25 points 2 months ago* (last edited 2 months ago)

If someone wants an AI companion, fine. If it's a crazy good one, fine.

But it's strictly predatory for it to be designed to make someone feel like it's someone else who was a real person, ESPECIALLY someone dealing with that type of grief.

You had to boot the mom out of the painting. There was no ambiguity on that one.

this post was submitted on 16 Nov 2025
1188 points (99.3% liked)

Funny

13524 readers
1463 users here now

General rules:

Exceptions may be made at the discretion of the mods.

founded 2 years ago
MODERATORS