this post was submitted on 26 Mar 2024
403 points (100.0% liked)
196
16593 readers
2182 users here now
Be sure to follow the rule before you head out.
Rule: You must post before you leave.
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Stalking would be one thing, reads to me like the idea is training a language model on the ex's texts to create an AI partner that acts like them so you can pretend nothing happened, which while probably safer for the other person than being stalked (as long as the training texts are never leaked which is a big if) is just sad
Now way that would negatively impact the person using that app
A more optimistic view might be, that the user could use the tool to help deal with their feelings and questions and lack of closure in a way that doesn't involve the ex or other people.
Once they've had their fill of ranting or stages of grief and reached catharsis, or they've figured out their own feelings and views, the model and all data could be destroyed and no humans would have been harmed in the process.
A suitably smart program might work as a disposable therapist of a sort. But that's probably quite far away from current models.
On the other hand, this is a product under capitalism. It's just gonna stop people from moving on.
Yeah. I see real life stalking combined with emotional reliance on a fake person.