this post was submitted on 26 Mar 2024
403 points (100.0% liked)

196

16593 readers
3239 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 64 points 8 months ago (1 children)

Stalking would be one thing, reads to me like the idea is training a language model on the ex's texts to create an AI partner that acts like them so you can pretend nothing happened, which while probably safer for the other person than being stalked (as long as the training texts are never leaked which is a big if) is just sad

[–] [email protected] 35 points 8 months ago (1 children)

Now way that would negatively impact the person using that app

[–] [email protected] 26 points 8 months ago (2 children)

A more optimistic view might be, that the user could use the tool to help deal with their feelings and questions and lack of closure in a way that doesn't involve the ex or other people.

Once they've had their fill of ranting or stages of grief and reached catharsis, or they've figured out their own feelings and views, the model and all data could be destroyed and no humans would have been harmed in the process.

A suitably smart program might work as a disposable therapist of a sort. But that's probably quite far away from current models.

[–] [email protected] 18 points 8 months ago (1 children)

On the other hand, this is a product under capitalism. It's just gonna stop people from moving on.

[–] [email protected] 4 points 8 months ago

Yeah. I see real life stalking combined with emotional reliance on a fake person.

[–] [email protected] 6 points 8 months ago (1 children)

I would wonder about the usefulness of it for relationships that weren’t primarily conducted via text for the whole duration of the relationship.

Almost all of my relationships have a pattern in texting. There’s usually a month of deep and emotional conversations, followed by a few months of sexually explicit chatting as we spend more time together and work out the emotional conversations in person, and then logistical conversations once we’re emotionally and physically comfortable with each other. And sure, we’re kind and sweet to each other between the logistics, but I don’t know if that’s really enough to train an AI on, or even enough to properly represent ourselves or the relationship.

I think that any AI would have to be very advanced to not respond to “Why did you leave me?” with ‘Because I had to take the dog to the vet/go get milk’ or whatever - especially when the bulk of the texts are in that latter relationship stage.

[–] [email protected] 4 points 8 months ago

I can confirm this, I have been backing up and restoring my messages in their entirety to every new phone since my very first Android phone (so about 15 years of messages), so all my relationships have complete texting records and I've seen that almost exact pattern

Sidenote, the Google Messages app seems to handle an almost 6 GB mmssms.db file with grace lmao