this post was submitted on 13 Apr 2024
504 points (97.0% liked)

Greentext

4348 readers
1124 users here now

This is a place to share greentexts and witness the confounding life of Anon. If you're new to the Greentext community, think of it as a sort of zoo with Anon as the main attraction.

Be warned:

If you find yourself getting angry (or god forbid, agreeing) with something Anon has said, you might be doing it wrong.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 44 points 7 months ago (2 children)

That's actually kinda fucked up.

[–] [email protected] 50 points 7 months ago* (last edited 7 months ago) (2 children)

Kinda is an understatement. There's some absolutely terrifying blogging/reporting I stumbled across a while back about someone using it to "talk with" a loved one who passed away.

In the end it was helpful and gave the author closure, but if it hadn't told them it was OK to move on then they would have been easily stuck in an incredibly unhealthy situation.

Found it: https://www.sfchronicle.com/projects/2021/jessica-simulation-artificial-intelligence/

[–] [email protected] 16 points 7 months ago (2 children)

There's a black mirror episode about this.

[–] [email protected] 16 points 7 months ago (1 children)

literally this exact premise and showing how such technology will affect people.

I've always found Black Mirror to be the most terrifying sci-fi show, because of how easy it was to see how we're on the verge of living it especially in the first two seasons, and here we are! Another exciting new Horror Thing inspired by the famous piece of media Don't Create The Horror Thing

[–] [email protected] 6 points 7 months ago (1 children)

Don't create the torment nexus

[–] [email protected] 3 points 7 months ago

I hear you, the torment nexus is bad, but what if it was also very profitable?

[–] [email protected] 2 points 7 months ago

There is Black Mirror about all of it

[–] [email protected] 4 points 7 months ago* (last edited 7 months ago) (2 children)

~~Maybe my comment came out sounding a bit too pretentious, which wasn't what I intended... Oh well.~~

To one extent or another we all convince ourselves of certain things simply because they're emotionally convenient to us. Whether it's that an AI loves us, or that it can speak for a loved one and relay their true feelings, or even that fairies exist.

I must admit that when reading these accounts from people who've fallen in love with AIs my first reaction is amusement and some degree of contempt. But I'm really not that different from them, as I have grown incredibly emotionally attached to certain characters. I know they're fictional and were created entirely by the mind of another person simply to fill their role in the narrative, and yet I can't help but hold them dear to my heart.

These LLMs are smart enough to cater to our specific instructions and desires, and were trained to give responses that please humans. So while I myself might not fall for AI, others will have different inclinations that make them more susceptible to its charm, much like how I was susceptible to the charm of certain characters.

The experience of being fooled by fiction and our own feelings is all too human, so perhaps I shouldn't judge them too harshly.

[–] [email protected] 2 points 7 months ago

I have grown incredibly emotionally attached to certain characters. I know they're fictional and were created entirely by the mind of another person simply to fill their role in the narrative, and yet I can't help but hold them dear to my heart.

Can't help but think of the waifu wars

[–] [email protected] -3 points 7 months ago

"Pretentious" is just a dogwhistle for "neurodivergent". Never worry about being pretentious.

[–] [email protected] 5 points 7 months ago* (last edited 7 months ago)

hitting it off real well

You could tell the AI that you are going to boil them alive to make a delicious stew and they would compliment your cooking skills.