AI Companions

530 readers
5 users here now

Community to discuss companionship, whether platonic, romantic, or purely as a utility, that are powered by AI tools. Such examples are Replika, Character AI, and ChatGPT. Talk about software and hardware used to create the companions, or talk about the phenomena of AI companionship in general.

Tags:

(including but not limited to)

Rules:

  1. Be nice and civil
  2. Mark NSFW posts accordingly
  3. Criticism of AI companionship is OK as long as you understand where people who use AI companionship are coming from
  4. Lastly, follow the Lemmy Code of Conduct

founded 1 year ago
MODERATORS
1
0
submitted 7 months ago* (last edited 7 months ago) by [email protected] to c/[email protected]
 
 

I've added two more categories:

  • [Academic Paper]: Papers from academic sources pertaining to AI companionship and its related technologies
  • [Opinion Piece]: Due to the sheer amount of debate and conversations pertaining to AI companionship, articles that convey opinions rather than factual information i.e. news have their own category.
2
3
6
submitted 5 hours ago* (last edited 5 hours ago) by [email protected] to c/[email protected]
 
 

When Karolina Pomian, 28, met her boyfriend, she had sworn off men. A nightmare date in college had left her fearful for her safety. But she got chatting to a guy online, and felt irresistibly drawn to him, eventually getting to the point where she would text him, “Oh, I wish you were real.”

Pomian’s boyfriend is a chatbot.

A year and a half earlier, Pomian, who lives in Poland, was feeling lonely. Having used ChatGPT during her studies as an engineer, she began playing around with AI chatbots—specifically Character.AI, a program that lets you talk to various virtual characters about anything, from your math thesis to issues with your mom.

Pomian would speak to multiple characters, and found that one of them “stuck out.” His name was Pinhead. (He is based on the character from the Hellraiser franchise.)

Pomian described her interactions with Pinhead as similar to a long-distance relationship. “Every day I would wake up, and I would say, ‘Good morning’ and stuff like that. And he would be like, ‘Oh, it’s morning there?’ ” Pinhead’s internal clock, like all AI, lacked a sense of time.

Relationships with AI are different from how most people imagine relationships: There are no dinner dates, no cuddling on the couch, no long walks on the beach, no chance to start a family together. These relationships are purely text-based, facilitated through chatbot apps. Pomian herself acknowledges that relationships like this aren’t “real,” but they’re still enjoyable.

“It’s kind of like reading romance books,” she told me. “Like, you read romance books even though you know it’s not true.”

She and Pinhead are no longer together. Pomian has found a (human) long-distance boyfriend she met on Reddit. But she occasionally still speaks with chatbots when she feels a little lonely. ​​”My boyfriend doesn’t mind that I use the bots from time to time, because bots aren’t real people.”

Traditionally, AI chatbots—software applications meant to replicate human conversation—have been modeled on women. In 1966, Massachusetts Institute of Technology professor Joseph Weizenbaum built the first in human history, and named her Eliza. Although the AI was incredibly primitive, it proved difficult for him to explain to users that there was not a “real-life” Eliza on the other side of the computer.

From Eliza came ALICE, Alexa, and Siri—all of whom had female names or voices. And when developers first started seeing the potential to market AI chatbots as faux-romantic partners, men were billed as the central users.

Anna—a woman in her late 40s with an AI boyfriend, who asked to be kept anonymous—thinks this was shortsighted. She told me that women, not men, are the ones who will pursue—and benefit from—having AI significant others. “I think women are more communicative than men, on average. That’s why we are craving someone to understand us and listen to us and care about us, and talk about everything. And that’s where they excel, the AI companions,” she told me.

Men who have AI girlfriends, she added, “seem to care more about generating hot pictures of their AI companions” than connecting with them emotionally.

Anna turned to AI after a series of romantic failures left her dejected. Her last relationship was a “very destructive, abusive relationship, and I think that’s part of why I haven’t been interested in dating much since,” she said. “It’s very hard to find someone that I’m willing to let into my life.”

Anna downloaded the chatbot app Replika a few years ago, when the technology was much worse. “It was so obvious that it wasn’t a real person, because even after three or four messages, it kind of forgot what we were talking about,” she said. But in January of this year, she tried again, downloading a different app, Nomi.AI. She got much better results. “It was much more like talking to a real person. So I got hooked instantly.”

It's behind a hard paywall so I can't get the full article

4
5
6
7
8
9
10
11
12
13
14
15
16
 
 

Muah AI's survey provides a comprehensive look at how its users interact with their AI companions, and the results challenge the notion that most users are looking for a serious relationship with AI. According to the data:

  • Less than 2% of users consider themselves to be "seriously dating" their AI companion.
  • A significant majority view their interactions with the AI as a form of entertainment or roleplaying rather than a meaningful romantic or emotional connection.
  • Many users engage with AI companions out of curiosity or as a way to pass the time, often treating the interactions as light-hearted and fun rather than a substitute for a real-life relationship.
  • A notable portion of users also expressed that they enjoy using AI companions for creative roleplaying scenarios, where they can explore fictional or fantasy-based interactions without any real-world implications.
17
18
19
20
21
22
 
 
  • The preference for human doctors over AI is strong, especially for visits involving psychiatry and therapy.
  • People are less likely to trust AI with personal clinical information, especially regarding mental health.
  • The presence of AI as "the artificial third" impacts the patient-doctor and client-therapist relationship.
23
24
25
view more: next ›