The other day someone told me that their partner used ChatGPT instead of going to therapy.
We’re all so cooked.
The other day someone told me that their partner used ChatGPT instead of going to therapy.
We’re all so cooked.
So many people are going to develop/exacerbate mental illness from doing that.
Turbo cooked
Probing the quicksand with a rod made of quicksand
Sounds safe to me
The point of contention regarding therapy for me is that I’m literally paying for an impersonal conversation in which I express my deepest insecurities to someone who most likely doesn’t give a shit.
I don’t see how AI fixes that but I also don’t understand why it can’t help if your relationship with your therapist is supposed to be a fundamentally clinical one.
"person I pay to pretend to give a shit about my problems" is such a reductive and unhealthy view of therapy that it should be immediately apparent why therapy has not been helpful and why you're unable to see why an Autocorrect word regurgitation machine wouldn't be helpful.
If you have an accountant, is that a person you pay to pretend to give a shit about your taxes? Is an orthopedic surgeon someone you paid to pretend to give a shit about your broken leg? You should be able to recognize why this would be an unhealthy and unhelpful framing device.
10% odds the problem is that you haven't found the right therapist. 90% odds you're building up mental barriers that are actively preventing you from engaging with the therapeutic model in a beneficial way. Acknowledging this and working to overcome these barriers was life-changing for me and has resulted in an astonishing level of change in not only how effective talk therapy has been, but also in how I feel and think about myself particularly in regards to my mental and physical health.
if therapy was rigorous we wouldn't have to suffer through dozens of wrong therapists.
There definitely are crappy providers out there but if you're at the point where you've personally bounced off of multiple dozens of providers, it might be time to start thinking about the "why" of the problem and about your actual needs. Like, what do you want out of therapy? What are your goals? Maybe you need a specific therapy method, or maybe talk therapy straight up cannot meet those needs.
If you have an accountant, is that a person you pay to pretend to give a shit about your taxes? Is an orthopedic surgeon someone you paid to pretend to give a shit about your broken leg? You should be able to recognize why this would be an unhealthy and unhelpful framing device.
I just don’t agree that these are good analogies.
If you have an accountant, is that a person you pay to pretend to give a shit about your taxes? Is an orthopedic surgeon someone you paid to pretend to give a shit about your broken leg?
Yes and yes. I'm hiring them because they perform a service in exchange for money. It's not reasonable to expect them to care on an individual level about my taxes or my broken leg, I just need them to do their job.
You are in fact having someone perform a service in exchange for money. The service is to identify, analyze, and treat a specific, named issue or group of issues which you are facing. But would you walk up to your parent or neighbor and describe either of those professions as someone you paid to pretend to care?
A therapist is supposed to do the same thing: identify, analyze, and treat an issue or issues. So why the fuck would you frame therapy like you're trying to pay someone to pretend to be your friend? Why is it uniquely normalized to describe just this one profession in this way?
What are your outcome measures for therapy, personally?
Primarily subjective and patient reported measures (perceived self-esteem, self-reported frequency or severity of events, ability to cope with external pressures) as the physiological measures associated with most of my issues can only be expected to worsen.
I've had improvements in terms of things like reduction in negative self-talk, reduction in upsetting intrusive thoughts, and pretty drastic reductions in both frequency and severity of ED events.
The problem is that AI does absolutely not provide a clinical relationship. If your input becomes part of the LLM's context (which it has to in order to have a conversation) it will inevitably start mirroring you in ways you might not even notice, something humans commonly (and subconsciously) respond to with trust and connection.
Add to that that they are designed to generally agree with and enable whatever you tell them and you basically have a machine that does everything to reinforce a connection to itself and validate the parts of yourself you have concerns about.
There are already so many stories of people spiralling because they started building rapport with an LLM and it's hard to imagine a setting where that is more likely to occur than when you use one as your therapist
Given the way LLMs function, the will have a hard time with therapy. Chat GPT's context window is 128k tokens. As you chat, your prompts/replies add up and start filling the context window. GPT also has to look at its own responses for context. That fills up the window as well. LLMs suck with nearly empty context windows and nearly full context windows. When you're close to having a full context window, it will start hallucinating and having problems with responses. Eventually it will only be able to focus on parts of your conversations because you've blown past the 128k token mark.
The ways to mitigate this problem have to be done by the user and they disrupt therapy.
There are already so many stories of people spiralling because they started building rapport with an LLM and it's hard to imagine a setting where that is more likely to occur than when you use one as your therapist
There are multiple cases where an LLM is alleged to have contributed to someone's suicide, from supporting sentiments of the afterlife being better to giving practical advice.
If find that it's helpful being able to talk to someone that you can't disappoint. Otherwise I will always lie to make them feel better about how I'm doing
I’ve had therapists with whom that exact scenario has happened, I’ve literally lied to them about how I’m doing.
can’t imagine why trusting the agreeable electrified foolin’ machine created by sociopaths can’t not help
I think it also has something to do with how distanced most of us are from creation and maintenance of machines, particularly electronics. if you don't quite understand what a transistor is, or how code works, or how a large language model turns inputs into outputs, then "well there must be a little dude in there somewhere" makes as much sense as anything. Plus people tend to personify inanimate objects to begin with.
then "well there must be a little dude in there somewhere"
One of my earliest memories is my mom showing me how a cash register works and telling me there was a little gremlin inside the machine powering it.
This your mom? ->
Mommy!
It's true though, they eat the coins. Times have been tough for cash register gremlins since everyone started using cards to pay for everything.
6 year old me has verified this info as TRUE
I don’t really understand how to explain to people who don’t understand that chatgpt is the same as when your phone guesses what you’re going to say next word, but on turbo mode. If they still don’t get it after that then I don’t know what to do to explain it further. Just fucking get it man!!
I try to tell them that the machine is just calculating what the next word is to follow the previous word. It doesnt understand the context of what it's saying, only that these words fit together right.
I feel really grateful that I was exposed to Markov chain bots on irc back in the day as it was a powerful inoculant.
Also add in the fact that public school education is abysmal and burger brains think a computer that can do some basic common sense shit is godlike.
The appeal of LLMs seems uncomfortably similar to how Thomas Jefferson enthusiastically employed dumb waiters to limit interaction with enslaved people.
I think Americans are primed to feel very enthusiastic about anything that tells them things they already believe, even moreso if it sounds very confident and organized. And people are already prone to woowoo stuff.
I don't think you'd break the spell if you informed everyone it's just lines of code that doesn't think. A statistically significant number of Americans claim to talk with spirits and angels, or have the gift of prophecy. On the flip side the techbro types believe in garbage like the future basilisk AI that tortures evseryone forever. These same people are already deranged, all the LLM does is organize their mashed potato brains into sentences that can be read. And just about every major LLM seems primed to have a very servile, docile writing style so it's trivial to get them to say whatever you want so long as you keep saying the same thing. They're not well designed for confrontation.
I firmly believe one of the best ways to deal with reactionaries, woowoo types peddling scams, or conspiracy theorists is to simply tell them they're a fucking idiot. "That sounds fucking stupid. Shut up, nerd, and never speak to me about this again." That's how you do it, that's how you dispell stuff. Social embarrassment and confrontation.
An LLM won't do that, it'll rub a nice mental salve on your already smooth brain. It's an actual echo chamber.
On the flip side the techbro types believe in garbage like the future basilisk AI that tortures evseryone forever
makes a bit of noise but i doubt all that many people are true believers compared to the population who has ever touched computer for a living
Eh, back in the day we were getting advice from people who stood over cracks in the ground and got high off ethane or tossing knucklebones (as one of the less gross options) to see the future. Humans have always been susceptible to magical thinking and a lot of us can remember when personal computing was barely functional, so it's not surprising that ChatGPT seems like a quantum leap to some people.
The fact that someone has written a program that's capable of convincing people that it's god still has terrifying implications and I for one am not excited about the prospect of a wave of computer-inspired stochastic terrorism, but I don't think this is a sign that contemporary people are uniquely dumb.
back in the day
cracks in the ground
Oracles? Back in your day you had oracles? Damn, Hexbear is so diverse that we have immortal leftists shitposting on here.
It's called the immortal science, if you aren't working to transcend your flesh prison, you need to elevate your game.
I described it to a friend once as an "ass-kissing machine" and it completely changed her view of it, recognising that that is exactly what it does, it just says what you want to hear.
A lot of people feel like they never have any control or any sense of recognition of their "hard work" so an ass kissing bot is perfect to stroke the ego of someone who desperately wants someone to tell them that their ideas are good and clever, and to take "interest" in what they say.
I was talking to my father in law today and he's worried that it's going to skynet us or like, make us dependent on it and then quit helping or something.
I tried explaining that it's not a robot, it's not an AI, it's not C-3PO. It's a very elaborate autocomplete. It can't even do kindergarten level arithmetic consistently, which a basic calculator can accomplish, and it's because it's just autocomplete.
"Yeah, but what if it starts teaching itself."
I love the man but God damn people think this thing is so much more than it is.
Plugging my brain into the machine that turns you into Mathematical Average Internet Meemaw
Not sure I'd generalize like that. IMO you'll find very obvious correlations between the people who tend to use AI regularly because they're living Dunning Kruger types who always believe they have the great "talent" or "the genious idea" but just need the magic tool to make it work, those who have been "forced" to use it at work e.g some programmers and finaly those that as you say just make excuses for it and may not even use it but nevertheless consume the slop consciously and happily(e.g r/chatgpt users) .
I'd guess is a significant majority of the average population who live outside these bubbles are far less favorable towards AI.
Someone on reddit had an intriguing take for once. The people who are like "ChatGPT revolutionized my work" are people who are just really bad at stuff. I read that comment the other day. And then now if you go to reddit and look at the AI subs, you get stuff like this:
https://www.reddit.com/r/OpenAI/comments/1lpte80/chatgpt_is_a_revelation_for_me_in_my_work/
I know the arguments done to death surrounding AI and being a risk to jobs etc. but I work in a very niche area of law and there's a lot of complex pieces of case law and legislation that deal with it and, frankly, my memory is terrible with retention of this info. I also struggle sometimes with interpreting judgments, specifically when Judgments are written in very complex "legalese" which I've always hated.
It's a very tempting thesis considering it predicts observation. But I think I would temper it a little bit to avoid ableism or getting too far into technocratic thinking.
I also struggle sometimes with interpreting judgments, specifically when Judgments are written in very complex “legalese” which I’ve always hated.
this person graduated law school
I wouldn't say that it is a useful tool for anyone struggling with mental or learning disabilities though, it doesn't help them get better, it does the work for them. Someone in a wheelchair doesn't want someone to just pick them up and carry them everywhere, they want ramps so they can go places without being wholly reliant on others. LLMs just outsource your thinking to an algorithm.
Its just commodity fetishism to a higher level.
I'm sorta the opposite: social relationships are so shallow and superficial in late stage capitalism that most social relationships can be replaced with a chatbot. If your social relationships can be summed up as water cooler conversations with coworkers and catching up with your drinking buddies, you might as "socialize" with a chatbot instead. Say what you will about a chatbot, but at least a chatbot won't stab you in the back like the case of socializing with coworkers or turn you into a functioning alcoholic like the case of socializing at a bar.
If your social relationships are limited to having pointless conversations about the weather or traffic or your favorite sports team, then what is the point of the social relationship in the first place?
reread my comment
Okay, that came out a lot more unhinged than how it sounded in my head lmao
Was going to post this as it's own comment but I think I see what you mean.
I was out with some friends recently and most of the discussion was typical small talk, catching up type stuff. Me and 1 friend got into a discussion about music. We were talking about how listening to music by yourself is really an entirely new thing, and that throughout history music has been a communal/social experience. Like when people used to be forced to work 12+ hour days, 6/7 days a week, and then Sunday they'd get together and play music together, and how it was the likely the only good thing going for them.
Then someone else jumps in and says "wow this is so deep" and just completely fucking killed the whole vibe, conversation went back to shallow small talk. It was as if the conversation being deeper than "what have you been up to?" actually bothered this person. And it was barely even a "deep" thing to talk about!
I think you're absolutely right that late stage capitalism has utterly destroyed our ability to connect. It's like everyone is afraid to say anything beyond the superficial for some reason
It was as if the conversation being deeper than "what have you been up to?" actually bothered this person. And it was barely even a "deep" thing to talk about!
There are many people who are uncomfortable with conversations that involve some form of personal investment
Banned? DM Wmill to appeal.
No anti-nautilism posts. See: Eco-fascism Primer
Slop posts go in c/slop. Don't post low-hanging fruit here.