The fact that this is considered a viable option because we live in a country with a government that refuses to actually provide for its people, is painfully depressing. AI as your therapist… seriously what the fuck is this timeline? I work in tech and the people constantly blowing AI hot air are not folks you want in charge of the tools for your therapy and wellbeing.
Privacy
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
This assumes all human therapists are ethical and never make mistakes, and that all of their offices, notes and data syatems are secure too. All security is porous.
No, it doesn't.
You distrust AI therapists.
You distrust bad therapists.
You do trust good therapists.
See? Works just fine.
That assumes you can tell and that the best people and processes are flawless which is not true by a wide margin.
are flawless
I cannot help you. You are having a conversation in your head that no one else here is a part of. You gotta come back down to Earth, man.
I mean... Yeah. That's why they went to a therapist...
No one can afford $2000 bread, well maybe in the great depression
By AI head specialist.
Depending on who you are an AI chat might just be a less tedious journal which can obviously be better than not journaling, I still find it sorta weird too but the ridicule is unfounded imo.
From a privacy perspective it's likely terrible/ terrifying but given the majority of people are already transparent for the most part, they are at least taking some real world value for their increased transparency.
I imagine at least some of that ridicule stems from this being kind of the exact wrong answer to the big, societal "why is everyone so lonely now?" question.
It's a bit like watching a pack-a-day smoker buy lozenges for their throat or something, as if you're not supposed to think about the cancer.
Not sure..
- An AI therapist can already easily handle general good mental advice, such as reducing cognitive load, perspective shifts, alternative methodologies, education of standard mental needs, processes and whatever low-level stuff we can benefit from. 2. hooman therapists are a coin-toss. Most are completely crap and build their business from archaic and/or wrong theories and personal ideology/feelings. 3. whatever flaws AI have now, is going away really really fast.
Hooman therapists cost a lot of money, and a shitload of people won't get any help at all without AI.
So, I think it is fine. The potential damage is far less than no help at all. Just use a little common sense and don't take anything as a Gospel - just as when we see hooman therapists.
I think this is true and until we have easily accessible and free mental health services it is the next best option and far more likely to do good than harm.
selfhosted AI is 1000% more confidential than a human therapist; but stuff like ChatGPT? yeah stay away from those
It might be 1000% more confidential, but is it effective? Anecdotal evidence doesn't count. For all we know AI therapy could be actively harmful to certain conditions. I'm not sure there's any published studies on this.
There was actually one published a few days ago that concluded that it can be effective:
Participants with depression experienced a 51% reduction in symptoms, the best result in the study. Those with anxiety experienced a 31% reduction, and those at risk for eating disorders saw a 19% reduction in concerns about body image and weight.
However the person who did the study shares your concerns:
I asked Heinz if he thinks the results validate the burgeoning industry of AI therapy sites. “Quite the opposite,” he says, cautioning that most don’t appear to train their models on evidence-based practices like cognitive behavioral therapy, and they likely don’t employ a team of trained researchers to monitor interactions. “I have a lot of concerns about the industry and how fast we’re moving without really kind of evaluating this,” he adds.
Also they did another article about difficulties and pitfalls of making these things
even that is questionable professionally
Thank you very much-o, Doctor Roboto
Shrink-ROM from Terry Gilliam's Zero Theorem.