Would You Use AI as a Therapist?

Jules

SF VIP
When this was being discussed on a podcast, my first reaction was absolutely not.

Listening to some pros and cons, maybe, if I wanted a therapist.

One (he) was in favour of using a live therapist, or at least through Zoom. He has a therapist.

She was accepting of using the pro version of ChatGPT or similar. Like when asking for any personal advice, you’d have to remove any details that can be traced back to you.

Some of the positive reasoning was the scarcity of therapists, the cost ($180+/hr), access at any hour or the day. A.I. has scraped info from therapists from all over the world, whereas your local therapist may be limited in resources.
 

I don't think it would hurt. I don't know how effective it would be. It might be effective. I'd like to hear from people who have tried it.
 
I have used an AI to have a serious discussion about complicated personal stuff. A couple times it worked pretty good. I felt better afterwards. I learned something. Other times we didn't get too far. :)
 

I think that it would be a good place to start, help to provoke thought and offer the conventional wisdom but I would balk at paying anything more than a nominal subscription fee of some sort.

For $180.00/hour I could probably learn to deal with almost any of the problems that I’m likely to experience in my life.

It’s sad that we live in a world where we pay professionals to do what friends used to do for free.

1749070200323.jpeg
 
With AI therapy, I think it would be possible for the patient to control the direction the session was going. Seems AI mostly responds to user input rather than with original esoteric thought.

With that opinion in mind, I think it would be entirely possible for the patient to hear what they wanted to hear from AI. I also think it possible that along the way, AI might offer some course of resolution or further resources ... in the direction the patient has proposed.

But not having actually tried it, I could be totally wrong of course.
 
Okay, I just asked ChatGPT ... Would you use AI as a therapist?

Response:

Using AI as a therapist is a complex and very personal decision, and whether someone should or would use it depends on their needs, expectations, and circumstances. Here are the pros and cons to help you think through it:




✅ Why Someone Might Use AI as a Therapist


  1. Accessibility:
    AI is available 24/7, often at a lower cost than a human therapist. This can be crucial for people who can't afford regular therapy or live in areas with few mental health resources.
  2. Anonymity & Reduced Stigma:
    Talking to AI can feel less intimidating for some, especially when discussing sensitive or stigmatized issues.
  3. Immediate Support:
    For people in crisis or emotional distress, AI tools can offer instant, if limited, support — like grounding exercises, mood tracking, or cognitive-behavioral techniques.
  4. Data Tracking & Reminders:
    AI can help track mood patterns, therapy progress, and remind users to practice self-care routines or mindfulness exercises.



⚠️ Limitations & Risks


  1. Lack of Deep Empathy & Understanding:
    AI can simulate empathy with words, but it doesn’t truly understand human emotion. That nuance matters a lot in therapy.
  2. Not Suited for Severe Mental Health Conditions:
    For conditions like PTSD, major depression, schizophrenia, or suicidal ideation, human professionals are necessary. AI should never replace emergency mental health care.
  3. Data Privacy Concerns:
    AI tools collect data — sometimes deeply personal. It's important to check who owns that data and how it's used.
  4. Overreliance:
    Some users might begin to lean on AI too heavily, avoiding human relationships or professional help they truly need.



🤖 Examples of AI Therapy Tools​


  • Woebot: A chatbot based on CBT (Cognitive Behavioral Therapy) principles.
  • Wysa: AI-driven mental health support with the option to connect to a human therapist.
  • Replika: A conversational AI that some use for emotional support, though it's not therapy in the clinical sense.



🧠 Final Thought​


AI can be a helpful tool in mental health — especially for basic support, journaling, or CBT-style guidance — but it’s not a full substitute for human therapy. If you’re dealing with deep emotional pain, trauma, or mental illness, a licensed therapist (or a combination of therapy + tech) is the safest and most effective route.




Would you like help evaluating a specific app or comparing AI therapy tools?


/ END OF ChatGPT RESPONSE /
 
Last edited:
A therapist, in my experience, is a clear thinking person who can see where your thinking might be off and is skilled at asking questions designed to steer your thinking in the right direction. Often the therapist fishes to find areas of your life that you avoid talking about. It is then the therapist's task to gently guide you into discussing problem areas in an attempt to help you to a better self awareness.
Just the sort of thing you'd expect from an advanced computerized search engine.
 
I would use AI doctors any days if they are cheap and readily available. Healthcare in the U.S. is prohibited expensive and complicated. I don't care some people say U.S. healthcare is better quality, when such good quality healthcare is not affordable and/or not available to me.
 
Last edited:
@Naturally, that seemed like a balanced response that ChatGPT gave you.

If someone doesn’t have a friend or wants to talk in the middle of the night, AI might be an alternative. Before doing that, deciding which platform is safest to use would be wise.
 
Would You Use AI as a Therapist?
Uh, if I were motivated to seek therapy(counseling) I would actually prefer a face to face appointment with a human of my choosing. That said, judging by the current performance of Google Search's AI Overview, I think AI could be a valuable tool for gathering relevant content for mental health solutions.
 
Uh, if I were motivated to seek therapy(counseling) I would actually prefer a face to face appointment with a human of my choosing. That said, judging by the current performance of Google Search's AI Overview, I think AI could be a valuable tool for gathering relevant content for mental health solutions.
Factor in the price, if you get to see a therapist face time for free(i.e , insuranc covers it all), of course you will visit the therapist. But if the therapist costs $250 per hour out of your pocket, and AI costs $50 per month, are you still going to see a therapist?
 
Just a huge common sense NO!. It's makes too many mistakes on simple things, I wouldn't use the Internet for anything this personal and important.
 
I have a therapist I meet with over video and that works well for me. Certainly, my attendance in therapy is much better now that I don't have to drive there.

But I wouldn't be interested in AI therapy. Taking the human component out of therapy strikes me as potentially dangerous in that it could become circular and worse, bad, even harmful advice seems almost inevitable.
 


Back
Top