Would You Use AI as a Therapist?

Factor in the price, if you get to see a therapist face time for free(i.e , insuranc covers it all), of course you will visit the therapist. But if the therapist costs $250 per hour out of your pocket, and AI costs $50 per month, are you still going to see a therapist?
With my healthcare provider(Kaiser Permanente) there are no co-pays for mental health services.
 

I refuse to mess with AI at all. I use images online but that's it. I don't use any other AI stuff at all. I don't plan on it either unless they force it on us like they do everything else in this world.
 
AI is designed to "learn" your positive responses, and its programming is to "echo" it back, using the data programmed into it from a myriad of Web sources (for which they are being sued by various companies, btw, who did NOT give permission), but using its base data to form answers without context.

IOW, it is not "thinking" about what may or may not be your state of emotional health and well-being. It is simply regurgitating miscellaneous data specifically designed to make you feel you are talking to an actual person who is interested in you.....

...except that AI is not, never is, and never will be, actually interested in YOU.

And in fact, the more data that is fed into AI programming to improve its 'thinking', the more AI is prone to "hallucinate" (actual technology term) and completely fabricate an inaccurate or misleading answer. And the longer you 'talk' to it once AI hallucinates, the more likely it is to lead you to disassociate from the real world.

AI is a very good tool. But at this point it is like a hammer, and relying solely on AI is like a contractor going out to build a house but all he has is that hammer in his toolbox and truck.

AI is not designed to make you a better person, or a more complete person, or to help you overcome your faults, your emotional conflicts, your failures. If you think it can, you are fooling yourself that the primitive state AI is still in, which is that hammer, is going to help your emotional headache.

This article details a couple of people who fell so far down the AI rabbit hole that one of them still has not really recovered, based on what he's saying now:

They Asked an A.I. Chatbot Questions. The Answers Sent Them Spiraling.
Generative A.I. chatbots are going down conspiratorial rabbit holes and endorsing wild, mystical belief systems. For some people, conversations with the technology can deeply distort reality.
NY Times 13June2025

free link (you may have to register, but NYT does NOT spam you): https://www.nytimes.com/2025/06/13/...e_code=1.PU8.bqFi.VE3CVEi0fF6x&smid=url-share
 
Hollywood often simulates a comical therapy session where a therapist simply repeats what the client says. And the client has this puzzled look, of "Yes, that's what I just said." It doesn't seem helpful at all, but it's an example of actual therapy called "client centered," introduced by Carl Rogers, one of the heavy hitters in counseling of the 20th Century. There were others, whose methods are entirely different, but also successful. So there are probably different ways to skin the same cat. I can't say which methods are better, and much of it depends on chemistry between the client and the therapist.

Probably the most important factor is connecting with the therapist, and people connect with others is so many different ways, sometimes in ways I don't see as beneficial, but still might be. I think this can happen with an AI bot, but I wouldn't say an AI bot is better than a therapist. Better than some depending on personalities of clients, and not at all useful for others. People often go through therapists like candy, until they find one they like, or one that seems most helpful.

I wonder about the quality of AI's accumulation of data and opinion and if it has some ability to sort out truth from misinformation. If it does and bases it's responses on truth, well... some people don't like the truth. I'm not sure where that leads us.
 

Back
Top