Can AI Provide Therapy

There are now more than 20,000 apps available to help people manage their mental health. If that wasn’t shocking enough, wait until you learn that many of them are using AI chatbots to facilitate a therapy-like experience and that more and more people say they prefer it.

Apps like Wysa and Woebot include AI generic tech that maintains complex conversations through text with users that include treatment options. A little too futuristic for you? You’re not alone. Yet, these treatment strategies are the same ones that real therapists are trained to use and deploy. There have also been reports that people are even relying on Chat GPT for therapy, which is not what it was designed for.

Still, many are saying they feel less judged communicating with AI, and certain stigmas seem eliminated when simply texting on their phone. Is this the future of difficult conversations and the need for privacy? I asked several people in the business.

One consensus was that people are drawn to the new resources AI can offer them. It can be hard to make that call, find that person, and work within traditional business hours. Some people live in more remote areas where the resources of all healthcare providers are limited. Another issue that was noted is the costs involved. There are a lot of people without insurance and others who have insurance but with limited coverage.

According to Minette Bennett, LPC, the world of AI therapy can be a double-edged sword. “Generally speaking, AI is likely to pick up on word patterns that point to disorders, trauma, and unhealthy thinking patterns faster than humans do. They may be better at pointing or easing clients in the directions necessary to bring clients to helpful epiphanies sooner.”

Others, like Bennett, raised concerns about the true anonymity of this type of therapy. Who is behind the AI? Some would-be users of AI therapy are hesitant due to questions about how the information shared in AI sessions may ultimately be collected, stored, and somehow used in the future. Bennett says, “While confidentiality can be a concern with a human counselor, there’s also no guarantee that online AI can’t be hacked, and the information you share can’t be used against you at some point.”

1 in 5 people live with depression or anxiety, according to multiple sources, but there are a lot of limitations when it comes to getting therapy, let alone quality help. For those who struggle with mental health issues and managing daily life, the obstacles of navigating the systems, insurance plans, provider waiting lists, and other “red tape” type are even more formidable challenges to getting the help they need. Sometimes family members not being supportive of therapy for their loved ones. Hard to believe, I know, but there are still many fearing what a therapist might accidentally trigger, including someone wanting to leave a marriage or even a religion.

With her years of experience, Bennett emphasizes the importance of human connection, saying, “We heal in relationship! There is no substitute for connection with a trustworthy, loving person who’s cheering you on through thick and thin, and looks forward to seeing you each week.” She goes on, “I can’t imagine calling a robot if I’m suicidal. I wouldn’t be needing logic, I’d need compassion and love along with direction.”

Counselors and other mental health professionals deal with very serious and even life-and-death decisions. Having instant access to help can make all the difference in some situations – just look at the popularity of Suicide Hotlines. Aside from times when someone isn’t in acute crisis, the type of help they seek may also depend on their personality or their comfort with automated systems.

Everyone is getting more comfortable with technology nowadays, it’s true. People seem more detached from one another, and many of us are used to texting more than talking. But Bennett also brings up a good point about our need for care and attachment. “I recall an experiment in the 1960’s with baby monkeys: Harlow’s Monkey Experiment and Attachment Theory. Basically, the baby monkey preferred the soft-wrapped fake monkey to the one that was simply a wire frame but had food. I think the human counselor wins!”

So, is AI the answer?

Let us know your thoughts in the comments

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Scroll to Top