top of page
Writer's pictureMelissa Fleur Afshar

PEOPLE ARE USING CHATGPT FOR THERAPY—BUT IS IT A GOOD IDEA?

Newsweek Exclusive Feature


As more people turn to ChatGPT for their mental health, professionals warn that they must tread with caution.


With persistent, fast-moving advancements in artificial intelligence (AI) cornering most of us every day, even the most technology-shy have begun to accept that AI now infiltrates nearly every aspect of our lives. However, while AI may be useful for retrieving data and making predictions, using it for the intimate and challenging endeavor that is therapy is one purpose you likely wouldn't have seen coming.


Yet, increasing numbers of people are sharing how they are using ChatGPT and other AI-led bots for "makeshift therapy"—which has also left experts questioning how safe this new practice is.


ChatGPT has 200 million monthly active users worldwide, with 77.2 million people using the OpenAI tool in the U.S. alone. Shannon McNamara, a podcaster and content creator, is one and often uses it as a therapeutic tool. McNamara, who is known as @fluentlyforward online, has enjoyed success after leveraging the power of social media to spearhead her own podcast. Still, like most people, she has bad days too, and has often found herself seeking out the support of her AI bot in times of need.


"I use ChatGPT when I keep ruminating on a problem and can't seem to find a solution, or even just to understand my own feelings," McNamara told Newsweek. "I'm shocked by just how incredibly helpful it is.


"Myself and all of my friends have found using ChatGPT in this way, for makeshift therapy, to be really, really helpful."


While the creator calls the responses Chat GPT provides her with "long," she says they usually cover a variety of solutions and have made a significant impact on her life and health. While McNamara acknowledged the potential privacy concerns associated with sharing every little detail with a bot, she felt that the benefits currently outweigh the risks.


"Who knows, maybe in five years when the robots take over I'll regret being so raw to ChatGPT!" she added.


McNamara shared how she uses ChatGPT in a TikTok video from July 24. The creator showed viewers online how she interfaces with the chatbot, much as she would with a journal or a therapist.


The post, captioned "how I use ChatGPT for makeshift therapy or a way to understand my feelings," has gained substantial traction online and has prompted a larger conversation among viewers about the merits and pitfalls of using AI for one's mental health.


Several Gen Z creators have also shared their experiences with doubling up their ChatGPTs up as therapists. One, @ashdonner, shared a lighthearted clip to TikTok in July, detailing how she uses the AI tool for support when she needs it.


Can AI Curb the Mental Health Crisis?


The U.S. is currently grappling with a mental health crisis, marked by a significant rise in stress, anxiety and depression.


The Anxiety and Depression Association of America (ADAA) reported that Generalized Anxiety Disorder (GAD) affects 6.8 million adults, or 3.1 percent of the U.S. population, with major depression often co-occurring.


Data from the American Psychological Association's 2023 Stress in America survey revealed that many Americans, particularly those aged 35 to 44, cite money and the economy as major stressors.


This surge in mental health issues underscores the need for comprehensive care and increased accessibility, but the high cost of traditional therapy compared with the increasing accessibility of AI tools is driving more people to turn to platforms like ChatGPT for emotional support.


Last year's average cost for a therapy session in the U.S. ranged from around $100 to $200, making it unaffordable for many, especially young adults and teenagers. AI tools, on the other hand, are often free or low-cost and available 24/7, providing an attractive alternative. Despite this appeal, mental health professionals have voiced concerns about this burgeoning trend.


"Using artificial intelligence as a substitute for therapy is not comparable to real therapy," Rachel Goldberg, psychotherapist and founder of Rachel Goldberg Therapy in Studio City, California, told Newsweek. "While AI can prompt curiosity and offer new perspectives, especially for someone struggling alone and in need of a quick way to release emotions and cope, it has significant limitations.


"One of the most crucial aspects of successful therapy is the connection between therapist and client. Research shows that this human connection is the foundation for why therapy works in helping someone to grow."


Goldberg cautioned that connections are essential for clients to feel safe enough to explore themselves and achieve personal growth through their therapy sessions.


"This type of vulnerability and growth cannot be replicated by AI, as it lacks the ability to form genuine human connections," she added.


While AI may be helpful in providing quick access to coping strategies or prompting self-reflection, it can only go so far. Without human connection, most people would likely lose interest in continuing to engage with it.


In comparing AI to platforms like BetterHelp, or the new phenomenon, Griefbots, which have faced scrutiny for providing inconsistent care, Goldberg noted that the impact really depends on the client and type of care they need.


"Inconsistent care from a therapist can be harmful, potentially leading to feelings of rejection or mistrust," Goldberg said. "In contrast, while AI hugely lacks in personal touch, it doesn't carry the additional risk of emotional harm from inconsistent human care.


"The difference between AI and real therapy is comparing apples to oranges. It would be more fair to compare a therapy workbook to AI because, at present, AI cannot match the ability to empathize and validate a person in a meaningful way."


THANK YOU FOR READING


COVER IMAGE CREDIT: NEWSWEEK


Comments


bottom of page