Americans are increasingly turning to artificial intelligence (AI) chatbots like ChatGPT for emotional support, companionship and even self-guided “therapy.” With OpenAI reporting more than 700 million weekly users and over 10 million paying subscribers, digital conversations with machines have become a mainstream part of daily life. For many, these tools fill a gap left by an overburdened and expensive mental health system, but experts caution that unregulated AI use could do more harm than good.
The New Digital Shoulder to Lean On
In 2025, the NPR feature “AI Chatbots Are Becoming Mental Health Confidants — But Experts Are Concerned” documented the rising number of Americans using generative AI for mental health conversations. Users described these tools as judgment-free, instantly available, and emotionally responsive — qualities many say are missing from traditional care. Some use AI to vent late at night; others to reframe anxious thoughts or journal through grief.
Cost and access are key drivers. According to the 2023 National Survey on Drug Use and Health (NSDUH), about 54% of U.S. adults with any mental illness received mental health treatment in the past year, meaning nearly half still did not. Long waitlists, provider shortages and rising costs leave many without options. Independent analyses, such as those from SimplePractice, suggest that therapy sessions in the United States typically cost between $100 and $250 per session, depending on insurance, provider type and location.
“The affordability and accessibility crisis in mental health care is driving people to look elsewhere,” says Dr. Kevin Wandler, Chief Medical Adviser for The Recovery Village. “AI chatbots may feel supportive, but they are not a substitute for evidence-based clinical treatment.”
Promise and Peril: AI’s Expanding Role in Mental Health
AI tools can mimic empathetic responses and provide structure for journaling or cognitive reframing, offering temporary relief for mild stress or loneliness. However, these systems are not clinicians, and their “empathy” is algorithmically generated through predictive text patterns, not genuine understanding.
Ethicists also raise concerns about false intimacy: users forming emotional attachments to AI companions that simulate empathy without accountability. Without clear disclaimers or guardrails, vulnerable users, especially teens, may overestimate what these systems can safely provide.
“AI can approximate warmth and understanding, but it can’t replace the nuance and clinical judgment of human care,” says Dr. Brian D. Barash, Chief Medical Officer for The Recovery Village.
Regulation and Oversight Lag Behind
While the U.S. Food and Drug Administration (FDA) regulates digital health tools that diagnose or treat medical conditions, most chatbots skirt oversight by avoiding explicit therapeutic claims. This regulatory gap means AI mental health companions can operate without the safety testing required of even basic wellness apps.
Privacy is another gray area. Chatbots often collect and store sensitive user data- potentially including disclosures about trauma, substance use or suicidal thoughts. Experts are calling for new federal guidelines to address AI safety, data transparency and emergency response protocols. Until then, public education remains the strongest line of defense.
“People need to know these tools are not therapy, they’re simulations,” says Wandler. “If you’re in crisis, AI cannot and should not be your lifeline.”
The Traditional Therapy Market Faces New Pressure
The rise of AI companions is already influencing the therapy marketplace. Some clinicians have begun integrating generative tools into practice for journaling, thought analysis or symptom tracking, while others worry that reliance on chatbots could erode trust in professional care.
The Recovery Village’s nationwide behavioral health network continues to expand physician-led, evidence-based programs that address both substance use and primary mental health needs — a response to what clinicians describe as a “fragmented, inaccessible” care landscape. Programs like The Recovery Village South Atlanta Primary Mental Health Unit were created to help individuals struggling with anxiety, depression or trauma receive affordable, structured therapy under medical supervision, bridging the gap that drives many toward unregulated alternatives.
Where AI Fits In: Supplement, Not Substitute
When used responsibly, AI tools can serve as adjuncts to therapy — helping clients track mood, rehearse coping skills or express feelings between sessions. But for those in crisis, the stakes are too high. Generative models lack the ethical duty of care that licensed therapists are bound to uphold.
For anyone experiencing suicidal thoughts or severe emotional distress, experts urge immediate contact with a human professional or calling 988, the Suicide and Crisis Lifeline.
“AI chatbots may offer temporary comfort,” Wandler says, “but recovery requires connection, accountability and empathy — things only human care can truly deliver.”
Takeaway: Technology Can Help, but It Can’t Heal Alone
AI’s growing role in mental health reflects both progress and a profound gap in America’s care system. As technology continues to evolve, its potential to support well-being is undeniable- but so is the need for oversight, safety and continued investment in human-led care.
At The Recovery Village, experts emphasize that accessible, evidence-based treatment, not unregulated technology, remains the safest, most effective path toward healing.
Interview an Expert
Do you need a subject matter expert to interview on this topic? Dr. Brian D. Barash, Chief Medical Officer at The Recovery Village, is available. Call us at 407-304-9824 to schedule an interview or get more information.