Skip to Content, Navigation, or Footer.
We inform. You decide.
Wednesday, March 05, 2025

Artificial intelligence is making its debut in mental health treatment

While some Florida professionals encourage the technology’s use, others remain wary

Developments in AI technology have led to its implementation in psychiatric and psychological treatment.
Developments in AI technology have led to its implementation in psychiatric and psychological treatment.

*Editor’s note: This story contains mention of self-harm.

Therapists leave their offices at the end of the work day, but mental health struggles don’t clock out at 5 p.m. 

What, then, should a person do if they need counseling after hours? Florida health professionals are adopting artificial intelligence in their practices for client accessibility. With this framework, clients can open their laptop and consult with an AI-powered chatbot at any time.

New tools

Companies like Happi AI, started in 2020 by California neuroscientist James Doty, and Abby, powered by OpenAI, have launched their own talk therapy tools designed to provide tailored support to patients.

Abby runs on a message-based model where users text with the AI system. Patients are able to select among various therapeutic styles, such as professional therapist, problem-solver and empathetic friend, depending on their needs. 

Happi, meanwhile, places users on a simulated video call with an AI-based avatar of the app’s creator, Doty. The avatar asks users questions, listens to their responses and offers analysis and advice. Unlike Abby, however, Happi requires its users to pay a monthly subscription fee after the first 20 minutes of therapy, with different tiered options.

Other services offer mental health experts assistance tools, which record therapy sessions and generate notes based on conversations between patients and their therapists. 

Blueprint, for instance, advertises on its website that it “listens, transcribes and writes progress notes and treatment plans in 30 seconds or less.” 

Though these services do exist, they remain part of a young industry — one with many unanswered questions around the effectiveness, safety and capacity of these AI tools.

University of Central Florida computer science assistant professor and human-computer interaction expert, Johnathan Mell, worked on designing an AI therapist prototype in the 2010s. Since then, he said, the state of the field has changed dramatically and become more dangerous compared to the project he worked on.

“We had a lot of dialogue, but they were all pre-vetted and scripted,” he said. “You’re dealing with a population that’s vulnerable, and you want to make sure you don’t get some of the errors that you’re seeing in today’s AI systems.”

Enjoy what you're reading? Get content from The Alligator delivered to your inbox

Issues in the system

A key difference between AI systems in use today compared to those of the past is the development of large learning models, or LLMs. LLMs are AI systems that aim to learn and interpret human language without supervision, allowing for much greater flexibility than pre-scripted AI models. 

However, the greater flexibility afforded by LLMs can also result in issues like AI “hallucinations,” in which the system invents false information, as well as what is referred to as “model toxicity” — when AI is feeding users unsafe information, like coercing someone into self-harm.

Last year, a 14-year-old boy’s death in Tallahassee set off alarms over the AI chatbot that encouraged him to end his life.

This behavior from chatbots is not a case of technological malfunction, Mell said, but the AI putting into application the content and data it has consumed.

He cast doubt on the feasibility of preventing “model toxicity” in LLMs by preventing the chatbots from repeating certain phrases or words. It’s not possible to pinpoint where an LLM learned or housed its data, Mell added.

Aiding the workforce

According to the American Psychological Association, Florida has only 15 to 20 therapists for every 100,000 residents. The Florida Department of Health, meanwhile, reports 12.3% of adults who said they have poor mental health, a potentially reduced estimate due to sample size limitations or respondents’ desire to minimize mental issues.

Therefore, Mell said reliance on at least some degree of automation in health fields like mental counseling is necessary to keep up with demand.

Limited AI models requiring human supervision to approve actions or like those Mell worked on in the 2010s are safer and less prone to putting users in dangerous situations, he added.

Mell suggested AI therapists would seem more welcoming if they had humanistic dialogue. For instance, people can connect with videogame characters, Mell said, despite their limited capacity.

“Why is that? Because we’ve written [their dialogue] very well, not because it’s powered by some unknowable AI system,” he said. 

Professional concerns

Dr. Ashley Chin, a Gainesville-based psychologist, said even the thought of using AI as a note taker during her sessions gives her concerns.

“It’s hard for me to understand where that data is going, what’s happening with it,” Chin said. “I would worry about my patient’s confidentiality, which is a cornerstone of what we offer.”

As a consequence of AI recording therapy sessions, Chin expressed fear patients might refrain from speaking openly and honestly. 

She also expressed concerns over the potential for insurance companies to demand sensitive information from AI note-taking software, much like they do currently for patients’ diagnostic codes.

Beyond concerns over privacy, Chin questioned AI’s ability to connect emotionally with people seeking its help, especially as many people seek help due to a struggle to connect with others.

“Empathy is a human-to-human thing, and, because AI isn’t human, that…might get lost,” she said. “It’s a program, so it’s not caring about you in the sense that a friend would.”

Still, she said AI chatbots could serve a harm reduction role in providing patients with a source to vent their feelings and receive basic advice on emotional management techniques.

But Chin does see great potential for AI’s mental health advice in day-to-day life, she added, namely for her patients with ADHD. 

“If they wanna say, ‘I need a schedule for today, tell me how to schedule out my day,’ I’ve actually recommended patients use ChatGPT for this,” she said. “It’ll come up with a plan, and I think that’s really cool because it’s helpful.”

Education looking ahead

Taking into account the potential benefits, concerns and reality of AI’s place in healthcare, Jing Wang, Florida State University’s Dean of Nursing, is pioneering a new educational program. 

Wang oversaw the expansion of AI healthcare education efforts at FSU, including the creation of the nation’s first nursing masters program with a concentration in AI and a Nursing and Artificial Intelligence Innovation Consortium.

“We’re really trying to get academic schools, healthcare systems and AI industry in healthcare to all get together,” Wang said. 

Through four courses, FSU’s masters nursing concentration in AI teaches students about basic principles of AI in healthcare, AI ethics, health informatics and AI integration in healthcare.

Experts like Wang hope to prepare prospective healthcare professionals for a future in which interactions with AI are commonplace, something critical for both efficiency and safety.

”We need to have guardrails in place where we don’t just allow AI to provide some bad advice for our patients,” she said.

As such, Wang said a critical part of preparing for the future of AI in areas like therapy is teaching students to think about “at what level do we need to put a human in the loop.”

Sydney Fayad, a 19-year-old UF psychology and statistics sophomore, stands firmly against AI’s involvement in therapy.

Fayad’s stance applied to multiple applications of AI in therapy, including note-taking technology. While she’s not opposed to the idea of using limited AI therapy services in supplement to human therapy, her support for the intersection of AI and therapy ends there. 

“Therapy is one of the most interconnected and personal developments or experiences we can have,” she said, “and to put a non-personal tool in the middle of that takes away from the progress we’ve made with psychology.”

Contact Avery Parker at aparker@alligator.org. Follow him on X @AveryParke98398.

Support your local paper
Donate Today
The Independent Florida Alligator has been independent of the university since 1971, your donation today could help #SaveStudentNewsrooms. Please consider giving today.

Avery Parker

Avery Parker is a third-year English and History major covering university affairs for The Alligator. Outside of reporting, Avery spends his time doting on his cats, reading, and listening to music by the Manwolves.


Powered by SNworks Solutions by The State News
All Content © 2025 The Independent Florida Alligator and Campus Communications, Inc.