Blog

Can AI Be a Therapist?

by | Sep 19, 2025 | Blog

A Conversation with Clinical Director Denise Ingenito

By Alissa Striano

Over the past few years, AI has made its way into almost every part of our lives—including mental health. With feelings of isolation, anxiety, and depression on the rise, especially among teens, many young people are turning to AI-powered apps like Wysa and Youper to talk things out. These apps offer everything from mood tracking to mindfulness tips to conversations modeled after cognitive behavioral therapy (CBT). They’re free, available 24/7, and sometimes feel more private than a traditional therapy session.

Let’s be real, teens are already glued to their phones, and talking to an app might feel less intimidating than opening up to a human. But can AI replace therapy?

As a volunteer with a personal interest in the future of AI, I sat down with Denise Ingenito, Clinical Director at North Shore Child & Family Guidance Center, to talk about it. Denise oversees mental health programs in the agency’s Roslyn Heights and Westbury locations, manages intern training programs, and plays a role in the Guidance Center’s services for kids and families dealing with co-occurring disorders.

Real People, Real Healing

At the Guidance Center, Denise and her team offer way more than just talk therapy.

“We have two girls’ groups for self-esteem, two boys’ groups, an LGBTQ+ group, an anti-bullying group, and parent groups,” she told me. “There’s also our Wilderness Respite Program where teens hike every other week, a Latina girls’ group, and our Community and Home-Based Services team that goes directly into homes to work with both kids and parents.” These programs are built on relationships, empathy, and human connection—everything AI can’t replicate.

Initial Reactions to AI Therapy

When the topic of AI therapy came up in their monthly clinical meeting, Denise said the room was split. “Some clinicians were really concerned about where AI is headed. Others saw it as a useful tool, but only in certain areas, like helping with notetaking or organizing treatment plans.”

Her personal take? “I think AI is potentially dangerous when used for therapy. It’s not really challenging clients or helping them examine distorted thinking. It’s just validating everything they say, and that’s not always helpful.” She added that while AI might be great for tasks like scheduling or documentation, therapists have to be careful not to let technology create cookie-cutter treatment plans.

What About Teens Using AI for Support?

Some teens do use AI tools late at night to vent or journal, Denise explained. And while it might be nice to get something off your chest, there’s no processing involved—just one-way communication.

“A robot doesn’t have empathy. It’s not going to gently challenge you or say, ‘Hey, that doesn’t sound quite right.’”

She also brought up a real example: A student at Bellmore High School texted a live counselor during class, saying she was thinking about hurting herself. Because a human was on the other end, they were able to intervene and get her help immediately. “That would never happen with AI,” Denise said. “There’s nothing better than a human response.”

Can AI Fill the Gaps in Underserved Communities?

When I asked Denise if AI could help kids in underserved areas throughout Long Island, she was skeptical. “A lot of them don’t even have access to technology’s mental health tools. And many kids can’t tell the difference between credible resources and harmful advice on TikTok.”

At Westbury High School, the Guidance Center actually has a clinician on-site to work with teens that are dealing with substance abuse and mental health issues—something Denise hopes to see more of in Nassau County. The New York State Office of Mental Health is pushing for mental health services to be implemented in schools, and the Guidance Center is fully on board with that.

Blending Tech with Therapy (The Right Way)

Still, the Guidance Center isn’t totally anti-tech. Denise mentioned a clinician who uses handheld devices to help kids express emotions through short games. “We’ll play a five-minute game and then have a conversation about how they felt. It opens the door.”

She also brought up storytelling in trauma work—an area where technology could one day play a bigger role. But she emphasized that AI should be a tool, not a replacement. “We need to educate both clinicians and the public about the pros and cons of AI. It’s already shaping our decision-making, even in the simplest ways. I mean, we’re asking apps what to have for breakfast.”

So… Where Does AI Really Belong?

At the end of the day, AI therapy isn’t black and white. It can offer comfort in the moment, help fill small gaps, and even assist therapists behind the scenes—but it can’t truly replace human connection, empathy, or judgment.

As we look toward the future of mental health care, especially for teens, the real question isn’t if we should use AI, but how we use it. Will we treat it as a quick fix, or will we stay focused on building real relationships, one conversation at a time?

Share this on social
Home 9 Blog 9 Can AI Be a Therapist?

Recent Posts