AI Can Listen, But Can It Heal?
What’s Missing in AI as
a Therapy Tool
Just a few years ago, when faced with life’s challenges, we had three options: keep them to ourselves, share them with a friend, or seek professional therapy. Now, there's a new option—consulting with ChatGPT about our mental health issues. Based on Wikipedia, ChatGPT is a large language model (LLM) that can generate human-like conversational responses.
Keeping struggles to ourselves can lead to long-term emotional strain. Sharing with a friend might bring mixed reactions, both supportive and dismissive. In-person therapy requires time and financial investment but often paves the way for lasting transformation. So, what gap is AI filling—and what is effective and what is it missing? And what makes ChatGPT a “human-like” advisor but not exactly a human therapist?
The Comforting Voice
After her divorce, Lena turned to an AI for advice. Late at night, she poured out her grief, anger, and exhaustion. The AI responded promptly, validating her emotions with calm, supportive words. Lena felt comforted. It was easy, predictable, and safe.
For weeks, she returned to the screen, recounting memories of her marriage and seeking validation for her decision. And she got it. The AI mirrored her feelings—always affirming, never challenging. It felt like a soft landing place, a space where she could release her pain without fear of judgment. The AI echoed her emotions with responses like, "It’s understandable. You did your best."
It wasn’t that the AI failed to offer new perspectives—it did. But something was still missing, though she couldn't quite name it. It felt like the magic mirror from Snow White, calmly and neutrally responding, "My Queen, you are the fairest in the land." Why, then, did the comfort feel incomplete?
How Do We Know We Matter?
To truly feel that we matter, we need to see that our words and actions produce an effect on another human being. It’s not enough to feel heard—we also need to know that our presence penetrates the world of the other and changes it. Something is getting created for both of us based on our dialogue. This awareness affirms our existence. If our words don’t produce any emotional effect, we may experience a nagging feeling of emptiness.
Think of when someone said, "I’m still thinking about what you said many years ago. Your words give me courage." Moments like this reveal that our words and presence—whether positive or negative—don’t simply vanish. They shape lives, linger in hearts, and become part of another person’s journey.
When we notice a change in someone’s eyes, a pause in their voice, or a shift in their emotions because of something we said or did, it signals, "I am seen. I matter." This dynamic is the heartbeat of meaningful relationships, where the interaction constantly shapes both partners.
We come to understand our significance not in isolation but through the mirror of the other’s response—when we see that we are not only affected by life but also actively shaping it in connection with others.
Though AI may quickly react to our questions intuitively, we feel the illusionary nature of this conversation. We don’t emotionally affect the large language model (LLM). Maybe we add to its technical capacity but it’s not enough to make us feel we matter on a deeper level.
Where Else AI Falls Short
We perceive 80% of communication through body language. AI can’t provide immediate feedback through leaning forward, changing voice tonality, or softening the gaze. These small, unconscious reactions signal that we need to adjust something to move the dialogue forward. Without these cues, AI can feel like an unyielding mirror, reflecting our emotions, expanding our knowledge, but never truly being engaged with us.
AI cannot describe real experiences or unique events, making responses seem abstract and not uniquely tailored to us. It can’t come up with a spontaneous story, a joke or a metaphor that makes the dialogue thought-provoking and alive.
AI doesn’t make mistakes. But making mistakes in a conversation is deeply valuable for the human soul because it fosters humility, growth, and authentic connection. A therapist may inevitably make an incorrect guess or ask a wrong question. By allowing ourselves to be imperfect in conversation, we engage in a process that nurtures self-awareness, invites discovery, and deepens relationships. We both feel imperfectly and uniquely alive.
You can’t laugh together with AI. But humor often helps unfreeze our most stuck issues.
All these subtle absences may produce a soulless, automatic feeling tone of the communication with AI.
The Risks of Using AI for Deeper Psychological Issues
Using AI for self-reflection or emotional processing can be helpful to an extent, but it also carries special risks, especially when dealing with complex mental struggles.
AI can provide immediate, accessible support when we are hesitant to seek human therapy or when access is limited. It can help clarify thoughts, identify patterns, offer basic coping strategies, and encourage emotional tracking. It can also guide us to valuable resources such as crisis hotlines or professional services.
But AI lacks the depth and emotional nuance that human connection provides. It cannot sense subtle emotional cues or unconscious dynamics, nor can it offer the resonance of human presence. For people experiencing serious mental health struggles, this absence of emotional attunement can lead to deeper feelings of disconnection.
If AI replaces genuine, meaningful relationships, we risk becoming stuck in emotional isolation and a self-validating loop, which can further alienate us from the community.
In conversations with a therapist, we constantly take emotional risks. What if we are misunderstood or misinterpreted? What if our thoughts sound controversial or crude? The relational space becomes a kind of playground where we flex our emotional muscles—practicing vulnerable expressions, learning to tolerate discomfort, building trust in ourselves, and developing resilience.
We can then transfer these skills into our everyday relationships, feeling more comfortable in our own skin, no matter what situations life brings. This process of taking an emotional leap of faith and being vulnerable with another human being is essential for emotional growth and authentic communication—but it is absent in conversations with AI.
Another risk is delaying professional help. Early intervention is crucial for many mental health conditions. Delaying proper care can result in deterioration or crisis.
AI cannot handle emergencies like suicidal ideation, psychosis, or severe anxiety attacks. Though there are studies that suggest that AI chatbots like ChatGPT can enhance patient-reported quality of life in a psychiatric setting, with high user satisfaction still more research needs to be done.
While it may suggest crisis services, it cannot actively intervene or provide immediate, human-centered support. Moreover, AI may unintentionally reinforce unhealthy thought patterns if it misinterprets input or fails to challenge cognitive distortions common in conditions like depression, anxiety, or OCD.
For these reasons, AI is best used as a supplement rather than a replacement. It can aid with psychoeducation, goal setting and motivation, but for deeper emotional work, especially involving trauma or complex relationships, human connection remains irreplaceable.
Recognizing AI’s limits and seeking human support when emotions feel overwhelming is essential. Healing often requires relational depth, emotional resonance, and feedback from another person. At the same time, the AI models, like ChatGPT and GPT-4, have shown promise in assisting clinicians and supporting individuals experiencing various psychological challenges.
Lena’s Realization
"It feels like the AI just… gives me abstract advice," Lena said to her therapist, her voice uncertain. "But I don’t know if it really feels me. And when I think about it, it’s a little like how it felt with my ex-husband. I would talk, and he would respond, but it was… flat. No emotional connection, no laughing together, no discovery. Like neither my troubles nor my joys really mattered to him."
The insight struck something profound. The AI hadn’t just been a tool; it had mirrored an old wound—the ache of feeling insignificant. With that realization, Lena felt a wave of grief.
She also noticed sadness in her therapist’s eyes, and in that moment, she felt both surprise and relief. Sitting across from her was someone who truly saw her as a struggling human being who was ready to be touched by her story, and that opened a whole new dimension to explore.
Note: Lena’s story is a fictional account created for illustrative purposes. It does not reflect the experiences of any actual individual or client.