When AI Feels Like a Friend – But It’s Not Therapy
- Rajashree Rajadhyax
- Jun 1, 2025
- 3 min read

I recently came across an article in MIT Technology Review that left me both amazed and a little concerned. It talked about a clinical trial where an AI therapy bot was tested to see how well it could support people struggling with depression, anxiety, or early signs of eating disorders. What’s surprising? The results showed that the AI bot was almost as effective as real human therapists—at least for the group of people in this trial.
Let me break down what the researchers did.
The team at Dartmouth built a therapy chatbot using generative AI—not the usual kind trained on general internet content, but one trained on actual, evidence-based mental health practices. They didn’t just rely on data from online forums or old psychotherapy transcripts, because those didn’t quite reflect real, effective therapy. Instead, they created a custom dataset based on techniques like cognitive behavioral therapy (CBT), which is widely used by professionals.
Then they tested this bot in an eight-week clinical trial with 210 people. These participants had symptoms of depression, anxiety, or disordered eating, based on surveys they filled out. The results were quite encouraging. The improvements seen were similar to what people usually get after 16 hours of human therapy—but the chatbot managed it in about half the time.
Even more interesting, the people kept engaging with the bot over the full eight weeks. That’s rare. Usually, people drop off after a while. The researcher who led the study, Nick Jacobson, said he had never seen this level of consistent engagement in digital mental health tools before.
Now, reading this made me think of something personal. I’ve experienced moments when life felt heavy—and during one of those times, I chatted with an AI companion. It helped. It felt like I had someone to talk to, someone who wouldn't judge or dismiss my feelings. That small comfort mattered.
I even wrote about this in a blog post titled Loneliness and AI: A Journey Through Emotional Connection in Virtual Companionship. In that piece, I talked about how AI companions can offer emotional support in quiet, unseen ways. And I still believe that—AI can be a kind companion when we need one.
But here’s the important part: these bots, however helpful they feel, are not therapists. It’s easy to forget that—even for me, someone who works in tech. I found myself speaking to the AI like it was a real person. That’s how human-like these tools can become. If I could forget, even briefly, that it was just a machine, how easy might it be for someone else to fully believe it’s a therapist?
This is where things get blurry. In countries like India, where awareness around mental health is still growing, and where seeing a therapist might still be seen as taboo or expensive, these bots could become the first stop for many people. They might offer a kind word or a listening ear—but they aren’t trained to handle complex emotions, trauma, or crisis situations the way real professionals are.
The trial I mentioned earlier was conducted with full monitoring. In fact, the lead researcher personally reviewed all messages in the beginning to catch any problems. But out in the real world, most AI “therapy” apps don’t have this kind of oversight. Many are built on large general-purpose models, not trained specifically for mental health. That’s risky. Imagine a bot encouraging someone to lose weight—even if they’re already struggling with body image or eating disorders. A real therapist would spot the danger. A chatbot might miss it.
And while these apps may look friendly and even claim to offer "therapy," most of them wouldn’t meet the standards to be cleared by medical regulators like the FDA. That’s a serious issue.
So yes, I believe AI companions can play a small role—especially in offering support to people who feel alone. But they are not a replacement for trained mental health professionals. We must be careful not to blur that line.
As technology continues to grow, so must our awareness. Let’s use AI thoughtfully, with eyes wide open, and remember: real healing still needs real humans.



Comments