From the start of the AI boom, users have spoken to chatbots as if they were real people. They attach names to them, give them personalities and even generate images of what their chatbot “companion” would look like as a Studio Ghibli character. Many have grown so emotionally attached that they now use AI models as emotional helpers and substitutes for therapists.
During COVID-19, online psychotherapy was a necessary tool for clinicians and patients to continue treatment under social distancing requirements. Now, however, there’s a new type of “online therapy” emerging among young adults: the thousands of AI chat models available at the click of a button.
The relationships that people form with these models are immensely concerning — if they can even be called “relationships” at all. These programs may mimic human conversations, but they are not human.
It is understandable why some users lean towards these models. It is affordable, convenient and accessible to those who are unable to get the help they truly need. And in their own right, they are valid. There is real solace in having something that tells you exactly what you need to hear in times of need.
But this is just a coping mechanism.
As someone who has been in therapy for over three years, ranging from twice-weekly meetings to maintenance check-ins once a month, these services are meant to help you grow and understand your mental health in a way that is manageable and realistic to your life. And these chatbots, by contrast, mirror your language and tell you exactly what you want to hear. That just creates an emotional echo chamber — not progress.
Over time, relying on a chatbot for emotional validation will do more damage. Your critical thinking and problem-solving skills will be exchanged for instant gratification.
And that’s before considering the ecological damages that AI chatbots cause. The mental repercussions alone should be enough of a warning.
AI systems do not simply generate information on a whim when given a prompt or sent a message. They follow an algorithm shaped by your usage of the program. They analyze everything you send to curate the perfect response for you.
Because there is rarely any pushback, users become hooked on the constant validation they receive. The advice you seek sounds perfectly aligned with your feelings because you trained it to respond that way.
Even worse, the models have been known to mishandle these situations, sometimes ending in true harm and tragedy.
This issue became so serious for ChatGPT that on Aug. 7, OpenAI released GPT-5, a new update that intentionally made the model less personal. Many users pushed back on this, claiming their “companions” had been taken away.
One Reddit user on r/ChatGPT wrote, “BRING BACK 4o, GPT-5 is wearing the skin of my dead friend.”
Users threatened to end subscriptions to the model, and in order to keep money in their pockets, OpenAI allowed the previous, more personable 4o model to be a toggleable option.
Overuse of AI is one of the many things contributing to the epidemic of anti-intellectualism, the erasure of culture, emotional intelligence and weakening of community bonds.
Grok’s two-second replies are easier and quicker sources of “comfort” than reaching out to a friend who can provide real empathy.
It is isolating by nature.
To understand the appeal, I asked a chatbot to console me about a personal issue.
It responded with breathing exercises, reassurance that the situation would be ok and steps on how to manage my emotions within the situation.
But one line that struck me was, “What you’re feeling is very human.”
I stared at it thinking, “This AI understands nothing about what it means to be a human.”
It has never experienced true heartache or stomach-churning anxiety. It has never felt joy. It gave me bullet points and prewritten, validating responses. No nuanced, outside perspective.
These chatbots are simply mirrors and algorithmic reflections of ourselves and its users. They do not — and cannot — care.
***
TOP PHOTO: Student in public seating area in distress, making conversation with a generative AI model on their device. Many overwhelmed young adults find comfort conversing with their chatbots. (Photo courtesy of Pexels)
Milo Jones is a staff writer for The Express. Follow him on X @stayonm4rs.
