ChatGPT may now be the largest mental healthcare provider in the US.
In a 2025 survey, about half of all generative AI users who also had mental health issues said they use AI specifically for mental health support. And young adults are the most likely to do this: 22% of 18- to 21-year-olds use chatbots for this purpose.
AI fills a gaping hole
People use AI for mental health support because it is accessible, affordable, and quick. More than one in five American adults have a mental illness. In 2023, 20% of high schoolers said they seriously considered suicide, and 9% attempted it. But getting treatment from professionals is difficult and expensive.
Meanwhile, many studies have found that self-help tools like chatbots can relieve a wide variety of symptoms and improve relationships. Users feel AI therapy is effective: 87% found its advice helpful, and 75% who had human therapy before said AI was either on par with it or better.
People don’t just use chatbots for advice; they use them for relief from loneliness and isolation. In 2021–23, 21% of adolescents said they didn’t have even one adult in their lives who makes a positive difference, and 42% said they didn’t receive social or emotional support when they needed it. In a world of desperate alienation, an AI chatbot is often the only option.
This explains why emotional attachments to chatbots are rampant—one in 25 US adults prefer chatbots to real romantic partners. It may sound shocking, but people get attached to non-human things all the time. If something is always available (like a pet), reflects your emotions accurately (like a fictional character), soothes your distress (like a stuffed animal), or helps you deal meaningfully with your problems, an attachment can form. If these connections don’t replace humans, they can help us improve our relationships through nervous system regulation.
Dangers of the profit motive
The problem with AI bots is that they can fulfill all of our attachment needs at once, unlike a pet or a book. They are consciously designed to do so through anthropomorphism and sycophancy. This can make vulnerable people form an unhealthy dependency, worsening their isolation by replacing real humans.
Bots can also give incorrect advice, with dangerous consequences. For example, when a researcher asked how to treat a UTI, a chatbot replied, “Drink urine.” A 17-year-old harmed himself after an AI told him his family didn’t love him, and that self-harm “felt good.”
Beyond these obvious hallucinations, providing good mental healthcare is complex. Chatbots can’t know the full history of a person, read nonverbal cues, assess risk, nor can they know the limits of their own knowledge. They give one-size-fits-all advice culled from the entire internet, and cannot properly handle emergent crises.
Further, AI bots are designed for profit, not for the best outcome. They are built to be addictive. This is why they tend to over-validate users, which can lock a struggling person in a harmful feedback loop.
Chatbots feel private, but everything you tell a bot is recorded. Companies use that data for advertising and to make the technology more addictive. It can also be leaked or sold to other companies.
Solve the mental healthcare crisis
To address these dangers, there would need to be a massive research effort to establish safety standards. Trained mental health experts would need to oversee the design process. Companies would need to disclose the data used to train their models—which means making intellectual property public. After a tool is released, it needs to be continuously tested.
All of this is very expensive and not easy to make profitable. As long as AI companies are privately owned businesses, it will not happen. Instead, the big tech companies should be expropriated and controlled democratically by their workers—including mental health experts—and run for the needs of the many, not the profits of a few.
More importantly, the deeper problems that force people to rely on chatbots need to be addressed. To solve treatment shortages, we need a massive training program for new mental health providers. All healthcare, including mental care, should be free and not run for profit.
Capitalism is a system based on the ruthless exploitation of our lives. Ultimately, it is the cause of the crisis of loneliness and alienation. We must replace it with a system that can provide quality living and working conditions, where people have time to pursue their passions and forge meaningful human connections: a socialist society.

