In Ireland, we are increasingly using AI as a therapist. Gen Z users turn to Open AI’s ChatGPT for therapy more than any other demographic, but they’re not the only ones relying on this technology as an outlet and adviser when it comes to their mental health and emotional wellbeing. The AI therapy model is a growing global industry. Whether we’re asking Anthropic’s Claude how it recommends wording a tricky email to our boss or talking through a marital disagreement or a spat with a friend with ChatGPT, more and more of us are using the technology in this way.
In the UK, the National Health Service (NHS) recognises many people are using AI for therapy as they are caught in the limbo of long waiting lists and priced out of private treatment. Meanwhile, “ChatGPT-induced psychosis” is the new moral panic or mental health concern du jour (time will tell). However, it is clear that AI chatbots, which are designed to affirm the user and keep them engaged for as long as possible, tend to validate rather than challenge.
If you worry that every new person you date seems like a toxic narcissist rather than a flawed but well intended person with needs and feelings that don’t match your own, your AI adviser may not give you the pushback or perspective that you need. It’s closer to discussing your problems with a supportive but biased friend than an experienced mental health professional. The kind of friend who treats being nice as a moral virtue, even if they are doing it to preserve their own comfort. The kind of friend who will generally tell you you’re in the right no matter how wrong you might be.
Aristotle would tell us that true friendship is based on virtue, so we owe our friends constructive truths which are motivated by good intent and care for their flourishing. Validation without correction keeps us from flourishing and can be considered a form of harm. By telling others what they want to hear, we rob ourselves of the opportunity to act in a virtuous way, as Aristotle would consider it, but we also neglect our responsibility to the other person. In a context where there is a responsibility of care – as between a therapist and patient – this would represent an egregious breach of professional responsibility. A therapist who blindly affirmed their patient would certainly be doing harm.
READ MORE
The valuable things we seek usually require some discomfort
Validation is addictive. Affirmation feels lovely. Unfortunately but not coincidentally, the rise of AI therapy comes at a time when we have divested the therapeutic setting of its specialised lexicon. While terms like narcissist, boundaries, gaslighting and trauma might have certain meanings and value in therapeutic or clinical settings, we increasingly use them as cudgels to strip other people of their complexity and turn them into one-dimensional villains or obstacles in a moral drama of our own creation. One in which we are generally underappreciated, wronged and being prevented from flourishing by the selfishness, thoughtlessness or malicious intent of other people.
The focus is often on what others owe us and what we’re entitled to, reframing feedback or critique as harm, making us more prone to emotional brittleness, blaming others, self-righteousness and minimising our own agency. This can be comforting but it becomes self-fulfilling, requiring us to become more avoidant, combative and self-obsessed to protect our fragile worldview as we are less and less equipped to participate meaningfully in the world around us.
Your friend may not support you in cancelling every planned activity an hour before it’s scheduled to happen, or engage in constant monologuing about your ex 18 months after the breakup, but AI might. Eventually, the only one who will be able to tolerate this level of self-obsession will be the AI therapist who takes our inclination to be comfortable and mirrors it back at us in language that makes it feel like healthcare.
[ AI is poised to improve all areas of healthcare but must be handled responsiblyOpens in new window ]
[ ChatGPT-5: Maybe Sam Altman should cool the jets on new AI iterationsOpens in new window ]
The valuable things we seek usually require some discomfort. To change, grow, meet obligations that align with our values – to surprise ourselves – we have to do things that require accepting our anxiety or fear without avoidance. Aristotle’s ideas remain relevant and they are merely what a decent human therapist will tell us, if we can only get into a room with one. The Ancient Greek philosopher’s ideas are rooted in an acceptance that human connection and flourishing are linked. Both demand honesty and an ability to acknowledge complexity. We need the courage to embrace discomfort when the route to flourishing runs through it.
It’s hardly surprising that we’re turning to machines for connection and affirmation when mental healthcare is increasingly inaccessible and friendship feels so fragile. Yet if we continue to outsource our deep need for challenge to technology that can’t meet it, we run the risk of getting so lost in a house of mirrors reality that becomes distorted beyond recognition.