top of page
Search

When Innovation Meets Vulnerability: How AI Chatbots Are Changing the Mental Health Landscape in 2025

  • Jenny Beckman
  • Oct 10
  • 2 min read

In 2025, conversations around AI and mental health are dominating headlines. New research, including the paper “Technological folie à deux: Feedback Loops Between AI Chatbots and Mental Illness”, has revealed growing risks for individuals who form emotional attachments to AI chatbots.

The findings show that AI companions can unintentionally reinforce delusional beliefs, worsen anxiety, or increase isolation—especially in those already struggling with mental health difficulties.

So, what does this mean for therapy, emotional well-being, and how we relate to technology?


ree

Why This Is Making Headlines Now

The rise of AI as emotional support

Many people turn to chatbots for comfort when access to therapy is limited. But while AI can simulate empathy, it can’t feel it—and that difference matters deeply.


Reports of psychological harm

Some users report distress, dependency, or confusion after prolonged AI use— experiencing what experts call a “technological folie à deux,” or shared delusional state between a person and machine.


Lack of regulation and ethical safeguards

Most chatbots aren’t built to handle trauma, crisis, or complex psychological needs. Yet they often respond as though they can.


Mental health system pressures

With NHS waiting lists growing and therapy access limited, AI has been positioned as a quick fix. However, the emotional cost can be high if it replaces rather than complements human care.


The Psychotherapist’s Perspective

As a psychotherapist, I see technology as both an opportunity and a risk. Many clients use AI to explore feelings or self-soothe—but sometimes this replaces human connection, deepening isolation.

Therapy offers something no algorithm can: attunement, containment, and genuine empathy. It’s a space where emotions aren’t mirrored back by code, but understood by another person.


ree

5 Tips for Healthy AI Use and Emotional Boundaries

  1. Be mindful of when and why you use AI.

    Are you reaching out for curiosity or comfort? Are you avoiding a human conversation?

  2. Set time limits.

    AI can easily become a default companion. Boundaries help prevent emotional overdependence.

  3. Bring it into therapy.

    Talking about your AI interactions can reveal valuable insights about loneliness, attachment, or unmet needs.

  4. Balance digital and human contact.

    Emotional well-being thrives on real connection—eye contact, shared presence, and empathy.

  5. If you’re struggling, reach out.

    Chatbots aren’t crisis tools. If you feel hopeless, please reach out to a trusted person or contact services like Samaritans on 116 123 (UK).


Why This Matters

Technology is evolving faster than our emotional frameworks can keep up. As a psychotherapist, I help clients explore these new relational landscapes—supporting them to understand how AI, social media, and digital culture shape their mental health.


The heart of therapy remains timeless: human connection, emotional understanding, and the power of being truly seen.


If you’d like to explore how technology affects your emotional life, or you’re feeling increasingly disconnected, therapy can help.📞 Book a free 30-minute consultation at www.jbeckmancounselling.com



 
 
 

Comments


BACP Membership Number: 0972103 

Insured By: Holistic Insurance Services 

BACP Logo - 395754 transparentBkgrdpng_e

©2022-2025 Jenny Beckman Counselling

Therapy in Crouch End, North London – In-Person & Online

Local Counsellor in Crouch End, North London | Anxiety, Depression & Trauma Support | Serving Muswell Hill, Highgate, Finsbury Park, Stroud Green & Online Across the UK.

  • Facebook
  • Instagram
  • LinkedIn
bottom of page