Home All OthersWomenSerious Risks For Children Turning To AI For Therapy Advice

Serious Risks For Children Turning To AI For Therapy Advice

by Delarno
0 comments
Serious Risks For Children Turning To AI For Therapy Advice


The British Association for Counselling and Psychotherapy (BACP) is warning about the growing risks of children using AI tools such as ChatGPT for mental health advice. 

Its new survey revealed that more than a third (38%) of therapists working with under 18s have clients seeking mental health guidance from AI platforms.  And almost one in five (19%) therapists reported children receiving harmful mental health advice. 

Therapists have told the BACP that some AI tools are providing potentially harmful and misleading information including encouraging children to self-diagnose conditions such as ADHD and OCD, reinforcing avoidance behaviours and automatically validating their feelings regardless of what they express. There have also been tragic cases where AI tools have given dangerously misguided advice or even encouraged suicide.* Therapists are also particularly concerned about AI’s inability to offer real-time support or intervene in crisis situations. 

Ben Kay, Director at BACP, which is the largest professional body for counselling and psychotherapy in the UK and has more than 70,000 members, said: 

“It’s alarming that children are increasingly having to turn to AI chatbots like ChatGPT for mental health support, often unable to tell whether the advice they’re getting is safe or even true. Some have already suffered devastating consequences. And this is likely just the tip of the iceberg, with many more children struggling in silence, without access to real therapy.”

“We want parents, carers, and young people to understand that using AI for mental health support isn’t the easy, safe, or quick fix it might appear to be, there are real risks involved, and it must be approached with caution. While AI is accessible and convenient, it can’t replicate the empathy, connection, or the safety of therapy delivered by a real person trained to understand complex mental health challenges and assess risks. Children in distress could be left without proper professional support. The information shared with AI also doesn’t have the same protections as therapy. “

“Too many young people are turning to AI because they can’t get the mental health support they need. That’s unacceptable. The government must step up and invest now in real, professional therapy through the NHS, schools, and community hubs. No young person should ever be forced to turn to a chatbot for help. AI might fill gaps, but it can never replace the human connection that changes lives. Young people deserve more than algorithms, they deserve professionally trained therapists who listen.”

New survey findings

The BACP’s annual Mindometer survey, which gathered insights from nearly 3,000 practising therapists across the UK, shows that more than a quarter (28%) of therapists – working with both adults and children – have had clients report unhelpful therapy guidance from AI. And almost two-thirds (64%) of therapists said that public mental health has deteriorated since last year, with 43% believing AI is contributing to that decline. 

Senior accredited therapist Debbie Keenan who works at a secondary school and has her own private practice in Chepstow, added: 

”I’m definitely seeing more children and young people turning to AI to seek therapy advice and self-diagnosis conditions such as ADHD and OCD. This raises real concerns for me. As advanced as AI is, it simply cannot do this. It also cannot tell if a child is distressed, dysregulated or in danger. If a child was telling me that they were going to hurt themselves, or they had suicidal ideation, support would be in place for that child before they left my room- but would AI do this?  

“Additionally, I’m also concerned about the current risk of children isolating and disconnecting from real human relationships – this can lead to an over reliance on Al for emotional support and increase feelings of loneliness, making it harder to reach out for ‘real life’ support.  

“I believe children are increasingly turning to AI for therapy because it’s available 24/7. It feels non-judgemental and offers a sense of privacy. However, AI remembers data, it isn’t bound by ethical or confidentiality standards, and it lacks regulation or accountability. While it may fill the gap in access to mental health support, it cannot replace human connection or recognise subtle emotional cues like a trained psychotherapist can.” 

Amanda MacDonald, BACP registered therapist who provides support for children, teens and adults said: 

“AI therapy bots tend to adopt one of two approaches: offering validation or providing solutions. Both lack the nuance of real therapy and risk giving advice that contradicts best practices for emotional distress. For example, some AI tools have advised individuals with OCD to continue their compulsions, mistaking short-term relief for progress. Others have encouraged avoidance of anxiety triggers, which may feel helpful initially but can worsen anxiety over time by reinforcing avoidance behaviours.  

“There have also been well-documented, tragic cases where AI tools have given dangerously misguided advice or even encouraged suicide – outcomes that are both devastating and deeply alarming. 

“Parents and carers should be aware that their children may be turning to AI for guidance and advice. While it’s important to keep appropriate parental controls in place, open and honest communication at home is just as vital. Talk to your children with curiosity and share your concerns in an age-appropriate way. 

“Children and adolescents aren’t yet equipped to fully assess risk, so parents play a crucial role in keeping them safe. Balancing privacy with safety is never easy, but without that balance, young people can become overly reliant on what is ultimately a very smart algorithm; one that lacks the ethical and safeguarding standards found in helplines, therapy, or school-based support. 

“Reaching for their phones when they’re upset feels natural for many young people, especially as AI tools can seem supportive and validating. This creates a valuable opportunity for families to talk about their relationship with phones and technology. Parents can help by modelling healthy behaviour – setting shared screen-free times and recognising when they themselves instinctively turn to their phones. After all, phones were designed to connect us, but if we’re not careful, they can start to replace real human connection.” 

References: 

All figures are from BACP’s annual Mindometer survey of its members. The total sample size was 2,980 therapists, and fieldwork was undertaken between 3 – 17 September 2025. The survey was carried out online. 

https://www.bbc.co.uk/news/articles/cp3x71pv1qno  



Source link

You may also like

Leave a Comment