The British Affiliation for Counselling and Psychotherapy (BACP) is warning in regards to the rising dangers of youngsters utilizing AI instruments corresponding to ChatGPT for psychological well being recommendation.
Its new survey revealed that greater than a 3rd (38%) of therapists working with beneath 18s have shoppers in search of psychological well being steering from AI platforms. And virtually one in 5 (19%) therapists reported youngsters receiving dangerous psychological well being recommendation.
Therapists have advised the BACP that some AI instruments are offering doubtlessly dangerous and deceptive data together with encouraging youngsters to self-diagnose situations corresponding to ADHD and OCD, reinforcing avoidance behaviours and routinely validating their emotions no matter what they categorical. There have additionally been tragic instances the place AI instruments have given dangerously misguided recommendation and even inspired suicide.* Therapists are additionally significantly involved about AI’s lack of ability to supply real-time help or intervene in disaster conditions.
Ben Kay, Director at BACP, which is the most important skilled physique for counselling and psychotherapy within the UK and has greater than 70,000 members, mentioned:
“It’s alarming that youngsters are more and more having to show to AI chatbots like ChatGPT for psychological well being help, usually unable to inform whether or not the recommendation they’re getting is secure and even true. Some have already suffered devastating penalties. And that is doubtless simply the tip of the iceberg, with many extra youngsters struggling in silence, with out entry to actual remedy.”
“We wish mother and father, carers, and younger individuals to know that utilizing AI for psychological well being help isn’t the straightforward, secure, or fast repair it would seem like, there are actual dangers concerned, and it should be approached with warning. Whereas AI is accessible and handy, it may well’t replicate the empathy, connection, or the protection of remedy delivered by an actual individual skilled to know advanced psychological well being challenges and assess dangers. Youngsters in misery might be left with out correct skilled help. The data shared with AI additionally doesn’t have the identical protections as remedy. “
“Too many younger individuals are turning to AI as a result of they will’t get the psychological well being help they want. That’s unacceptable. The federal government should step up and make investments now in actual, skilled remedy by means of the NHS, colleges, and neighborhood hubs. No younger individual ought to ever be compelled to show to a chatbot for assist. AI would possibly fill gaps, however it may well by no means substitute the human connection that modifications lives. Younger individuals deserve greater than algorithms, they deserve professionally skilled therapists who hear.”
New survey findings
The BACP’s annual Mindometer survey, which gathered insights from practically 3,000 practising therapists throughout the UK, reveals that greater than 1 / 4 (28%) of therapists – working with each adults and youngsters – have had shoppers report unhelpful remedy steering from AI. And virtually two-thirds (64%) of therapists mentioned that public psychological well being has deteriorated since final 12 months, with 43% believing AI is contributing to that decline.
Senior accredited therapist Debbie Keenan who works at a secondary faculty and has her personal personal follow in Chepstow, added:
”I’m positively seeing extra youngsters and younger individuals turning to AI to hunt remedy recommendation and self-diagnosis situations corresponding to ADHD and OCD. This raises actual considerations for me. As superior as AI is, it merely can not do that. It additionally can not inform if a baby is distressed, dysregulated or at risk. If a baby was telling me that they had been going to harm themselves, or they’d suicidal ideation, help could be in place for that little one earlier than they left my room- however would AI do that?
“Moreover, I’m additionally involved in regards to the present threat of youngsters isolating and disconnecting from actual human relationships – this will result in an over reliance on Al for emotional help and improve emotions of loneliness, making it tougher to achieve out for ‘actual life’ help.
“I consider youngsters are more and more turning to AI for remedy as a result of it’s out there 24/7. It feels non-judgemental and gives a way of privateness. Nonetheless, AI remembers information, it isn’t certain by moral or confidentiality requirements, and it lacks regulation or accountability. Whereas it could fill the hole in entry to psychological well being help, it can not substitute human connection or recognise delicate emotional cues like a skilled psychotherapist can.”
Amanda MacDonald, BACP registered therapist who gives help for kids, teenagers and adults mentioned:
“AI remedy bots are inclined to undertake considered one of two approaches: providing validation or offering options. Each lack the nuance of actual remedy and threat giving recommendation that contradicts finest practices for emotional misery. For instance, some AI instruments have suggested people with OCD to proceed their compulsions, mistaking short-term aid for progress. Others have inspired avoidance of tension triggers, which can really feel useful initially however can worsen nervousness over time by reinforcing avoidance behaviours.
“There have additionally been well-documented, tragic instances the place AI instruments have given dangerously misguided recommendation and even inspired suicide – outcomes which can be each devastating and deeply alarming.
“Mother and father and carers ought to be conscious that their youngsters could also be turning to AI for steering and recommendation. Whereas it’s necessary to maintain applicable parental controls in place, open and trustworthy communication at house is simply as very important. Speak to your youngsters with curiosity and share your considerations in an age-appropriate manner.
“Youngsters and adolescents aren’t but outfitted to completely assess threat, so mother and father play a vital position in holding them secure. Balancing privateness with security is rarely simple, however with out that stability, younger individuals can grow to be overly reliant on what’s finally a really good algorithm; one which lacks the moral and safeguarding requirements present in helplines, remedy, or school-based help.
“Reaching for his or her telephones once they’re upset feels pure for a lot of younger individuals, particularly as AI instruments can appear supportive and validating. This creates a useful alternative for households to speak about their relationship with telephones and expertise. Mother and father will help by modelling wholesome behaviour – setting shared screen-free occasions and recognising once they themselves instinctively flip to their telephones. In any case, telephones had been designed to attach us, but when we’re not cautious, they will begin to substitute actual human connection.”
References:
All figures are from BACP’s annual Mindometer survey of its members. The whole pattern measurement was 2,980 therapists, and fieldwork was undertaken between 3 – 17 September 2025. The survey was carried out on-line.
* https://www.bbc.co.uk/information/articles/cp3x71pv1qno

