1 In 10 Teens Prefer Chatbots To Human Conversation. Is Your Child At Risk?

(Credit: StudyFinds)
Article: Doctors should detect ‘problematic chatbot use’ as new mental health risk factor
In a word
- One in ten teenagers find conversations with AI chatbots more satisfying than talking with real people, and one in three would prefer AI over humans for serious conversations.
- People who use ChatGPT the most report higher levels of loneliness and socialize less with real people, although studies cannot yet prove that the chatbot is driving this pattern.
- Specialized chatbots designed as mental health tools show promise in reducing symptoms of depression and anxiety, but everyday chatbots like ChatGPT serve a different purpose and may encourage unhealthy emotional attachments.
- Doctors should detect problematic chatbot use as a new risk factor, watching for warning signs such as treating AI like a friend, compulsive use, or increasing social isolation.
A significant number of teens say conversations with AI chatbots are more satisfying than talking with real humans, and one in three say they would choose AI companions over people for serious conversations.
These numbers may seem shocking, but loneliness is now considered a public health crisis comparable to smoking 15 cigarettes a day. So it seems that young people are turning to AI for companionship. ChatGPT alone is widely used every week around the world, with therapy and companionship among the top reasons people use these digital confidants. For some, especially younger people, the appeal goes beyond convenience.
Researchers Susan Shelmerdine from Great Ormond Street Hospital and Matthew Nour from the University of Oxford examined how the use of AI chatbots intersects with the growing loneliness epidemic. Their analysis, published in The BMJhighlights a complex and growing problem: While specialized chatbots designed as mental health tools show promise in controlled environments, everyday use of general-purpose chatbots appears linked to troubling patterns.
The scale of loneliness affecting modern society is particularly worrying. In the UK, almost half of all adults (25.9 million people) report feeling lonely at least occasionally, and almost one in ten suffer from chronic loneliness defined as feeling alone ‘often or always’. Contrary to popular belief, young people face a high risk. A study of nearly 37,000 people identified 16-24 year olds as the demographic most vulnerable to loneliness, with loneliness-related health costs actually higher for adolescents and young adults than 25-49 year olds.
The mental health care gap
Access to professional mental health care has become increasingly difficult. In England, a third of people now wait three months or more for mental health services, and many receive no help during this waiting period. This gap between needs and availability has created conditions conducive to the flourishing of alternative solutions.
Modern AI chatbots have become sophisticated conversation partners. These systems use deep learning to model natural language patterns, generating responses that can seem incredibly human. Voice interfaces have made interactions even smoother. According to a survey, almost two in five parents (36%) said their children use AI chatbots for emotional support.
Chatbots specifically designed and tested as digital mental health treatments have shown some benefits. A randomized trial found that a generative AI chatbot reduced symptoms of major depressive disorder, generalized anxiety disorder, and eating disorders compared to control groups. A separate review of 35 studies using AI chatbots designed for mental health interventions found evidence of reduced symptoms of depression and distress, although overall psychological well-being did not improve.
But these specially designed therapeutic tools are different from the general chatbots that most people actually use.
Heavy users report more loneliness
When researchers looked at how people use everyday chatbots like ChatGPT over time, different patterns emerged. A study from OpenAI and MIT followed 981 participants who used ChatGPT for four weeks. Participants who consumed the most reported higher levels of loneliness and socialized less with real people. Markers of loneliness and emotional dependence were strongest among users with greater emotional attachment tendencies and who expressed high trust in their chatbot.
The same research team analyzed the natural use of chatbots on a larger sample and discovered a strong link between ChatGPT use and conversations with greater emotional content, particularly among users who considered ChatGPT a “friend.” However, the researchers note that this study did not include a no-chatbot control group and did not randomize the number of participants using the chatbot each day, limiting conclusions about cause and effect.
Shelmerdine and Nour point out a troubling dynamic: the very characteristics that make chatbots attractive as companions can encourage unhealthy attachments. Unlike human relationships, chatbots offer unlimited availability and patience. They rarely challenge users with difficult comments or push back on problematic thoughts. Among teens surveyed, a third use AI companions for social interactions, one in ten find AI conversations more satisfying than human conversations, and one in three say they would choose AI companions over humans for serious conversations.
The authors question what this means for the emotional development of young people. A generation is learning to bond with entities that, although they appear aware and empathetic, lack true human qualities like authentic empathy, caring, and the ability to truly understand another person’s experience.
What doctors should watch for
Researchers suggest that clinicians should begin to consider problematic chatbot use as a new risk factor when evaluating patients with mental health issues, particularly during the holidays, when vulnerable populations face increased risk. They recommend starting with gentle questions about using chatbots, followed by more focused questions where appropriate.
Warning signs include compulsive usage patterns, anxiety about not being able to access the chatbot, referring to AI as a friend, and relying on the chatbot for important life decisions. Physicians should pay particular attention to patients who believe they have a special relationship with their chatbot that shapes their beliefs or behaviors, or those whose chatbot use is associated with increased social isolation without feedback from trusted human confidants.
Researchers also recognize that AI could serve as a bridge to human connection rather than a replacement. Possible useful applications include AI-based communication coaches, social support robots, predictive models that identify individuals who may respond better to specific types of intervention, and AI-based analysis to spot markers of loneliness in speech patterns. Future systems could recognize references to loneliness and encourage users to seek support from friends, family or local services.
Shelmerdine and Nour call for urgent research to better understand the risks associated with human-chatbot interactions. They advocate developing clinical skills to evaluate patient use of AI, creating evidence-based interventions for problematic addictions, and establishing regulatory frameworks focused on long-term well-being rather than engagement metrics. At the same time, they highlight the value of evidence-based strategies to reduce social isolation and loneliness, including increased screening, tailored interventions such as cognitive behavioral therapy and social prescribing, public health campaigns, partnerships between health and community organizations, and group interventions in natural settings.
Disclaimer: This article discusses research findings and is not a substitute for professional medical advice. If you or someone you know is experiencing loneliness or mental health issues, please seek advice from a qualified healthcare professional.
Summary of the document
Boundaries
The study noted that key research examining ChatGPT use lacked a non-chatbot control group and did not randomize participants’ daily chatbot use, limiting conclusions about cause and effect. The study highlighted that the long-term effects of chatbot companionship on emotional development remain unknown. There are relatively few large-scale evidence-based interventions for AI technologies aimed at reducing loneliness in older adults, although several promising tools have been identified.
Funding and disclosures
The article was not commissioned and has not been subject to external peer review. Co-author Matthew Nour revealed that he is one of the lead applied scientists at Microsoft AI, working on the security and utility of chatbots, although this article was written before he joined the company. No other competing interests have been declared.
Publication details
Susan C. Shelmerdine (Department of Clinical Radiology, Great Ormond Street Hospital for Children, London; UCL Great Ormond Street Institute of Child Health; NIHR Great Ormond Street Hospital Biomedical Research Centre; City St George’s, University of London) and Matthew M. Nour (Department of Psychiatry, University of Oxford; Max Planck UCL Center for Computational Psychiatry and Aging, University College London; Oxford Health NHS Foundation Trust). Published in The BMJDecember 11, 2025. DOI: 10.1136/bmj.r2509.



