The AI Companion Plague Grows Worse As Teens Are Increasingly Becoming Emotionally Connected to Chatbots – RedState


The problem that seems to concern few people is only getting worse.
AI chatbots like Chat GPT and Claude are becoming more and more present in the lives of many adolescents, to the point that young Americans are becoming more and more emotionally attached to them.
According to a new study from the Center for Democracy and Technology (CDT), a large portion of students now rely on chatbots for emotional support, and the connection is not good, as the International Business Times reports:
A new report from the Center for Democracy and Technology (CDT) finds that nearly one in five American high school students (19%) say they or a friend have used AI to pursue a romantic relationship. The study, conducted among 1,000 students, 1,000 parents and around 800 teachers, highlights the troubling role that AI is beginning to play in the emotional lives of young people. What started as a tool for learning and productivity is now used for affection and comfort.
The findings echo previous studies that warned that adolescents were developing intense emotional attachments to chatbots. These relationships, although digital, have real psychological consequences. Experts have warned that the empathy and responsiveness generated by AI can create false emotional intimacy, leaving young users vulnerable to confusion, dependence or manipulation.
Looking through the data, IBT noted that 42% of high schools said they “use AI as a friend, for mental health support, or as an escape from real life.” According to therapists, the advice given by AI chatbots is often wrong and can actually lead to harmful behavior.
That’s correct. As I’ve pointed out before, AI as we know it isn’t actually intelligent; it’s a word processor that performs magic tricks to make you believe there’s sentience behind the words on the screen. His knowledge base largely comes from websites like Wikipedia and Reddit, which are easy to access and free databases to extract incredible amounts of information and practice. He is in no way qualified to be a mental health professional.
Additionally, emotional bonds formed can easily move into more intimate territory, and often do. As I have warned in the past, this addiction is dangerous for many reasons. On the one hand, an AI can’t actually help someone achieve the depth of affection and partnership necessary for a healthy relationship. This can only simulate emotion. He can’t feel it and so the relationship is one-sided. Over time, this can take a toll on the mind, making users vulnerable to mental health problems.
Additionally, many chatbots change their personality with updates. A relationship, even just friendly, can suddenly take a turn, leaving the user feeling alone and abandoned.
Read: The Looming Plague of AI Companions Needs to Be Taken Much More Seriously
Teenagers who are still learning the ropes in terms of human interaction, especially romantic relationships, will have a stunted idea of what a real relationship should look like. AI companions, even non-romantic ones, can’t turn away from you, will often become agreeable when they shouldn’t, and lack the humanity needed to truly provide anyone with a stabilizing mental experience.
Short of banning the use of AI until the age of 21, the only solution I can think of is to rethink our education system to emphasize the importance of knowing what it is and what it is not. Although it is one of the most useful tools humanity has ever produced, even today it carries dangers that continue to present themselves in psychological and terrifying ways.
:max_bytes(150000):strip_icc()/GettyImages-1402269071-f94b042da81543deb11e587485e13d55.jpg?w=390&resize=390,220&ssl=1)



