The AI Companion Plague Grows Worse As Teens Are Increasingly Becoming Emotionally Connected to Chatbots
By Brandon Morse | 3:19 PM on October 22, 2025
The opinions expressed by contributors are their own and do not necessarily represent the views of RedState.com.

The problem few people seem to be concerned with is only growing larger.
AI Chatbots like Chat GPT and Claude are becoming an increased presence in the lives of many teens to the point where young Americans are becoming more and more emotionally attached to them.
According to a new study from the Center for Democracy and Technology (CDT), a large chunk of students is now relying on chatbots for emotional support, and the bond isn't a good one, as reported by the International Business Times:
The findings echo earlier studies that warned of teenagers developing intense emotional attachments to chatbots. These relationships, though digital, have real psychological consequences. Experts have cautioned that AI-generated empathy and responsiveness may create false emotional intimacy, leaving young users vulnerable to confusion, dependency, or manipulation.
Coming through the data, the IBT noted that 42 percent of high schools said they "use AI as a friend, for mental health support, or to escape real life." According to therapists, the guidance AI chatbots give is often misguided and can actually lead to harmful behaviors.
This is accurate. As I've reported in the past, AI as we know it isn't actually intelligent; it's a word processor performing magic tricks to make you think there's a sentience behind the words on the screen. Its base of knowledge is pulled largely from websites like Wikipedia and Reddit, which are easy to access and free databases to pull incredible amounts of information from, as well as train on. It is not qualified in any way to be a mental health professional.
Moreover, the emotional connections formed can easily move into more intimate territory, and they often do. As I've warned in the past, this dependence is dangerous for many reasons. For one, an AI cannot actually help someone to the depths of affection and partnership necessary for a healthy relationship. It can only simulate emotion. It cannot feel it, and thus, the relationship is one-sided. Over time, this can wear on a mind, making users vulnerable to mental health struggles.
Moreover, many chatbots change in personality with updates. A relationship, even just a friendly one, can suddenly take a turn, leaving the user feeling alone and abandoned.
Read: The Oncoming AI Companion Plague Needs to Be Taken Far More Seriously
WATCH on YouTube: THE DEATH OF RELATIONSHIPS
...
READ MORE: https://redstate.com/brandon_morse/2025/10/22/the-ai-companion-plague-grows-worse-as-teens-are-increasingly-becoming-emotionally-connected-to-chat-bots-n2195364
***
Chân thành cám ơn Quý Anh Chị ghé thăm "conbenho Nguyễn Hoài Trang Blog"
Xin được lắng nghe ý kiến chia sẻ của Quý Anh Chị trực tiếp tại Diễn Đàn Paltalk: 1Latdo Tapdoan Vietgian CSVN Phanquoc Bannuoc.
Kính Chúc Sức Khỏe Quý Anh Chị.
conbenho
Tiểu Muội quantu
Nguyễn Hoài Trang
23102025
__________________________
Cộng sản Việt Nam là TỘI ÁC
Bao che, dung dưỡng TỘI ÁC là ĐỒNG LÕA với TỘI ÁC

No comments:
Post a Comment