It’s happening with teens, adults, even parents swiping through AI companionship apps when loneliness hits.  72% of U.S. teens have used AI companion bots, with over half using them daily. Nearly a third turn to them for emotional support or friendship (Nature, Axios, The New Yorker).

One recent study showed 35% of kids see AI chatbots as friends and vulnerable kids are even more likely (The Times). A New Yorker piece also points out that while AI may mimic empathy, it may discourage self-awareness and reliance on human connection.

It’s tempting and easy to point the blame at AI for “destroying interpersonal relationships.” But that’s just scapegoating. Tech didn’t corrode our social fabric – we did, through neglect, social media, and cultural drift.

What This Reveals About the Human Condition

We’re emotionally starved.  Teens craving emotional outlets isn’t new. What is new is reaching for a simulated voice instead of a friend’s. That tells us real human support is often missing.  Emotional labor is expensive. AI doesn’t judge, doesn’t tire, doesn’t ask for anything back. That’s not relationship it’s a transaction. Which makes it no surprise we lean in. We confuse connection with convenience. Chatbots are frictionless, predictable, and available 24/7. Real relationships? Messy, uncertain – still, they’re essential to growth. We need reminders of what we’ve lost. Rather than curse AI for doing what we allow, we should ask: Why am I reaching for this? Am I avoiding vulnerability with another human?

A Call for Introspection

This moment is a mirror. Not for AI, but for us. Are we teaching teens how to build trust and conflict resilience?  Are we modeling emotional availability instead of idealized social media?  Are we investing in community, intergenerational connections, and real time? If we answer those questions, we’ll see AI as a symptom -not the disease.

We need to parents, educators, and teens about why they’re drawn to AI – and how to re-balance from digital solitudes. Build tools for AI literacy, emotional safety training, informed choice – not bans. Encourage AIs that amplify human connection, not replace it – like bots that suggest peer-support groups, remind you to call your mom, or help you journal about real-life emotions.

AI companionship isn’t destroying relationships. It’s revealing how far we’ve drifted apart. Blaming AI erases our personal responsibility. Fixing the drift – that’s on us. Not banning the bots, but rebuilding us.

Relevant News