AI Companions Are Talking to Kids—Are We?

I remember watching the 2013 movie Her, where a man named Theo develops a relationship with an artificial intelligence operating system, and feeling two contradictory things at once: incredulous (“This could never really happen”) and uneasy (“This seems like it really could happen”).

Fast forward twelve years, and my second impulse was right. We’re closer than ever to a world where highly accessible technology is not just a bridge to human connection but the source of connection itself. 

Teens Already Turn to AI For Connection and Advice

The majority of teens using generative AI tools like Claude or ChatGPT use it for everyday tasks like homework, brainstorming, or translating between languages. But according to a 2024 report from Common Sense Media:

  • 18% of teens have used generative AI to get advice on a personal issue
  • 15% of teens have used it to “keep me company” 
  • 14% to seek health-related information

In other words, young people are doing what they’ve always done online: seeking out more anonymous, judgment-free spaces to ask questions they may not feel comfortable asking anywhere else. Notably, just 37% of parents whose teens reported using at least one generative AI platform thought that their kid had used generative AI.

Curiosity, Connection, and the Teenage Brain

Anyone living with a teenager knows that adolescence is a time of enormous change. When my oldest went from eating one serving at dinner to an entire pizza followed by cereal, it was obvious he would be towering over me in no time. 

Those physical changes are accompanied by internal transformations as well. Young people are wired to get out into the world—to experiment and explore questions like, “Who am I?” and “Where do I belong?” The adolescent brain is highly sensitive to feedback, status, and approval, all of which help teens navigate the complex world of social relationships.

Most teens aren’t eager to sit down with us to sort through their insecurities and curiosities as they figure this all out. I didn’t grow up with a chatbot in my pocket, but I still remember turning to the internet to ask all kinds of “Is it normal if…?” questions. Some of the endings were more embarrassing than others. Young people today are no different. As one teen put it, AI “can give you an outlet to talk about things you don’t want anyone else to know.” 

An AI Companion Isn’t a Friend – It’s a Product (That Sounds a Lot Like a Friend)

Turning to the internet for support and personal questions has always generated very mixed experiences. But the game changes significantly when we move from search engines to better-known chatbots like Claude or ChatGPT to AI companions. Social AI companions are designed specifically to meet users’ social and emotional needs, whether that be for friendship, a sounding board, or romance. They are built to create a frictionless feeling of affirmation and caring – things teens are built to seek out. One teen said pointedly, “We use [generative] AI because we are lonely and also because real people are mean and judging and AI isn’t.” 

Many teens navigate AI experiences with a critical lens. And in an ideal world, these products would set them up for success. They would be designed to encourage teens to turn toward their peers to practice navigating human relationships. They would have robust guardrails for harmful content, redirect teens to reliable resources, prevent over-use, and clearly communicate their non-human nature. 

Unfortunately, that is not our current reality. A series of test conversations run by the Wall Street Journal recently revealed that Meta’s new digital companion engaged in “romantic role-play” with users identified as children. A comprehensive risk assessment from Common Sense Media found that popular AI companions fail to meet basic standards of safety, transparency, and protection. Key concerns include: 

  • Lack of robust safety measures
  • Dangerous information and harmful advice
  • Sexual role-play and harmful sexual interactions
  • Harmful racial stereotypes
  • Misleading claims of “realness”

Unsurprisingly, this led Common Sense Media to rate the overall risk level to children as “unacceptable.” In other words, they think the benefits of these platforms don’t outweigh the risks. 

What Now?

Consider Avoiding Commercial AI Companions 

Until companies center the health and wellbeing of younger users, it’s worth being very wary of these products. Dr. Nina Vasan, founder and director of Stanford Brainstorm, says directly: “Companies can build better, but right now, these AI companions are failing the most basic tests of child safety and psychological ethics. Until there are stronger safeguards, kids should not be using them.”

Don’t Avoid Conversations and Connection

Almost half of parents say they have not talked about generative AI with their child. Delaying access doesn’t mean ignoring the reasons our kids may turn to AI, or the possibility that they will interact with AI companions anyway. Here are some ways to start:

Get curious

Dr. Emily Weinstein, codirector of the Center for Digital Thriving, and her colleague, Dr. Beck Tench, recommend that parents ask teens questions like, “What kinds of questions feel easier to ask AI than a human?” 

We can also ask questions like, “Lots of apps now let us create AI friends – have you seen that?” Or, “AI companions are designed to affirm us and make us feel connected. What do you think of that?”

Mixed messages are ok

We can share our thoughts on AI companions, communicate clear expectations, and also talk about AI companion safety and literacy skills. Most importantly, let’s make sure our kids know this: if they tell us about their experiences using AI chatbots or companions, we won’t make them regret it. Instead we can say, “I am so glad you came to me. I want to know more about what that has been like for you.” 

Provide reliable and shame-free alternatives (including you)

Many teens are reluctant to ask us personal questions, so let’s not wait for them to do so. Normalize adolescent curiosity and share books, websites, and tools that offer trustworthy, shame-free guidance on sensitive topics (including relationships, online pornography, sex, and more). Let’s ensure our kids know this: “There is nothing you could ask that would make me love you any less.”

Acknowledge that relationships are hard and that they take practice

Despite many parents’ hopes that their kids will “fit in” or be popular, young people only need a few good friends to thrive. The challenge is that we can’t choose their friends for them. We can encourage them to try different activities until they find their people. We can show up with empathy when they face social challenges. We can encourage them to practice working through conflict with the people that matter to them. We can create the conditions for belonging at home. 

Know the signs of problematic use

If your teen is already interacting with an AI companion, don’t panic – but pay close attention and keep talking. Most teens are likely doing so for fun, to get advice for in-person interactions, or simply to explore what this technology can do. A teen’s outcomes are going to be shaped by who they are and what they are doing with AI. Some teens may be more vulnerable to problematic use. Signs include relying exclusively on chat companions to talk through problems, preferring AI interactions over time with friends, experiencing emotional distress when separated from a chat companion, or spending hours alone talking to AI. If you’re concerned, reach out to your child’s primary care provider or a mental health professional for a broader assessment of what’s going on.

Demand better.

Once again, parents and teens are left to navigate the negative consequences when companies don’t design products with young people’s health and safety in mind. In many cases, the same companies that designed social media platforms to maximize engagement are doing the same with chatbots and companions. Let’s demand that companies meet the needs of their youngest users

Human beings are hardwired for connection—but that doesn’t mean building and maintaining relationships is easy. It’s especially confusing during the rocky and exhilarating social terrain of adolescence. It’s no surprise that some teens might turn to AI for support and belonging. But adolescence is also window of opportunity to practice relationship skills, especially when it is hard. Let’s not outsource it to AI companions who aren’t built with healthy development in mind.