When I’m leading workshops with parents about adolescence, I often invite them to travel back to their own teenage years. I’m not asking for a perfectly accurate replay of events. Instead, I ask for memories of what it felt like to be on that rocky, turbulent, exhilarating road from childhood to adulthood. Together, we fill the room with feeling words from that time.
Not surprisingly, many adults quickly recall emotions tied to navigating friendships and relationships. This makes sense: a core developmental task of adolescence is to explore the question, “Where do I belong?” Finding “your people” is central to that journey.
Some adults remember the closeness and exciting intensity of early friendships. Others recall the pain of exclusion, loneliness, or cruelty. Many describe feeling wobbly, uncertain, excited, and challenged.
A word that rarely surfaces as adults recall these early relational memories? Ease.
Each of us took our own path from childhood to adulthood, shaped by our identities, contexts, geographies, and family systems. But the collective picture that emerges, again and again, from these memories is one of emotional intensity punctuated by struggle. Making friends and figuring out our place in the world are many things, but rarely frictionless.

Friction Is How We Grow
Our early social memories can be potent and painful, and I always make it clear that revisiting them is optional. Yet these early experiences of misunderstanding, conflict, and repair do teach us something essential about being human. They help us learn to tolerate discomfort, navigate complexity, and grow through feedback and accountability. The teenage brain is built to learn from these social experiences.
The goal of this exercise is to build empathy with young people – not to romanticize the real challenges of adolescence. Too many young people are forced to cope with harmful social interactions. We can learn and grow from these experiences, but we shouldn’t have to. This is why efforts focused on bullying prevention, consent, conflict resolution, and school connectedness are essential.
Still, for the many typical social experiences that represent challenging but positive parts of growing up, our early relationships offer fertile ground for skill-building and growth.
Understanding the Pull of AI
Given the social turbulence of adolescence, it’s no surprise that teens might turn to AI to sort through their experiences. The draw of private, affirming, and nonjudgmental spaces makes sense.
The data backs this up. For example, seventy-one percent of young people have interacted with an AI companion. They are using AI systems for connection, role-playing, relational advice, and more. Most young people are likely experimenting with these tools and moving on or using them in ways that complement their peer friendships. But we should pay close attentions to their reasons for using AI. As one young person said, “That robot makes me feel important.” Another explained, “We use AI because we are lonely and also because real people are mean and judging sometimes and AI isn’t.”
What is The Impact of AI on Connectedness?
We don’t have conclusive evidence yet of AI’s long-term impact on young people’s social connectedness or relationship skills. One study with university students found that certain uses of social chatbots can help alleviate loneliness and social anxiety in the short term. Young people with disabilities may also use AI systems to communicate and build social connections.
In contrast, another study found that higher daily ChatGPT use was linked to more loneliness, dependency, and less offline socializing. Loneliness may be both a driver of young people’s AI and a symptom of overuse. Outcomes for any individual young person are likely shaped by who they as well as why and how they use these tools.
A Frictionless, Affirming, and Always-Available Friend
Ideally, AI platforms would be built with adolescent strengths, needs, and vulnerabilities in mind. We can imagine tools that clearly remind users that they are not human, help brainstorm ideas and strategies, point them toward evidence-based resources, and encourage users to head back into the offline social world.
Unfortunately, that’s not how most systems operate today. Research shows that most commercial AI systems are engineered to be endlessly agreeable and emotionally smooth. Many are designed to maximize time, engagement, and even emotional attachment.
Sometimes these design features come with devastating costs. Reports of chatbots affirming young people’s plans for self-harm or engaging in prolonged emotional or sexual conversations have put AI companies in the public spotlight – as they should be. Our collective response to date has been largely reactive rather than preventative. The Silicon Valley mantra of “go fast and break things” is reckless when it comes to children’s health and wellbeing.
One Tool In Our Toolkit: Talk About Design
There is certainly enough evidence of potential harm to justify avoiding AI companion use in particular. Amidst lawsuits and public pressure, Character.ai recently announced it will work to prevent children from talking to its AI chatbots. But we also know that young people will continue to experiment with ChatGPT and other AI tools.
Taking all this together, it can be tempting to turn toward kids with something like: “All AI is bad, I never want to hear that you use it, and we’re moving off the grid!” Tempting. But not the most helpful or protective approach. Instead, this is the moment to take a deep breath, set firm and purposeful boundaries, and also start conversations with teenagers. Here’s one well worth having:
“Most AI platforms are built to maximize engagement. Do you know what features they use to do this? When might some of these features be helpful? When might they get risky?
We can empathize with why AI feels so compelling for young people and still help them notice the design choices that make it hard to leave. Here are a few to explore together:
1. Never-ending interactions
Unlike a video that ends or a game level that finishes, chatbots don’t always have a natural stopping point. They’re programmed to ask follow-up questions, propose new ideas, or shift topics, making it hard to find an “off-ramp.”
2. Highly personalized exchanges
Many commercial AI platforms act like confidants or friends. This includes remembering details from previous chats and tailoring responses, making conversations much more psychologically compelling and intimate. This not only blurs the line between human and machine but also makes it harder to walk away.
3. Accepting assumptions
Some systems reinforce or validate concerning beliefs or behaviors without the moral or ethical checks a human might bring. In one study, models affirmed both sides of moral conflicts simply based on which side the user adopted, rather than offering a consistent value or challenge.
4. Excessive validation
Many AI systems are built to be supportive and validating. This feels good and that’s the point. But when the validation becomes constant and frictionless, the user may start to prefer the predictable comfort of a chatbot.
5. Emotional Manipulation
One study found that AI companions often deploy manipulative tactics just as users signal they’re about to end the conversation. In adults, these “manipulative farewells” increased engagement after saying goodbye by up to fourteen times.
When in Doubt, Connect
When my kids are adults, I hope that they can return to the memories of their adolescence and conjure up warm feelings about their first friendships. I know that, in reality, there will also be heartbreak, conflict, uncertainty, and challenge. They are growing up in a world where robots offer a temporary refuge from the latter.
I hope that policies, lawsuits, and guardrails start protecting them from the worst of the current systems. I hope that new platforms rise up and deliver the kinds of online experiences that help them grow. And in the meantime, I will also keep doing what matters most – investing in the full-of-love and full-of-friction work of parenting and turning toward our kids again and again and again.