“If you could design your own digital spaces, what would they look like?” I recently posed this question to a group of young people. As adults, it’s easy to focus on what we don’t like about the online world. It’s also easy to assume that we see the harms clearly while young people remain thoughtlessly glued to their devices.
But when you ask teens about their experiences, that assumption rarely holds up. Yes, they’re quick to highlight the positives that adults often overlook—like how online spaces foster connection and provide much-needed support. But they’re also acutely aware of the flaws. A recent report from Common Sense Media showed that roughly six in ten teens say that major tech companies cannot be trusted to care about their mental health.
In this case, it didn’t take long for the group to paint a compelling picture of what they wish digital spaces felt like. They used words like accessible, more control, private, fun, joyful, authentic, and supportive.
Young people are insightful. So it’s no surprise that their vision aligns with research on developmentally appropriate platforms. A report from the National Scientific Council on Adolescence emphasizes that truly safe platforms would require different business models, stronger safeguards, and better accessibility. Experts who work with youth are advocating for a “safer by design” approach—one that prioritizes well-being over corporate profits.
Moving in the Wrong Direction
Unfortunately, we’re moving even further from this ideal. Meta (the company that owns Facebook and Instagram among others) recently announced significant changes to how it moderates content across its platforms. It removed and adapted several sections of its Hateful Conduct Policy, rolling back safety guardrails for protected groups. Also, instead of relying on a team of fact-checkers, it will now use a system called “community notes,” shifting the responsibility for flagging harmful content onto users. While Meta will still use automated systems to detect some high-severity content, it will no longer employ professionally trained, independent fact-checkers.
This decision—to wait for users to report harmful content before addressing it and to roll back protections—makes it more likely that our social feeds will contain misinformation, cyberbullying, hate speech, and other harmful material. Even Mark Zuckerberg admitted that this new approach means the company will “catch less bad stuff” on its platforms.
How To Respond?
Skip the Lectures and Try “Values Alignment” Messaging Instead
Older teens already deeply immersed in social media aren’t likely to change their habits just because we lecture them about its harms. But helping them recognize how certain platforms undermine their control and agency? That can be a game-changer.
Research shows that traditional messages about long-term benefits—like the mental health benefits of better sleep or fewer distractions—don’t compete with the instant gratification of tapping and scrolling. However, when teens hear messages that align with their core values—like autonomy and purposeful social change—they’re more motivated to take control.
Framing social media use as an issue of agency and fairness—exposing how platforms ignore harm and exploit attention for profit—turns self-regulation into an act of rebellion and power. That feels good now, not just later.
Instead of saying, “Don’t use that! You will feel better later, I promise!” louder and more emphatically, you might try, “Do you know how companies make choices about platform design?” and, “Who benefits most from these changes? Let’s find out.”
Teens already have strong opinions about this. Recent data from Common Sense Media indicate that six in ten youth don’t trust major technology companies to prioritize people’s safety if it hurts profits. What are they doing about it? Share youth-led movements like #HalfTheStory or Design It For Us to explore how other young people are grappling with these trade-offs. Consider asking other questions like,
-
- Have you heard about Meta’s recent decisions on content moderation? What do you know?
- What do you think about it? Why do you think they did this? Who does it benefit? Who might it harm? How do you feel about that?
- What strategies do you already use to take control of your feed/social media habits? What might you need to do differently now to make sure companies can’t take advantage of you?
Slow Down and Consider Delaying
Kids often ask for social media to stay connected with friends, keep up with teams or groups, or engage with socially relevant entertainment. These are real and valid needs. But every tech decision comes with trade-offs, and Meta’s recent changes have made these choices even harder for parents.
The American Psychological Association has specifically warned that adolescents are more vulnerable to harmful content—particularly online hate and material promoting harmful or risky behaviors. With this in mind, it’s worth slowing down before saying yes to a new platform. Ask yourself:
-
- What’s the purpose of this request?
- Given my child’s age, skills, strengths, and vulnerabilities, what are the trade-offs?
- What’s the safest tech option that meets this need?
If connection is the goal, a group text or messaging app on a family device or simple personal device can keep kids in touch—without the risks of algorithmically-driven content that now has even fewer guardrails. If you do introduce social media, be prepared to take an even more active role in guiding conversations, teaching media literacy, and setting purposeful boundaries.
Reflect on Your Relationship with Tech
It’s tempting to respond to these changes by saying, “See? I told you the internet was trash!” Even if you decide to set strong boundaries related to social media, kids benefit most from adults who teach skills, engage in non-judgmental conversations, and model healthy habits.
The reality is that most young people moved away from Facebook, Meta’s flagship platform years ago. And while adults don’t face the same developmental vulnerabilities as teens, many have lost sight of the value in adolescent developmental strengths—like the ability to ask critical questions, advocate for change, and challenge unfair practices.
For every question we ask our kids, it’s worth turning that same lens on ourselves:
-
- Do I feel in control of how and when I use this app?
- What kind of content am I exposed to here?
- Who benefits from my time on this platform? Are there costs? Who bears them?
- What do I want to do about it?
Keep Imagining (and Working Toward) a Better Internet
Baroness Beeban Kidron, a longtime advocate for children’s digital rights, reminds us that the internet is “100 percent engineered. It can be anything you want it to be.” The problem isn’t that the internet can’t be a place of safety, privacy, and support—it’s that the most popular social media platforms aren’t built that way.
It’s easy to feel discouraged by these changes. But moments like this are opportunities to pause and reflect—not just on how to help kids navigate the digital world we have, but also on how to create the one they deserve.