“My son is on Discord,” a parent recently wrote in an email immediately following the racist killing spree in Buffalo, NY. “Should I be concerned?”
I took a deep breath because I knew these conversations were complicated. In the wake of the racist killings in Buffalo, NY last week, each of us has been left searching for healing, action, answers and prevention in our own circles of influence. We know all too well that the grief, rage, and fear ripple out well beyond the loved ones of the victims themselves.
The reality is, of course, that there are no simple answers. There is no single experience, site, or reason that leads an eighteen-year-old to the most horrible acts of racist violence. Instead, there is a toxic stew of forces that are rooted in our history and growing in our present, accessible online and offline, activated by individual vulnerabilities and fueled by collective influences.
In other words, when it comes to parenting in the digital age there is no single platform for racist, anti-semitic, or other forms of far-right extremism. There are absolutely certain applications that should raise red flags. Specifically, any highly encrypted apps designed to host content or conversations that would be banned on more mainstream sites should be taken very seriously. But even mainstream apps can be exploited by extremists and host hate speech – meaning that casual exposure is likely for young people online. Common Sense Media found that 64 percent of teen social media users say they come across hateful content on social media. That’s why I tend to pause before responding to questions about specific games or tools in the wake of high-profile violence. Is our takeaway from last week that all young people should be banned from Discord? Not necessarily.
But should we be concerned about and take seriously far-right extremism online? Absolutely.
Extremist recruitment online
Being aware of and working to buffer children against predatory behaviors online is not a new parenting task. While these risks are relatively rare, the stakes are high enough that prevention is important. And it’s clear that we need to expand our understanding of online exploitation beyond bullying or sexual exploitation. Extremists share common patterns with other forms of online predation. Namely, extremists work to separate young people from their circle of trusted adults. According to Common Sense Media, online extremist predation often gradually introduces extremist content, starting with more mainstream topics like video games or fashion. Far-right extremists provide frameworks that often include:
- Simple, false solutions to complex problems.
- Black and white thinking that encourages directing anger toward people different from themselves.
- A false sense of security and certainty
It’s important to note that violent extremism isn’t defined by challenging a system or even advocating for radical change from the status quo. Instead, extremist beliefs are rooted in a belief that violence or domination is the only answer to identity-based conflicts. According to American University’s Polarization & Extremism Research & Innovation Lab, online radicalization occurs when,
“someone’s online activities – reading, watching, socializing – help lead them to adopt politically or religiously extreme views. Extremist beliefs say that one group is in dire conflict with other groups who don’t share the same racial or ethnic, gender or sexual, religious or political identity. Extremists believe that this imagined conflict can only be resolved through separation, domination, or violence between groups.” It becomes “us versus them,” and them is defined as the enemy.”
To be clear, most children can encounter extremist views and not go on to adopt these beliefs. The factors that converge to drive a young person towards extremist thinking or groups are always complicated and mediated by offline forces. According to the Southern Poverty Law Center, some factors that make children more vulnerable to online predation include loneliness and isolation, desire for love and friendships, feelings of anger or betrayal as well as confusion or uncertainty, stress and trauma, or strong relationships with people who hold extremist beliefs or whose beliefs align with.
Our best understanding is that social media and gaming environments can facilitate radicalization, but are likely not the primary driver. In general, the people consuming extreme content have extreme views already. This underscores the importance of nurturing resilience, cultivating critical thinking and media literacy, talking about racism and antisemitism, and creating a healthy and inclusive sense of belonging and purpose for all children. In other words, kids need more from adults than just deleting apps from their phones.
What can we do?
While news headlines sometimes portray perpetrators of far-right extremist violence as “lone wolves,” the reality is that young people are influenced by ideas online and often share them with online audiences.
The good news is that communities can also help counteract these messages (just as they can reinforce them). A broad network of healthy relationships not only provides more “eyes and ears” to spot warning signs but these relationships act as solid roots for children and youth as they navigate misinformation and risk.
Talk early and talk often.
When we talk with our kids about their online experiences we can start including explicit conversations about racism, anti-Semitism, and other forms of hate speech. Young people should know that they can encounter extremist messages on any platform and in any medium. According to Common Sense Media, extremist groups have targeted youth (especially young white men) through “music, video, images, coordinated meme campaigns, invitations to chat rooms, and through influencers.” Talk about hate speech and how it can legitimize and provide an on-ramp to extremist rhetoric.
Listen.
Listening to our kids about what they are seeing, hearing, and doing online provides a valuable window into their online worlds. Listen for extremist themes, explanations, or vocabulary in their language and don’t hesitate to address these head on when we hear them. Viewing media and gaming together also gives us important windows into the language and culture of their online communities.
According to the Polarization and Extremism Research and Innovation Lab at American University, we should listen more specifically for things like fear of a “white genocide” or a “great replacement,” antisemitic conspiracy theories, belief in the need for a violent uprising, misogynist language or ideas, blaming non-white and immigrant communities for problems, or glorifying societal collapse and violence.
Talk about values and belonging.
We will not counter extremism only by banning our kids from certain sites. As we have learned in the past, extremists “pop up” in new places online as soon as they are shut down. To be clear, we should absolutely use monitoring and limits as important tools in our toolkit. But this is rarely effective unless we make an equal commitment to talking about inclusive values, teaching critical thinking, and staying connected. Talk together about things like, “Why do you go where you go online? How do you benefit? Who belongs here? Who might not experience belonging here? Where do you experience belonging? Why?”
Don’t avoid talking about racism, anti-semitism, and xenophobia
The reality is that far-right extremism direct anger and betrayal towards others based on identity. The killer in Buffalo, NY claimed very directly the role that white supremacy played in his attack. Research demonstrates that when parents talk openly, frequently, and in age-relevant ways about race and racism their children are less likely to adopt racist views later in life and/or are more buffered from the toxic impacts of racism on their lives.
Practice media literacy.
Commit to basic Internet safety practices and growing media literacy skills. Ask questions like, “How can we find out if that is credible?” and introduce your child to sites, resources, and programs that can help them spot misinformation. Remind young people that most extremist groups gradually introduce more harmful content over time. Learning how to spot memes, images, and messages that reinforce extremist messages is essential.
Read more tips and follow-up resources in American University’s and Southern Poverty Law Center’s full guide Building Resilience and Confronting Risk: A Parents and Caregivers Guide to Online Radicalization.
Let’s model our commitment to our shared humanity.
We all have an interest in protecting young people from online extremism. The reality is that the seeds of extremist views are shared in mainstream channels and reinforced in everyday interactions. The more people spend time in like-minded groups without discussion rules, the more likely they are to move towards extremes. We can model an alternative for kids based on authentic relationships and shared humanity. We can work to challenge harmful stereotypes and build and enjoy meaningful relationships with people whose identities are different from our own. Modeling that there is room for everyone in our schools and in our neighborhoods helps buffer kids from online explanations that blame certain groups of people for our challenges.
Countering far right extremism isn’t just about watching out for it online. It is also about creating healthy spaces of belonging and modeling our commitment to our shared humanity wherever we go.