At Discord, understanding teens’ specific needs is central to how we design our products, policies, and enforcement actions. It’s known as a “teens’ rights” approach, which takes its cues from an unlikely source.
A teens’ rights approach honors young people’s experience, including areas where they could use a little help—say, when they push boundaries just a little too far or succumb to peer pressure and engage in inappropriate behavior online.
The idea comes from the United Nations’ guidance on the impact of digital technologies on young people’s lives. For the report, committee members consulted with governments, child experts, and other human rights organizations to formulate a set of principles designed to “protect and fulfill children’s rights” in online spaces.
The committee also engaged those at the center of the report—young people themselves. The UN’s guidance was not only rooted in research, it treated teens with respect. It listened to them.
It's this kind of holistic and human rights approach that makes this body of work so influential, especially for child rights advocates. But it’s also become a guiding force for Discord.
“Teens are going to try things out, figure out who they are, and make mistakes online. Discord is focused on guiding them through that process without being overly punitive or cutting them off from their support networks and their friends,” said Laney Cloyd, a senior product policy specialist.
Cloyd and her colleagues set out to construct a new warning system for when a teen breaks Discord’s rules. They started with the idea that young people are going to do regrettable things, like send their friend a link that says, “Fortnite cheats here” or “Cute kitten pics” and it’s really an IP grabber. They may think it's funny and not harmful, but it’s still against the rules.
“We can't be super punitive on the first violation, because they don't know. We have to teach them,” said Cloyd.
Of course, when someone violates the rules, there should be consequences. But a teens’ rights approach considers a person’s potential to change—call it a teachable moment.
Teens need extra opportunities to figure out what the boundaries are. It’s true when they’re at home with their parents, when they’re at school with their teachers, or when they’re on platforms like Discord with its own set of community guidelines. Parents, teachers, and coaches are there to tell young people what’s what. Discord saw an opportunity to play that role online: We built interactive educational moments that explain to a person what they did wrong.
“As opposed to just saying, ‘you're kicked off, go read our guidelines,’ we offer more guidance so they can do better,” said Liz Hegarty, Discord’s global teen policy manager. Hegarty pointed to research that shows teens really aren't given great advice about how to handle themselves online. Platforms have a role to play in helping teens build their digital citizenship skills.
All of this is part of Discord’s philosophy for designing policies that hangs on the belief that people are capable of change if you give them a chance, especially when those people are young and still learning. They need a chance at rehabilitation. (An important caveat is that some violations are more serious than others, and Discord takes appropriate action depending on the severity of the violation. For example, we have and will continue to have a zero-tolerance policy towards violent extremism and content that sexualizes children.)
It might sound odd that a tech company would borrow ideas from the preeminent human rights organization. But consider the most foundational tenet of human rights: the right to live freely and safely. Platforms like Discord are most successful when they enable people to be themselves, have fun, and make meaningful connections—all without fear or threats to their safety or well-being.
To illustrate how teens’ rights are woven into our warning system, take for example Discord’s updated approach to enforcing our Bullying and Harassment policy.
First, it’s helpful to understand that bullying is broken down into a number of different dimensions. One of those includes imminent threats of physical harm—for example, extortion or a death threat. For violations of that severity, there are no second chances. Users don’t get to do that on Discord.
But for non-imminent or non-physical threats, Discord has built in more opportunities for teens to get the message that bullying isn’t allowed. These show up in the form of a series of time-outs.
Say a teen has been flagged for harassing a user—maybe they’re degrading or mocking someone. For their first violation, they would get a notice and Discord would remove the content.
Then the second time, they’ll get a notice, the content will get removed, and the teen will be placed in “read only” mode for a set number of hours. If it keeps happening, Discord will apply longer and longer time outs.
Ultimately if the teen doesn’t change their behavior and continues to engage in harassment or sustained bullying, they can receive a year’s suspension.
This approach is in contrast to a traditional “three strikes and you’re out” rule, where most violations, no matter how severe, would count against you, and then after three dings, you’re banned for good. Discord’s more rehabilitative approach takes into account the too-real probability that teens are going to make mistakes.
“Imagine a 14-year-old who falls in with a bad crowd and engages in a bunch of bullying and gets suspended for a year,” said Cloyd. “If they return, a 15 year old is going to be a different person in a lot of fundamental ways.”
The new warning system is also geared toward addressing the specific thing a teen did wrong. For example, let’s say a teen posts an image that violates Discord’s rules. An appropriate measure would be to turn off their ability to post images for a while, but they can still talk to their friends, join voice chats, or play games.
“We're taking very targeted interventions and not just slamming a big ban hammer down,” said Cloyd. “Because you posted one wrong image you didn't know was wrong, we’d block you out of the app forever? That's not fair. That's not a learning experience.”
“We intuitively recognize that as children get a little bit older, they are autonomous humans. They have agency,” said Hegarty. “They have the right to access online spaces safely.”
This goes back to the UN document, which Hegarty first encountered before she came to Discord when she worked for a children’s digital advocacy organization.
“The whole world is fairly new to thinking about how we handle teens in online spaces, especially when more and more of their lives are taking place online. It has an impact across everything from telemedicine to education,” said Hegarty.
Teens are at a transformative and exciting stage of life. They get to explore the world in new ways. They use platforms like Discord to find themselves and make meaningful connections, but that can't be separated from the potential that they'll make mistakes along the way.
Recognizing that—and helping them grow from it—helps keep everyone safer.