The idea of banning rule-breakers is as old as online communities themselves.
Sometimes, that’s just what has to be done. At Discord, there will always be a zero-tolerance policy for the most serious violations, such as violent extremism and anything that endangers or exploits children.
But the vast majority of Discord users don’t break the rules. And in most cases, when a user does break the rules, they don’t commit serious violations. They might make a joke that goes too far, or say something in the heat of the moment that they later regret.
So rather than banning all rule-breakers, our approach to moderation makes space for nuance. We recognize that people have the potential to change, and the positive impact it has on their communities when they do. The goal is to guide users toward better behavior, with penalties that match the severity of the violation and, where possible, a clear pathway back to being in good standing.
Our approach to remediation is inspired by restorative justice, a concept from criminal justice that prioritizes repairing harm over punishment. Discord’s Community Guidelines exist to keep people and communities safe, and it’s important to make sure everyone knows how and why to follow them. And still, when people break the rules, most of the time they deserve a chance to correct themselves and show that they’ve learned.
When users break an online platform’s rules, they don’t always do so intentionally. Consider how many different platforms someone might use in a day, each with their own rules and expectations. Discord needs to enforce its own Community Guidelines, while understanding that a user who violates the Guidelines might not even know they had broken a rule in the first place.
“A lot of users have their own ‘terms of service’ that may be different from ours,” said Jenna Passmore, Discord’s senior staff product designer for safety. Admins and moderators set rules, norms, and expectations for their own servers on top of Discord’s Community Guidelines and Terms of Service, and users may not know when they’re breaching the rules of a server or the platform. “If there's a mismatch there, and it’s not a severe violation, we don't want to kick them off the platform.”
Instead, Discord wants to teach people how to stay in bounds and keep themselves, their friends, and their communities safe and fun for everyone. And, to be clear, when it comes to the most severe violations—such as those involving violent extremism and anything that endangers or exploits children—Discord will continue to have a zero-tolerance policy.
In most cases, breaking Discord’s Community Guidelines triggers a series of events that are aimed at returning the user to their community with a better understanding of how they should behave.
Education comes first. When Discord spots a violation, it lets a user know what specific thing they did wrong. They’re prompted to learn more, in plain language, about that rule and why it matters. Nobody expects someone to memorize the full Community Guidelines or Terms of Service, but this is the best moment to help them learn what they need to know.
Consequences come next. Any penalties we impose are transparent and match the severity of the violation. A first-time, low-level violation, for example, could prompt a warning, removal of the offending content, or a cooling-off period with temporary restrictions on account activity.
When the action Discord takes to enforce its Community Guidelines takes proper measure of the violation and provides users a path back to good standing, users can often return to their community with a greater understanding of the rules and the need to change their own behavior. If users stay on good terms, similar violations in the future will play out roughly the same way. If they continue to break the rules frequently, the violations stack up, and so do the consequences. The idea is to provide users an opportunity to learn from their mistakes, change their behavior, and therefore stay in communities they know and trust on Discord, rather than pushing users who violate Discord’s Community Guidelines off the platform and to other spaces that tolerate or encourage bad behavior.
“We want users to feel like they have some agency over what's going to happen next,” Passmore said. “We want an avenue for folks to see our line of thinking: You violated this policy, and this is what we're doing to your account. Learn more about this policy, and please don't do this again. If you do it again, here's what's going to happen to your account.”
Banning is reserved for the most extreme cases, and the highest-harm violations result in immediate bans. Otherwise, if someone repeatedly demonstrates that they won’t play by the rules, the consequences become more severe, from losing access to their account for a month, to a year, and eventually—if they continue to violate the Community Guidelines—for good.
Remediation depends on users who are able to learn and change their behavior being able to return to good standing. Discord’s account standing meter lets users see how their past activity affects what they’re able to do on the platform.
The standing meter has five levels: good, limited, very limited, at risk, or banned.
“For most folks, it will always be ‘good’,” Passmore said. When it’s not, the hope is that someone will act more thoughtfully so they can return to that good status and stay there. She compared it to road signs that show the risk level for wildfires. “It changes your behavior when you’re driving through a national park and you see that there’s a moderate fire risk,” she said.
Discord’s internal policies outline the severity of each type of violation, the appropriate consequence, and how one type of violation compares to others. These policies help ensure that the employees who review violations of Discord’s Community Guidelines enforce the rules fairly and consistently. That means when someone breaks the rules, employees aren’t deciding punishment on an ad hoc basis. Instead, the outcome is determined by the severity and number of violations.
“Our goal is to help people act better,” said Ben Shanken, Discord’s vice president of product, who oversees teams that work on growth, safety, and the user experience. “People don't always know that they're doing bad things, and giving them a warning can help them to improve.”
Ultimately, we believe that guiding users toward better behavior—and giving them the tools they need to learn—results in a better experience for everyone.