Editor's Note: On July 1 2024, the Supreme Court reaffirmed that moderation decisions are protected by the First Amendment of the U.S. Constitution. You can read our response to this ruling here.
Today, Discord filed an amicus brief with the Supreme Court in support of our ongoing efforts to create online communities of shared interests that are safe, welcoming, and inclusive. The cases at issue (Moody v. NetChoice, et al. and NetChoice et al. v. Paxton) challenge the constitutionality of two new laws in Texas and Florida. You may have heard of them: these laws would significantly restrict the ability of online services to remove content that is objectionable to their users and force them to keep a wide range of “lawful but awful” content on their sites. Discord is not a traditional social media service, but whether intended or not, the laws are broad enough to likely impact services like ours.
It’s hard to overstate the potential impact if these laws are allowed to go into effect: they would fundamentally change how services like Discord operate, and with that, the experiences we deliver to you. There’s no doubt that the outcome here is hugely important to us as a company, but that alone is not why we are taking a stand. Ultimately, we filed this brief because important context was missing from the official record: you all and the vibrant communities you’ve built on Discord.
Taking a step back, Discord has invested heavily in creating safe and fun online spaces. Our Community Guidelines set the rules for how we all engage on Discord, and we put a lot of effort into enforcing them: around 15% of the company works on safety, including content moderation. This includes Trust and Safety agents who enforce Discord’s rules, engineers who develop moderation tools for both Discord and its users, and many more dedicated individuals across our company.
We do this work so that you can create those online spaces where you can find and foster genuine connection. But in order to bring this world to life, Discord needs to be able to prevent, detect, and remove harmful speech and conduct. Barring companies like Discord from moderating content is a little like barring garbage collection in a city: every individual block would have to devote its own resources to hauling its own garbage, diverting those limited resources from more productive activities and the individualized curation that makes each neighborhood unique. So, while our rules and content moderation efforts certainly reflect our own values and speech about the kind of services we want to offer, we really do it for you all—we work to take care of the hard stuff so you all can focus on creating the communities you want.
That’s why we decided to take such a strong position in these cases: we want to make sure the Supreme Court understands how these laws would impact your speech and your communities. Fortunately, the First Amendment has a lot to say about this–it protects your association rights, which make it possible for you to come together around your shared interests and see content that is relevant and helpful, not harmful.
Keeping your communities fun and safe by not allowing harmful content is a top priority for us. We wouldn’t be doing what we do without all of you, and we will keep working to make sure you can create the online spaces you want.