Editor's Note: This article was last updated April 2, 2025.
If you encounter a violation of our Terms of Service or Community Guidelines, we ask that you report this behavior to us.
If the violation happened in a server, you can also reach out to the server’s moderators, who may be able to respond immediately and help resolve your concerns. In addition, please remember that you always have the ability to block any users that you don’t want to interact with anymore.
Do not mislead Discord’s support teams. Do not make false or malicious reports to our Trust & Safety or other customer support teams, send multiple reports about the same issue, or ask a group of users to report the same content or issue. Repeated violations of this guideline may result in loss of access to our reporting functions.
If a credible threat of violence has been made and you or someone else are in immediate danger, or if someone is considering self-harm and is in immediate danger, please contact your local law enforcement agency.
Additionally, if you are in the United States, you can contact Crisis Text Line to speak with a volunteer crisis counselor to help you or a friend through any mental health crisis by texting DISCORD to 741741. You can learn more about Discord’s partnership with Crisis Text line here.
You can find more resources about mental health here.
4. Select the type of abuse you're seeing
EU users can report illegal content under the EU Digital Services Act by clicking here. EU government entities reporting illegal content should follow the process outlined here.
EU users will be required to go through a verification process and follow the prompts to provide details and descriptions for their report.
When reporting a message under the EU Digital Services Act, a message URL is required. Here’s how to find the message URL for the desktop and mobile apps.
When you submit a report, we’ll send a system DM or email to let you know that we have received it. Reports are processed based on priority as described below, and we use a combination of automated and human-powered review methods to process reports. We may also let you know if there are safety features available that could help you address the potential harm more quickly.
The time it takes for us to review a report depends on the type of content and the severity of the violation:
Report review times may fluctuate due to process changes, volume surges, technical issues, and backlogs.
While not every report identifies a violation of Discord’s policies, your reports help us improve and enhance Discord's safety practices. We will let you know if we identified a violation of our policies based on your report.
When you submit a report, your identity is not shared with the account you reported. This is true even if the report is appealed. We never intentionally disclose the identity of a reporter unless we are legally required to do so. Examples of when this may be the case are if Discord receives valid legal process requiring us to provide the identity of the reporter and requests to remove content under the Digital Millennium Copyright Act (“DMCA”). The DMCA allows the reported user to request further details about the complaint, including the name and email address of the claimant, a description of the allegedly infringed work, and the reported content. In such cases, Discord provides this information in accordance with copyright law requirements.
You can read more about the reports we receive and the actions we take on violations of our Community Guidelines or Terms of Service in our Transparency Report.