February 16, 2024

Reporting Abusive Behavior to Discord

Editor's Note: This article was last updated April 2, 2025.


If you encounter a violation of our Terms of Service or Community Guidelines, we ask that you report this behavior to us.

Reporting a Message

  1. Select the Message you wish to report. On mobile, hold down on the Message, and on desktop, “right-click.”
  2. Select “Report Message
  3. Select the type of abuse you’re seeing:
  1. The next screen will allow you to further specify the specific abuse that’s occurring. You can always click back and change your first answer, so you can select the most relevant category.

If the violation happened in a server, you can also reach out to the server’s moderators, who may be able to respond immediately and help resolve your concerns. In addition, please remember that you always have the ability to block any users that you don’t want to interact with anymore.

Do not mislead Discord’s support teams. Do not make false or malicious reports to our Trust & Safety or other customer support teams, send multiple reports about the same issue, or ask a group of users to report the same content or issue. Repeated violations of this guideline may result in loss of access to our reporting functions.

What to do if you receive a violent threat, or someone is at risk of Self-harm

If a credible threat of violence has been made and you or someone else are in immediate danger, or if someone is considering self-harm and is in immediate danger, please contact your local law enforcement agency.

Additionally, if you are in the United States, you can contact Crisis Text Line to speak with a volunteer crisis counselor to help you or a friend through any mental health crisis by texting DISCORD to 741741.  You can learn more about Discord’s partnership with Crisis Text line here.

You can find more resources about mental health here.

Reporting a User Profile

  1. Select the User Profile you wish to report by clicking on the three dot menu
  2. Select “Report User Profile
  3. Select the specific elements of the profile you are reporting - you can report multiple aspects of a profile at once

         4. Select the type of abuse you're seeing

  1. The next screen will allow you to further specify the specific abuse that’s occurring. You can always click back and change your first answer, so you can select the most relevant category.

Reports Under the EU Digital Services Act

EU users can report illegal content under the EU Digital Services Act by clicking here. EU government entities reporting illegal content should follow the process outlined here.

EU users will be required to go through a verification process and follow the prompts to provide details and descriptions for their report.

When reporting a message under the EU Digital Services Act, a message URL is required. Here’s how to find the message URL for the desktop and mobile apps.

Desktop App

  1. Navigate to the message that you would like to report.
  2. Right-click on the message or press on the ellipses icon when hovering over the message.
  1. Select Copy Message Link.
  1. The message URL will be copied to your device’s clipboard.

Mobile App

  1. Navigate to the message that you would like to report.
  2. Hold down on the message to reveal a pull-up menu.
  3. Select Copy Message Link.
  1. The message URL will be copied to your device’s clipboard.

What Happens After I Submit a Report?

When you submit a report, we’ll send a system DM or email to let you know that we have received it. Reports are processed based on priority as described below, and we use a combination of automated and human-powered review methods to process reports. We may also let you know if there are safety features available that could help you address the potential harm more quickly.

The time it takes for us to review a report depends on the type of content and the severity of the violation:

  • High harm reports (e.g., child safety, extremism, imminent harm): These reports are prioritized and Discord aims to review them within 24 to 48 hours.
  • Lower harm reports (e.g., harassment, hate speech, cybercrime misinformation): Discord aims to review reports in this category within a week.
  • Complex Investigations (e.g., coordinated abuse, multiple violations, or appeals): Reports of any type might result in additional review or investigation by Discord. These cases may require days to weeks to review, depending on their complexity and the need for thorough investigation.

Report review times may fluctuate due to process changes, volume surges, technical issues, and backlogs.

While not every report identifies a violation of Discord’s policies, your reports help us improve and enhance Discord's safety practices. We will let you know if we identified a violation of our policies based on your report.

When you submit a report, your identity is not shared with the account you reported. This is true even if the report is appealed. We never intentionally disclose the identity of a reporter unless we are legally required to do so. Examples of when this may be the case are if Discord receives valid legal process requiring us to provide the identity of the reporter and requests to remove content under the Digital Millennium Copyright Act (“DMCA”). The DMCA allows the reported user to request further details about the complaint, including the name and email address of the claimant, a description of the allegedly infringed work, and the reported content. In such cases, Discord provides this information in accordance with copyright law requirements.

You can read more about the reports we receive and the actions we take on violations of our Community Guidelines or Terms of Service in our Transparency Report.

Tags:
Reporting
User Safety

Lorem Ipsum is simply