Toxic Mod Filter
Disqus avatar
Written by Disqus
Updated over a week ago

Toxic comments disrupt communities, drive users away, and strain moderation efforts. The Toxicity Mod Filter empowers moderators to prioritize toxic content for moderation in order to lower their negative impact on the community and decreases the reliance on users flagging comments.

How does this work?

The toxicity filter utilizes natural language processing and machine learning to analyze and identify comments likely to be toxic. We’ve integrated our moderation system with Google’s Perspective API to deliver this capability.

More information and insights on this technology can be found in this blog post.

What are toxic comments?

Toxic comments are defined as having at least two of the following properties:

  • Abuse: The main goal of the comment is to abuse or offend an individual or group of individuals.

  • Trolling: The main goal of the comment is to garner a negative response.

  • Lack of contribution: The comment does not actually contribute to the conversation.

  • Reasonable reader property: Reading the comment would likely cause a reasonable person to leave a discussion thread.

This two-property guideline should help prevent comments like “haha”, that don’t add to the conversation as well as comments that provide opposing viewpoints from being flagged as toxic.


Filter for toxic comments using your Moderation Panel and decide on the moderation action to perform.

Frequently Asked Questions:

Does this auto-moderate comments on my site?
No. Toxicity is a tag / label that DIsqus provides for publishers in the moderation panel. From their moderation panels, publishers can see which comments are labelled as toxic. Publishers can also sort by “toxic” to see all toxic comments. Comments that are labelled as “toxic” are not moderated, or pre-moderated. At this time, the toxicity label is just another piece of information to help publishers moderate more effectively.

If a comment is not Toxic, can I remove the “Toxic” tag?
No. While this ability currently isn’t available, we are looking into methods of improving its use in the future.

Will commenters see this?
No. Only site moderators can see this within the Moderation Panel.

Will you show a numeric score for toxic comments?
No. This only tags a comment with the label “Toxic” within your Moderation Panel.

What happens when a user edits their comments?
The filter re-checks their comment.

What languages does the filter support?
Currently, only English. We hope to expand support for non-English languages in the near future.

Did this answer your question?