Content Moderation at Mozilla

Last Updated February 22, 2024

Content rules

Mozilla offers several products that allow users to share and exchange content — including original text and images, third party web pages or articles, and software applications. Content shared through those products, or anywhere in the Mozilla Community, must comply with the applicable content policies.

All products must comply with Mozilla’s Acceptable Use Policies. In addition, content shared on Mozilla.Social is subject to the Mozilla Social Content Policies; and content shared in the Mozilla Add-On Marketplace (AMO) is subject to the Firefox Add-on Policies. New and experimental products that host user-generated content may have their own content policies: when using one of these products, please check the product’s home page or review its Terms of Service to learn which content policies apply.

How we moderate

Mozilla seeks to create a vibrant online community that welcomes contributors and visitors from all communities, nationalities, and backgrounds. Our content moderation efforts reflect that goal.

Each applicable Mozilla product allows users to report content that is illegal, or that violates the governing policies. A human reviews each content-related report that we receive. Content policy violations on AMO and Mozilla Social are assessed by a dedicated team of human moderators. Content policy and Acceptable Use Policy violations for other products are reviewed by members of the applicable product teams. In certain cases, reports of illegal content will be reviewed by a member of our Legal team. Add-ons that have been reported for technical or AMO Policy violations, as well as feedback that we receive about add-ons hosted on AMO, will be assessed by a human on the AMO product team. This assessment will generally involve reviewing the listing, as well as the add-on’s codebase and features.

Where we receive duplicative reports about the same piece of content, we will do a single review and will notify later reporters that the content has already been assessed, rather than reviewing and responding separately to each report.

Moderation decisions

When a moderator determines that content does not violate our policies, we will notify the reporter of that determination and will take no action against the content. In most cases, we will provide the reporter the opportunity to appeal.

When a moderator finds that one of our policies has been violated, they will take the action dictated by that policy, including:

  • adding a content warning,
  • removing the content, or
  • suspending the account.

We will notify both the reporter and the user that a violation was found, and we will provide the user the opportunity to appeal.

Where a violation is punishable by a warning or content removal, additional violations may lead to suspension. In such cases, we will also notify the user that their account is being suspended based on a history of multiple violations. We will provide opportunities to separately appeal both the latest moderation decision, and the resulting suspension. In some products, these appeals can be combined, while in others they must be done separately.

Appeals

We accept two kinds of moderation-related appeals:

  • from reporters whose reports are not actioned; and
  • from users whose content is removed or whose accounts are penalized for violating our content policies.

In both cases, we provide a form where the party can submit their appeal and explain why they believe the prior decision was incorrect, and did not align with our policies or with the applicable legal requirements. These appeals rely on the same case management tool that we use in the first stage of our moderation process — which helps us to receive and track appeals, and to review the policy, the initial report, and any other information that was used in making our moderation decision.

Appeals will be routed to a dedicated queue upon receipt. Depending on the type of violation involved, items in that queue will be reviewed by other members of the moderation team, the applicable product team, or our Mozilla-wide Trust & Safety or Legal Teams. Appellate reviewers will review the applicable policy or policies and any relevant internal guidelines, as well as the information submitted by the appellant, to determine whether the original decision was appropriate and in line with Mozilla’s rules.

When a user prevails in their appeal, we will inform the user, and the prior action against their content or their account will be reversed. When a reporter prevails in their appeal, we will inform both the user and the reporter, and will take action against the content that violated our rules, according to the applicable policy.

If you disagree with the outcome of our internal appeals process and are based in the EU, you can bring your concerns to a certified out-of-court dispute settlement body.

Abusive reporters

Where a reporter regularly submits reports that have no basis, we will stop processing their reports. They will be routed to a special queue, and will not be reviewed.