Transparency Report Reporting Period: January 1, 2024 to June 30, 2024

Government Demands for User Data

In the Reporting Period, Mozilla received the following.

Legal Processes Received User Data Produced
Search Warrants 1 1
Subpoenas 3 2
Court Orders 0 0
Wiretap Orders 0 0
Pen Register Orders 0 0
Emergency Requests 0 0
National Security Requests 1 0-249 0-249

Government Demands for Content Removal

In the Reporting Period, Mozilla received no government request for content removal from our services.

Requesting Country Requests Received Data Removed
Russia 5 0

Legal Removal Requests By Companies or Individuals

In the Reporting Period, we received 0 private legal removal requests invoking laws other than copyright or trademark (note that takedown notices can target more than one item). Where takedown requests are made pursuant to national laws removal might be limited to the relevant jurisdiction.

Personal Data Requests

In the Reporting Period, we received 7,323 requests.

Service Received
Mozilla 6,440
Pocket 817
Fakespot 66

Targeted Advertising Disclosures

In the Reporting Period, and during the time we started targeted advertising, we have placed the following targeted advertisements.

Mozilla

EU Monthly Active Users

Service Average MAU in the European Union
Addons.mozilla.org 2,755,114
MDN 2,379,021

Policy Supplement

During the course of this reporting period, Mozilla continued its efforts to protect the security and privacy of our users through public policy.

In Europe, we published our policy recommendations for the new EU mandate outlining our position on ensuring greater openness, privacy, fair competition, and meaningful choice online. We also contributed to the EU’s process of evaluating consumer protection rules and emphasized that such rules should unequivocally incentivise the use and uptake of privacy-enhancing technologies to balance personalization and respect for privacy online. At the Computers, Privacy and Data Protection (CPDP) conference, we utilized the conference’s strong focus on privacy to organize a panel on accountability in AI, highlighting the importance of transparency and responsibility in AI development.

Building on our work in 2023, we continued to engage in a dialogue with the European Commission on the eIDAS legislation published in the official journal. We continued to monitor the implementation of Article 45 to ensure encryption and web security are not weakened by restricting the independence of browser root store programs to apply strict cybersecurity requirements for website authentication.

In the United States, we engaged in discussions on comprehensive federal privacy proposals, including the American Privacy Protection Act (APRA). We testified before Congress on the need for a comprehensive federal privacy law to ensure the responsible development of AI. We also supported state privacy proposals modeled after the American Data Privacy Protection Act (ADPPA) to ensure individuals’ security and privacy on the Internet are fundamental and are not treated as optional. Additionally, we co-signed an amicus brief in the Nevada vs. Meta case, defending end-to-end encryption to affirm our collective commitment to safeguarding encryption, digital privacy and security as fundamental rights. Through our efforts to protect net neutrality, we highlighted the privacy implications of weakening net neutrality in January's reply comments to the FCC.

Globally, through the Global Encryption Coalition Steering Committee, we continue to collectively advocate for the protection of encryption against regulatory efforts that aim to weaken or undermine it. As a part of our collaborative efforts with GEC SC, we contributed to the following activities: We joined allies in signing a joint statement urging Australia to protect end-to-end encryption in the Online Safety Act review. We also signed onto a letter outlining GEC’s concerns about the Belgian Presidency's advocacy to use scanning technologies for encrypted messaging services to prevent CSAM, without addressing the security and fundamental rights concerns raised by experts. This statement was a follow-up of earlier GEC SC comments on the Belgian Presidency’s compromise proposal on EU CSAM. We supported a statement delivered as a part of the open consultations on the zero draft of the Global Digital Compact, articulating the expert insights on protecting and extending the use of strong encryption to protect human rights. We also joined GEC SC in supporting the Chilean government’s inclusion of encryption as a right for every person in Chile through its Cybersecurity Law. In the UK, GEC SC recommended that Ofcom removes E2EE as a risk factor within risk assessments and that the Ofcom guidance be revised. GEC SC also outlined concerns on proposed revisions to the UK’s Investigatory Powers Act where the Amendment Bill proposed changes that would interfere with the ability of providers to offer strong encryption in their products, which in turn would harm the security of users not just in the UK, but around the world.

The policy team played a strong role in supporting product teams to ensure privacy continues to be ingrained in our product offerings. We supported our product teams in testing Privacy-Preserving Attribution (PPA) designed to measure digital ad effectiveness without compromising user privacy. By using advanced cryptographic methods, PPA aggregates data to assess ad impact without tracking individual users, aligning with privacy regulations like GDPR. During the course of this reporting period, Mozilla continued its efforts to protect the security and privacy of our users through public policy.

Voluntary Threat Indicators & Data Disclosures

  • 0 Cybersecurity Threat Indicator Disclosures
  • 0 Other Specific User Data Disclosures