Our Continued Investment in System Transparency

Facebook Business

Last week, we announced some important changes to prepare for the 2020 US elections and fight against racial injustice — a direct result of feedback from the civil rights community collected through our civil rights audit.

As we make these changes, today we are sharing updates and next steps as to how we are continuing to make our systems more transparent.

Content Monetization, Brand Safety Tools and Practices

We plan to evaluate our partner and content monetization policies and the brand safety controls we make available to advertisers. This audit, run by the Media Rating Council (MRC), will include (but not be limited to):

  • An evaluation of the development and enforcement of our Partner Monetization Policies
  • An evaluation of the development and enforcement of our Content Monetization Policies and how these policies enforce the 4A’s/GARM Brand Suitability Framework, and comply with MRC’s Standards for Brand Safety
  • An assessment of our ability to apply brand safety controls to ads shown within publisher content such as in-stream, Instant Articles or Audience Network
  • A determination of the accuracy of our available reporting in these areas

We will share an update on the scope and timing of this audit once finalized with the MRC.

Enforcement of our Community Standards

In May, we made the announcement our Community Standards Enforcement Report (CSER) is moving to a quarterly basis. The next report will be released in August.

This detailed report showcases how we are doing at removing content that violates our Community Standards, in more detail than any other major internet service. As part of our ongoing commitment to transparency we intend to include the prevalence of hate speech in our CSER reports over the coming year, pending no further complications from COVID-19.

As Mark said in , we’re also looking at opening up our content moderation systems for external audit. We’re reaching out to key stakeholders spanning government regulators, civil society, and the advertising industry as this is designed. This separate third-party audit will be completed by a reputable firm and include the incidence of violating content. We are in the early days of this process and will share more details as we are able.

Collaborating With the Industry

Brand safety is a challenge that affects the entire advertiser community, which is why we collaborate with industry partners who are also working to make online platforms safer for businesses and people. Our work with partners includes:

  • Participating in the World Federation of Advertiser’s (GARM) to align on brand safety standards and definitions, scaling education, common tools and systems, and independent oversight for the industry.
  • Holding sessions with industry bodies to provide further insight into how our teams work to review content and enforce our Community Standards.
  • from independent groups, like the Digital Trading Standards Group which specifically examines our advertising processes against JICWEBS’ Good Practice Principles and is a requirement for achieving the Interactive Advertising Bureau's Gold Standard. We will continue to work with advertising industry bodies such as the Global Alliance for Responsible Media and the Media Rating Council to audit our brand safety tools and practices.

We recognize our responsibility to facilitate a safe environment for everyone using our platforms.

To do this, we have to provide our partners with transparency of how well we are doing at keeping the platform safe by removing violating content. We have work underway to address the major concerns expressed but acknowledge there is much more work to do.

Get Facebook Business news in your inbox.

Sign up for our monthly newsletter for the latest updates, insights, marketing trends and articles from Facebook.

Tags

Announcements

Announcements
·
June 29, 2020

Our Continued Investment in System Transparency

Our Continued Investment in System Transparency

L

ast week, we announced some important changes to prepare for the 2020 US elections and fight against racial injustice — a direct result of feedback from the civil rights community collected through our civil rights audit.

As we make these changes, today we are sharing updates and next steps as to how we are continuing to make our systems more transparent.

Content Monetization, Brand Safety Tools and Practices

We plan to evaluate our partner and content monetization policies and the brand safety controls we make available to advertisers. This audit, run by the Media Rating Council (MRC), will include (but not be limited to):

  • An evaluation of the development and enforcement of our Partner Monetization Policies
  • An evaluation of the development and enforcement of our Content Monetization Policies and how these policies enforce the 4A’s/GARM Brand Suitability Framework, and comply with MRC’s Standards for Brand Safety
  • An assessment of our ability to apply brand safety controls to ads shown within publisher content such as in-stream, Instant Articles or Audience Network
  • A determination of the accuracy of our available reporting in these areas

We will share an update on the scope and timing of this audit once finalized with the MRC.

Enforcement of our Community Standards

In May, we made the announcement our Community Standards Enforcement Report (CSER) is moving to a quarterly basis. The next report will be released in August.

This detailed report showcases how we are doing at removing content that violates our Community Standards, in more detail than any other major internet service. As part of our ongoing commitment to transparency we intend to include the prevalence of hate speech in our CSER reports over the coming year, pending no further complications from COVID-19.

As Mark said in February, we’re also looking at opening up our content moderation systems for external audit. We’re reaching out to key stakeholders spanning government regulators, civil society, and the advertising industry as this is designed. This separate third-party audit will be completed by a reputable firm and include the incidence of violating content. We are in the early days of this process and will share more details as we are able.

Collaborating With the Industry

Brand safety is a challenge that affects the entire advertiser community, which is why we collaborate with industry partners who are also working to make online platforms safer for businesses and people. Our work with partners includes:

  • Participating in the World Federation of Advertiser’s Global Alliance for Responsible Media (GARM) to align on brand safety standards and definitions, scaling education, common tools and systems, and independent oversight for the industry.
  • Holding sessions with industry bodies to provide further insight into how our teams work to review content and enforce our Community Standards.
  • Certification from independent groups, like the Digital Trading Standards Group which specifically examines our advertising processes against JICWEBS’ Good Practice Principles and is a requirement for achieving the Interactive Advertising Bureau's Gold Standard. We will continue to work with advertising industry bodies such as the Global Alliance for Responsible Media and the Media Rating Council to audit our brand safety tools and practices.

We recognize our responsibility to facilitate a safe environment for everyone using our platforms.

To do this, we have to provide our partners with transparency of how well we are doing at keeping the platform safe by removing violating content. We have work underway to address the major concerns expressed but acknowledge there is much more work to do.

Related Articles

Our Continued Investment in System Transparency
Announcements · June 29, 2020

Our Continued Investment in System Transparency

We're sharing updates and next steps as to how we are continuing to make our systems more transparent.

Announcing Boost With Facebook Summer of Support
Announcements · June 24, 2020

Announcing Boost With Facebook Summer of Support

Boost With Facebook Summer of Support is a free virtual training program designed to help businesses and people learn new skills during a time where it’s important to be online.

Helping Businesses Comply With the California Consumer Privacy Act (CCPA)
Announcements · June 23, 2020

Helping Businesses Comply With the California Consumer Privacy Act (CCPA)

To support businesses with their compliance efforts, we’re introducing a new feature businesses can use to limit how we use the data they send to Facebook, called Limited Data Use

Get Facebook Business news in your inbox.

Sign up for our monthly newsletter for the latest updates, insights, marketing trends and articles from Facebook.