Updates to Our Continued Investment in System Transparency

Facebook Business

Last month we announced some important changes to prepare for the 2020 US elections — a direct result of feedback from the civil rights community collected through our civil rights audit.

As we make these changes, we are sharing updates and next steps as to how we are continuing to make our systems more transparent.

Content Monetization, Brand Safety Tools and Practices

We plan to evaluate our partner and content monetization policies and the brand safety controls we make available to advertisers. This audit, run by the Media Rating Council (MRC), will include (but not be limited to):

  • An evaluation of the development and enforcement of our Partner Monetization Policies
  • An evaluation of the development and enforcement of our Content Monetization Policies and how these policies enforce the 4A’s/Global Alliance for Responsible Media (GARM) Brand Suitability Framework, and comply with MRC’s Standards for Brand Safety
  • An assessment of our ability to apply brand safety controls to ads shown within publisher content such as in-stream, Instant Articles or Audience Network
  • A determination of the accuracy of our available reporting in these areas

We plan to report progress on the overall process by mid-August, and will communicate a plan for further marketplace updates thereafter.

Enforcement of our Community Standards

In May we made the announcement that our Community Standards Enforcement Report (CSER) is moving to a quarterly basis. The next report will be released in August.

This detailed report showcases how we are doing at removing content that violates our Community Standards, in more detail than any other major internet service. As part of our ongoing commitment to transparency we expect to share prevalence data for hate speech in the November CSER, pending no additional COVID-19 challenges.

As Mark said in , we’re also looking at opening up our content moderation systems for external audit to validate the numbers we publish in our CSER report. We’re reaching out to key stakeholders spanning government regulators, civil society, and the advertising industry as this is designed. This separate third-party audit will be completed by a reputable firm and include the incidence of violating content. The request for proposal (RFP) will be issued in August 2020 and the audit is expected to be conducted in 2021.

Collaborating With the Industry

Brand safety is a challenge that affects the entire advertiser community, which is why we collaborate with industry partners who are also working to make online platforms safer for businesses and people.

“Hate speech has no place on our platform and while we have invested in technologies and teams over the years to combat hateful content, we clearly have more work to do,” says Carolyn Everson, VP Global Marketing Solutions, Facebook. “We are grateful for the candid feedback from our advertising partners and the Global Alliance for Responsible Media and are committed to taking swift action across four key focus areas identified by GARM.”

Our work with partners includes:

  • Participating in the World Federation of Advertiser’s to align on brand safety standards and definitions, scaling education, common tools and systems, and independent oversight for the industry.
  • Holding sessions with industry bodies to provide further insight into how our teams work to review content and enforce our Community Standards.
  • from independent groups, like the Digital Trading Standards Group which specifically examines our advertising processes against JICWEBS’ Good Practice Principles and is a requirement for achieving the Interactive Advertising Bureau's Gold Standard. We will continue to work with advertising industry bodies such as the Global Alliance for Responsible Media and the Media Rating Council to audit our brand safety tools and practices.

We recognize our responsibility to facilitate a safe environment for everyone using our platforms.

To do this, we have to provide our partners with transparency of how well we are doing at keeping the platform safe by removing violating content. We have work underway to address the major concerns expressed but acknowledge there is much more work to do.

Get Facebook Business news in your inbox.

Sign up for our monthly newsletter for the latest updates, insights, marketing trends and articles from Facebook.

Tags

Announcements

Announcements
·
July 17, 2020

Updates to Our Continued Investment in System Transparency

Updates to Our Continued Investment in System Transparency

L

ast month we announced some important changes to prepare for the 2020 US elections — a direct result of feedback from the civil rights community collected through our civil rights audit.

As we make these changes, we are sharing updates and next steps as to how we are continuing to make our systems more transparent.

Content Monetization, Brand Safety Tools and Practices

We plan to evaluate our partner and content monetization policies and the brand safety controls we make available to advertisers. This audit, run by the Media Rating Council (MRC), will include (but not be limited to):

  • An evaluation of the development and enforcement of our Partner Monetization Policies
  • An evaluation of the development and enforcement of our Content Monetization Policies and how these policies enforce the 4A’s/Global Alliance for Responsible Media (GARM) Brand Suitability Framework, and comply with MRC’s Standards for Brand Safety
  • An assessment of our ability to apply brand safety controls to ads shown within publisher content such as in-stream, Instant Articles or Audience Network
  • A determination of the accuracy of our available reporting in these areas

We plan to report progress on the overall process by mid-August, and will communicate a plan for further marketplace updates thereafter.

Enforcement of our Community Standards

In May we made the announcement that our Community Standards Enforcement Report (CSER) is moving to a quarterly basis. The next report will be released in August.

This detailed report showcases how we are doing at removing content that violates our Community Standards, in more detail than any other major internet service. As part of our ongoing commitment to transparency we expect to share prevalence data for hate speech in the November CSER, pending no additional COVID-19 challenges.

As Mark said in February, we’re also looking at opening up our content moderation systems for external audit to validate the numbers we publish in our CSER report. We’re reaching out to key stakeholders spanning government regulators, civil society, and the advertising industry as this is designed. This separate third-party audit will be completed by a reputable firm and include the incidence of violating content. The request for proposal (RFP) will be issued in August 2020 and the audit is expected to be conducted in 2021.

Collaborating With the Industry

Brand safety is a challenge that affects the entire advertiser community, which is why we collaborate with industry partners who are also working to make online platforms safer for businesses and people.

“Hate speech has no place on our platform and while we have invested in technologies and teams over the years to combat hateful content, we clearly have more work to do,” says Carolyn Everson, VP Global Marketing Solutions, Facebook. “We are grateful for the candid feedback from our advertising partners and the Global Alliance for Responsible Media and are committed to taking swift action across four key focus areas identified by GARM.”

Our work with partners includes:

  • Participating in the World Federation of Advertiser’s Global Alliance for Responsible Media to align on brand safety standards and definitions, scaling education, common tools and systems, and independent oversight for the industry.
  • Holding sessions with industry bodies to provide further insight into how our teams work to review content and enforce our Community Standards.
  • Certification from independent groups, like the Digital Trading Standards Group which specifically examines our advertising processes against JICWEBS’ Good Practice Principles and is a requirement for achieving the Interactive Advertising Bureau's Gold Standard. We will continue to work with advertising industry bodies such as the Global Alliance for Responsible Media and the Media Rating Council to audit our brand safety tools and practices.

We recognize our responsibility to facilitate a safe environment for everyone using our platforms.

To do this, we have to provide our partners with transparency of how well we are doing at keeping the platform safe by removing violating content. We have work underway to address the major concerns expressed but acknowledge there is much more work to do.

Related Articles

Introducing Free Online Courses for Community Management
Announcements · July 8, 2020

Introducing Free Online Courses for Community Management

Our business education program, Facebook Blueprint, offers free online courses to help community managers build and engage their online audiences.

Updating Our Business Terms
Announcements · July 1, 2020

Updating Our Business Terms

To help businesses better understand their rights and responsibilities when using our products, we are making a series of updates to our business terms.

Sharing Our Actions on Stopping Hate
Announcements · July 1, 2020

Sharing Our Actions on Stopping Hate

Facebook is committed to making sure everyone using our platforms can stay safe and informed.

Get Facebook Business news in your inbox.

Sign up for our monthly newsletter for the latest updates, insights, marketing trends and articles from Facebook.