Posted by Cyrus-Shepard

Note: This post was co-authored by Cyrus Shepard and Rida Abidi.

Everyone wants to win Google featured snippets. Right?

At least, it used to be that way. Winning the featured snippet typically meant extra traffic, in part because Google showed your URL twice: once in the featured snippet and again in regular search results. For publishers, this was known as "double-dipping."

All that changed in January when Google announced they would de-duplicate search results to show the featured snippet URL only once on the first page of results. No more double-dips.

Publishers worried because older studies suggested winning featured snippets drove less actual traffic than the "natural" top ranking result. With the new change, winning the featured snippet might actually now lead to less traffic, not more.

This led many SEOs to speculate: should you opt-out of featured snippets altogether? Are featured snippets causing publishers to lose more traffic than they potentially gain? 

Here's how we found the answer.

The experiment

Working with the team at SearchPilot, we devised an A/B split test experiment to remove Moz Blog posts from Google featured snippets, and measure the impact on traffic.

Using Google's data-nosnippet tag, we identified blog pages with winning featured snippets and applied the tag to the main content of the page.

Our working hypothesis was that these pages would lose their featured snippets and return to the "regular" search results below. A majority of us also expected to see a negative impact on traffic, but wanted to measure exactly how much, and identify whether the featured snippets would return after we removed the tag. 

In this example, Moz lost the featured snippet almost immediately. The snippet was instead awarded to Content King and Moz returned to the top "natural" position.

Featured Snippets Experiment

Here is another example of what happened in search results. After launching the test, the featured snippet was awarded to Backlinko and we returned to the top of the natural results.

Featured Snippets Experiment Examples

One important thing to keep in mind is that, while these keywords triggered a featured snippet, pages can rank for hundreds or thousands of different keywords in different positions. So the impact of losing a single featured snippet can be somewhat softened when your URL ranks for many different keywords — some which earn featured snippets and some which don't.

The results

After adding the data-nosnippet tag, our variant URLs quickly lost their featured snippets.

How did this impact traffic? Instead of gaining traffic by opting-out of featured snippets, we found we actually lost a significant amount of traffic quite quickly.

Overall, we measured an estimated 12% drop in traffic for all affected pages after losing featured snippets (95% confidence level).

Featured Snippets Experiment Results
This chart represents the cumulative impact of the test on organic traffic. The central blue line is the best estimate of how the variant pages, with the change applied, performed compared to how we would have expected without any changes applied. The blue shaded region represents our 95% confidence interval: there is a 95% probability that the actual outcome is somewhere in this region. If this region is wholly above or below the horizontal axis, that represents a statistically significant test.

What did we learn?

With the addition of the “data-nosnippet” attribute, the test had a significantly negative impact on organic traffic. In this experiment, owning the featured snippet and not ranking in the top results provides more value to these pages in terms of clicks than not owning the featured snippet and ranking in the top results.

Adding in the “data-nosnippet” attribute, not only were we able to stop Google from pulling data in that section of the HTML page to use as a snippet, but we were also able to confirm that we would rank again in the SERP, whether that is ranking in position one or lower.

As an additional tool, we were also tracking keywords using STAT Search Analytics. We were able to monitor changes in ranking for pages that had featured snippets, and noticed that it took about seven days or more from the time of launching the test for Google to cache the changes we made and for the featured snippets to be overtaken by another ranking page, if another page was awarded a featured snippet spot at all. The turnaround was quicker after we ended the test, though, as some of these featured snippets returned as quickly as the next day.

However, a negative aspect of running this test was that, although some pages were crawled and indexed with the most recent changes, the featured snippet did not return and has now either been officially given to competing pages or never returned at all.

To summarize the significant findings of this test:

  1. Google's nosnippet tags can effectively opt-out publishers from featured snippets.
  2. In this test, we measured an estimated 12% drop in traffic for all affected pages after losing featured snippets.
  3. After ending the test, we failed to win back a portion of the featured snippets we previously ranked for.

For the vast majority of publishers winning the featured snippet likely remains the smart strategy. There are undoubtedly exceptions but as a general "best practice" if a keyword triggers a featured snippet, it's typically in your best interest to rank for it.

What are your experiences with winning featured snippets? Let us know in the comments below.


Join Moz SEO Scientist, Dr. Pete Meyers, Wednesdays in April at 1:30 p.m. PT on Twitter and ask your most pressing questions about how to navigate SEO changes and challenges in a COVID-19 world. Tweet your questions all week long to @Moz using the hashtag #AskMoz. 

Add to Google Calendar

Add to iCal


Sign up for The Moz Top 10, a semimonthly mailer updating you on the top ten hottest pieces of SEO news, tips, and rad links uncovered by the Moz team. Think of it as your exclusive digest of stuff you don't have time to hunt down but want to read!