Before you start: what do you know about SEO split-testing? If you’re unfamiliar with the principles of statistical SEO split-testing and how SplitSignal works, we’re suggesting you start here or request a demo of SplitSignal.
First, we asked our Twitter followers to vote:
71% of our followers guessed it right! The result was positive.
Read the full case study to find out why.
The Case Study
SplitSignal partnered with a software supply chain appsec provider who helps to automate and secure the building of and deployment of open source. SplitSignal set out to better understand how text banner ads may be impacting organic traffic to resource centre articles.
Much like how fans of streaming platforms will often pay more to watch or listen without ads, users across the web suffer from ad fatigue. With the average person seeing as many as 10,000 ads per day it’s reasonable to assume that people prefer their content ad-free, even if those ads are relevant to them. In this case, we predicted that removing the text ads on these pages would help to improve traffic. If users prefer pages without ads, maybe Google does too!
The test for this experiment consisted of removing text banner ads from a total of 27 URLs and comparing them against a control group of 36 URLs. For page and traffic recommendations, for tests, visit here.
The cohort of pages had nearly 750K clicks making the clicks a large enough data set that we could feel comfortable about the data. By simply identifying a CSS selector where the text ads sat and setting the tool to “remove” that section, we were able to create a version of the pages that did not have the text ad.
After 21 days of the test being live we saw a 2.2% increase in clicks to the test group pages. This amounted to an additional 1,399 clicks over the 21 days. Annualized that would be as much as 24,315 clicks that could be added to this site’s traffic if they did not have these text ads. Looking at that traffic number the marketing team could make an educated decision around whether or not the conversions they got from the ads or conversions they’d get from 24K extra clicks would help them achieve their overall goal of platform usage.
While this seems logical that removing ads would make a difference, the ads were not overwhelming to users and pointed back into the brand website, not external sites through a display network like GDN. Google has of course made statements and has mentions of distracting ads in the search quality raters guidelines, we do not think that this site was in poor standing percentage to those statements.
Instead, there are some other factors that may have led to the growth here. There also could be some false positives here given that the lift was such a small percentage. Rankings for key pages could have increased while others may have slightly decreased. Even given the fact that the content this site produces is fairly evergreen, there is a chance that something in the world of python happened during the course of the test that directed more traffic to the test group.
Test Launched August 26th
Why This Test Was a Winner
Our data points to this test as being a winner but not to the degree that we’re 100% certain. Given that it is a fair assumption to state that users prefer pages without ads, in some cases ads can lend themselves to searcher task completion and in the case of this site, we do think that clicking their text ads may be a logical next step for some users.
Since we do not feel that this site was egregious in its use of ads, the impact here could be any number of things including happenchance OR better “stickiness” of the pages. While relatively controversial of a concept in SEO, Google may look at something like user engagement signals to determine how well a page is satisfying a users query. If for some reason the ads caused shorter sessions, or somehow otherwise signalled to Google that the user did not get what they wanted there’s a chance that they ranked those pages lower than if the ads were not there. We feel that the data here is not conclusive enough to say that this is the case with any degree of certainty, so more testing is needed here to better understand the impact.