A/B tests are a vital part of WordStream’s marketing efforts. Seriously, we split test everything: banner ads, registration forms, landing pages – you name it. My colleagues keep telling me that buying two different burritos for lunch then tweeting about which one I like best doesn’t count as “real” A/B testing, but I beg to differ.
One of the tools we use for A/B tests is Optimizely, a comprehensive solution that allows us to conduct tests quickly and easily. It’s awesome when we find tools that help us understand our audience more effectively, but it’s even more awesome when the people behind those tools recognize us for how we use their software. That’s precisely what happened recently when Optimizely told us that WordStream is a finalist in the inaugural Opties 2014 awards! The Opties honors Optimizely customers “who create delightful online experiences for their own customers and empower their organizations to turn data into action,” and we’re both thrilled and honored to be recognized.
WordStream is one of three finalists in the “Testing Hero of the Year” category. Although we’re confident that the quality and rigor of our A/B tests will help us emerge triumphant, we’ll have to wait a little longer for the official results – the winners are due to be announced at OptiCon on April 17 in San Francisco. While we wait with baited breath for the big reveal, let’s take a look at how WordStream uses Optimizely and A/B tests to target our audience more effectively.
Landing Page A/B Tests
We used Optimizely in a series of A/B tests we conducted last year during our Grade and Get Paid Marketing Makeover competition. We wanted to learn which landing page was more effective in terms of driving conversions.
- In the control version of the landing page (A), we included a button that, when clicked, brought visitors to a separate page where they could enter their email address for a chance to win.
- In the variation (B), we switched out the button with an actual submission box. We also removed the “Enter Here” navigational element from the header to reduce the number of entry placements in one view.
- By making the change from the control to the variation, we saw a 128% increase in contest entry submissions and a 58% increase in the number of users who completed all three stages of the process.
Banner A/B Tests
Like many sites, we use a variety of banners to highlight our various offers, products and services. However, although we were keen to see which of our banners had the highest click-through rates, we ultimately wanted to determine whether contextual relevance – placing relevant banners in blog posts focusing on a similar topic – had a noticeable impact on CTR.
In keeping with A/B testing best practices, we began our A/B tests with a specific hypothesis in mind, namely that contextual relevance would have an effect on the CTR of our banners.
- In our first round of A/B tests, we set out to see which designs overall had the highest CTRs.
- After the preliminary round of tests, we then split test the winning ads against each other to create a shortlist of the most effective banners.
- We then experimented with the placement of the banners to see whether contextual relevancy improved the CTR of the strongest banners.
These A/B tests demonstrated that our original hypothesis was incorrect, despite our expectations to the contrary.
- We initially thought that contextual relevancy would play a role in the banners’ CTR, but the opposite was the case.
- Ultimately, it didn’t matter where we placed the banners – the designs themselves had the greatest impact.
Interestingly, we discovered that in this series of A/B tests, users responded negatively to banners featuring images of people – in every single instance, banners featuring people performed poorly compared to the graphical banners.
Form A/B Tests
As a vital lead generation tool, forms are one of the most frequently split-tested elements on a site. Ask for too much information and you risk losing your visitor; ask for too little and it’s hard to qualify your prospects as viable leads. With this in mind, our Fearless A/B Testing Masterminds™ set out to see which of the following two forms was the most effective.
- We designed two versions of a thank-you page to be presented after a webinar to help boost the number of users completing a PPC audit using the AdWords Performance Grader Plus.
- We found that using a form embedded directly into the webinar thank-you page (A) was much more effective than linking a button labeled “Get Your Free Report” that redirected visitors back to the AdWords Performance Grader Plus homepage.
- However, greater visual separation led to increased conversions. After implementing color and headline changes to the variation, we drove conversions to stage one Grader completions.
- In addition, including an image to visualize the grading process and presenting the Grader as a “bonus” offer helped increase the number of full Grader completions.
What Do A/B Tests Have to Do with PPC?
The examples above deal primarily with visual elements, but A/B tests are invaluable to advertisers hoping to maximize their return on investment from search engine marketing. This can (and does) apply to everything from keyword research to ad copy.
In a recent eBook, “How to Increase ROI from SEM with A/B Testing,” WordStream’s founder and CTO Larry Kim spoke with Optimizely about how testing elements of PPC, such as bid amount, long-tail keywords and negative keywords, can significantly improve the performance of an AdWords account. Larry urges advertisers to test their ad copy ruthlessly, as the top 1% of all AdWords accounts focus on high commercial intent keywords. To see what else Larry suggests, get your copy of the eBook here.
Although we won’t know the winners of the Opties 2014 for a while yet, we’d like to wish the other finalists the very best of luck. In the meantime, why not tell us how you use A/B tests in the comments?