A/B tests are a vital part of WordStream’s marketing efforts. Seriously, we split test everything: banner ads, registration forms, landing pages – you name it. My colleagues keep telling me that buying two different burritos for lunch then tweeting about which one I like best doesn’t count as “real” A/B testing, but I beg to differ.
One of the tools we use for A/B tests is Optimizely, a comprehensive solution that allows us to conduct tests quickly and easily. It’s awesome when we find tools that help us understand our audience more effectively, but it’s even more awesome when the people behind those tools recognize us for how we use their software. That’s precisely what happened recently when Optimizely told us that WordStream is a finalist in the inaugural Opties 2014 awards! The Opties honors Optimizely customers “who create delightful online experiences for their own customers and empower their organizations to turn data into action,” and we’re both thrilled and honored to be recognized.
WordStream is one of three finalists in the “Testing Hero of the Year” category. Although we’re confident that the quality and rigor of our A/B tests will help us emerge triumphant, we’ll have to wait a little longer for the official results – the winners are due to be announced at OptiCon on April 17 in San Francisco. While we wait with baited breath for the big reveal, let’s take a look at how WordStream uses Optimizely and A/B tests to target our audience more effectively.
We used Optimizely in a series of A/B tests we conducted last year during our Grade and Get Paid Marketing Makeover competition. We wanted to learn which landing page was more effective in terms of driving conversions.
Like many sites, we use a variety of banners to highlight our various offers, products and services. However, although we were keen to see which of our banners had the highest click-through rates, we ultimately wanted to determine whether contextual relevance – placing relevant banners in blog posts focusing on a similar topic – had a noticeable impact on CTR.
In keeping with A/B testing best practices, we began our A/B tests with a specific hypothesis in mind, namely that contextual relevance would have an effect on the CTR of our banners.
These A/B tests demonstrated that our original hypothesis was incorrect, despite our expectations to the contrary.
Interestingly, we discovered that in this series of A/B tests, users responded negatively to banners featuring images of people – in every single instance, banners featuring people performed poorly compared to the graphical banners.
As a vital lead generation tool, forms are one of the most frequently split-tested elements on a site. Ask for too much information and you risk losing your visitor; ask for too little and it’s hard to qualify your prospects as viable leads. With this in mind, our Fearless A/B Testing Masterminds™ set out to see which of the following two forms was the most effective.
The examples above deal primarily with visual elements, but A/B tests are invaluable to advertisers hoping to maximize their return on investment from search engine marketing. This can (and does) apply to everything from keyword research to ad copy.
In a recent eBook, “How to Increase ROI from SEM with A/B Testing,” WordStream’s founder and CTO Larry Kim spoke with Optimizely about how testing elements of PPC, such as bid amount, long-tail keywords and negative keywords, can significantly improve the performance of an AdWords account. Larry urges advertisers to test their ad copy ruthlessly, as the top 1% of all AdWords accounts focus on high commercial intent keywords. To see what else Larry suggests, get your copy of the eBook here.
Although we won’t know the winners of the Opties 2014 for a while yet, we’d like to wish the other finalists the very best of luck. In the meantime, why not tell us how you use A/B tests in the comments?
Comments
Please read our Comment Policy before commenting.