The 6 A/B Testing Mistakes That Stifle Conversions
Everyone wants to talk about A/B testing. It’s been around for years, but the average business is just now stumbling on tools and technologies that make it easy to execute. That being said, there’s a fair amount of misinformation circulating, and it’s leading to poor results. Avoid these six mistakes when you are A/B testing across your digital marketing programs.
1. Testing Without Purpose
The biggest mistake marketers make is running an A/B test without a purpose. Just like a good experiment needs a hypothesis in order to function appropriately, an A/B test needs a goal in order to be properly conducted.
How to Create a Conversion-Based Social Campaign
An A/B test can be very basic. You may be trying to figure out whether one color scheme is better or if a certain headline produces more conversions. But if there’s no objective, you’re simply wasting your time.
2. Failing to Make Big Changes
One of the biggest mistakes marketers make is not getting creative enough with their A/B tests. They’ll simply change a word here or a color there, which doesn’t always do the trick. Sometimes you need to think outside of the box and make a big change.
For example, did you know that studies have shown landing pages with people on them perform better than those without? That’s because people connect with people – not products, graphs, charts, and symbols. In one A/B test for Highrise, the landing page version with a person on it increased conversions by 102.5 percent.
If you’re consistently getting poor results, consider the fact that you may need to radically switch things up, not make minor tweaks.
Remember, the purpose of A/B testing is to find what works. If you’re too conservative, you won’t get the results you want.
3. Not Differentiating Between New and Repeat Visitors
There’s a huge difference between new and repeat visitors. People who have already visited your site in the past–and especially those who are frequent visitors–are used to your flaws. They’ve become conditioned to little nuances and aren’t good indicators of what is working. New visitors, on the other hand, will respond pretty dramatically to even the slightest variance from what they want or expect, especially if they’re coming to your website from social.
When A/B testing a given webpage, you need to make sure you’re tracking and differentiating between the results you get from new and repeat visitors. This will lead to data that’s much more relevant to your objectives.
4. Using the Wrong Tool
The rise in popularity of A/B testing has fostered a subsequent increase in the number of testing tools available online. Some are paid tools and others are free, but they all tend to offer the same benefits and results. The problem is they aren’t created equal.
Using the wrong A/B testing tool is a major mistake that could cost you time, money, and, in extreme cases, the integrity of your brand. Some tools slow down your website (which has a negative impact on your SEO and conversion rates), while others provide faulty data that negatively impacts decision making.
How to Drive More Web Traffic From Social Media
In order to avoid issues and make sure you’re using the right tools, start by running an A/A test. This tests two identical versions of the same page against each other to make sure you’re getting statistically accurate results. If you are, then you can rest easy knowing your tool works. If you aren’t, then you should probably find a new tool.
5. Not Testing for a Long Enough Period of Time
Another common mistake is ending a test before enough time has passed. Impatient marketers will try to run a test for 24 or 48 hours and then call it quits.
The problem is that any number of things can temporarily manipulate data. Experts suggest running tests for at least seven days. The larger the data sample size, the more accurate and true the information will be. Not sure whether or not you have enough data to reach a significant conclusion? This simple calculator will help.
6. Assuming You Know What People Want
The danger of being involved in marketing for a long period of time is that you begin to assume you know what customers want. You take certain things for granted, and this causes problems when you approach tasks like A/B testing. One of the more common mistakes people make is failing to test certain elements because they assume they already know the truth. This is a recipe for disaster.
For an example of why this is so dangerous, consider an A/B test EA Sports ran a number of years ago when releasing SimCity 5, one of the company’s highest-grossing games. During the pre-order phase, EA Sports launched a splash page that displayed a big banner across the top and an option to purchase the physical or digital copy below. The banner read, “Pre-Order and Get $20 Off Your Next Purchase.*”
When EA Sports discovered that the page wasn’t producing nearly the number of sales they had hoped for, they decided to test an unlikely element: the banner. With the banner removed – and no offer messaging displayed at all – conversions increased by 43.4 percent. It turned out that customers just wanted to buy the game and found the extra offer obtrusive.
The point of this case study is that it’s a mistake to assume you know what your customers want. Test everything; you’ll probably be surprised by the results.
Put A/B Testing to Work for You
With the right A/B testing strategy, you can take your marketing efforts from underperforming to highly effective in a matter of days. These diagnostic techniques are incredibly powerful and insightful, but you have to understand how they work.
Avoid the aforementioned mistakes; you’ll like the results you see.