Request A Demo

A/B Testing: The Staggering Success of Presidential Optimization and How to Do It Yourself

In the early morning hours of September 24th, 2015, Kyle Rush sent a simple Tweet.


Odds are you haven’t heard of Kyle Rush. He’s not exactly a public figure. What’s more, his tweet wasn’t greeted by much fanfare.

So, why does an obscure tweet about a single “a/b test” matter?

Because the truth is that Kyle and conversion rate optimizers (CRO) like him are at the forefront of an industry that’s not only transforming business, but also politics.

As the former Head of Optimization at Optimizely, which recently raised $58 million in its latest round of funding, Kyle was the man behind over 500 A/B and multivariate tests at Obama for America. Now he’s the Director of Front-end Engineering and Optimization at Hillary for America.

Simply put, the science and staggering success of CRO may well determine the winner of 2016’s presidential race.

That’s a bold claim, but the proof is in the numbers.

A Brief History of Presidential A/B Testing

According to Maketizator, A/B testing is “implemented through the creation of a variation page (page B) whose results are compared with the ones of the original version (page A). The one that performs better in terms of conversions wins the challenge.” It’s a data-driven way to improve a host of online actions like click-through rates, email signups, social sharing, and (of course) purchases.

You’re Doing Social Wrong

In the wake of Barack Obama’s historic 2008 presidential campaign, Optimizely pulled back the curtain on what was, at that time, a budding industry. Dan Siroker, Obama’s Director of Analytics, put it like this:

My job was to use data to help the campaign make better decisions.

We started with just one simple experiment back in December of 2007. This experiment taught us that every visitor to our website was an opportunity and that taking advantage of that opportunity through website optimization and A/B testing could help us raise tens of millions of dollars.

The results of Siroker’s work were staggering. By testing insanely simple online elements — like various two-to-three-word combinations on the site’s call-to-action buttons — the Obama campaign improved their email sign up rates by 40.6%, which brought in an additional 2,880,000 email subscribers, 288,000 volunteers, and roughly $60 million in donations.

Obama’s 2012 campaign tells a similar story. Recounting the over 500 A/B tests his team ran over the race’s twenty-month period, Kyle explained:

We optimized just about everything from web pages to emails…which increased donation conversions by 49% and sign-up conversions by 161%.

Perhaps the team’s greatest contribution was the development of Quick Donate, which allowed users to “donate with a single click through email or on the web, and even through SMS.” In the end, 1.5 million people contributed $115 million using Quick Donate. Even more impressive, Quick Donate users donated “four times as often and gave three times as much money” than their traditional counterparts.

Again, those are impressive numbers.

But don’t be misled. At its core, the science of A/B testing is surprisingly simple. To illustrate its simplicity — and how you can apply its success to your own business — iSpionage helped me create a behind-the-scenes look at some of the tests 2016’s candidates were running over the past few months.

The Current State of Presidential A/B Testing

Pop-Up Background Color

Presidential Comparison
Hillary’s popup background color test. (Image credit: iSponage)

This test comes from the Clinton campaign, and it’s a two-for-one.

First, it illustrate just how incredibly simple your tests can be, e.g., background color. Second, it illustrates one of the most overlooked but effective online tactics: popups.

As bad a reputation as popups might have, they work. For example, GetResponse — which specializes in A/B testing emails — reported how a family-owned lingerie business implemented a “20% off” exit popup and “acquired 3,017 new email subscribers—more than tripling the company’s total number of subscribers.”

For Clinton, email subscribers are so vital that this popup appeared immediately when new visitors entered the site.

CTA Button Color

Ben Carson Button
Carson’s CTA button color test. (Image credit: iSponage)

Here’s another shockingly simple test.

Button color seems like one of those on-page elements that couldn’t possibly affect performance, right? Wrong. In fact, Unbounce chronicled a test — just like Carson’s — that pitted (A) a green button with white text versus (B) a yellow button with black text.

The winner: “Version B’s yellow and black button may be ugly (and I mean ugly), but it is clear and led to a 14.5% increase in conversions.”


Ben Carson FormNaturally, the headline is the most prominent and most-read element of any page.

This means that testing simple variations on your headline can lead to big results. For example, Visual Website Optimizer reported an equally easy test that compared (A) Natural Joint Relief to (B) Natural Joint Relief Supplement. Adding one extra word produced an 89.97% increase in sales.

In Carson’s case, the test is between impersonal language (“Receive a Free Bumper Sticker”) and personal language (“I’ll Send You a Bumper Sticker.”)

Image vs. Video

Rubio Video Test
Rubio’s text-versus-video test. (Image credit: iSpionage)

One of the tried — but not always true — rules of online optimization is that video beats images.

However, while video has a proven track record of outperforming its static counterpart, the bedrock principle of A/B testing is to assume nothing. That’s why Rubio’s test is key to developing a culture of testing in your organization.

How to Run A/B Tests for Yourself

After considering both the simplicity and staggering success of presidential A/B testing, the next question is, “How can I run A/B tests for myself?”

Smashing Magazine’s A Roadmap To Becoming An A/B Testing Expert (2014) and The Ultimate Guide To A/B Testing (2010) are phenomenal resources. But for our purposes, here are six principles to get you started:

1. Size Matters

Before jumping into your first test, be sure you have a large enough data pool to generate verifiable and statistically significant results. Naturally, this will be determined in part by your site’s average traffic volume and current conversion rates.

However, beware of basing your “winners” on sample sizes that are too small. A good rule of thumb is to only declare a clear winner after each variant has 1,000 visitors and at least 150 conversions.

Also, beware of focusing too narrowly on the test alone. As my friend Johnathan Dane, Founder of KlientBoost, explains, “[While] many A/B tests show you new conversion winners, they can also come at a cost, which could equal lower revenue or profits for you.”

2. Hypothesize First

Another common mistake newcomers to A/B testing make is to not hypothesize first. This means clearly formulating a theory behind your tests before just throwing them out there.

Your hypothesis formula doesn’t need to be complicated. In fact, you can follow this simple pattern: “I think that changing [Element A] to [Element B] will produce [Qualitative Result] and therefore [Quantitative Result].”

For example, if we take Ben Carson’s headline test, a solid hypothesis would be: “I think that changing the headline from (A) impersonal language to (B) personal language will produce a far more direct message (qualitative), and therefore increase form completion (quantitative).”

Having a clear hypothesis before testing will enable you to expose and scrutinize your assumptions from the jump, as well as create objective criteria for judging the results.

3. Start Small

All the tests listed above are incredibly small, and for good reason. The worst thing you can do – especially when you’re starting out – is test multiple elements at once.

Pick one portion of you page – e.g., the headline, background color, button color, button copy, or text-versus-video – and stick with that test (that test only) until you have enough data to make a statically significant decision.

If you aren’t sure where to start, Sujan Patel from provides 50 A/B tests to help optimize your site to perfection. The simplest jumping-off points include testing:

  • The wording on your call-to-action buttons.
  • The color of your call-to-action buttons.
  • The number of your call-to-action buttons.
  • The copy of your page’s headline.
  • The “hero image” on your landing page or sales page.

Never underestimate the power of small changes. Remember that simply changing (A) Natural Joint Relief to (B) Natural Joint Relief Supplement produced an astounding 89.97% increase in sales.

4. Separate Mobile from Desktop

Always test mobile and desktop traffic separately to reduce the risk of sample pollution. It might seem easier to test all of your traffic at once, but they’re two different beasts. Shanelle Mullin from ConversionXL explained to me:

Mobile and desktop traffic volume is typically very different (probably not an even 50/50 split), there are segments within “mobile” (Android, iPhone, etc.) that will go unrecognized, and what works on desktop might not work on mobile (and vice-versa).

You’re going to end up with a lot of ‘on average’ data if you test them together, which is often useless to optimizers. By separating them, you can also create more tests faster and optimize for different outcomes (e.g. transactions on desktop, copy engagement on mobile).

Thankfully, all the tools listed below under A/B Testing Website allow you to easily separate mobile from desktop traffic, both in the tests themselves and in the results.

5. Measure and Wait

Measuring “statistical significance” is hard, heady stuff.

Thankfully, Jen Havice’s article I Spent All Summer Running A/B Tests, and What I Learned Made Me Question the Whole Idea contains a wealth of wisdom. For instance, consider this dense nugget: “Remember that 80% statistical significance is no better than 50% in terms of calling a test definitively. If it’s not greater than 95% with a barely-there margin of error, the test isn’t done yet.”

If that’s Greek to you, the good news is each of the tools listed below will guide you through the set-up process, math included. The golden rule is simple: never act until a test is complete. In other words, wait.

A/B test size calculator. (Image credit: Copyhackers)
A/B test size calculator. (Image credit: Copyhackers)

6. Transfer Wins

Once you’ve determined a clear winner (e.g., personal language in the headline increases form complete by X%), apply (or transfer) that win into other areas of your site.

If you know that certain colors, specific calls-to-action, or particular tones of voice work in one area, chances are they’ll also work across the board.

Transferring wins is the best way to apply what you’ve learned and make the most of your tests. However, always track your changes and the results they produce.

A/B Testing Tools

The last thing to cover, especially when it comes to testing for yourself, are the tools. Here’s a quick list, broken up into two parts, to help guide you. (Note: The tools are presented in no particular order.)

A/B Testing Websites

Marketizator A/B testing for websites. (Image credit: Marketizator)
Marketizator A/B testing for websites. (Image credit: Marketizator)
  1. Marketizator

Marketizator’s visual editor lets you easily connect any existing webpage and edit it to create your own A/B tests. This includes copy, graphics, buttons, and navigation.

Marketizator also allows you to schedule your tests to include or exclude a specific day along with a statistical relevance calculator to help with the behind-the-scenes math. Their free program gives you access to up to 10,000 tested views as well as more advanced personalization and segmentation features.

  1. Unbounce

At its core, Unbounce is a landing page optimizer. This means Unbounce specializes in helping you get the most out the pages that truly matter. If you’re in the process of developing a landing page – if you don’t already have one you want to test – Unbounce includes a wide variety of proven templates, as well as a deep repository of articles and recommended partners to guide you along the way.

  1. Optimizely

I’ve already mentioned some of the amazing names behind Optimizely. Similar to Marketizator, Optimizely lets you connect your already-existing pages to perform both A/B tests, as well as multivariate tests. If you’re just getting your feet wet and want to start out small, the “Starter” offering will give you free and instant access to their basic features. Advanced options like high-volume tests and “multi-page testing” (i.e., funnels) will require you to enroll in the full “Pay as you go” program.

A/B Testing Emails

Email A/B
GetResponse A/B testing for email. (Image credit: GetResponse)
  1. GetResponse

To start testing your emails, GetResponse offers an incredibly easy-to-implement interface. Simply create your A email (either from scratch or using an existing template) and then begin testing B versions of anything from your subject line, body copy, images, and even personalized elements (like names, entire content blocks, and calls-to-action). Their dropdown report (shown above) is by far the easiest results page to interpret and take action on. GetResponse also includes a landing page generator — much like Unbounce — as well responsive email designs for mobile readers.

  1. MailChimp

MailChimp offers a suite of testing tools similar to that of GetResponse: subject line, body copy, images, and calls-to-action. MailChimp also lets you “pre-test” your tests by sending out A/B variants to small segments of your email list. This added step is helpful in essentially determining a “winner” before you communicate en masse. Of course, the statistical significance of that initial winner will be determined by the overall size of your list itself.

  1. Marketo

While both GetResponse and MailChimp are scalable, Marketo specializes in enterprise-level B2B testing. This means their “Standard” package starts at $1,795 per month. Why? Because Marketo is a full-scale marketing automation platform. So, if you’re looking for big-picture, one-on-one guidance (and can afford it), Marketo excels.

Simple…But Wildly Successful

While we may have to wait another year to find out what Kyle Rush’s “biggest A/B test winner” was for the Hillary campaign, one thing is certain: the science behind presidential testing is insanely simple, but its success is staggering.

Don’t be intimidate by terms like optimization. Just be sure to keep these six guiding principles in mind:

  1. Size Matters
  2. Hypothesize First
  3. Start Small
  4. Separate Mobile from Desktop
  5. Measure…and Wait
  6. Transfer wins

The tools listed above make it easy to get started. And, who knows–you might just discover your own massive winner. If you do, or even if you already have, be sure to share about it in the comments.

Aaron Orendorff

Aaron Orendorff is a regular contributor at Entrepreneur, Fast Company, Business Insider, Content Marketing Institute, Unbounce, Copyblogger and more. Grab his Ultimate Content Creation Checklist at or follow him on Twitter .

Never Miss a Post!

Request a Demo