By now you probably know that I’m a fan of A/B testing (also called split testing).
It is a “scientific” way of taking the guesswork out of your online marketing efforts.
But what would happen if I told you that A/B testing can actually lead to a dramatically lower conversion rate, even when the numbers are showing you that it’s a raging success?
Yeah, it would frighten me too!
In this article I’m going to share a few things that happened to me this week to make me re-think how I go about some of my split tests.
*Dramatic, ominous music*
What is A/B testing again?
In case you’re not sure, A/B testing is where you run an experiment on your sales page, website or blog whereby you change one element and see if it outperforms the original.
For example, you might have an individual page on your blog (like I do) where you ask people to subscribe to your mailing list. On that page you might run a test to see whether a red submit button out performs a green one. Or you might change the input text (that’s the text on the button) from “Submit” to “Get started!” and see which works best.
The amazing thing about A/B testing is that you often get results that really surprise you. I remember hearing once about some Internet Marketers who removed their social proof from their sidebar and saw conversions skyrocket (if anyone remembers that article please let me know). That is a really surprising result because marketers are always telling us that social proof increases conversions.
If you’d like a really cool introduction to split testing, WIRED has an amazing article about how Barack Obama used it with great success in his first election campaign.
Two A/B tests that I ran last week that went badly wrong
Okay so you’re probably pretty keen to find out about my horror split testing story. Well, it’s not all that bad but it did kind of make me re-think the way I go about my testing a little bit.
1. Split testing my pop-up opt in form
The first A/B test that I started last week was in conjunction with my post about AWeber. In the post I mentioned that I was going to start a split test on my pop up forms to show everyone how much valuable information can be gathered (if you’d like to learn how to do that here’s a video).
Well, that was where the problem started. After a day or so I checked the split test and saw this:
Quick explanation if this is new to you.
As you can see I’m running two different pop ups. My usual one is the bottom one and the new version I’m testing has a square photo and is much narrower.
Probability indicates the percentage split between these two versions. So half of people will see one version, the other half see the other.
Displays shows how many people have actually seen it. (Problem)
Subscribers show how many people have signed up with that form. (Hmmm)
S/D is subscribers divided by displays. This is your conversion rate.
What does this mean?
As you can tell, the displays here are way off. Which means that the rest of the numbers are kind of useless. The conversions rate of my original pop up at 3.1% is really quite good but it’s seeing far too few displays to choose it as a winner over the new one.
More to the point, I can’t tell if the displays are incorrect or just the numbers that I’m seeing. So there really is no way to tell what is what.
We’ll come back to more conclusions once the next part is discussed.
NOTE: I contacted AWeber about this and they said that there is a temporary problem with the split testing measurements which the techs are trying to resolve quickly.
2. Split testing my HelloBar up the top
The little bar that you see at the top of my blog is called a HelloBar. It’s a rather expensive ($30 a month for my plan) but freaking awesome little thing developed by Neil Patel that lets you do a lot of cool things like measure the click through rate, run and A/B test between two bars and see which works better, etc.
I’m currently doing a little split test between two bars, one blue and one orange to see which gets the better click through rate to my post about starting a WordPress blog.
I always run these HelloBars the same way; two variations of the same message, see which gets the most clicks, choose it as a winner.
But when I noticed that the latest successful ad coincided with a drop in real conversions I started re-assessing the way I did things. And that is the crux of this article.
How A/B testing decreased my conversions
Now, any seasoned internet marketer will laugh at the obviousness of this. But it’s something I still wanted to share because, even though I’ve been testing for a while, I still made the mistake.
The thing about A/B testing is that it is all pointless unless you are tracking conversions.
Click through rate? Useless.
Traffic numbers? Useless.
Think about it for a second.
I could do a split test between two pop up opt-in forms. One of them could be my regular “Join my mailing list” thing and the other could say “Enter your email and get $500”.
Obviously, the $500 version is going to get more subscribers.
But it will also have a massive rate of attrition. People will get their money (if I’m being serious) and then unsubscribe. They will be of absolutely no value to me whatsoever.
And so it was with my various split tests around my blog. I was getting so focussed on seeing bigger numbers and more clicks that I forgot to measure conversions. And that is a really big problem because it wastes time and money and can cause you to lose a lot of earning opportunities.
Goals and authentic tracking are key
What this really comes back to is setting clear goals/outcomes for your split testing and ensuring that you have authentic tracking in place for those goals.
This is very difficult when the product that you’re selling is not your own. You need to be able to do things like add a tracking pixel onto the sales page to see which traffic is converting into customers – a pretty common problem for some affiliate products.
So what’s the answer?
Actually, I think the starting point is to begin thinking about what you really want from the various activities that you do on your blog.
For example, if you are trying to get more email subscribers then you want to think about what you really want them for. For example, you might have 100,000 people on your list but:
- Do you have good open rates?
How many of those 100,000 subscribers are opening your emails?
- Are they engaging?
Of the people who do open those emails, how many people are engaging with your content or whatever it is that you are putting out there?
- Are they unsubscribing or flagging you?
Even worse than not opening, are they just unsubscribing from your list or flagging you as spam?
- Are they buying?
Lastly, how do those email subscribers go when it comes to purchasing your products or whatever it is that you are promoting?
Once you have figured out what you really want to achieve (goals) you need to figure out a real way to track that stuff.
In your email marketing software you might do it by segmenting your lists based on which sign up form they used. Of course, that is also problematic because other factors like your mail out, time of day, offer, etc. will also affect the results.
If it is a landing page then you can use Visual Website Optimizer to create two versions of the same page and see which one performs better. Again, this is only really useful if you can track people to the final sign up page.
Do you think about numbers or conversions?
I would be really interested to know how many Tyrant Troops think about the numbers they are getting, or how many conversions those numbers are leading to. How many of you are tracking and measuring conversions on your blog? If you have any insights or tools please leave me a comment.