Search The Personalization Glossary

Personalization Glossary
A/A Testing

A/A Testing

When it comes to A/B tests, there are three common formats of testing setups you will frequently see growth marketers refer to: the A/B test, the A/A test, and the A/B/N test. Technically, all three models are considered A/B tests, but there are some key differences that are important for you to understand.

What is A/A testing?

At its most simple level, an A/A test is a type of A/B test in which you test a webpage or experience against an identical webpage or experience.

Rather than split testing a version A of a page (your control) versus a distinctly different version B of that page (your variation), you send all of your traffic to two identical versions of the same page (or more simply, two page As).

Why is A/A testing important?

Now, you’re probably scratching your head. Why the heck would you test two versions of the same page against each other? Wouldn’t it give you useless results?


You’re completely right! Running an A/A test should give you no winner. In short, there should be not statically significant winner.

And that’s what A/A testing is all about! Rather than assuming your testing tool (such as Proof Experiences, Optimizely, or Google Optimize) is working perfectly, you can use A/A testing to gauge the accuracy of your testing engine. Many marketers take this first step before launching any test with the mentality: “if I can’t trust my testing environment, can I trust my results?”‍

The A/A test is also an important method for establishing a baseline for future tests and for determining a minimum sample size required for future tests.‍

How do I interpret A/A testing?

Normally, when running a traditional A/B test, you’re looking to get one variation to outperform the other and be statistically significant (95% or greater p-value). With an A/A test, you don’t want to see statistical significance — as that would indicate something is off with your testing platform. An identical page shouldn’t perform any differently than its twin.

In a properly executed A/A test, your results should be consistent. Version A and Version B should have a similar conversion rate, and your statistical significance should be below 95%. If you reach statistical significance, something is wrong in this type of test.

Should I A/A test my site?

There’s a lot of opinions on the subject of A/A testing — and for good reason! Proponents of the method cite its ability to help marketers identify potentially egregious errors on their site.  

Other marketers don’t buy it. They claim that since A/A testing is a form of A/B testing, it is subject to randomness and can potentially give inconclusive results. Plus, they claim the method is time-intensive, slows down a rapid testing cycle, and potentially invalidates past results.

So who’s right? The jury’s still out on this one...

But if you’re using Proof Experiences for website personalization and you’re interested in setting up an A/A test, here’s how you can do it.


Other personalization resources tailored just for you

Instantly personalize this page for you
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Tyson Quick on moving Instapage upmarket & growing enterprise revenue to 20% ARR
Watch the episode ➜
How to combine A/B testing and personalization to get the best results
Read the article ➜