AB Testing
Header image featuring woman eating, sleeping, and working out

This is page three of a handbook on Startup Growth. Begin here.

Topics

  1. Content
  2. Content
  3. Content
  4. Content
  5. Content
  6. Content

What's the story behind this guide?Expand

My goal with Julian.com is to write handbooks that are more exhaustive and insightful than anything else online. That's bold and slightly obnoxious, right? 

Yup, but that's what excites me 😂 I don't get out of bed to write derivative content.

So it made sense that I finally got around to writing about growth. Because it's how I've been earning a living for the past three years.

I started writing this handbook while I was on a startup consulting spree. I had a blast bouncing between startups so I could peek inside their operations, show them what to do, then move onto the next roller coaster.

But when I decided to formalize my consulting into a proper agency, I put this handbook on hold for a couple months. Here's why: Now my goal was to sign clients whose businesses would help me learn what I still hadn't mastered.

So the remainder of this handbook began filling up with my calculated, on-the-job experiences.

I wish I could write more handbooks with this method. Unifying my work (growth) with my passion (handbook writing) made this particularly enjoyable. Unfortunately, this probably won't happen again, as future handbook topics include playing piano and speaking Chinese 😂

By the way, as I finished writing this, it struck me that this doubles as a reference to train my agency hires at Bell Curve. With it as ammunition, I was confident in hiring the best people for the job — regardless of their resumés. 

The handbook could teach them everything.

As my team continues to ask me questions, I incorporate my answers into this handbook. (So subscribe below for continued updates on my agency learnings.)

I really hope all this material helps you just as much too. Kick ass with it.

And if you've read this far, come say hello on Twitter!

Interesting in building muscle? Check out my already-released Muscle Guide.

Landing page A/B testing

This page teaches you to rigorously improve the conversion rate of all your pages. 

A/B's are page experiments that test the change in conversion rate (e.g. signups, purchases) between page variations. For example, you could try changing the benefits you pitch. Or you could replace all your images. Or you could cut your page in half. 

A/B testing is critical

I don’t care how much work you've put into your page, I guarantee its first iteration is nowhere near as performant as one that's undergone multiple well-designed A/B tests.

A/B's aren't a nicety; they’re the only way to methodically improve a page over time. 

Frankly, they're magical. If you have a proper A/B testing regimen in place, it works partially on autopilot to significantly improve your conversion while you sleep. This is the lowest-friction, lowest-cost way to increase your bottom line.

This page shows you how to design, assess, and iterate on A/B tests.

How A/B testing works

Here's the A/B testing cycle:

There are three popular A/B testing tools: OptimizelyVWO, and Google Optimize. The latter is free, full-featured, and integrated into Google Analytics. I recommend it.

Sourcing A/B ideas

You source A/B testing ideas from several places:

If you're running surveys to better understand users before running A/B tests, survey them at the point they're most invested in your company. This is when they are most likely to respond.

For example, after they've purchased from you, present three concise questions on the post-checkout page that can be quickly answered via dropdown menus.

If you don’t yet have this data to pull from, start by asking, What do I think our ideal customers would most want to see on our page? Then test every major variation.

A/B testing and the growth funnel

Before I get into what to test, we have to first understand what we're testing for.

Consider this: If you discover an A/B variation motivates people to click a button 10x more, but this behavior doesn’t lead to greater signups or more of any other meaningful conversion event, then your variation isn’t actually better than the original.

All it's done is distract users.

So remember to consider the totality of the growth funnel when assessing the results of an A/B test. The more an A/B variation affects Revenue or Referrals versus, say, Engagement, the better it ultimately is. 

That said, while the goal of A/B testing is to increase end-of-funnel conversion, what you’re most often testing will actually be early steps in the funnel.

There are two reasons for this:

That's why this page focuses on A/B testing landing pages. Plus, A/B testing your product would entail a deep discussion on product development, UI, and UX. That's outside the scope of this handbook.

What to A/B test on your landing page

For any page, you’re testing what I call either a micro or a macro variation.

Micro variations are adjustments to copy, creative, and page structure. Macro variations are significant restructurings or rewrites of your page. 

Micro variations

Here are micro variation ideas to get you started:

Micro variations sometimes significantly affect conversion. But typically they don’t. 

Changing a button’s color, or making it twice as big, usually only gets you so far. 

However, there are two notable exceptions — when micros can have a big impact:

Macro variations

Macro variations, meanwhile, more significantly affect conversion. However, they require considerable thought and effort: You’re forcing yourself to return to the drawing board to create an all-new page — new design, new value props, new copy.

It’s hard to summon the focus and team collaboration needed to do this in earnest. 

Which is why they're rarely done.

But macro variations are a necessity. You must see the forest through the trees

Since the biggest obstacle to designing macro changes is simply committing to them, I implore you to create an A/B testing schedule and rigorously adhere to it: Create a recurring calendar event for — at most — every 3 months. On that day, spend a couple hours brainstorming a macro for a pivotal page or product flow. 

Here are the two most significant sources of macro ideas:

How many A/B tests should you run?

Each A/B test — or A/B experiment — tests a primary objective: increasing landing page to signup page views, increasing signup form completion, etc.

To avoid confounding test results, I recommend running one experiment at a time. 

However, within that experiment you can have several variations. Each variation will receive the same amount of proportioned traffic and will test a different approach to successfully concluding your experiment.

For example, one variation may test switching the order of a page's elements. Another may test making the page half as long.

How to prioritize A/B tests

Each A/B test has opportunity cost; you only have so many visitors you can test against in a given time period. So prioritize tests sensibly — don't run-and-gun them.

To methodically prioritize tests, consider five factors:

Setting up A/B tests

When creating A/B tests in your tool of choice (again, I recommend Google Optimize), you want to keep the following two implementation details in mind.

Setup: Parallel vs. sequential

Always run A/B tests in parallel — meaning, your original page and its variant are running at the same time. (A/B tools will randomly assign visitors to one or the other.)

If you run variants sequentially, visitors' traffic sources and time or day of the week won’t be controlled for. This renders your results meaningless. 

Consider how traffic sources vary wildly in the quality of visitors they send. And how people sign up for B2B services at a lesser volume on weekends.

Setup: Referrer restrictions

If a visitor began reading your blog before visiting your homepage then signing up, they may know a more about your market or product than someone who came to the homepage straight from Google. 

As a result, they may respond very differently to the copy on your homepage.

Therefore, if you have common navigational paths on your site that you suspect significantly influence conversion, setup A/B's to only run if the user either came from a specific referrer (e.g. a product page) or none at all (i.e. came directly to your site). 

When you know where they're coming from, you can tailor your copy.

Similarly, if you have recurring third-party traffic sources, such as an industry blog that frequently covers you, consider setting up A/B tests that trigger exclusively for these sources — if you know how they differentiate from typical visitors.

Your A/B testing tool will allow you to easily set up these targeting restrictions.

Assessing A/B test results

You now know what and how to test, but how do you assess your test results?

You need to be on the lookout for three things:

Sample size

The principles of statistics dictate that we need a sufficiently large sample to confidently measure a boost in conversion:

Therefore, if you don’t have a lot of traffic, you can only afford to run macro variations — because they have the potential to make 10-20%+ improvements. 

Otherwise, you’ll be waiting forever for micro tests to complete! 

Conversely, if you have a ton of traffic, congrats, you marketing wizard. You can afford to run a bunch of copy and creative micro-optimizations to fully optimize your pages.

In the example below, we ran an experiment for a client (using Google Optimize):

As you can see, our page had 1,724 views throughout the testing period. There was a 30% (29/22) improvement in our test variation over our baseline (regular page).

Had the experiment revealed merely a 2% increase in conversions, we would have concluded  the sample size was too small to consider it viable. And, that it’d take far too long to reach 10,000 visitors to justify such a small increase in conversions. We would have pulled the plug.

Fortunately, that wasn’t the case with this experiment. Looking at the chart above, we only needed 100 visitors to validate a 20% increase in conversions.

A sign of success

One more note about the above Google Optimize screenshot: When assessing your experiment, pay attention to the column labeled Probability to be Best

If your variant’s probability exceeds 70% and has a sufficient number of sessions as outlined above, your A/B test should be considered for implementation into your site.

End-of-funnel performance

Increasing signup conversion is nice, but increasing the total number of paying users is what matters most.

In other words, your landing page may be really good at enticing people to sign up — but bad at enticing actual customers to sign up.

Each of your test variations can incept a different set of expectations into the user that affects their behavior later in the funnel. 

So assess which landing page variation actually resulted in end-of-funnel conversion. Your analytics tool, such as Google Analytics, automatically tracks users' initial landing pages. (Seamlessly now that Google Optimize is embedded into it.) So it's easy to associate full-funnel conversion events with their respective A/B tests. 

Reasons for success or failure

Once you find your revenue-boosting page variation, try to identify why it worked. "Try" is the key word: There's no foolproof rubric to understanding human psychology.

Consider how increasing revenue is actually just one benefit of a successful A/B test. The other is increasing your efficiency at coming up with better future tests. 

This is critical because there are only so many tests you can run in given period. You’re at the mercy of how much traffic you have.

So, at the end of every A/B test, brainstorm with your teammates to make a best guess as to what went right and wrong. 

Then consider running future tests specifically designed to verify these guesstimates.

How to share results with your teamExpand

I like to use a task management tool, like Trello, to keep track of the A/B tests I'm running and considering running in the future.

Each time I run a test, I make note of the following on Trello:

Once the test has finished running, I additionally make note of:

Next page: User onboarding

The next page explains how to onboard users so they become addicted to your app.

Next →
Julian Shapiro headshot

Updates are coming to this handbook

So far, I've spent 400 hours writing this. As my agency learns more from running growth experiments for our clients, I update this guide with the results.

To read drafts of these sections before they're published, subscribe below.

You'll also get my upcoming guides on how to play pianowrite fiction, and speak Chinese a couple months before they appear on my site 👊

Interesting in building muscle? Check out my already-released Muscle Guide.