Usability Testing vs. A/B Testing

There are many testing methodologies to answer business questions, but not all of them are equal. Some prefer one method over another and neglect using multiple methods. This video compares and contrasts Usability Testing vs. A/B Testing.

Having worked extensively with an organization that was very entrenched in doing usability testing, I have found there are good and bad things about each approach. There is also an optimal way to do both to derive the most value of any of your research or optimization efforts.

In this video, you will learn several things to improve your testing efforts including:

  • The pros and cons of each method
  • The difference in how each method determines success
  • How sample size impacts each method
  • Segmentation considerations with each method
  • How a natural vs. unnatural environment influences visitors
  • How each method has a different view of risk

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

Rapid Iteration The Right Way Using A/B Testing

Rapid iterations sound great in theory, but if it is done wrong it can be devastating for an organization and its customers. Rapidly iterating without the causal data you get from A/B Testing can also be detrimental.

Iterating within a strong optimization program is the surest way to iterate rapidly in the right direction.

In this video, you will learn several things to improve your testing efforts including:

  • The limitations of rapid design iteration
  • How sample size, moderator bias, and going too quickly can be detrimental
  • How A/B testing solves for the limitations of small group design iteration sessions
  • Four things to do that will increase your ability to iterate rapidly in your optimization program

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

Quantify the Qualitative Data and Use the Right Quantitative Data

Too often organizations make decisions based on just a few pieces of qualitative feedback or a limited sampling of the quantitative data available. In this video, you will learn how to quantify qualitative feedback so that it can be measured and understood for what it truly is. You will also learn about using the full set of quantitative data available to you.

By quantifying the qualitative and using more quantitative data with your results you will be able to hone in on what matters to your visitors and your business without being swayed by a few strong opinions. You will also be able to put down the internal nay-sayers.

In this video, you will learn several things to improve your testing efforts including:

  • What it means to quantify qualitative feedback
  • How using the right quantitative data can help you avoid being trapped
  • Why telling the full story with the data is essential to building a testing program

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

3 Types of A/B Testing Metrics- Use the right ones or fail

The three types of metrics are your business metrics (those that impact your bottom line), your test specific metrics (those that are unique to each test), and your analytics metrics (those that are in all your analytics tools).

We will use a real-life example of what these metrics look like for a business.

In this video, you will learn several things to improve your testing efforts including:

  • The differences between each type of metric
  • When you should use each type of metric
  • How primary metrics should guide test results interpretation
  • Which types of metrics can be primary metrics, secondary metrics, or both
  • A massive caution at optimizing to certain metrics
  • And much more

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

Heatmaps And A/B Testing

Like other correlative data sets, heatmaps are often used incorrectly to make decisions about a visitor experience. This video will review how to strategically use heatmaps with your split testing to get the maximum value for your visitors. We will also cover the 2 things to avoid doing as you use your site’s heatmaps.

By using your heatmaps the right way, you will be able to optimize your site faster and get that valuable and intuitive visual analysis with your testing.

In this video, you will learn several things to improve your testing efforts including:

  • How correlative heatmaps can hurt your optimization efforts
  • The danger of using click only heatmaps
  • The benefit of analyzing mouse movement heatmaps
  • The 4 steps to follow to get the most value out of your heatmap and testing data

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

Zoom Zoom A/B Testing Strategy

The Zoom Zoom A/B testing strategy will help you get better testing results by helping you focus on the things that matter most. Too many split testers test things that are so granular that they don’t even matter. This results in tests that have low lifts or no lift at all.

By following the Zoom-Zoom testing strategy you will be able to identify what matters to your visitors and get bigger gains faster because you will know where the value is.

In this video, you will learn several things to improve your testing efforts including:

  • How to avoid testing things that are too granular
  • What it means to zoom out with your business question
  • What a zoom zoom sandwich is
  • The 4 steps to following the Zoom-Zoom Testing Strategy

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

Creating a Culture of Optimization

Good A/B testing isn’t about just running split tests. Conversion Rate Optimization is about creating a data-driven culture inside the company. This video will cover 6 ways to build a data-driven culture of optimization. As you do each of these things you will create a stronger optimization practice.

Here are the six steps and the things that we will cover.

1. Get Test Results

2. Evangelize Results Broadly

3. Ideation From The Company

4. Create Infrastructure

5. Integrate Testing Widely

6. Cross-Functional Team

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

WebsiteBox Existence Testing

This A/B testing example shows a case study from WebsiteBox.com. The very first two tests we did with them helped us learn how to prioritize our testing to make future tests even more valuable.

These tests show how to do a back to back existence knockout punch that would be great for the first two tests of any site. 

This video will help you learn a ton about foundational tests to jumpstart your split testing.

In this video, you will learn several things to improve your testing efforts including:

  • The efficiency of existence testing current pages and content
  • Learning over lift approach
  • Learning regardless of loss in lift
  • How the relative impact of existence testing helps prioritize efforts
  • How to do a reverse existence test
  • Combination effect of elements
  • Redesign based on the discovered value of elements relative to each other

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

3 Reasons to A/B Test the “Impossible”

As you are doing conversion rate optimization, you will get push back from stakeholders, technical people, and even team members. They will tell you the same thing, that an A/B test can’t or shouldn’t be done. “We can’t build that test,” or “So and so doesn’t like that split test,” or “That A/B test is too complex.”

This video will address 3 reasons why you should A/B test the impossible.

In this video, you will learn several things to improve your testing efforts including:

  • The three benefits of testing things that people say you shouldn’t
  • How to test something that is impossible
  • Fake it before you make it strategy
  • Cautions in testing the impossible

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

What is the Optimal Number of Variations Per A/B Test?

Did you know there is an optimal number of variations to run per A/B test? Some simple math can help you identify what that sweet spot is so you get more gains with your split testing. More gains per test amount to better visitor experiences and more conversions for your company.

The Math of More Variations

In this video, you will learn several things to improve your testing efforts including:

  • How more variations increases your success rate
  • The math behind choosing the optimal number of variations
  • The diminishing returns of  adding more variations
  • The constraints to consider before building multiple variations

Testing Theory is where professional testers turn to do better A/B testing and get more conversions.

1 2 3 4
>