SpareFoot is a fast-moving technology company, so software development is at the core of what we do. Our engineers constantly release new features and optimize existing ones to make our storage marketplace even better for both our facility clients and storage-seeking customers.

But as you know, the Internet is tricky, and consumer behavior is even more unpredictable. When we make a change to the website, we need to be absolutely certain that it will yield positive results. For example, what would happen if we made a change that decreases reservations? This would negatively affect both SpareFoot and you—our clients.

To take the guesswork away and ensure we’re approaching every decision with a data-driven mindset, our engineers use a method called “A/B testing.”

What Is A/B Testing?

A/B testing is a common method websites use to test changes and determine which ones produce the desired results. The basic philosophy is simple: Compare two versions of a web page to see which one performs better, then roll out the winning version.

With A/B testing, 50 percent of visitors to our site see the standard, unchanged version (Version A), and 50 percent of visitors see the “test” version (Version B). We track how each version performs against our core business metrics:

  • What percentage of customers who visit the site reserve a unit online?
  • What percentage call us?
  • Of those groups, what percentage of these customers move in?

We use this methodology for most changes we make—from very small changes (like font size and button colors) to big changes like adjusting our ranking algorithm.

Tests normally run for two to six weeks before we have enough data to make the final decision to either keep the standard version or roll out the test version to all visitors.

A/B Testing in Action

Let’s take an example of a test that’s running on right now.

Listings currently display all of a facility’s amenities. In the current version, all amenities are shown in a long and unsorted list.

Our hypothesis: This could be clunky for customers and add friction that discourages them from reserving a unit.

Our test: Organize amenities into categories like “Security,” “Moving” and “Access.”

Here’s what the two versions look like:

A/B testing SpareFootNote that we typically run three or four tests on our site at any given time, so the version you see on your work computer might differ from what you see at home.

Believe it or not, many test versions—even the ones we’re sure are going to be winners—fail. This is totally normal and part of the reason why we avoid massive overhauls and redesigns, and instead focus on making as many small changes as we can. Our goal, ultimately, is to do everything we can to get you more customers.

Think you know which version of the amenities list test (pictured above) will win? Leave a comment below to make your pick! Check out your listing on in a few weeks, where you’ll see the winning test.