Several months ago, a client came to us with a problem.
They’d recently launched a completely new interface for their website, but they weren’t 100% confident that it made sense for them to sunset their original web design. When they approached us, they still had both versions of their site up and running: folks originally landed on the new optimized interface, but there was also a large arrow at the top of the page that allowed them to go back to the original interface if they preferred.
They asked us to evaluate a big question for them: “Is our new design working?”
In a situation like this, there are a number of things that need to be measured. We have a model for this work at Medullan, and it includes establishing goals, translating those goals into metrics, and creating measures based on those metrics. The measures should tell the story of the metrics themselves.
As we dove into work with this client in particular, our first step after establishing goals was to gather some initial data. And this data seemed very straight forward:
Only 3% of users were choosing to revert back to the original web design. Of those 3%, more than ⅓ went back to the new design after spending a bit of time on the old one, too.
So I know what you’re saying: “Duh, you solved it! Obviously, the new website works better.”
And you’re right: the original finding proved that most folks chose the new design.
But to that I’d say, “Not so fast.” Most analytics companies would stop here. But at Medullan, we believe that you need to tell the whole story. Otherwise, you risk missing hugely important pieces of information.
In this case, the small number of users electing to switch back to the original design was actually a red flag. Why? Well, usually, when a user has the option to revert back to a previous design in this kind of a situation, a larger percentage of users elects to make this change. This isn’t because the new design is inferior. Rather, it’s because users are familiar with the older version and are able to more easily find what they are looking for in the older version based on their memory. We call this the “Familiarity effect,” and its absence in this particular case stuck out as unusual. It warranted further investigation.
There was also another question to answer: What if those 3% of users – the 3% that prefered the old design – were also from an incredibly valuable customer base that the client couldn’t afford to lose?
To answer both questions, we leveraged analytics to track the number and behavior of users who were reverting. These analytics allowed us to tell the client about more than just that first percentage: we could also tell the client that the user experience design of the new website was far superior to the old, carrying the most popular and needed information items at the top of the screen and the most unpopular below the scroll. This behavioral insight confirmed that it was the new design itself that resulted in such a small portion of the population reverting to the original design.
Additionally, we found that the 3% who chose the old design were from similar demographics and had similar, if not identical, site usage patterns as the 97% who chose the new design. These findings suggested that the users electing to revert to the original design did not represent a unique portion of the population, so they wouldn’t be isolated by sunsetting the original design.
In the end, we felt comfortable making a recommendation to the client on this project: What you’ve done here is working. Sunset the old version, and move to the new.
But there are some learning points here as well.
First, it can be a challenge to do this work well.
We’ve learned that success in this arena often comes from creating a varied team of analytics specialists, business strategists, and user experience designers. Without the right team, it can be a struggle to tell a complete story with the data you gather. As you can see above, sometimes the data points that you choose initially can seem to tell you the full story. But widening the scope is important – and you need the right folks around you to remind you to take user experience, demographics, and more into account.
Second, telling a human story with numbers will always be a challenge, no matter what.
So it’s important to have many perspectives on board, working with folks who take particular care to turn data points into assessments that explain what you need to find, not just what you expect to find.
Despite the challenge, the benefits of this work are huge as well, and they’re especially important in healthcare.
I believe that using data to explain the efficiencies (and inefficiencies) in medical care is of the utmost importance, especially when that data can be translated into human terms. The benefits here to patients, payers, and providers can be life altering. Plus, this kind of data allows us to look at a problem from both a macro level, in terms of trends, as well as from a micro level, as you can see in the example above. This combination of a long distance view and up close work is needed in healthcare.
The bottom line:
Using data analytics to assess feature success is becoming more and more popular, both in the healthcare field and beyond. Everyone is doing this, or if they’re not, they want to be doing it. The key that many companies are missing, though, is that to get the most out of your data you need to have the right combination of expertise – data analytics, user experience, and business strategy.
And finally, I’ll leave you with this:
Analytics work is tough, as it can sometimes feel removed from the humanity of a problem. However, when designed correctly, data-based user experience assessments can provide solutions that are incredibly narrative and statistically sound. Taking the extra steps to tell a full story, rather than just solving pieces of the puzzle, is worth every extra minute spent.