Research in Brief

The Problem

As a SaaS business, we regularly make improvements in our software product because we care about our customers. We also want to give our customers a competitive advantage with our customer data so they can make better business decisions. One of the features, Live, was not getting the engagement it needed by our product team, and we decided to iterate on the feature in order to increase engagement with the feature to determine if we had to kill the feature entirely if it was not useful for our customers.

The Approach

We took an iterative approach by talking to customers directly and pulling existing data from customer support tickets. From there, we came up with sketches and wireframes. We showed the designs to customers and iterated on the wireframes before moving on to visual design. Once we tested with users again on the visual design, we brought the design to the product and engineering team to build.

The Impact

The re-design resulted in a 155% increase in engagement for the feature. Our first launch ended up resulting in some negative feedback for users that had smaller screen devices. We learned that we should have done more testing around different screen sizes. We iterated on that very fast in order to meet customer needs. We learned that we should always take into account the different screen sizes that customers have and design experiences that allow for utility at any screen size.

KISSmetrics Live lets you monitor top activity trends, filter for specific activity you’re looking for with a new launch or campaign, and see information for individual customers — all in real time.

KISSmetrics Live lets you monitor top activity trends, filter for specific activity you’re looking for with a new launch or campaign, and see information for individual customers — all in real time.

Our Process

When we set out to improve our Live tab (which provides people with a real-time data stream of customer activity), we first looked to our secret sauce — customer feedback.

Thanks to our awesome customers, we were able to define a list of requirements and use cases that our Live tool needed to help customers get their jobs done better and faster.

Goals

  • Reliability*** — Flash was causing all sorts of trouble
  • A way to drill down on specific people, events, or properties
  • A better way to view your own activity AND monitor the live stream
  • Getting into individual customer profiles more obviously

Flash was a big offender. It caused loading problems. The different versions caused different errors. It crashed. Customers were not able to see it at all because of their device. Customers lost their whole session.

Methods

  • Screener targeting customers that have used the feature within the last 30 days
  • Screener targeting customers that have used the feature at least 1X but NOT used the feature within the last 30 days
  • Remote user interviews with select customers from above screener responses
  • Feedback and support ticket review of existing problems and customer feedback
  • Usability testing of sketches, wireframes, and prototypes

Respondents fell under the following demographics in the study:

  • Age: 21 - 45

  • Country: US only

  • Roles represented: Product Marketing Manager, VP of Marketing, Marketing Analyst, Product Manager, and CEO/Founder

  • Industries represented: Software, Agency, E-Commerce, Publishing, Education

Product Development Process

The team started with customer interviews and existing support tickets to aggregate problems with the current feature. We annotated all of the issues onto a single screen.

The old version of Live, complete with annotations, after customer feedback was summarized by Jason, our product manager. Oof.

The old version of Live, complete with annotations, after customer feedback was summarized by Jason, our product manager. Oof.

But what do customers actually need from this list of problems? To help answer this, we started with sketches, mockups, and wireframes. Our customer success team made a crude sketch that we showed some users to see what they thought.

 

Since Live was actually used a lot for debugging internally. We asked our own staff what they think could be better to make their jobs easier. This is one of the earliest sketches in our design phase from Eric, our Q/A engineer.

Since Live was actually used a lot for debugging internally. We asked our own staff what they think could be better to make their jobs easier. This is one of the earliest sketches in our design phase from Eric, our Q/A engineer.

We wanted to work with something low fidelity to show customers’ rough user experiences so we could see if our ideas were actually helping them solve their problems. This allowed us to focus on the jobs customers were trying to get done without having colors and major layouts get in the way of the feedback. It allowed us to differentiate what they needed (ways to filter, search, etc.) from what they wanted (button colors, perfect alignment, etc.). After we were able to validate some functionality, the design team created low fidelity wireframes:

 

Another idea from one of our designers, Jason, with UI notes.

Another idea from one of our designers, Jason, with UI notes.

As a company that helps other businesses get to know their people, it was obvious to us that we needed to keep in close contact with our customers. And we did just that. After several cycles of interviews and testing with users, we were able to get to a point where customers agreed that they would be able to do the jobs they wanted to accomplish with our new improved tool. So we started building it out with engineers.

The Result

 

Final design that was made by our lead designer, Ian, and implemented. Huzzah!

Final design that was made by our lead designer, Ian, and implemented. Huzzah!

A job well-done, everyone! Let’s move on to the next thing, we thought. Not so fast. When we launched the feature , our feedback box started filling up with messages. I kept an ear to the feedback, collected it, and showed it to the product, engineering, and design teams:

 

Some feedback messages all about stream activity being too big. Yowza. Thanks to everyone who sent in feedback!

Some feedback messages all about stream activity being too big. Yowza. Thanks to everyone who sent in feedback!

A lot of the initial negative feedback focused on how the stream items were so large that it was impossible to scan for customer activity. Some people even wanted to switch back because it was not valuable without the ability to scan easily!  Luckily, we received positive feedback with regard to reliability, search filters, and trend monitoring. All the jobs were accomplished (yeah!), but we had a design issue to solve.

Here’s How We Got A Crazy Boost In Engagement

 

Yes, this is the actual metric chart with the huge jump. Since our data is tracked on a per-person basis, we knew something great happened.

Yes, this is the actual metric chart with the huge jump. Since our data is tracked on a per-person basis, we knew something great happened.

The small bump in the middle of the graph was when we originally launched the new version of Live. We saw about a 50% increase in that week alone after an email to our existing customers to check it out and use it over the course of the week where it leveled off.

After the feedback was delivered to the teams, they wasted no time and hustled. One key message was communicated in the feedback that I recommended: customers wanted to see more activity and data in the stream.

 

What we rolled out a week later as Version 2. We increased how much activity you could see, while maintaining some Version 1 design elements on hover.

In the old design, you could see two customer activities, maybe two and half activities, in the stream. In the new design the team came up with, you can see ten activities in the same resolution.

Here’s how some of our customers responded:

 

Valuable feedback turned into fix. Thanks Pejman!

Valuable feedback turned into fix. Thanks Pejman!

Happy customer with the new fixes and improvements. Thanks Evan!  

Happy customer with the new fixes and improvements. Thanks Evan!

 

Best part? We didn’t do any additional marketing or launch emails when we implemented the new version. Our customers organically started using the feature.

So What Did We Do?

We started with tracking customer data. We made sure we established tracking for our Live tab with KISSmetrics. Tracking on a per-person basis steered us away from dangerous vanity metrics and made us start analyzing the behavior of real people. One event each from a million people is very different from a million events from just one person.

We also got customer feedback data and made sure their problems were solved by our solutions. Research played a big part to not only review existing feedback, but also be aware of what to track to accurately measure behaviors later.

With customer data, we were able to look into the whole lifecycle of our engagement patterns, all the way back to the first occurrence, as well as drill down into specific customers, if necessary. We were able to benchmark our performance to measure against dips, or in this case, gains.

Finally, we made small design changes according to most common customer requests. And we did it fast by making small iterative changes, going back to users to interview them, and then back to design. It’s hard for any business to get everything right the first time. Or the second time. And so on. If you iterate quicker, you’ll learn faster about what worked, what didn’t work, and how to prioritize what’s next.