Ongoing Listening Methods
: CHAPTER #
1

Continuous User Feedback Surveys

Once you’ve launched a new experience, you may feel as though you can kick back, put up your feet, and wait for the positive feedback to roll in. However, your work is far from over. In fact, in many ways, it’s just begun.

Post-launch is a critical time to gain user feedback. That’s because your product/design is now in the wild, and real users are going to be clicking buttons, navigating to new pages, and generally interacting with everything you’ve created. This is yet another key time to listen to your users—where are they having wins? Where are the stumbling blocks? What works? What doesn’t?

Ongoing surveys are a great way to collect user feedback post product launch. Here’s how to get started from scratch, or more likely build on systems you may already have running within your organization.

Decide which survey methods you’ll use

While there are countless survey tools and methods to choose from, the good news is that you don’t need a lot to set up a system that helps you gain user feedback in a pretty automated way. Of course, you need to make sure you’re choosing a survey system that provides you with feedback that you can act on.

Here are some things to do as you assess survey methods:

1. Align your efforts with business goals

When evaluating survey assessment options, start with knowing your goal. Consider the overall goals of your product, business, and users, goals that have probably been identified in prior research. Do you want to improve customer satisfaction, increase conversions, or reduce customer churn? Ongoing customer surveys can help you track metrics and qualitative feedback related to how your users feel about the experience. When it comes to metrics like churn, LTV, conversion rate, we'll dive into those in our analytics chapter.

2. Decide which ongoing assessment methods are right for you

Many organizations like Net Promoter Score (NPS) which divides your users into promoters (those who would recommend the experience), passives (those who are neutral), and detractors (those who have substantial problems). Other options are Customer Satisfaction (CSAT) and Customer Effort Score (CES). 

Here’s an overview of some common assessments that user researchers use:

Net Promoter Score (NPS) 

Net Promoter Score segments users based on how likely they are to recommend your product to a friend. It’s a simple way of assessing whether your experience can spread word of mouth, which counts for a lot. Often users will also see an option to include why they scored a particular way. A little quant, a little qual. Nice.

This is the NPS survey we send via email when someone completes a project at User Interviews

Customer Satisfaction Score (CSAT) 

The Customer Satisfaction Score (CSAT) measures customer satisfaction by asking users to rate their experience based on a predetermined scale. CSAT is simple and easy to use, but since the question is so broad, the reason behind the responses can be hard to decode. Still, in some ways it is a more direct question than NPS and can help gauge overall satisfaction in a more straightforward way.

A sample CSAT survey from Hubspot

Customer Effort Score (CES) 

CES measures how much effort it takes for users to complete certain tasks, such as contacting support to resolve an issue.

Nicereply shows what a CES survey looks like through its platform

Website Intercept Surveys 

Website intercept surveys are essentially modals or similar that appear at key points in the user journey to assess sentiment. This may seem like an annoying addition to your site, but when implemented properly, they can be relatively frictionless for your user, and provide key ongoing feedback for you.

Some common tools businesses use to support gathering these types of surveys are Wootric, Promoter.io, GetFeedback, SurveyMonkey, SurveyGizmo, Nicereply, and Zendesk.

It's very possible someone or some team in your organization has already established a tool or process of gathering one more of the types of feedback you're interested in reviewing. You may be able to use what they're already doing, or work with them to get the functionality you'd ideally want out of those tools.

Make it easy for users to give feedback

As in many things user experience, when it comes to deploying an ongoing user feedback survey, friction is your enemy. Here are some guidelines to consider:

1. Keep your surveys short and focused

Ask only a small handful of questions for your best response rate, and to keep your analytical attention focused on the key goals you’ve set. NPS, CSAT, and CES typically include 1-2 questions for a reason. If you’re building your own custom survey, make sure you’re tracking a consistent quantitative metric, while allowing for an open response that gets at the “why” behind the what.  

2. Pick a good time to solicit feedback

Is there an ideal moment in your user journey where it makes sense to ask for feedback?

For example, it makes sense for Google Maps to ask for a restaurant rating shortly after you’ve added the restaurant as a destination in the app or for Uber to prompt a driver rating right after a ride. Ask for feedback while the experience is still fresh in the user’s mind.

Depending on your goals, you might ask at different times. If you're focused on gaining adoption of a new feature, you might ask people who are using it about how they found your app and their initial impressions after using it. You might also proactively survey users to find out why they haven’t adopted it yet.

Go back to your goals, and your status against any targets you’ve set for launching a new product, then identify the users and events that will give you the most insight to reach those goals. Finally, make sure your surveys are deploying at the right time to achieve that.

3. Consider where you make your ask

Next comes where to deploy your survey. Again, your options may be limited to the services you’re using, but here are a couple things to consider when you decide where to deploy surveys.

Keep it contextual

If there’s a key moment to get feedback when someone is actively using your web or mobile app, ask them directly in the app through a modal or similar experience. If they miss prompt, you could follow up with an email, judiciously.

Email based support conversations are naturals for email feedback follow ups, and in the same way a chat or messenger conversation can easily end with a chat or messenger feedback request.

Consider user preference

If you’re sending proactive, versus user behavior triggered, surveys, try to send them in the channels your users prefer. Email is a classic choice here! But for your users who aren’t subscribed to email, or those who are more responsive via chat, in-app messaging, push, or other channels, those may be more appropriate. Keep in mind push is likely your most aggressive option, and you should watch your negative KPIs—like opt-outs—closely when using it to request feedback.

4. Ask for feedback early and often

The best way to do this is to take the key moments and channels you’ve identified, then automate the delivery of the survey based on those rules. Depending on the platform(s) you’re using for your survey(s), your ideal state may be seamless, or may take a little clever Zapier-ing, but you should be able to get these kinds of ongoing surveys to a largely autopilot state, freeing you to spend more time uncovering insight and making better product and business decisions.

And please make sure you’re not berating your users. Look at your ongoing survey program holistically, and take advantage of any frequency capping or other options available to you through the platforms you're using to make sure you are not overwhelming anyone.

This seems simple enough, but in many organizations these surveys may be owned by a variety of teams like support or marketing, so you’ll need to work with them to make sure your surveys are implemented in the best way possible to get you valuable feedback, which is what everyone wants. 

Build a system for implementing results

There’s no point asking users to take a survey you can’t analyze and take action on. Ongoing survey data can help you find issues you didn’t know existed, separate signal from noise, understand the why behind user behaviors, and more. Make sure you have a system in place to continually analyze data and implement improvements. Here are our top recommendations.

1. Get the right people on board

Whether your research team acts as a service arm or your organization, or works hand-in-hand with product to make decisions, it’s essential that you get the right people on board from the beginning. Work together to assess the biggest priorities and determine how you will address ongoing survey feedback in general.

2. Decide on a schedule for analysis.

The thing about ongoing listening methods is that they’re running all the time, so when should you stop and analyze what’s happening? If you’ve rolled out a brand new experience for the first time, you should be checking in very frequently, whatever that means for you. If you haven’t released anything huge recently, you might review on a less regular cadence. Many services now integrate with email or Slack, so you can stay on top of the day-to-day somewhat passively, while doing a deeper dive on a more set, less frequent cadence.

3. Implement changes based on what you’ve learned.

Make sure you have a way to bubble up the strong signals you’re getting from ongoing feedback. You may also identify quick wins in the form of bugs or small usability issues that can make a big difference to the user experience. In either case, building processes and relationships to turn insight into action is critical to the success of your ongoing user surveying initiatives. Make sure to document how the changes you've made based on ongoing survey feedback have improved the key metrics you're tracking both regarding customer satisfaction, and broader business goals (like retention or revenue).

NEXT CHAPTER

User Analytics

Monitoring key user analytics, web analytics, and product analytics is an important part of post product launch UX research.
Go to next chapter →

Get the UX Research Field Guide delivered weekly.

Subscribe