Continuous User Feedback Surveys

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Note to the reader:

This part of the field guide comes from our 2019 version of the UX Research Field Guide. Updated content for this chapter is coming soon!

Want to know when it's released?
Subscribe to our newsletter!

Collecting user feedback continuously can help inform future UX research studies, improve product designs, and speed up iteration cycles. 

And one of the best ways to collect this feedback is with surveys

‍To help you not only collect survey data, but also act on it, we’ve rounded up the best tips, tools, and examples from experts around the web. 

In this chapter:

  • What are continuous user feedback surveys?
  • Advantages and disadvantages of feedback surveys
  • How do you run feedback surveys continuously?
  • Mixing other methods with continuous feedback surveys 
  • Examples of great ongoing feedback questions
  • Tools, software, and integrations for continuous feedback surveys

What are continuous user feedback surveys?

Ongoing user feedback surveys (such as NPS, CSAT, or CES—more on these in the next section) are a method for collecting information directly from users or customers about their product experiences, preferences, and usage. These surveys are a continuous, always-on process, allowing teams to keep a pulse on customer satisfaction and respond to user needs as they arise. 

The value of regularly collecting customer feedback should not be understated—it’s what drives every decision in your company. As Noah Shrader, Product Manager and UX at Lightstream, says:

"User feedback in customer-centric companies is the fuel that drives every internal working part. Every process and every business decision is powered by a deep understanding of who the user is, what it is they're needing to do and ensuring they can do it successfully." 

Collecting this feedback continuously can also help speed up and improve iteration cycles in continuous delivery models. In the research study, How do Practitioners Capture and Utilize User Feedback during Continuous Software Engineering?, the authors concluded:

“We conclude that a continuous user understanding activity can improve requirements engineering by contributing to both the completeness and correctness of requirements.”

📚 Types of feedback surveys

Net Promoter Score (NPS)

Net Promoter Score segments users based on how likely they are to recommend your product to a friend. It’s a simple way of assessing whether your experience can spread word of mouth, which counts for a lot. Often users will also see an option to include why they scored a particular way. A little quant, a little qual. Nice.

Screenshot of an NPS survey: "Hi there! Quick question - how likely are you to recommend User Interviews to others?" with buttons from 1-0
This is the NPS survey we send via email when someone completes a project at User Interviews

Customer Satisfaction Score (CSAT)

The Customer Satisfaction Score measures customer satisfaction by asking users to rate their experience based on a predetermined scale. CSAT is simple and easy to use, but since the question is so broad, the reason behind the responses can be hard to decode. Still, in some ways it is a more direct question than NPS and can help gauge overall satisfaction in a more straightforward way.

Screenshot of a HubSpot CSAT survey: "how satisfied were you with your experience today?" from worst (1) to best (7)
A sample CSAT survey from Hubspot

Customer Effort Score (CES)

Customer Effort Score measures how much effort it takes for users to complete certain tasks, such as contacting support to resolve an issue.

Screenshot of a nicereply CES survey: "To what extent do you agree or disagree with the following statement: Nicereply made it easy for me to handle my issue." From strongly disagree to strongly agree.
Nicereply shows what a CES survey looks like through its platform

Website Intercept Surveys

Website intercept surveys are essentially modals or similar pop-ups that appear at key points in the user journey to assess sentiment. This may seem like an annoying addition to your site, but when implemented properly, they can be relatively frictionless for your user, and provide key ongoing feedback for you.

Email surveys

Email surveys are surveys that you send, well, via email. You can deliver NPS, CSAT, or CES surveys using email, or you can send more in-depth qualitative surveys. 

Ideally, email surveys allow respondents to answer questions embedded directly in the email; this offers the cleanest, easiest user experience. However, you can also collect responses by asking respondents to reply to the email or by including a link to a survey on an external site or platform. 

What are the advantages and disadvantages of customer feedback surveys? 

Like any research method, user feedback surveys come with their own pros and cons. It’s up to you to determine the situations in which they’re most appropriate for your team. 

✅ The pros of user feedback surveys

“Through continuous user feedback loops, you are ensuring at every point that you are working on something useful and valuable to customers. This reduces risk and cost throughout the development cycle massively.” – Kerstin Exner on The Product Manager 

The main benefits of user feedback surveys for UX research include:

  • Surveys help you reveal questions that need answering with UX research.
  • Surveys help you pinpoint flaws or shortcomings in the product.
  • Surveys prevent you from wasting time on developments that customers don’t want or need.
  • Surveys help you create benchmarks to track future performance. 
  • Surveys increase your efficiency and effectiveness in designing great products. 
  • Surveys boost customer trust and loyalty by reminding them you care.

⛔️ The cons of user feedback surveys

“It is too easy to run a survey. That is why surveys are so dangerous. They are so easy to create and so easy to distribute, and the results are so easy to tally. And our poor human brains are such that information that is easier for us to process and comprehend feels more true. This is our cognitive bias. This ease makes survey results feel true and valid, no matter how false and misleading. And that ease is hard to argue with.” – Erika Hall on Mule Design

The main risks of user feedback surveys include:

  • Surveys are sometimes implemented for the sake of it, rather than with specific questions or learning goals in mind.
  • The data you collect from surveys do not necessarily constitute insights—you need to be able to properly analyze this data in context to understand what to do with it. 
  • Because surveys are so easy to implement, it’s easy to ask vanity questions or questions that are better addressed through other methods

🎙 Hear more from Erika Hall on why surveys (almost) always suck on the Awkward Silences podcast.

How do you run user feedback surveys continuously? 

Ongoing surveys are a great way to collect user feedback post-product launch. Here are some best practices for user feedback surveys to help you get started from scratch (or more likely build on systems you may already have running within your organization):

  1. Understand your business goals.
  2. Get the right people on board.
  3. Decide which ongoing survey methods are right for you.
  4. Create short, focused sets of survey questions.
  5. Use a balanced set of answer choices.
  6. To anonymize or not to anonymize?
  7. Pick a time and cadence for soliciting feedback.
  8. Consider where you place your survey.
  9. Test and QA your survey with your team.
  10. Ask for feedback early and often.
  11. Decide on a schedule for analysis.
  12. Act on what you’ve learned. 

1. Understand your business goals.

As always, the best place to start is with your high-level goals. 

Consider the overall goals of your product, business, and users—goals that have probably been identified in prior research. 

Do you want to: 

  • Improve customer satisfaction?
  • Increase conversions?
  • Reduce customer churn? 

Ongoing customer surveys can help you track important metrics and collect qualitative feedback related to how your users feel about the experience. Knowing your goals ahead of time will help you align your efforts with those goals and ask the right questions—to the right people, at the right time, with the right survey methods.

2. Get the right people on board.

Whether your research team acts as a service arm at your organization or works hand-in-hand with the product team to make decisions, it’s essential that you get the right people on board from the beginning. 

Gaining the support and participation of the right people makes it easier to execute your ideas and make the right decisions about product changes. Collaborate with key stakeholders and internal team members to assess the biggest priorities and determine how to address ongoing survey feedback in general.

👉 Learn more in the UX Research Field Guide: Stakeholder Interviews

3. Decide which ongoing survey methods are right for you.

5 types of user feedback surveys: net promoter score (NPS), website intercept surveys, customer satisfaction score (CSAT), email surveys, customer effort score (ces) by User Interviews
5 types of user feedback surveys

As we mentioned earlier, there are many different types of survey methods you can choose from, including:

  • Net Promoter Score (NPS)
  • Customer Satisfaction (CSAT)
  • Customer Effort Score (CES)
  • Website Intercept Surveys
  • Email Surveys

Many organizations prefer Net Promoter Score (NPS) which divides your users into promoters (those who would recommend the experience), passives (those who are neutral), and detractors (those who have substantial problems). 

However, NPS can easily fall into the camp of “vanity” questions, or those that tell you little about users’ needs or any real insight into their product experiences—so don’t just choose NPS because it’s the most popular. Make sure the survey assessment you choose is intentional, based on your goals, and allows you to ask meaningful questions

4. Create short, focused sets of survey questions.

As with many things user experience, when it comes to deploying an ongoing user feedback survey, friction is your enemy. You have to make it easy for users to give feedback.

Be sure to:

  • Keep your surveys as short as possible. Ask only a small handful of questions for your best response rate, and to keep your analytical attention focused on the key goals you’ve set. NPS, CSAT, and CES typically include 1-2 questions for this reason. 
  • Focus on closed-ended questions, or those with a consistent quantitative metric.
  • Limit open-ended questions to 2 per survey. Pairing quantitative metrics with qualitative responses allows you to understand the “why” behind the “what” — but too many of these could hurt your response rates. 

If you’re looking for more qualitative feedback than 2 open-ended questions will allow, consider running continuous user interviews instead. 

✍️ Other tips for writing great survey questions include:

  • Consider your Most Crucial Question (MCQ). In Surveys That Work, Caroline Jarrett defines the MCQ as “the one that makes a difference. It’s the one that will provide essential data for decision-making.” Revise your MCQ until it’s incredibly clear and doesn’t leave room for interpretation. 
  • Use the first person to encourage people to focus on their experience rather than their opinions.
  • Replace UX jargon with simple, straightforward syntax and familiar words—language that a child could understand. 
  • Keep the questions neutral to avoid biasing answers. For example, instead of asking “How much do you love User Interviews?”, ask something like “on a scale of 1-10, how would you rate your experience with User Interviews?”
  • Ask one question at a time. For example, instead of asking “how would you rate your experience with User Interviews, and what made you choose that rating?”, break it down into two separate questions to avoid confusion and encourage clear, unambiguous feedback. 

5. Use a balanced set of answer choices.

Writing great answer choices is just as important as asking great survey questions

There are several different types of response options for surveys, including:

  • Long-form answers are open-ended survey responses, typically consisting of multiple (5+) sentences. Long-form answers provide in-depth, qualitative information and are great for getting nuanced feedback. However, participants need more time and effort to answer them, so they might lower your response rates
  • Short-form answers are open-ended survey responses that are typically limited to 3–5 sentences maximum. Like long-form responses, they’re great for collecting qualitative feedback. Because they require less time to answer, they’re usually a good choice for asking open-ended questions without significantly impacting your response rate.
  • Radio buttons are single-answer responses to multiple choice questions. Multiple options are presented in a list, and participants can choose only one response by clicking the “button” next to their choice. Radio buttons work well for yes/no questions and rating scales. However, participants are limited to the pre-set options in the list, which can present an issue if none of the options are relevant to them. 
  • Check boxes allow respondents to choose more than one answer to multiple-choice questions. They allow for more nuanced responses than radio buttons, but like radio buttons, are limited by the options presented. One way to mitigate this limitation is by providing a short-answer “other” option that allows participants to write-in other responses. 
  • Likert scales provide respondents with a range of options (for example, from “not at all likely” to “extremely likely” or “strongly disagree” to “strongly agree”). Likert scales are an extremely effective and popular survey method to gauge opinions, attitudes, or behaviors. 

Choosing the right survey answer option depends on the type of question you’re asking and the type of data you’re hoping to collect in return. If you want to explore more survey answer types and when to use them, check out this great article from SurveyMonkey

6. To anonymize or not to anonymize?

As you’re creating your survey, you may have the option of making survey responses anonymous. There are pros and cons to anonymized surveys, and the best option (as always) depends on your survey goals. 

As a rule of thumb:

  • Allow anonymous responses when you want to encourage more honest feedback or improve response rates for surveys on sensitive topics. 
  • Require names when you want the ability to follow up with respondents. 

✅ Survey best practice: If you’re planning on following up with respondents, be sure to ask for consent to do so. In most cases, consent for research participation is required by law, but this is also just a good practice that builds trust with participants. 

7. Pick a time and cadence for soliciting feedback.

Is there an ideal moment in your user journey where it makes sense to ask for feedback?

As a general rule, the best time to ask for feedback is while the experience is still fresh in the user’s mind. Depending on your goals, you might deliver surveys:

  • Immediately following a purchase, using an in-app pop-up or providing a link in an email invoice.
  • Immediately following an important user event, such as usage of a new feature.
  • Proactively, prior to a user event. For example, you might survey customers who aren’t using a new feature to find out why they haven’t adopted it yet.
  • On an ongoing basis, by offering an open feedback form on your website. 

For example, it makes sense for Google Maps to ask for a restaurant rating shortly after you’ve added the restaurant as a destination in the app or for Uber to prompt a driver rating right after a ride.

Whatever you do, though, do not ask for feedback:

  • When new users have just arrived on your site for the first time—you won’t have enough information about them to contextualize their responses.
  • When users are about to perform essential actions, like checking out on the shopping cart page or signing up for a newsletter—this can prevent users from following through with those actions and interfere with business goals. 

8. Consider where you place your survey.

Next comes where to deploy your survey. Your options may be limited to the survey tools you’re using, but here are a couple things to consider when you decide where to deploy surveys.

🗺 Keep it contextual

If there’s a key moment to get feedback when someone is actively using your web or mobile app, ask them directly in the app through a modal or similar experience. If they miss a prompt, you could follow up with an email, judiciously.

Email-based support conversations are naturals for email feedback follow-ups. In the same vein, a chat or messenger conversation can easily end with a chat or messenger feedback request.

👍 Consider user preference

If you’re sending proactive surveys (as opposed to those triggered by user behavior), try to send them in the channels your users prefer. 

Email is a classic choice here! But for your users who aren’t subscribed to email or are more responsive via other channels, you may find chat, in-app messaging, or push notifications to be more appropriate. Keep in mind push is likely your most aggressive option, and you should closely watch your negative KPIs—like opt-outs—when using it to request feedback.

🧠 Pro-Tip from User Interviews’s Senior Product Manager, Paolo Appley, and Sprig's Staff User Researcher, Allison Dickin, in “A Framework for Continuous Research”: Make sure you surface the survey to the right users—people who are experienced with your platform in most cases—and don't interrupt a flow.

9. Test and QA your survey with your team.

Nothing messes with survey results quite like having to edit it post-launch. 

Make sure your survey is 100% workable and free of errors by testing it with your team before you send it live. 

Download these free survey quality assurance checklists from Rachell of the User Interviews team to use as reference. 

user feedback survey QA checklist by User Interviews
Initial Survey QA Checklist

user feedback survey QA checklist by User Interviews
Final Survey QA Checklist

10. Ask for feedback early and often.

As Roberta Dombrowski, former VP of Research at User Interviews, and Chris Lee, Product Designer at Sprig discuss in this webinar, the best way to know if your designs will resonate is by testing early, often, and with the right users. 

If you can, automate the delivery of these early surveys to save time and effort. Depending on the platform(s) you’re using for your survey(s), your ideal state may be seamless, or may take a little clever Zapier-ing, but you should be able to get these kinds of ongoing surveys to a largely autopilot state, freeing you to spend more time uncovering insight and making better product and business decisions.

And please make sure you’re not berating your users. Look at your ongoing survey program holistically, and take advantage of any frequency capping or other options available to you through the platforms you're using to make sure you are not overwhelming anyone.

This may seem simple enough, but in many organizations, surveys may be owned by a variety of teams like support or marketing, so you’ll need to work across teams to make sure your surveys are implemented in such a way that gives you valuable feedback without annoying users. 

11. Decide on a schedule for analysis.

The thing about ongoing listening methods is that they’re running all the time, so when should you stop and analyze what’s happening?

Again, it depends on your goals and the context of the survey.

If you’ve rolled out a brand new experience for the first time, you should be checking in very frequently, whatever that means for you. If you haven’t released anything major recently, you might review on a less regular cadence. 

Many services now integrate with email or Slack, so you can stay on top of the day-to-day somewhat passively, while doing a deeper dive on a more set, less frequent cadence.

👉 Learn more in the UX Research Field Guide: Analyzing User Research

12. Act on what you’ve learned.

As the feedback rolls in, you might identify quick wins in the form of bugs or small usability issues that can make a big difference to the user experience. 

Building processes and relationships to turn this insight into action is critical to the success of your ongoing user surveying initiatives. 

Make sure to document how the changes you've made based on ongoing survey feedback have improved the key metrics you're tracking both regarding customer satisfaction, and broader business goals (like retention or revenue).

📚 Related reading: How to Track the Impact of UX Research

Mixing other methods with continuous feedback surveys 

The insights you discover from user feedback surveys can illuminate key questions or areas of exploration for future UX research, and vice versa. Pairing surveys with other user research methods can help you develop a more nuanced understanding of your customers, ultimately leading to better products and experiences. 

You might also pair survey data with feedback from: 

One way of easily accessing and aggregating this data is to develop what Sachin Rekhi, founder and CEO of Notejoy, refers to as a “feedback river,” or an open channel for anyone to get direct access to primary feedback on the product from across various teams and channels:

“This has typically taken the form of an internal company mailing list in Gmail or Outlook, but I’ve also seen it as a feedback channel in Slack or HipChat. I typically require all product managers, designers, and engineering leads to be on the list, but encourage anyone across R&D, marketing, sales, customer service, and more that’s interested to subscribe as well. The list usually has open write access as well for any internal stakeholder to contribute.

All feedback that is gathered across the various feedback sources is then encouraged to be shared in a reasonable aggregate form on this channel. For example, let’s say the product team conducted a set of customer interviews. They are encouraged to provide both links to interview recordings as well as summarized feedback on the channel. As another example, the customer support team usually has a designated person who sends a weekly customer feedback report on the channel with details of top issues that customers have been facing as well as links to reports for further details.”

This is a great way to start identifying patterns in user insights across your various feedback channels, including surveys. 

Examples of great ongoing feedback questions

The best user feedback survey questions are always the ones that help you answer your key questions and reach your particular goals—but if you’re looking for inspiration, here are some great templates, examples, and samples of great survey questions for collecting ongoing user feedback. 

The 2-question survey from Els Aerts, co-founder AGConsult:

  • Who are you?
  • What brought you to our site today? Please be specific.

3-part continuous feedback form template from Matter

  • How would you rate your satisfaction with [PROBLEM AREA]?
  • How would you rate your satisfaction with [SOLUTION TO PROBLEM AREA]?
  • What, if anything, could we do to improve [PROBLEM AREA] further?

28 customer feedback questions for customer surveys from HotJar:

  • How would you describe yourself in one sentence?
  • What is your main goal for using this website/product?
  • What, if anything, is preventing you from achieving that goal?
  • What is your greatest concern about [product/brand]?
  • What changed for you after you started using our product?
  • Where did you first hear about us?
  • Have you used our [product or service] before?
  • Why did you choose to use our [product or service] over other options?
  • Have you used a similar [product or service] before?
  • How do you use our product/service?
  • How can we make this page better?
  • What’s the ONE thing our website is missing?
  • What, if anything, is stopping you from [taking action] today?
  • What are your main concerns or questions about [product or service]?
  • Thanks for [taking action]! How are you planning to use [product or service]?
  • How would you describe the buying experience?
  • Do you feel our [product or service] is worth the cost?
  • What convinced you to buy the product?
  • What challenges are you trying to solve?
  • What nearly stopped you from buying?
  • What do you like most about our [product or service]?
  • What do you like least?
  • What feature/option could we add to make your experience better?
  • How could we have gone above and beyond?
  • Net Promoter Score (NPS): how likely are you to recommend our products?
  • Customer Satisfaction (CSAT): how satisfied are you with our product/services?
  • Customer Effort Score (CES): how easy did (organization) make it for you to solve your issue?
  • Is there anything you’d like to add?

Continuous feedback collection questions from Qualaroo:

  • How difficult was this product to use?
  • Did you encounter any challenges when enrolling in the Data Science course? 
  • I noticed you didn’t use the commenting tool in your classroom. Why is that? 
  • Rate your experience using E-Scolere? 
  • How likely are you to take a course on another subject with E-Scolere? 
  • What subjects are you interested in learning more about online? 
  • How likely are you to recommend E-Scolere to a friend?

Tools, software, and integrations for continuous feedback surveys

Your options for collecting user feedback in the form of continuous surveys are vast. Some of the most popular tools for continuous surveys include:

  • Sprig: A tool to help you capture rich user insights to build better products. Sprig’s in-product surveys allow you to target specific users and get real-time feedback on your product experience. Plus, Sprig is integrated with User Interviews, so you can run studies with the confidence that you’re always testing with the right users, all within one streamlined workflow.
  • SurveyMonkey: A survey and feedback management solution that helps organizations create surveys, collect responses, and analyze results. SurveyMonkey’s AI-powered tools help anyone, from novice survey creators to the most experienced market researchers, create, launch and analyze surveys with ease. Learn more about the SurveyMonkey <> User Interviews integration.
  • Qualtrics: A stand-alone UX research platform for building, launching, and analyzing survey research. Plus, Qualtrics is integrated with User Interviews, so you can pair this flexible survey solution with the highest-quality respondents. 
  • Typeform: A market research survey tool with tools to create forms, surveys, quizzes, and videos that help users collect data and insights. With Typeform, you can create thoughtfully-designed forms that require no code. Plus, Typeform is integrated with User Interviews!
  • Chameleon: A digital adoption platform that gives modern SaaS teams the control, configuration, and customization to win with in-product UX. Product teams use Chameleon to create better self-serve experiences, build smooth user onboarding, and collect contextual, relevant feedback using microsurveys. Listen to Pulkit Agrawal, Co-founder and CEO of Chameleon, talk about best practices for onboarding on the Awkward Silences podcast. 
  • Appcues: A no-code platform that empowers non-technical teams to track and analyze product usage, and publish beautiful in-app onboarding tours, announcements, and surveys, in minutes. Hear Laura Powell of Appcues talk about making user testing fun, fast, and accessible on the Awkward Silences podcast
  • SurveyLab: An online survey tool and questionnaire software that supports survey creation process, and automates response collection and report generation.
  • SurveySparrow: An all-in-one omnichannel experience management platform that bundles customer experience and employee experience tools such as NPS, Offline, Chat, Classic, and 360° Surveys which are mobile-first, highly engaging, and user-friendly.
  • Survicate: A fast survey tool for getting continuous customer insights. Survicate supports the collection and management of customer feedback for customer experience insights, marketing feedback, product experience and more.
  • Ethnio: A UX research recruiting, scheduling, incentives, and participant management tool with a popular web intercept feature for in-app surveys. Learn how Ethnio and User Interviews compare. 
  • Jotform: An easy form builder to create and publish online forms from any device. Jotform offers 10,000+ ready-made form templates, 100+ integrations to 3rd party apps, and advanced design features.

In some cases, you might also consider using conversational tools like Drift or product marketing tools like HubSpot to implement a continuous survey practice. 

🧙‍♀️✨ To explore these and other UX research tools in more detail, check out our 2022 UX Research Tools Map, an illustrated guide to the ever-changing world of user research software. 

In a nutshell

Continuous feedback surveys are a great way to keep a pulse on user sentiment, at every user touchpoint throughout the entire user lifecycle. 

By choosing the right type of feedback survey and implementing them carefully and efficiently according to your goals, you can gain a deep and adaptable understanding of your users. Ultimately, continuous surveys done right can help speed up product iteration cycles, leading to better, more competitive products and highly loyal customers. 

Talk to your users in 75% less time

Try Research Hub free

Subscribe
X

Master UX Research

Get the full course and exclusive pro tips delivered straight to your inbox. (New content dropping in 2022)!
Thanks! See you in your inbox 👋
Oops! Something went wrong while submitting the form.