down arrow
How to Craft User Experience Surveys That Identify the Problems Customers Most Need Solved

How to Craft User Experience Surveys That Identify Customer Problems

User experience surveys can tell you what customers need from your company. But a survey’s success depends on asking the right questions.

UX surveys can be used for a range of purposes. But among their best and most popular uses — and the one we’ll be discussing in this article — is gathering intelligence on where and how your site or product might be improved in order to match:

  • What customers and prospects need and expect
  • What motivates them
  • What problems they’re facing.

To yield feedback that’s truly valuable, however, a survey has to ask the right questions. This sounds simple, but there’s more to it than meets the eye. And unfortunately, it’s where most surveys go wrong

UX surveys that fail to ask the right question types can yield responses that are vague, misleading, inaccurate, or otherwise unactionable. It’s not unusual for them to have low response rates. Sometimes, surveys aren’t even directed to the audiences that are best positioned to answer them.   

For a master class in creating and analyzing UX surveys that do ask the right questions — resulting in answers that UX professionals can use to improve the experience of their target audiences — we spoke with Els Aerts. 

Co-founder of AGConsult, an agency that has worked with companies such as Yoast and PricewaterhouseCoopers on user research, Els believes that UX surveys are most effective when they’re designed to reveal users’ pain points, zeroing in on problems so companies can adjust their websites and products to specifically address their target audience’s needs. In this article, we’ll share her insights so you can apply them to your own UX surveys. Specifically, we’ll explain:

  • The two basic UX survey types Els uses and how and when to use them yourself
  • The right kinds of questions and mistakes to avoid for each
  • How to analyze UX survey results.
Note: Looking for a specific audience to participate in your user research? User Interviews offers a complete platform for finding and managing participants in the U.S. and Canada. Tell us who you want, and we’ll get them on your calendar. Find your first three participants for free.

Two Effective Survey Types and How to Run Them

There are no fewer potential approaches to formatting a UX survey than there are possible applications for the survey itself. But Els likes to keep things simple. In her experience, two basic types of UX survey, used in combination, often make for a very effective approach:

  • Website Surveys: used for visitors and prospects.
  • Email Surveys: mainly used for existing customers and subscribers.

For Els, whether a UX survey is conducted via website or email, the goal (getting audiences to reveal their problems, needs, motivations, and expectations) is roughly the same. And web and email surveys share some important common best practices, too:

  • Brevity: However you reach respondents, it’s important to be mindful of their time. Overwhelming readers with more than a handful of questions may irritate them, making them less likely to respond to the survey thoughtfully and truthfully, if at all. 
  • Neutrality: Even without meaning to, survey questions can bias respondents, nudging them toward one response or another and skewing the results in the process. This can be as simple as including an unnecessary adjective, like “leading [business type]” or “knowledgeable” in a question, predisposing readers to apply a positive frame to their response. Good questions avoid this tendency, maintaining neutrality to provide the best chance of getting accurate responses. 

An example of a loaded question. Unsurprisingly, the answers skewed positive.

The rating scale you use for closed-response questions matters, too. The way the question is phrased and weighted can make a difference in how participants respond.

  • Openness: Similarly, survey questions tend to yield more accurate insights when you ask open questions, allowing respondents to explain in their own words what they need or want, how they feel, and what problem(s) they’re trying to solve — as opposed to limiting possible answers, for example, by employing multiple choice questions (or other closed question types).
  • Specificity: At the same time, survey questions are most effective when they encourage audiences to be specific in their responses, making vague answers less likely. 

With these commonalities in mind, we’ll explain some best practices and common errors in user experience surveys.

Website Surveys

For anyone who wants to better understand their audiences’ behavior online, analysis tools like Google Analytics and heatmap software are indispensable. They can furnish valuable data about how people find and interact with your site. At the same time, they can only tell you so much.

Online surveys, which often appear in the form of pop-ups, can be understood as a way of parsing insights gleaned from user behavior analysis tools. Used effectively, they can help explain why visitors to your site are doing what they’re doing. 

“Google Analytics tells you what people do on your website,” Els said. “Surveys tell you what people want to do on your website.” 

Executed wisely, they can provide valuable clues about how to adjust your site or product to better suit your target audience(s). 

The Two-Question Survey

An example of a two-question web survey on Yoast.

In the spirit of keeping things simple, Els often deploys a user experience survey template consisting of just two basic questions:

  • Who are you?
  • What brought you to our site today? Please be specific.

Deceptively simple, this survey template provides an opportunity for audiences to offer up extremely useful user feedback.

By asking respondents to identify themselves, the first question has the effect of verifying that the right target audience is arriving at your website through the intended portal — via a particular Google ad, say — and that they’re landing on the page most appropriate to their needs. 

If your site has separate pages designed for private individuals, businesses, and government audiences, for example, answers to this question can illuminate glitches that might keep potential customers from easily accessing the portion of your site that’s most relevant to them. (We should note that depending on how certain you are of your audience, this is one survey question for which multiple choice answers can be appropriate.)

In adhering to the rules of openness and neutrality, the survey’s second question invites respondents to identify the problem they’re dealing with and, either explicitly or implicitly, the solution they’re looking for. 

We can’t stress enough the importance of asking respondents to be specific. Without that addition, Els says, people sometimes give short, vague answers that don’t add anything to your sense of their needs or wants, e.g. “I was looking for information.”

On the other hand, specific responses can yield key insights, indicating, for example, that although you offer what visitors are looking for — a particular kind of stove, let’s say — they haven’t been able to find it. From this data, the UX team might conclude that perhaps the product photos or descriptions are unclear, or that the website’s design could be more intuitive. 

Specific responses to Question #2 can also surface surprising food for thought. An insurance company might learn, for example, that many visitors to their website are looking for a kind of insurance they don’t currently offer — and begin to consider whether they ought to begin offering it. 

Timing, Placement, and Follow-Up

There’s a widespread and not unwarranted assumption that, whether they contain surveys or some other content, pop-ups tend to annoy website visitors. But the timing and placement of a pop-up survey can make a huge difference in the rate at which people respond to them. 

It’s one thing to greet visitors to your homepage with a pop-up, pestering them before they’ve even had a chance to get their bearings. It’s quite another to approach them at a moment when you have reason to believe that they’re frustrated — that they haven’t found what they’re looking for. 

It’s times like these — on web pages where your analytics show steep visitor drop-off, or when visitors navigate away without adding anything to their cart — that can make ideal circumstances for short online surveys. The exact wording can be varied to suit the context, and although it’s far from an exact science, survey placement matters, too. 

If you’re hoping to collect information from customers who decide to make a purchase/appointment, including the survey on their confirmation page is an option.

For example: Despite anti-pop-up prejudice, a survey that appears in the center of the screen as a pop-up may well attract more respondents than one that occupies the bottom of a page (where visitors may never see it) as a static web element.

Els also emphasizes the benefits of routing survey respondents to a thank-you page with space for visitors to enter their email addresses. A surprising number of respondents often fill it out, providing a great resource for remarketing and other follow-up contact, such as interviews or user testing. 

💫 Continuous user feedback surveys can help inform future UX research studies, improve product designs, and speed up iteration cycles. Explore best practices, tools, and examples for effective surveys in the UX Research Field Guide

Email Surveys

For email UX surveys, which Els recommends for learning more about existing customers and subscribers, the same basic rules apply. Open questions, devoid of bias and requiring specificity from respondents, generally yield the most valuable results.  

Because email surveys build on some existing relationship with the audience, they can be longer than web surveys. But brevity remains important: It’s a signal to your customers that you appreciate the value of their time. To reinforce that sense, Els says, it’s helpful to explain to email survey recipients that the aim of the survey is to improve their customer experience of the product or service that they’re already using. 

That way, they’re primed to understand the survey as a tool designed to benefit them — and to be more likely to answer questions in detail. In addition to being longer, email surveys can ask more specific questions than web surveys, pegged to respondents’ specific experience of a product or service. Their answers can thus provide an up-to-date view of customers’ evolving needs, wants, problems, and motivations.

On account of these differences, email UX surveys have some best practices of their own:

Eliminating Bias

Because email surveys tend to be longer with more detailed questions, there are more opportunities to go wrong. When survey questions get more complex, it becomes easier to introduce bias, to ask leading questions, and to artificially limit respondents’ range of answers. 

Each of these mistakes makes getting accurate, actionable responses less likely. To avoid making them, Els suggests doing one or more internal test runs of email UX surveys, preferably distributing a sample survey to colleagues of various ages, genders, and backgrounds. Their feedback can highlight biases that the author did not intend, and which they could not otherwise identify themselves, suggesting how and where questions might be reworded.

Even meticulous internal test runs can overlook errors. So Els also recommends to clients that before emailing a UX survey to an entire audience, they send it to a small sample set. By analyzing the responses from the sample, they can verify that their questions are being interpreted as intended before emailing them to a wider readership.

Targeting the Right Audience              

An image from Els’ presentation on surveys.  

To get the most useful results from UX surveys of existing customers and subscribers, Els says, be thoughtful about whom to target. The best way of doing this is to identify the particular purpose of the survey in question: What gap in your own knowledge are you trying to close? What is it you want to know? The answers will provide significant clues about which email addresses to add to your list of survey recipients. 

For example, if you want to know about your users’ onboarding experience, send a survey only to subscribers who’ve signed up in the last six months, rather than to every single user. The experience will be fresher in the minds of more recent sign-ups, making their customer feedback more useful than those from users who signed up years ago and are therefore unlikely to accurately remember the process or their feelings about it. 

Similarly, if you have a SaaS product and want to know why certain subscribers haven’t signed in for a while, it’s best not to survey subscribers who are consistently active. It might annoy them — and make them less likely to respond to a future survey that is relevant.

Don’t have access to the right users for your research? User Interviews offers a complete platform for finding and managing participants in the U.S. and Canada. Tell us who you want, and we’ll get them on your calendar. Find your first three participants for free.

How to Analyze UX Survey Results  

If, as intended, a survey receives a lot of responses, you’re left with the task of sorting and analyzing a potentially confusing set of results. And when survey questions are open-ended — allowing respondents to describe their needs, wants, and experiences with great specificity — the task can be especially daunting, requiring a painstaking manual review. 

According to Els, however, this is simultaneously an advantage of open-ended questions. “You get to read the answers, and you have to absorb them in a way that is different than when you just see words and graphs,” she says. “It deepens your knowledge of what people’s problems really are.” When the time comes for you to collaborate on solutions with the company’s founder or UX designers or other stakeholders, you have an intimate understanding of the target audience’s pain points, making their insights that much more valuable.

To understand the results of an open-ended UX survey, and to make them easily legible, Els suggests creating a tagging system, coding each response with a Main Task and, if applicable, a Sub-Task

For example, a survey from the website of a city government might yield many responses about the library, expressing a variety of concerns: book return, hours of operation, fines. Each response would be coded with Library as a Main Task, and with a Sub-Task corresponding to its particular area of focus.   

An example of Els' analysis system.

You can code the results in Excel or Google Sheets. To prioritize among a broad field of Tasks, Els says, remember why you conducted the survey in the first place: to better understand your audience’s needs, wants, motivations, and problems.

Accordingly, respondents’ most urgent and/or common problems migrate to the top of your to-do list, where they can be evaluated as a function of what might produce the biggest improvements for your company or organization.

It’s worth noting here that a survey doesn’t need to get many responses in order to provide valuable information. For example, a small-town bakery distributing a survey is unlikely to receive a huge number of responses, simply by virtue of the relatively limited number of customers they serve.

Let’s say they distribute a survey during the COVID-19 pandemic, and seven out of 20 respondents say they want to know if the shop sanitizes doors and other surfaces regularly. Despite relatively few responses, the takeaway is clear: Even if the owners are already taking those steps, they need to communicate them clearly to their customers.

Conclusion

By using surveys to reveal the what and why of respondents’ problems, user experience researchers can point their employers and clients toward needed solutions, closing the gap between what customers are looking for and what is currently being offered.

Keeping this guideline in mind can not only help you include the right kinds of questions in your surveys, it can also help you avoid common errors that make results less accurate or less useful:

  • Asking “vanity” questions, often in the context of a customer satisfaction survey, that encourage audiences to be complimentary about your product or service but which tell you little about their needs or the substance of their experience. Net promoter score (NPS) surveys can easily fall in this camp, unless you’re also asking meaningful questions alongside it.
  • Asking questions better addressed through other means, such as digital testing, e.g., “How would you rate the speed of this website?”.
  • Asking questions that invite people to predict the future, e.g., How likely would you be to use X, Y, or Z new feature or new product if it were available? (Remember: UX surveys are great at identifying problems, not brainstorming solutions.)

Depending upon the complexity of the task at hand, a UX survey might be the only research tool required, an intermediate step, or merely a starting point — a conduit to more detailed subsequent surveys, one-on-one interviews, user testing, and the like. Of course, sometimes, no matter how well conceived, a survey doesn’t yield helpful feedback. Luckily, with online surveys, it’s easy to return to the drawing board, make adjustments, and quickly go back out into the field.   

Looking for a specific audience to participate in your user research? User Interviews offers a complete platform for finding and managing participants in the U.S. and Canada. Tell us who you want, and we’ll get them on your calendar. Find your first three participants for free.  
Olivia Seitz

Olivia is a content strategist at Grow & Convert who loves science, cats, and swing dancing. She enjoys a mix of writing, editing, and strategy in every work week.

Subscribe to the UX research newsletter that keeps it fresh
illustration of a stack of mail
[X]
Table of contents
down arrow
lettuce illustration against an abstract blob filled with a teal brushstroke texture

The UX research newsletter that keeps it fresh

Join over 100,000 subscribers and get the latest articles, reports, podcasts, and special features delivered to your inbox, every week.

Yay! Check your inbox for a welcome email from katryna@userinterviews.com.
Oops! Something went wrong while submitting the form.
[X]