SUBSCRIBE TO OUR NEWSLETTER
User experience surveys can tell you what customers need from your company. But a survey’s success depends on asking the right questions.
There are no fewer potential approaches to formatting a UX survey than there are possible applications for the survey itself. But Els likes to keep things simple. In her experience, two basic types of UX survey, used in combination, often make for a very effective approach:
For Els, whether a UX survey is conducted via website or email, the goal (getting audiences to reveal their problems, needs, motivations, and expectations) is roughly the same. And web and email surveys share some important common best practices, too:
An example of a loaded question. Unsurprisingly, the answers skewed positive.
The rating scale you use for closed-response questions matters, too. The way the question is phrased and weighted can make a difference in how participants respond.
With these commonalities in mind, we’ll explain some best practices and common errors in user experience surveys.
For anyone who wants to better understand their audiences’ behavior online, analysis tools like Google Analytics and heatmap software are indispensable. They can furnish valuable data about how people find and interact with your site. At the same time, they can only tell you so much.
Online surveys, which often appear in the form of pop-ups, can be understood as a way of parsing insights gleaned from user behavior analysis tools. Used effectively, they can help explain why visitors to your site are doing what they’re doing.
“Google Analytics tells you what people do on your website,” Els said. “Surveys tell you what people want to do on your website.”
Executed wisely, they can provide valuable clues about how to adjust your site or product to better suit your target audience(s).
In the spirit of keeping things simple, Els often deploys a user experience survey template consisting of just two basic questions:
Deceptively simple, this survey template provides an opportunity for audiences to offer up extremely useful user feedback.
By asking respondents to identify themselves, the first question has the effect of verifying that the right target audience is arriving at your website through the intended portal — via a particular Google ad, say — and that they’re landing on the page most appropriate to their needs.
If your site has separate pages designed for private individuals, businesses, and government audiences, for example, answers to this question can illuminate glitches that might keep potential customers from easily accessing the portion of your site that’s most relevant to them. (We should note that depending on how certain you are of your audience, this is one survey question for which multiple choice answers can be appropriate.)
In adhering to the rules of openness and neutrality, the survey’s second question invites respondents to identify the problem they’re dealing with and, either explicitly or implicitly, the solution they’re looking for.
We can’t stress enough the importance of asking respondents to be specific. Without that addition, Els says, people sometimes give short, vague answers that don’t add anything to your sense of their needs or wants, e.g. “I was looking for information.”
On the other hand, specific responses can yield key insights, indicating, for example, that although you offer what visitors are looking for — a particular kind of stove, let’s say — they haven’t been able to find it. From this data, the UX team might conclude that perhaps the product photos or descriptions are unclear, or that the website’s design could be more intuitive.
Specific responses to Question #2 can also surface surprising food for thought. An insurance company might learn, for example, that many visitors to their website are looking for a kind of insurance they don’t currently offer — and begin to consider whether they ought to begin offering it.
There’s a widespread and not unwarranted assumption that, whether they contain surveys or some other content, pop-ups tend to annoy website visitors. But the timing and placement of a pop-up survey can make a huge difference in the rate at which people respond to them.
It’s one thing to greet visitors to your homepage with a pop-up, pestering them before they’ve even had a chance to get their bearings. It’s quite another to approach them at a moment when you have reason to believe that they’re frustrated — that they haven’t found what they’re looking for.
It’s times like these — on web pages where your analytics show steep visitor drop-off, or when visitors navigate away without adding anything to their cart — that can make ideal circumstances for short online surveys. The exact wording can be varied to suit the context, and although it’s far from an exact science, survey placement matters, too.
For example: Despite anti-pop-up prejudice, a survey that appears in the center of the screen as a pop-up may well attract more respondents than one that occupies the bottom of a page (where visitors may never see it) as a static web element.
Els also emphasizes the benefits of routing survey respondents to a thank-you page with space for visitors to enter their email addresses. A surprising number of respondents often fill it out, providing a great resource for remarketing and other follow-up contact, such as interviews or user testing.
For email UX surveys, which Els recommends for learning more about existing customers and subscribers, the same basic rules apply. Open questions, devoid of bias and requiring specificity from respondents, generally yield the most valuable results.
Because email surveys build on some existing relationship with the audience, they can be longer than web surveys. But brevity remains important: It’s a signal to your customers that you appreciate the value of their time. To reinforce that sense, Els says, it’s helpful to explain to email survey recipients that the aim of the survey is to improve their customer experience of the product or service that they’re already using.
That way, they’re primed to understand the survey as a tool designed to benefit them — and to be more likely to answer questions in detail. In addition to being longer, email surveys can ask more specific questions than web surveys, pegged to respondents’ specific experience of a product or service. Their answers can thus provide an up-to-date view of customers’ evolving needs, wants, problems, and motivations.
On account of these differences, email UX surveys have some best practices of their own:
Because email surveys tend to be longer with more detailed questions, there are more opportunities to go wrong. When survey questions get more complex, it becomes easier to introduce bias, to ask leading questions, and to artificially limit respondents’ range of answers.
Each of these mistakes makes getting accurate, actionable responses less likely. To avoid making them, Els suggests doing one or more internal test runs of email UX surveys, preferably distributing a sample survey to colleagues of various ages, genders, and backgrounds. Their feedback can highlight biases that the author did not intend, and which they could not otherwise identify themselves, suggesting how and where questions might be reworded.
Even meticulous internal test runs can overlook errors. So Els also recommends to clients that before emailing a UX survey to an entire audience, they send it to a small sample set. By analyzing the responses from the sample, they can verify that their questions are being interpreted as intended before emailing them to a wider readership.
To get the most useful results from UX surveys of existing customers and subscribers, Els says, be thoughtful about whom to target. The best way of doing this is to identify the particular purpose of the survey in question: What gap in your own knowledge are you trying to close? What is it you want to know? The answers will provide significant clues about which email addresses to add to your list of survey recipients.
For example, if you want to know about your users’ onboarding experience, send a survey only to subscribers who’ve signed up in the last six months, rather than to every single user. The experience will be fresher in the minds of more recent sign-ups, making their customer feedback more useful than those from users who signed up years ago and are therefore unlikely to accurately remember the process or their feelings about it.
Similarly, if you have a SaaS product and want to know why certain subscribers haven’t signed in for a while, it’s best not to survey subscribers who are consistently active. It might annoy them — and make them less likely to respond to a future survey that is relevant.
Don’t have access to the right users for your research? User Interviews offers a complete platform for finding and managing participants in the U.S. and Canada. Tell us who you want, and we’ll get them on your calendar. Find your first three participants for free.
If, as intended, a survey receives a lot of responses, you’re left with the task of sorting and analyzing a potentially confusing set of results. And when survey questions are open-ended — allowing respondents to describe their needs, wants, and experiences with great specificity — the task can be especially daunting, requiring a painstaking manual review.
According to Els, however, this is simultaneously an advantage of open-ended questions. “You get to read the answers, and you have to absorb them in a way that is different than when you just see words and graphs,” she says. “It deepens your knowledge of what people’s problems really are.” When the time comes for you to collaborate on solutions with the company’s founder or UX designers or other stakeholders, you have an intimate understanding of the target audience’s pain points, making their insights that much more valuable.
To understand the results of an open-ended UX survey, and to make them easily legible, Els suggests creating a tagging system, coding each response with a Main Task and, if applicable, a Sub-Task.
For example, a survey from the website of a city government might yield many responses about the library, expressing a variety of concerns: book return, hours of operation, fines. Each response would be coded with Library as a Main Task, and with a Sub-Task corresponding to its particular area of focus.
You can code the results in Excel or Google Sheets. To prioritize among a broad field of Tasks, Els says, remember why you conducted the survey in the first place: to better understand your audience’s needs, wants, motivations, and problems.
Accordingly, respondents’ most urgent and/or common problems migrate to the top of your to-do list, where they can be evaluated as a function of what might produce the biggest improvements for your company or organization.
It’s worth noting here that a survey doesn’t need to get many responses in order to provide valuable information. For example, a small-town bakery distributing a survey is unlikely to receive a huge number of responses, simply by virtue of the relatively limited number of customers they serve.
Let’s say they distribute a survey during the COVID-19 pandemic, and seven out of 20 respondents say they want to know if the shop sanitizes doors and other surfaces regularly. Despite relatively few responses, the takeaway is clear: Even if the owners are already taking those steps, they need to communicate them clearly to their customers.
By using surveys to reveal the what and why of respondents’ problems, user experience researchers can point their employers and clients toward needed solutions, closing the gap between what customers are looking for and what is currently being offered.
Keeping this guideline in mind can not only help you include the right kinds of questions in your surveys, it can also help you avoid common errors that make results less accurate or less useful:
Depending upon the complexity of the task at hand, a UX survey might be the only research tool required, an intermediate step, or merely a starting point — a conduit to more detailed subsequent surveys, one-on-one interviews, user testing, and the like. Of course, sometimes, no matter how well conceived, a survey doesn’t yield helpful feedback. Luckily, with online surveys, it’s easy to return to the drawing board, make adjustments, and quickly go back out into the field.
Looking for a specific audience to participate in your user research? User Interviews offers a complete platform for finding and managing participants in the U.S. and Canada. Tell us who you want, and we’ll get them on your calendar. Find your first three participants for free.
Olivia is a content strategist at Grow & Convert who loves science, cats, and swing dancing. She enjoys a mix of writing, editing, and strategy in every work week.
Research Methods & Deliverables
November 22, 2019
Love letters & breakup letters. Experience journey mapping. Prototype testing tweaks. This post unpacks the unconventional but fun-and-effective methods of Brandie Ward Smith.