Join over 150k subscribers to get the latest articles, podcast episodes, and data-packed reports—in your inbox, every week.
Join over 150k subscribers to get the latest articles, podcast episodes, and data-packed reports—in your inbox, every week.
All posts
UX Research Topics
Podcasts
Field Guide
SUBSCRIBE TO OUR NEWSLETTER
What makes a “good” user test question? Take inspiration from these 70+ sample user testing questions to ask in your next research project.
What makes a “good” user testing question?
In short, effective testing questions are those that prompt participants to provide useful insights without introducing bias.
But writing them is harder than you might think.
In this post, we’ve listed 70+ sample user test questions to use in your next research project. You’ll learn:
Before we dig into all the example questions we’ve gathered for you, we wanted to talk about how to write good user testing questions.
To be clear, when we say “good,” we mean: questions that prompt participants to provide useful insights without introducing bias.
That last part—without introducing bias—is critical to gathering valid data. Learn more about how to identify and reduce bias in your research.
Here are some steps for writing effective user testing questions:
This Taxonomy of Cognitive Domain chart from Nikki Anderson at the UX Collective is a nifty cheat-sheet for writing questions that get to the heart of what you’re hoping to learn:
💡 Note for those interested: The taxonomy of cognitive domain comes from the idea that there are six categories of cognitive ability: knowledge (information recall), comprehension (actually understanding what you can recall), application (the ability to use that knowledge in a practical way), analysis (the ability to draw insights out of given information), synthesis (the ability to create something new from what is known), and evaluation (the ability to make judgments about what is known).
There are two main types of usability testing questions: open and closed.
Neither one is inherently better or worse than the other—but using the wrong type of question at the wrong time can introduce bias and fail to get you the data you need.
As a general rule of thumb, use open-ended questions when you’re looking for qualitative data about feelings, thoughts, or behaviors, and use closed-ended questions when you’re looking for quantitative data with a focused set of possible answers.
📚 Related Reading: How to Write UX Research Interview Questions to Get the Most Insight
Usability testing governs any testing around how a user might work with or navigate the product you’ve created. It’s commonly used to make sure that navigation, onboarding, and other aspects of a site or tool work as intended. If there are any hiccups or confusing instructions, it’s how you’ll find out about them.
📌 Gearing up for your next user testing project? Take the pain out of recruiting participants with User Interviews. Tell us who you need, and we’ll get them on your calendar. Sign up today.
Many of the questions you ask prior to a user test can be covered in your screener survey. However, you may want to repeat some or all of the questions in-session for the chance to confirm the participant’s answers and ask for follow-up information.
The questions you ask prior to a user test should cover information about the participants’ demographics (if applicable) and current behaviors, experience, and attitudes.
Here are some examples of questions to ask pre-test:
The questions you ask during the session are the most important. In this section, we’ve laid out some effective questions to ask during a user test (categorized by test type), as well as some effective follow-up questions to help you dig for more information.
A product survey can be used to generate ideas for improving an existing product, testing an idea, or even getting feedback on a beta version. If you’re looking for quick feedback, you can ask multiple-choice questions. But it’s worth throwing in a few free response questions to see what extra info you can get from respondents.
Multiple Choice Survey Questions:
Open-Ended Questions:
In addition to these types of questions, you can list a series of features or aspects of the product you’re testing and ask participants to rate them on a given scale.
How we think someone performs a task and how they actually perform the task can be very different. And unless you perform a task analysis study, you don’t know what you don’t know.
Task analysis is best done before you’ve solidified the user flow for a new product. It helps you understand factors such as cultural or environmental influences on how someone performs that task. If being present to observe how people approach the task is critical, you’ll want to conduct an ethnographic field study. Otherwise, you can still get a wealth of information from user interviews.
Here are a few questions you might ask during task analysis:
Card sorting can be moderated or unmoderated. The benefit of a moderated session is that you can remind test participants to think out loud, granting access to the reasoning process they go through to make sorting decisions. If all you care about is the end result, however, unmoderated is the way to go.
Most of the work in setting up a good card sort is done before the exercise starts. You’ll need to give participants some context as to what they’re doing ahead of time (at least for most studies).
Whether or not you give them categories for the cards or have them write their own is up to you.
If you do ask any questions during the process, consider these:
Many of the questions & scenarios we’ve discussed through now (and will continue to discuss in the usability section below) would be appropriate during beta testing. But there are some beta testing questions we haven’t covered, like how to ask potential customers about pricing.
Since we’ve already had a great conversation with Marie Prokopets about how she and FYI co-founder Hiten Shah failed and then succeeded with a product launch, we’ll let you read all of their beta testing advice and recommended questions instead of rehashing it here.
Website usability testing can be moderated or unmoderated. Conducting moderated tests means you can react to problems in the moment and better understand what the user is thinking. Unmoderated tests are convenient and can result in valuable insights as long as you leave good instructions and the testers explain what they’re thinking.
Preparing a focused test is arguably more important than the questions you ask during the session. But a few well-placed questions, especially if the participant isn’t as talkative as you hoped, can keep things on track.
Consider asking these web app usability testing questions:
During the testing session, keep on eye on how they respond to what’s in front of them. Does anything distract them from the specific tasks you gave them? Do they completely miss something you think is important? How quickly can they find what they need? Do they take a circuitous route to solve what you thought was a simple problem? How do they react to usability issues?
To be honest, most of the questions you need to ask for website usability testing are the same for mobile app usability testing sessions. While your testing environment may look a little different, the goals are mostly the same.
Still, there are a few extra questions worth mentioning:
As with website testing, pay attention to trouble spots — do they linger on a task you think should be easy? Are there places where the app is slow or annoys them? Do they make any verbal cues (“Ooh,” “Ah,” “Hmm,” and so forth) at specific parts of the flow you’re testing?
Sometimes, the best information you’ll get in the whole session comes from follow-up questions. Perhaps it’s just a matter of you mimicking what they just said (“The navbar disappeared ... ?”) and waiting for them to expound on that statement.
For the sake of variety, here are a few different ways of getting a participant to explain themselves:
Following a user test, you’ll mostly want to ask questions related to users’ overall impression.
Effective questions to ask after a user test include:
📚 Related Reading: How to Ask Great User Research Questions with Amy Chess of Amazon
Lastly, here are a few questions to avoid asking in your next user test:
The first four questions are leading, because they imply the desired answer within the question (e.g. identifying the "upgraded" feature will give participants the sense that it's the one they should like better).
These last three questions are poor because humans are notoriously bad at predicting their own behavior. Plus, there can be a real difference between what people think they want and what they’ll actually choose to solve a problem when the time comes.
You definitely won’t need all 100+ questions from this guide for each project you undertake. You’ll probably need to write additional questions specific to your work. Hopefully, these are enough to get you started.
Even with the right questions and testing methodology, you won’t get very far without recruiting great participants. If you’d like to make recruiting quick and (relatively) painless, consider User Interviews.
Using our platform, you can recruit from nearly 3 million vetted participants simply by telling us which demographic characteristics matter to you, and by setting up a short screener survey based on behaviors your users exhibit. When you’re ready to launch your study, we’ll find as many participants as you need.
Or, you can bring your own panel and seamlessly manage all the logistics using our panel management and recruitment automation platform, Research Hub.
Sign up for a free account to get started right away or visit our pricing page to learn about our plans for any sized team.
Content Marketing Manager
Marketer, writer, poet. Lizzy likes hiking, people-watching, thrift shopping, learning and sharing ideas. Her happiest memory is sitting on the shore of Lake Champlain in the summer of 2020, eating a clementine.