down arrow
Survey Best Practices - woman in glasses and red lipstick looking at a paper and smiling

Surveys for User Research: Best Practices for Better Insights

Steal a page from our playbook: Sourced from the UI Research Guide, here's a list of survey writing tips, QA checklist items, and more!

“Surveys are overused and badly used. People should mostly just stop.” 

- Erika Hall, Co-founder of Mule Design, Awkward Silences ep. 21

We agree—surveys are over- and misused. But in the right hands, they can be a powerful tool for doing user research at scale.  

So why do surveys “suck”? 🤔As simple as surveys may seem, there’s a lot of room for bias and poor design to sway survey data. Surveys require a structured approach to yield valuable data and results.

That’s why I took extra care to compile and document a list of survey best practices from our internal UX research playbook—a resource designed to enable non-UXRs at User Interviews to do quality research. In this case, crafting clearer, more intentional, and constructive surveys. 

In our ongoing effort to make this documentation available for wider use, we’ve adapted our guide to survey best practices for the blog— use it, bookmark it, adapt it to your own team’s needs, and share it with others to help us put an end to poorly designed surveys!

Here’s what we’ll cover:

  • 11 survey writing tips
  • Survey tools to help you get started
  • Survey QA checklist
  • Comprehension questions
  • The value of “good” surveys

11 tips for writing survey questions

1. Start with your survey goal: 

Before you can begin writing survey questions, you need to know what you're hoping to learn from the audience you're reaching out to. 

Like all research, survey studies need a solid research question behind them. This can be as simple as a sentence that articulates why you're creating the survey and what you're hoping to learn. 

For example: “I want to know which tools a researcher uses in their role.”

2. Identify your audience.

Who are you speaking? What kind of participants are you looking to target?

3. Next, identify the core audience you would like to learn about. Reflect on the following:

  • What user groups am I looking to target? Will the users I select for my survey introduce bias at all?
  • What characteristics do they have? Job title? Demographics (if they are relevant to the user experience)? Income?
  • How do I intend to find users for my survey? 
  • How will users access my survey? Is it sent by email? Posted online somewhere?
  • Am I going to offer incentives for participants' opinions?
  • What is the level of statistical precision that I hope to get from my audience? Is my survey a pulse check or is it meant to be more robust/ representative of the population?

4. Focus on asking close-ended questions:

Open-ended questions (free response) require more time and effort to analyze than close-ended questions. In general, you should not ask more than two open-ended questions per survey. (If you are looking for qualitative responses to round out your survey data, consider combining a survey with a series of interviews with a subset of participants. This will yield richer qualitative data than open-ended questions alone.  

Instead, utilize closed-ended questions that include a predefined list of answer options such as: multiple choice, checkboxes, dropdown, star ratings, slider and ranking. See here for more information about question types.

5. Keep survey questions neutral:

Asking leading questions can seriously bias your survey results. Be sure to write your questions in an objective tone to ensure you don't influence participants responses:

  • Leading question: We think our customer service representatives are amazing; what do you think?
  • Neutral question: How helpful or unhelpful do you find our customer service representatives to be?

6. Keep a balanced set of answer choices:

The answer choices you include in your survey can be another potential source of bias. Adopt a balanced set of options in your questions (likert scales) to help mitigate this.

  • Likert example: Very helpful, helpful, neither helpful or unhelpful, unhelpful, very unhelpful

7. Ask one question at a time:

Asking participants two questions in one (double-barreled questions) can cause confusion and prevent respondents from communicating their true opinions and preferences. If you want to know two different things, ask two seperate questions.

  • Original: How would you rate our customer service and product reliability?
  • Improvement: Break into 2 questions
  • How would you rate our customer service?
  • How would you rate our products' reliability?

8. Write clear instructions:

Be brief and straightforward when writing instructions for your survey. Be sure to add information about: 

  • What the survey is for and why it’s important
  • How you’ll guarantee confidentiality 
  • How long the survey will take and the deadline
  • Who to contact with any questions

9. Use familiar language and avoid jargon:

Be mindful of word choice and use words that will make sense to survey participants. Avoid any jargon that is not part of the respondents’ everyday language.

10. Include a question at the end to capture any points of confusion:

This can be as simple as an optional, open-text question that asks, “Was there anything else that was confusing or unclear during the survey today?” You can check responses to this as you field your survey so you can catch any errors in your survey early! (See the section on comprehension questions below for more examples.)

11. Test out and QA your survey:

There's no worse feeling than sending out your survey and finding errors after the fact. Share your survey with others on your team to get a second pair of eyes on it before it sending out to participants! (Keep reading to see the Survey QA Checklist section below for detailed QA instructions.)

Survey tools to help you get started

Now that you’ve mastered (or are working on) the art of writing survey questions, it’s time to find reliable survey tools. Here are some of the survey tools we recommend and provide integrations for:

If your team is just getting started with research tools, we recommend trying a combination of various tools to see which one suits your needs. You’ll find it easier to gauge what functionalities serve your survey needs best.

Want to feel confident that you’re recruiting good, legit participants who are who they say they are? User Interviews’ Research Hub now offers special survey pricing to make recruitment more accessible for teams running high-volume surveys. 

Scale your surveys with ease and leave the recruitment worries to us–because your great surveys deserve great participants, too.

💫 Continuous user feedback surveys can help inform future UX research studies, improve product designs, and speed up iteration cycles. Explore best practices, tools, and examples for effective surveys in the UX Research Field Guide

Survey QA checklist

Before you send off the survey to the masses, it's important to QA (quality assurance) test the survey to ensure it's set up for success!

To test the same survey multiple times, you may need to try a different browser, clear your browser history each time, or update the survey settings to allow for multiple responses from the same person.

Here's what the process looks like:

1. Do a thorough QA yourself first (see checklist below) to ensure your survey is programmed properly and reflects your finalized draft.

2. Then, find at least two other people to help QA your survey. The more unfamiliar these people are with your study, the more realistic the “participant” experience is.

3. Once you’ve incorporated any feedback from reviewers, you can do a final QA to make sure the survey is ready to launch to your actual respondents. You can choose to fully launch your survey or "soft-launch" to just a few respondents to ensure clarity and proper functioning.

Want to save these checklists for your next survey? Download the full-sized image for the initial QA checklist here and the final QA checklist here!

Comprehension questions

Even after you launch your survey, it’s good practice to check in about whether or not participants find the questions clear and easy to understand. You may find that you want to ask some additional questions in your soft launch to ensure understanding. Below are some examples you can use as a starting point!

Example 1:

  • (Optional) Was there anything else that was confusing or unclear during the survey today? (open text)

Example 2: 

Thanks for your participation so far! We’d like to ask you a couple of additional questions to understand your experience with the survey today.

  • Overall, how easy or difficult was it for you to complete this survey?
  • 1 (Very difficult) - 7 (Very easy)
  • (Optional) Briefly, tell us why you gave the rating above. (open text)
  • In this survey, there were two example scenarios presented involving XYZ. How helpful or not helpful did you find these scenarios to be?
  • 1 (Not at all helpful) - 7 (Very helpful)
  • (Optional) Briefly, tell us why you gave the rating above. (open text) 
  • Were there any features that you were asked about that you did not understand? If so, please list those below. (open text)

Whether you choose to do a soft or full launch, use the responses you gather from your comprehension questions to continually improve surveys you create in the future. These questions will help you fine-tune your surveys and achieve better survey completion rates.

The value of “good” surveys 

Whether you’re a UX researcher or a product manager, everyone should use research tools like surveys to get a taste of what it means to be more user-centric.

Research is a craft that requires expertise and years to master, but educating yourself and others within your organization with research best practices helps empower your team to start awesome research on their own.

Let’s shift our focus from organizational research siloes to sharing resources and empowering others across functions. Even if you’re just getting started with research and don’t know how to ask the right questions, we’re here to help with guides (like this one) to help you start on the right path. 

Share these survey best practices with your team to help them see the value of good research practices so they can run their own unmoderated tests and research at scale. 

User Interviews is the fastest and easiest way to recruit and manage participants for research. Get insights from any niche within our pool of over 1.5 million participants with Recruit, or build and manage your own panel with Research Hub. Visit the pricing page to get started.

Rachell Lee
Copywriter at Seamless.AI

Rachell is a SEO Copywriter at Seamless.AI and former Content Marketing Manager at User Interviews. Content writer. Marketing enthusiast. INFJ. Inspired by humans and their stories. She spends ridiculous amounts of time on Duolingo and cooking new recipes.

Subscribe to the UX research newsletter that keeps it fresh
illustration of a stack of mail
[X]
Table of contents
down arrow
lettuce illustration against an abstract blob filled with a teal brushstroke texture

The UX research newsletter that keeps it fresh

Join over 100,000 subscribers and get the latest articles, reports, podcasts, and special features delivered to your inbox, every week.

Yay! Check your inbox for a welcome email from katryna@userinterviews.com.
Oops! Something went wrong while submitting the form.
[X]