Note to the reader:
This part of the field guide comes from our 2019 version of the UX Research Field Guide. Updated content for this chapter is coming soon!
Want to know when it's released?
Subscribe to our newsletter!
In the last chapter, How to Recruit Participants for User Research Studies, we introduced you to the topic of screener surveys. Well, we actually introduced that topic way back in chapter about the UXR process, and again when we walked you through how to create a user research plan.Â
Why do we keep harping on about screeners?
Because theyâre important! Screener surveys are what stand in the way of you and hoards of unscrupulous, unqualified, uncommunicative participants. (Thatâs a bit dramatic, but you get the point.
Screener surveys are essential tools for qualitative research. And while they sound simple, theyâre actually very easy to get wrong. Which is why weâve dedicated a whole chapter to them!
Screener surveys, or just âscreeners,â are surveys people take before participating in a research study. Theyâre made up of a few questions, designed to weed out the folks who arenât your intended audience and capture the ones who are.Â
You can think of a screener survey as a sieve that captures the people who hit all your âmust haveâ criteria and filters out the ones who donât quite fit the bill.
â
If you want the right participants, youâve got to design smart screening questions.
Thatâs not always as straightforward as you might think. You have to ask questions in a somewhat roundabout way to avoid leading people to certain responses, but also in a clear way to make sure youâre universally understood.
The devil is in the details, but luckily there are some pro strategies that anyone can learn and put to use right away.Â
The guiding principles explained in this chapter will help get you there.
â
We covered this bit in previous chapters, and the advice here is the same as the last.Â
Clearly defined goals and objectives are must-have requirements for any user research project. These goalsâaka your reason for doing researchâshould be hammered out well before you start writing your screener surveys. (If theyâre not, head on back to the chapter on planning research to get some clarity. Go on, weâll wait...)Â
â
This is the part where you consider your research question, imagine the ideal participant who can give you the answers you need, and identify the targeting criteria (the things that must be true about them) needed to qualify for your study.
Your targeting criteria will typically be defined by using a mixture of:
So⊠how do you decide what criteria to target by?Â
â
We covered all of those points in finer detail in the last chapter. If youâre still struggling to pin down who to recruit, we recommend revisiting those recommendations before moving onto the next steps.
Take a closer look at your targeting criteria. Do they include demographic criteria like age, gender, race, income, etc? Do they need to?
Demographics are the low-hanging fruit of screener surveys, but these characteristics have their limitations.
For instance, where people live is important if youâre doing an in-person study, or if your app will only serve certain locations. But if you donât have a clear reason to target based on geography⊠donât. The same thing applies to demographics. In many cases, a personâs gender or how much they earn per year wonât determine how they interact with a product.Â
Our assumptions about these characteristics are prone to bias, which can invalidate your study and do real harm to the people your research will ultimately impact.
Also, itâs just bad form to waste valuable screener questions on criteria that aren't absolutely essential.
Screening for psychographics and behaviors lets you group people based on how they live, what they value, and how they relate to your product or category. Thatâs the juicy stuff!
Letâs say you want to test for accessibility with a mix of gender identities, age ranges, and educational backgrounds. In this case, adding demographic criteria will allow you to target a diverse audience.
Not every question on your screener has to result in an automatic in or out, but can be used to filter for a variety of participants as a final step. Accept anyone who could be a fit based on any given question.Â
Once you know the characteristics of your target participants and youâve broken that down into specific criteria for how youâll identify the people who qualify, itâs time to write the questions that will help you filter out the goodâuns.
The language you use in your screener is important. When writing screener questions:
The more clearly worded and specific your questions are, the less likely participants will be to get confused and answer inaccurately. Leave no room for misinterpretation!
Similarly, make sure the multiple choice options you provide are carefully worded. Being clear in your responses is just as important as being clear with your questions.
Have you ever taken a survey where, on a certain question, you found yourself forced to choose between more than one answer that applied to you?
To avoid putting your audience in that position, make sure your answers have clear borders without any overlap. For example, when asking for numerical values (age, size, frequency etc.), make sure your values are mutually exclusive:
For less definitive answers or for answers that canât be made mutually exclusive, ask participants to select the answer that is most true or give them the option to select all that apply, rather than a single answer.
Donât make prospective participants complete your entire screener before finding out they donât qualify. Eliminate unqualified people early.
Think of the process as a funnel. Youâre refining your participants, and refining them further. Or, think of it like weeding a grown-over garden. The biggest, tallest, most obvious weeds come out first, simply because theyâre the easiest to grab. Start with the questions that are most likely to weed people out.Â
The easiest way to do this is to write out your questions, rank them in order of importance, and look for any interdependencies.Â
For example, if youâre doing an in-person study, ask about location right away. Location here is a must and must-have criteria go first.
Before diving into questions about how people use apps on their smartphones, find out if they use a smartphone at all. Then, move on to the questions that tap into specific behaviors, interests, and preferences. Â
If youâre not working with a recruiting service that gathers demographics for you, ask any demographic questions that you need to ensure a diverse recruit pool.
You know how some folks add a âright?â at the end of every sentence, so that you have no choice but to nod or shrug in agreement? Right?Â
Thatâs an example of âleading.â Leading questions will influence people to answer in a certain way.Â
Itâs a handy conversational device if, say, youâre a dogged prosecutor in a courtroom TV series and the judge will allow it (for the drama, obviously). But leading questions have no place in user researchâand definitely not in your screener survey. This is not the place to try to validate your assumptions. Youâll end up with skewed results or the wrong kind of participants.
Hereâs an example:
A good way to identify whether a question might be leading is if it includes a hint or excludes possible answers.
Another way to avoid leading questions is to provide a series of unrelated options as answers.Â
For example, if you want to screen users who have a high level of concern around internet privacy issues, rather than diving right into questions about internet privacy by asking:
⊠you can create a question like this (not leading): Which of the following topics is most concerning to you regarding internet use in your life?
Likewise, avoid yes/no or true/false questions, which tend to be leading. Users might answer in the way they believe will entitle them to participate in the study. Whenever possible, replace these questions with multiple choice options or provide a scale for degree of agreement with a given question.Â
Exception: In cases where a black and white answer is requiredâfor example, when asking if a person is willing or able to participate under the conditions of your studyâa binary question will be your best bet.
âLoaded questionsâ are similar to leading questions (and the two are often conflated), in they push the participant to answer a certain way. Loaded questions do this by making assumptions, which are implicit in the question itself.
Hereâs an example:
A good way to identify whether a question might be loaded is if it includes strong language or excludes possible answers.
If you create multiple choice responses, donât assume that youâve presented the user with every possible option. Even the best survey designers have their limitations. As Gandalf once said, âeven the very wise[st survey designers] cannot see all ends.â đ§
Include a ânone of the above,â âI donât know,â or âotherâ option to account for any outliers.Â
Otherwise, you could end up with someone in your study who doesnât belong there because they were forced to choose an answer that didnât apply to them. Likewise, you might screen good participants out because they didnât quite fit the answers you provided.
Spare yourself the pain of having to drag answers out of a reticent participant by screening uncommunicative people out of your study.
Screener surveys help you to get more value for your time and money on a per-participant basis. Sometimes that means excluding certain people who otherwise perfectly fit your ideal audience profile.
Screen for expressive participants by asking âarticulation questions.â These are open-ended questions designed to test a userâs capacity to communicate. If a person can express their ideas with depth of thought, theyâre likely to be a helpful participant.Â
Including open-ended questions also helps weed out âprofessional participantsâ who are just looking to make a quick buck by qualifying for any and every study.
A screener survey is meant to help you find the candidates who are a perfect fit for your study.Â
Giving away too much information about the purpose of your studyâby, say, revealing the name of your company to non-users or telling participants who youâre looking to interview (which is a real mistake that weâve seen)âcan devalue the screening process and make your research less effective.
And this advice doesnât just apply to your screener. The title and description you give your study, the way you talk about it when youâre recruiting participants, and the things you reveal in the lead-up to the session itselfâit all matters.Â
For instance, letâs say you are doing research on (yet another) photo editor app for influencers and people who actively post photos on social media. You might tell participants itâs a study related to social media habits. That way they have some context (which can help them decide to click into the screener), but they donât know what type of social media habits (editing photos) youâre looking for.
This will make it harder for professional testers to guess what you want, making it more likely youâll get authentic responses.
â
Make sure your participants are clear about what theyâre doing, and at what stage of the process theyâre at.Â
The screener survey is a sort of dress rehearsal, and it will help the participant to know theyâre not yet in the final round. Be sure the candidate knows what theyâre in for if they do make it.Â
â
If there are any possible deal-breakers (like NDA agreements, for example) let them know up front.Â
And of course, be clear that they wonât be paid until they make it through to complete the actual survey.
Finally, keep your screener surveys short and sweet. Weâve seen some screener surveys get so long that participants mistake them for a (paid) research survey! If youâre looking for a rough guideline on length, try to keep your screener to fewer than 10 questions.Â
Remember, the point of a screener survey is to help you find the right participants for your research. This list of screener questions is meant to be used for inspiration, and to help you get a gut check on your own screener. Itâs not a library of general use questions to copy and paste from in all circumstances.Â
With that caveat out of the way, here are some sample screening questions to ask, depending on the type of criteria youâre filtering for.
Ask employment questions when you want to screen for people with a certain level of familiarity with a particular industry, or exclude those who work for competitors.
Example question: What industry do you work in? Â
Answers: A list of industries (retail, IT, healthcare, education, etc.)
Format: Single selectÂ
Example question: Which category best describes your job function?
Answers: A list of job functions (marketing, accounting, engineering, product design)
Format: Single selectÂ
If you need to test with novices, experienced users, or some combination of each, ask about familiarity with a given product.
Example question: Please rank your experience with {name of product}.
Answer: A range from expert to novice
Format: Scale rating or single select
Example question: Which of these tools do you use for work? (Select all that apply)
Answer: A list of software products
Format: Multiple select
Asking about frequency of use or action is useful when youâre screening for users who regularly do a specific task, or who used to behave in a certain way and then stopped.
Consider defining terms like often (every day) and rarely (once a year) so thereâs no guesswork.
Example question: Please tell us how often you {name the task}.
Answer: A range of time from often to rarely.
Format: Scale rating or single select
Example question: When was the last time you {name the task}.
Answer: A range of time from today to never.
Format: Scale rating or single select
Just because someone meets your screening criteria doesnât mean theyâre actually going to be willing to participate, especially if your study touches on sensitive topics like health, income, lifestyle, marital status, etc. Ask participants directly if theyâre willing and able to answer personal questions.
Example question: This study will require you to share openly about {examples}. Do you agree to share honestly about these subjects?
Answer: Agree or disagree
Format: Single select.
â
If youâre conducting medical research, the recruitment process and screening process are considered separate activities. Recruitmentâin which you reach out to research candidates and tell them about the planned studyâis a pre-screening activity that can be done without informed consent. But even your pre-screening process may have to be submitted to your Institutional Review Board (IRB) before you can proceed.
Before you gather protected health information or obtain medical records to determine study eligibility, youâll need patients to sign a consent form to proceed with screening activities. Your screening script for interacting with possible participants and gathering information also has to be submitted for IRB review.
For more information about IRBs, refer to the FDA website. For your institutionâs specific rules regarding screening and research procedures, refer to its specific IRB.
â
If you want, your screening procedures can include a phone call to potential candidates who seem most promising. Double-screening like this typically isnât necessary, but it can be a good way to be absolutely sure that youâre getting the right people to talk to during your study.
Weâve seen researchers use double-screening when the study theyâre doing is high-profile (visible to important stakeholders in their organizations) or when they have a highly specific research need.Â
ââPeople who promise to show up for your study and donât will cost you time, and likely a moment of discomfort with colleagues and bosses who are forced to sit around waiting. Save yourself some trouble and work to prepare your participants and yourself to avoid the dreaded no-show.
Also consider being prepared by recruiting a few extra folks who you can call in at the last minute, if need be.
â
Long story short, envision your ideal participantâknow who they are, know who they arenâtâand build a screener survey that allows you to filter out the right people to answer your research question.Â
The specifics of how to get there are outlined above. Just remember to keep an open mind as to who these study participants might be, and donât limit yourself with prejudgements mired in demographics.
Weâll leave you with a few rules of thumb:
Oh, and did we mention that you can build and fully customize screener surveys with User Interviews? Launch a research project and easily build your screenerâwe'll give you 3 free participants to get started.
â