All posts
UX Research Topics
Podcasts
Field Guide
SUBSCRIBE TO OUR NEWSLETTER
Protect the validity of your research by recruiting the right people. Learn how to set up effective screeners, ensure fit, and avoid fraud.
Most seasoned researchers have seen at least one or two unqualified participants slip through their screeners.
An unqualified participant is one who is unable to provide you with complete, accurate, unbiased, and useful data for your specific study topic. On the contrary, good-fit participants are engaged, relevant, and verified.
💡 If you’re consistently attracting poor-fit participants, it usually signals an issue with your recruitment process—and it’s important you’re able to recognize these participants before, during, and after recruiting so you know when it’s time to make changes.
⚠️ Before we start, it’s important to note that there are nuances to why participants might exhibit some of these behaviors or characteristics.
For example, inconsistent or low-effort answers might simply be due to not understanding the question, with no intent of dishonesty. Other folks might fib their way through screeners and have many reasons for doing so—perhaps a monetary incentive feels like a good reason to tell a white lie to a stranger, or participants might be unaware that participating in studies they don’t qualify for is a waste of someone else’s time and money.
Instead of raising the alarm for one-off instances, look for a combination of these characteristics or repeated behaviors to signal an issue with your recruiting process (not the participant). This isn’t about excluding people you don’t like—it’s about disqualifying folks who aren’t relevant to your current study to protect the integrity of your research.
The credibility of your participants can have a direct impact on the quality of your research. Below are 8 key factors to look out for as you recruit research participants.
The first interaction you’ll have with potential participants is through the screener survey.
You need to follow screener survey best practices to attract and qualify on-target participants while weeding out participants who aren't a good fit for your study. If you’re struggling to get qualified participants past this first step, it might indicate an issue with your screener.
Here are some best practices to keep in mind when building and QA-ing your screener.
⏸️ Note: While inconsistent answers can signal dishonesty, it can also mean the participant simply misunderstood the question. Double-check that your screener is clear, digestible, and accessible before jumping to conclusions.
If you want extra assurance or need to screen for attributes that can’t be conveyed through a survey, consider double-screening. Double-screening is when you directly call or message participants who got through the screener survey to confirm their qualifications and availability. You can also use this process to ask for clarification or elaboration on their survey responses.
Double-screening is one of the many features that User Interviews offers to streamline the recruitment process. User Interviews is the only tool that lets you source, screen, track, and pay participants from your own panel using Research Hub, or from our 3-million-strong network using Recruit. Try it out today by signing up for a free account.
If you’re looking for qualified participants, then you need to meet them where they are: in trustworthy recruitment channels.
Like you, participants are concerned about trustworthiness. You’ll be handling their personal data, sometimes asking them sensitive questions, and promising them an incentive that won’t be paid out until they’ve already devoted their time and energy to the study. So, for their own safety and peace-of-mind, many participants prefer to sign up for studies through legit, verified recruitment channels.
Here are some things to look for in a credible recruitment channel:
⚡ Curious about what kind of audiences you could recruit with User Interviews? Check out our 2023 Research Panel Report for a deep-dive into the participants who make up our research panel: who they are, how we source and vet them, and what other researchers have to say about recruiting with User Interviews.
Another great way to verify the credibility of your participants is to do what everybody does in the digital age—check out their social media accounts!
Depending on the recruitment channel you use, you may or may not have access to the data you need to do this. Many panels don’t provide any participant data (you just get a participant ID number) or else they charge extra for it (sometimes referred to as “appended data").
If you use User Interviews for recruiting, we verify participant social media accounts for you. You get access to rich profile data (free of charge!) which tells you who the applicant is and whether or not their social media profiles have been verified.
Often, the participants you can recruit through dedicated recruiting channels have participated in one or two studies in the past.
Repeat participation doesn’t necessarily signal dishonesty or poor fit. In many cases, participants apply for more studies because they enjoy sharing their opinions, partaking in research, and making a little extra money on the side—this means they’re likely to be more engaged and communicative, which is what you want in a good-fit participant!
If you’re using a recruiting channel that provides insight into past participant behavior, look for signals of engagement and reliability, such as:
For example, User Interviews automatically tracks participant behavior, including no-shows, researcher ratings, and past screener applications. This information informs our matching algorithm, making it stronger, smarter, and more efficient at connecting researchers with their target audience.
In fact, only 0.3% of active participants in the User Interviews panel have ever been flagged as suspicious, and our no-show rate for moderated sessions is under 8%. That’s low compared to industry standards which, on average, is about 11%, though it’s been reported to be as high as 20%!
📊 Interested in learning more about engagement and quality of the User Interviews recruitment panel? Download the 2023 Research Panel Report for a data deep-dive into our consumer and professional audiences, details on how we manage our panel to ensure quality, and testimonials from both researchers and participants.
One of the best ways to verify participant credibility is to ask the researchers who’ve worked with them in the past.
However, not all recruiting channels collect participant feedback ratings from researchers. You might be able to ask your peers for suggestions for specific tools or agencies, but many recruiting channels leave you in the dark about researchers’ previous experiences with specific participants.
If you use User Interviews for recruiting, you can easily view feedback ratings from other researchers and leave your own feedback as well. When you mark a session as completed, our platform prompts you to leave feedback about the session, jot down notes for your team, and save top participants to re-invite to future studies.
In fact, 98.3% of sessions completed through User Interviews result in positive feedback. As Ariel F. said in a G2 review:
“We have had excellent success with participants from User Interviews — we always recommend to our clients if they require a quick turnaround because we know we will get the highest quality participants right away.”
It might sound obvious, but good-fit participants should be representative of your target audience.
The trouble is, targeting capabilities vary from channel to channel, and it’s not always easy to find exactly who you’re looking for (and verify that they are, in fact, who they say they are).
Some participants might be more convenient to recruit than others (e.g. your family and friends, people local to your area, or tester panels provided by the testing tools you already use), but that doesn’t mean they’re going to give you the best, most relevant, and unbiased data.
As Joan Sargeant, PhD says in their article on participants, analysis, and quality assurance in qualitative research:
“The subjects sampled must be able to inform important facets and perspectives related to the phenomenon being studied. For example, in a study looking at a professionalism intervention, representative participants could be considered by role (residents and faculty), perspective (those who approve/disapprove the intervention), experience level (junior and senior residents), and/or diversity (gender, ethnicity, other background).”
To ensure a credible recruit, look for recruiting channels or tools with proven targeting capabilities across all the attributes you need—whether they’re demographic, psychographic, technical, or else.
If you recruit participants using User Interviews, you can target audiences with specific demographic attributes like age and income, professional attributes like job titles and skills, and technical attributes like their smartphone operating system. In fact, our 3-million-strong panel spans 7 countries (and counting), 73,000 professional occupations, and 140 industries.
👉 Sign up for a free account to start exploring feasibility or learn more about who you can recruit with our smart targeting system in the 2023 User Interviews Panel Report.
To recruit a truly diverse and representative audience, it’s good to talk to a variety of personalities and make sure you’re designing your study with accessibility in mind.
However, different research methods do certain characteristics to run successfully. For example:
Along with these method-specific behavioral requirements, you’ll also always need participants who can:
There is some nuance to recruiting for personality, of course—filtering out all the shy people obviously isn’t going to provide you with a representative sample of your target audience, and it’s important you’re not making value-based judgements about different personality types to avoid introducing your own bias. Plus, disengaged participants can signal other issues with your study, like an incentive that’s too low or unclear instructions.
Even so, you and your participants need to be able to understand each other on some level. If you aren't already, make sure you're designing your study to accommodate accessibility guidelines (such as the WCAG standards from W3C); enlisting translators for participants who speak different languages; and providing clarity throughout the process to ensure that all participants have what they need to engage with the study.
Beyond that, any other necessary characteristics can be zeroed in on through a combination of:
Sometimes, despite all your best efforts, an unqualified participant might get into your study. At this point, it’s critical to identify them so you can recruit someone else in their place or at the very least, exclude their data from your final analysis.
While you’re conducting sessions with participants, you’ll want to look for:
As we’ve mentioned throughout this article, it’s always possible that some of these behaviors indicate other issues, such as unclear instructions. Don’t jump to conclusions, but do use these signals as an opportunity to double-check your recruitment process to verify that you’re attracting folks who fit your target audience.
Unqualified participants are those who are unwilling or unable to provide you with complete, accurate, unbiased, and useful data. They can slip into your study through any number of mistakes or oversights, from poorly designing your screener survey to simply choosing the wrong tools.
That’s why User Interviews exists: to provide stronger recruiting tools, higher-quality participants, and more powerful automation than any recruiting solution on the market. We’re also super transparent about the kinds of people and professions represented in our 3M+ panel, which is why we created the 2023 User Interviews Panel Report.
Download the UI Panel Report to evaluate the depth and quality of Recruit for your research needs, with:
The report should give you a good idea of who’s represented in our growing panel—but it only scratches the surface. The best way to gauge the feasibility of User Interviews for your specific recruiting needs is to sign up for a free account and launch your first project.
Product Education Manager
Marketer, writer, poet. Lizzy likes hiking, people-watching, thrift shopping, learning and sharing ideas. Her happiest memory is sitting on the shore of Lake Champlain in the summer of 2020, eating a clementine.