down arrow
Woman working at calculator

The User Research Incentive Calculator

A data-backed calculator for user research.

Trying to get quality insights from bad-fit participants is like trying to squeeze water from a stone—no matter how strong your research skills or plan are, you're just not going to get very far.  

One of the best ways to ensure that you're attracting the right participants to your study—the folks who can give you the insights you need to make important business and product decisions—is to offer them the right incentives.

Our User Research Incentive Calculator can help you figure out the right amount to offer your participants, depending on factors like audience and study type, remote or in-person status, and session duration. Our recommendations are based on data from nearly 20,000 completed research projects, which you can explore in more depth in The 2022 Research Incentives Report.

Keep reading for an overview of the different calculator inputs and variables.

📌 Think you can skip out on incentives? Think again. Here's why incentives are non-negotiable for UX research (and how to offer them on a shoestring budget).

Incentive calculator variables and inputs

Research takes time and effort for someone to participate in, and what that time is worth changes based on what you’re asking for in your study. 

We ask about a number of different variables that help us make a better recommendation for your study. This section explains what we mean by consumer, professional, moderated, unmoderated, remote, in-person, study type, duration, and more. 

🔬 Methodology

With help from our Analytics team, we analyzed a sample of nearly 20,000 completed projects recruited and filled with User Interviews within the last year. 

We then synthesized this data to come up with our incentive recommendations for different study types. Since every study is unique, we have given an optimal range (e.g. $90 - $200/hr) for each segment, as well as a single value that represents the midway point of that range (e.g. $145).

To do this, we looked at the average, most common, and second-most common incentive rates for each segment. The smallest number of the three became the minimum “optimal” value, while the largest became the maximum “optimal” value in our range. (We rounded these numbers to the nearest multiple of 5.)

Then, we averaged those two values to find the midway point (for anyone looking to be smack dab in the middle of the market. When the number of projects in a given segment was too small to rely on the average and most common numbers, we calculated our recommendations to the best of our ability. 

👉 You can read more about our methodology and explore additional data points like no-show rates and qualified-to-requested participant ratios in The 2022 Research Incentives Report.

Looking for the original 2019 incentives report? You can still find that data in the Ultimate Guide to User Research Incentives.

B2C (consumer) vs. B2B (professional)

Consumer (B2C) participants

When we say B2C or consumer, we’re talking about participants who were requested on our platform without any  job title targeting. Of course, consumer participants often have jobs, and may also be recruited as professionals in other studies.

For example, if you’re running a study on grocery shopping and you need to talk to people between the ages of 25-35 who are the primary shoppers for their household, you’d be targeting consumers. 

You’d also be targeting consumers if you want to talk to people who have recently purchased a specific kind of lawnmower, maintain a garden, or live in towns of less than 30,000 people. If they need to be landscape architects, we’re talking about professionals. 

Professional (B2B) participants

Professional participants, on the other hand, are people that have been recruited through User Interviews with targeting based on their job titles and skills. These folks are more difficult to match, and often expect more compensation for their time. 

For example, professional participants might include people who are:

  • Landscape architects
  • Cardiologists
  • VPs of Marketing
  • Uber drivers
  • Product developers

These are all professionals, not to be confused with “professional participants” who try to participate in studies as a primary source of income. Learn the steps User Interviews takes to vet and review our participant audience to ensure a high-quality pool

Moderated vs. unmoderated

Moderated sessions

Moderated sessions are those in which a researcher, designer, product manager, or other moderator observes—either in-person or remotely—as the participant takes part in the study. Generative interviews, focus groups, moderated usability tests, and field studies are all examples of moderated research activities. Moderated studies often produce qualitative data.

Unmoderated sessions

In an unmoderated session, participant completes the activity on their own, without live observation by a researcher or moderator. Surveys, unmoderated usability tests, tree tests, diary studies, and first click tests are all examples of unmoderated tasks.  These studies often produce quantitative data.

Remote vs. in-person

Remote studies

A remote study is any study that doesn’t require the participant and the researcher to be in the same physical location. The session can be held over video chat, a phone call, or a recorded submission. 

In-person studies

An in-person study requires the participant and the researcher to be in the same physical location. These could involve having the participant come to your office for an interview, hosting an in-person focus group, or conducting ethnography research. In-person research can also include things like on-the-street research, or in-store shop-alongs

Study types

1-1 interviews

Interviews are conversations with a single participant, in which a researcher asks questions about a topic of interest to gain a deeper understanding of participants’ their attitudes, beliefs, desires and experiences.

Note: We’ve used the term “interviews” since those are by far the most common type of 1:1 moderated studies that we see; however, this bucket may also include moderated usability testing with an interview component, card sorts, etc.

Focus groups

Focus groups are moderated conversations with a group of 5 to 10 participants in which a moderator asks the group a set of questions about a particular topic. 

Note: Similar to "interviews", we've used “focus group” is an umbrella term for any moderated research session with multiple participants, and this bucket could include things like co-design workshops as well. 

Multi-day studies

Multi-day studies are moderated (or partially moderated) studies that span the course of days, weeks, or even months— such as diary studies or long-term field studies. These methods produce rich, longitudinal data but require a higher time commitment from participants that often translates into higher incentive payouts.

Unmoderated tasks

Unmoderated tasks are completed by participants (independently, and often at their own pace) without the real-time observation of a researcher or moderator. There is no distinction in this report between remote and in-person unmoderated studies (the latter being rare and not adequately reflected in our dataset).

Session duration

Session duration refers to the length of time required for a research session. 

Our calculator breaks down duration times into 5 buckets, depending on study type and sample size:

  • Less than 15 minutes
  • 15–29 minutes
  • 30–60 minutes
  • 1 hour or more
  • Multiple days

📚 More definitions

To make sure we’re all on the same page, let’s take a moment to define the terms we’ll be using in this report.

Incentive

An incentive is a reward that you offer to participants in exchange for taking part in your research. Incentives can be anything, from a cool hat to cold hard cash. They encourage people to apply to your study, help you recruit a wider (or, in some cases, more targeted) pool of participants, and thank participants for their time. 

This report is focused on cash and cash-like incentives (like gift cards), since they are the most commonly offered types of incentives according to the 2022 State of User Research Report. (Plus, they’re easiest to quantify.) If your ideal participants are less likely to be motivated by cash-like incentives—or your organization doesn’t allow you to pay research participants—you can still use the insights in this report to extrapolate an appropriate non-cash incentive for your study, relative to other types of studies with higher or lower rates of remuneration.

Completed sessions

Completed sessions are research projects that were scheduled, held, and finished successfully. In other words, completed sessions are the building blocks for larger, closed research projects. 

Optimal range

Optimal range represents the minimum and maximum amounts among the following three values (rounded to the nearest multiple of 5): average, most common, and second most common incentive amount. 

👉 Distributing incentives doesn't have to be a hassle. User Interviews the only purpose-built recruitment automation and panel management tool that lets you source, screen, track, and pay participants from your own panel, or from our 3-million-strong network. Sign up for a free account to get started.

Katryna Balboni
Content Director

Content marketer by day, thankless servant to cats Elaine Benes and Mr. Maxwell Sheffield by night. Loves to travel, has a terrible sense of direction. Bakes a mean chocolate tart, makes a mediocre cup of coffee. Thinks most pine trees are just okay. "Eclectic."

Subscribe to the UX research newsletter that keeps it fresh
illustration of a stack of mail
Latest posts from
Recruiting Participantshand-drawn arrow that is curved and pointing right
lettuce illustration against an abstract blob filled with a teal brushstroke texture

The UX research newsletter that keeps it fresh

Join over 100,000 subscribers and get the latest articles, reports, podcasts, and special features delivered to your inbox, every week.

Yay! Check your inbox for a welcome email from katryna@userinterviews.com.
Oops! Something went wrong while submitting the form.
[X]