Join over 150k subscribers to get the latest articles, podcast episodes, and data-packed reports—in your inbox, every week.
Join over 150k subscribers to get the latest articles, podcast episodes, and data-packed reports—in your inbox, every week.
UX Research Topics
SUBSCRIBE TO OUR NEWSLETTER
May 3, 2023
4 participant recruiting methods, 10 tools, dozens of fresh insights. Find out how much ROI you can get from popular UX research tools.
Editor’s note: Since initial publication on March 31, 2023, we have uncovered additional information about a leading all-in-one platform’s pricing which we have used to update this report. As with the previous version, these cost calculations are based on discovery calls with actual buyers, publicly available information, and good-faith estimates.
A study by Forrester found that every dollar invested in UX brings $100 in return—that’s a staggering 9900% ROI.
There’s little doubt that making design and business decisions based on high-quality insights can earn you money in the long term. But we’ve got to be frank here: UX research can be expensive. And we’ve got good reason to argue that not every dollar spent on UXR delivers the same return on investment—a lot depends on the tools and methods you use along the way.
It’s March (almost April) 2023. And right now, against a backdrop of economic uncertainty, UX research teams are stuck in a catch-22 of inflated UX demand and researcher burnout. We’re asked to do more with less time, less energy, and smaller (or, at the least, more scrutinized) budgets. In turn, we’re asking our tools to do more, too.
But which tools are up to the task? 🤔 In a seemingly ever-growing landscape of UXR software (our most recent UX Research Tools Map included over 230 logos), how do you determine which tools offer the best value? Should you opt for a layered, integrated stack made of multiple tools or a single “all-in-one” platform that promises an end-to-end solution?
In the following report, we take a close look at the cost of different user research solutions—not just the sticker price of a SaaS subscription, but also the labor cost in terms of both time and money.
For starters, we compared four methods of research recruiting (User Interviews, other recruiting tools, agencies, and DIY).
Next, we compared the cost of a popular “all-in-one” platform against integrated user research tool kits to understand how each approach stacks up.
(Don’t worry—you won’t have to take our word for it. We’ve covered our methodology and data in the sections below.)
✅ User Interviews offers high ROI for targeted, low-volume recruiting. For teams that need an average of 10 B2B participants/month, UI offers between ROI of 87% and 168%. For 50 B2C participants, the ROI is between 92% and 283%.
📈 User Interviews is the most economical recruiting method at scale. Recruiting 100 qualitative participants per month with User Interviews costs $5,050 (B2C) to $5,390 (B2B) less than our leading competitor, $8,587 to $11,787 less than an agency, and $5,580 to $16,452 less than DIY.
⚖️ Agencies can actually be cheaper than DIY recruiting—but they’re more expensive than recruiting tools like User Interviews. Recruiting 20 B2B participants with an agency saves you $810 and over 63 hours compared to DIY recruiting—but still costs $2,268 more than User Interviews.
💰 A single all-in-one tool can be vastly more expensive than a 5 best-in-class tools combined. It costs less money to do research with 50 participants using User Interviews, Figma, Zoom, Sprig, and Lookback than it does to conduct 5 moderated sessions with an all-in-one ($2,008 vs $2,645). And for 100 monthly sessions, you’re looking at cost savings of $38,300 per month—or a staggering $459,600 per year (1037% ROI).
👉 TL;DR: Compared to other self-serve recruitment tools, DIY recruiting, agency recruiting, and “all-in-one” UX research tools, User Interviews is the most economical way to recruit participants—especially when integrated with other best-in-class tools in a layered UX tech stack.
👀Sign up now for a free account and see for yourself!
First, a quick note on bias.
In this report, we will be comparing the cost and benefits of User Interviews against alternative solutions. As such, we acknowledge that we are not wholly unbiased.
But we’ve done our best to be objective in our calculations, and have made an effort to present our methodology as transparently as possible. We feel confident that the numbers in this report are realistic estimates of the potential cost savings and ROI offered by each tooling option.
We hope it serves as a useful guide for teams of all sizes as you evaluate your options and helps you articulate a compelling business case for investing in the tools your team needs to do quality research at scale.
B2C vs B2B: In this context, B2B refers to participants chosen based on their industry, job title, or professional expertise. Both B2B and B2C participants may be screened for demographic or behavioral criteria.
No-shows: Participants who accept a study invite but never show up to the session.
Labor cost: The labor cost of the time researchers actively spend on recruitment, based on the average US salary of UXRs and people who do research
👇 Now, here’s step-by-step progression of how our 2023 ROI report came together.
As with any research project, the first thing we needed to do was define our approach. For this report, we decided to focus on qualitative recruiting and research.
When we talk about the cost and ROI of User Interviews in this report, we’re talking about our best-in-class recruitment solution, Recruit.
We were interested in understanding not only the ROI and cost savings of User Interviews compared to recruiting alternatives, but also how layered tool stacks that include our own solution, well, stack up against all-in-ones..
But simply comparing the sticker price of each solution doesn’t illustrate the full value of tools. There are all sorts of hard-to-quantify factors—like participant quality, the flexibility of a modular tool stack, etc.—that we couldn’t factor into our calculations (we talk about those in the “What the quantitative data doesn’t tell us” section below).
We did, however, work out some formulas for calculating the impact of time spent, no-show rates, and multiple users on the actual cost of each solution, which we’ll get into below.
Given the scope of our report, we worked out that we would need the following pieces of information in order to calculate cost savings and ROI:
Meeting this first data requirement sounds simpler than it was. For one thing, we had to work out the cost for each tool for any given number of participants. For another thing, SaaS pricing can be frustratingly hard to untangle.
User Interviews costs
The User Interviews cost per participant is based on the session price of the most appropriate pricing tier for the volume of participants required, as of the time of writing (March 2023).
For example, if you need 12 participants on average each month, the cost per participant = $26. That is the session price for our Essential subscription, which costs $3,900 per year for 150 sessions (at a monthly cost of $325 for 12.5 sessions on average). And the cost of B2B recruiting is twice that of B2C, so the UI B2B price multiplier = 2.
Recruiting agency costs
We leaned on some insider knowledge about agency pricing to work out a per participant cost. We used $100 as the agency cost per B2C participant and $150 as the agency cost per B2B participant. (The agency B2B price multiplier = 1.5)
Next, we used the published pricing tiers for top User Interviews competitors, calculated the cost for any given number of participants, and used the average for our competitor per participant price estimate.
DIY recruiting costs
With DIY recruiting, there’s no tooling cost. The cost of recruiting yourself is based on time and salary estimates (see below for calculations).
User Interviews is an ecosystem player; our panel is testing tool agnostic, meaning you can recruit participants for research with whatever testing tools you have in your stack.
Cost of user research tools
To show you what a layered solution looks like and how it compares to an all-in-one platform, we also worked out the cost of UXR tools for things like interviews, usability testing, card sorting, prototype and concept testing, and more. Those tools are Figma, Zoom, Maze, Sprig, Lookback, Miro, Optimal Workshop, and Usability Hub.
Cost of a popular all-in-one platform
As for the all-in-one platform that we’d be pitting these layered stacks against, we decided to use a major user testing platform as our foil.
This proved… challenging. That’s because their pricing is extremely opaque.
Through speaking with actual buyers, we’ve learned that plans often start around $50,000—and that some companies are paying nearly $1 million per year for use of their platform + panel. (Yes, you read that right.)
In the absence of official pricing information, we have relied on insights from discovery calls, competitive research with actual buyers, publicly available information (including comments in Reddit threads, like this one in r/UXResearch and this one in r/userexperience), and some educated guesswork.
Here’s what we can ascertain: Pricing for major all-in-one platforms scales in a way that, to quote one Reddit user, is “hella complicated.” According to users of one platform:
Some users mention volume discounts, so we factored that into our math.
Now, we’re just making an educated guess about how these discounts might work, but let’s say that the starting package of $15,000 includes the platform itself and 40 moderated sessions with panelists (at a rate of $300 per session). Let’s assume that the per-session cost is reduced by -$2/session for every additional 60 sessions.
Using this model, worked out the following (estimated) pricing tiers for an enterprise-grade all-in-one-solution.
🧮 Estimated subscription costs for an all-in-one solution
Disclaimer: Once again, these are estimated rates and represent the hypothetical costs of an enterprise all-in-one UXR platform.
We used these rates as the basis for our calculations, but we assume that most folks will pay somewhat more or less than this. If you’ve been given a quote from UserTesting or another all-in-one (or are already a customer), you could simply replace these numbers with your quote to get your actual ROI.
(Psst—we’re working on an interactive cost savings calculator to help you do this math! Subscribe to our newsletter to get notified when it drops.)
For these inputs, we made educated guesses based on the amount of active user effort each recruiting method requires.
In other words, these estimates don’t include the full length of time it takes to fill a study with quality participants—those hours, days, weeks you might spend waiting for the right folks to respond to your call for participants. (In that case, agencies are notoriously slow—they can take weeks to fill even small studies. At User Interviews, on the other hand, we specialize in fast participant matching—the median time to first recruit is just 2 hours, meaning you can fill a study with quality participants in a matter of hours.).
When it comes to the hours researchers actively spend on recruiting with each method, our assumption was that DIY would take the longest by a long shot, followed by self-serve tools like User Interviews and competitors, followed by recruiting agencies and built-in tester panels.
Here’s a summary of our assumptions and the time estimates we used in our calculations (⏳= Time spent actively recruiting/reviewing participants.):
User Interviews and other recruiting tools: We automate scheduling, participant emails, incentive payouts, etc—but the more participants you need, the more time it takes to review recruits.
⏳ 2.5 hours for the first 10 participants, +15 minutes for every additional 10 participants.
DIY: You’re doing everything yourself, without an existing user panel to tap into.
⏳ 1.15 hours for every B2C participant; 3 hours for every B2B participant.
Agencies: The labor cost is minimal because you’re paying a premium to outsource recruiting.
⏳ 2.5 hours for the first 50 participants, +15 minutes for every additional 50 participants.
All-in-one's built-in tester panel: Built-in tester panels promise fast, high-volume recruitment (vs. the targeted recruitment you’d look for in qualitative studies).
⏳2.5 hours for the first 50 participants, +15 minutes for every additional 50 participants.
On teams with multiple users of UXR tools, we assumed that the workload would be partially shared, so the labor cost wouldn’t simply increase 100% for each additional user. For multiple users, we estimated a +25% increase in time spent for each additional user—so on a team with 3 users, the time multiplier = 1.5. If you have 8 people regularly doing this work, that multiplier = 2.75.
Labor cost if multiple users = (labor cost) + ((number of users - 1) x 0.25)
In order to understand the dollar value of each hour spent on recruiting, we needed to work out the average salary for the typical users of these tools.
To do this, we used Glassdoor to pull the average annual salary for mid-career professionals (folks with 4-9 years experience) in 9 US cities.
The job titles we searched for were: UX Researcher, Research Operations Manager, UX Designer, Product Manager, Product Marketer. And the cities (which we chose because they are tech hubs for which Glassdoor had enough salary data to be “confident” in their estimates) were: Atlanta, Austin, Boston, Chicago, Denver, NYC, San Francisco, San Jose, Seattle.
Average salary of people who do research: $134,632 = $64.73/hr
Sometimes participants don’t show up. When you recruit with User Interviews, we automatically recruit extra participants to account for the average no-show rate. With DIY, there’s no such security—so it’s best practice to recruit additional participants in the event that some folks ghost ya.
According to data from projects completed with our platform, the average no-show rate for moderated studies (between Q4 2022 and Q1 2023) was 10%. That means for every 10 participants you actually need, you should recruit 11.
Average no-show rate for qualitative studies: 10%
Once we had all our data assembled, it was time to crunch some numbers.
To calculate labor cost—which reflects the time UXRs actively spend on recruiting—we multiplied the average hourly salary by the relevant time multiplier.
Labor cost of recruiting n participants = (Average salary) x (time multiplier)
For example, the labor cost of recruiting 10 participants with User Interviews is: $64.73 x 2.5 hours.
If we’re looking at a scenario with multiple users, we have to multiply by +25% for each additional user (so for a 3 person team using UI to recruit 10 participants the formula would be: Labor cost = $64.73 x 2.5 x 1.5
Next, we had to calculate the cost of recruiting n participants with different solutions. The calculated cost includes both the sticker price for recruiting n participants and the labor cost of doing the same.
Cost of recruiting n participants = (Sticker price for n participants) + (labor cost for n participants)
In actuality, the formula for calculating the cost of each solution looked slightly different.
Cost of recruiting with User Interviews
The cost for recruiting B2C participants with User Interviews was calculated based on labor cost and subscription tier pricing:
Cost to recruit with User Interviews (B2C) = (UI labor cost) + (per participant price x number of participants)
For B2B, we factor in a 100% price increase:
Cost to recruit with User Interviews (B2B) = (UI labor cost) + (per participant price x number of participants x 2)
Cost of recruiting with an agency
We used the same formula to calculate the cost of recruiting with an agency (but remember that the user’s time cost scales at a slower rate and the per participant cost is higher.)
The cost for recruiting B2C participants with an agency is therefore:
Cost to recruit with an agency (B2C) = (Agency labor cost) + (per participant price x number of participants)
For B2B, we factor in a 50% price increase:
Cost to recruit with an agency (B2B) = (Agency labor cost) + (per participant price x number of participants x 1.5)
Cost of DIY recruiting:
With DIY recruiting, the labor cost scales very rapidly with DIY — +1.15 hours for every B2C participant and +3 hours for every professional. For this method, we also had to factor in no-shows.
The cost of DIY B2C recruiting is therefore:
Cost of DIY recruiting (B2C) = (DIY labor cost) x (number of participants + no shows)
For DIY B2B recruiting, we need to add a time multiplier to the first half of that formula:
Cost of DIY recruiting (B2B) = (DIY labor cost x 2.6) x (number of participants + no shows)
Cost of recruiting with other tools:
The formulas for our competitors’ pricing get a little hairy because of differences in pricing structure. For example, one company charges for each additional screening criteria, while another bases their pricing structure on the incentive amount (we assumed a B2C incentive of $80/hr and used the average B2B incentive of $100/hr in that case).
We averaged the price per participant across competitors and used the UI labor cost value to work out an average cost for n participants .
Once we worked out the cost for n participants according to each variable (recruiting method, number of users, B2C or B2B), we could calculate the cost savings and ROI of switching to User Interviews.
First, we had to calculate out the cost savings:
Cost savings = (cost of recruiting n participants with UI alternative) - (cost of recruiting n participants with UI)
And from there, the ROI can be worked out using a simple ROI formula:
ROI = (cost savings) ÷ (cost of recruiting n participants with UI)
The formulas above helped us work out the cost of recruiting participants… but recruitment is just one step in the user research process. Conducting interviews, usability tests, and other types of UX research requires additional tooling.
You probably already have some UXR staples in your tool stack; Figma, Miro, and Zoom/Google Meet/MS Teams, for example, are fixtures in many organizations.
🔎 But let’s say you’re starting from square one. You’ve got no existing subscriptions. How much will it cost to assemble a tool stack that lets you conduct a variety of user research? And how does that compare to an all-in-one platform that promises to solve all your research needs?
To answer these questions, we needed to work out the cost of popular user research tools at different scales. That is, which subscription would you need to conduct an average of 5, 10, 20, 150 moderated studies per month?
Here’s a (very abridged) glimpse of what those pricing charts looked like:
(We’ve been having dreams about rows, columns, and formulas for weeks.)
Once we had those rates worked out, we were able to add up the costs of different tool stacks (User Interviews + Maze + Zoom, for one example. Or User Interviews + Lookback, for another).
Then, using the same cost savings and ROI formulas above, we compared different tool stacks against the monthly cost of our all-in-one example.
Cost savings = (all-in-one monthly subscription price + labor cost for n participants) - (cost of doing research with n participants with tool stack)
ROI = (cost savings) ÷ (cost of doing research with n participants with tool stack)
We also calculated the cost of these tool stacks for different team sizes (1 user, 3 users, 8 users) using the following formula:
Labor cost if multiple users = (labor cost) + ((number of users - 1) x 0.25)
Using the formulas above, we analyzed the cost of 4 different recruiting methods (User Interviews, DIY, agencies, and other recruiting tools), depending on how many participants a user is looking to recruit on a monthly basis.
Here’s what we learned 👇
Take a look at the chart below. You’ll see that the cost of recruiting 5 to 10 participants is somewhat similar, regardless of which method you choose—but the cost rises sharply for all but User Interviews as the size of the recruit increases.
If you’re recruiting with an agency, it will cost you $7,037 on average (B2B + B2C) to recruit 50 participants. For 100 participants, you’re looking at $13,944. For 200 participants, a soaring $27,710 on average.
If you’re recruiting on your own, the average (B2B + B2C) cost of recruiting 50 participants comes out to $7,387. For 100 participants, you’re up for $14,774. And if you decided to recruit 200 participants on your own (😳), those user hours could add up to $29,548.
The average (B2B + B2C) monthly cost of recruiting with our leading competitors is $4,562 for 50 participants, $8,977 for 100, and $17,809 for 200 participants.
Meanwhile, with User Interviews, the monthly cost of recruiting 50 participants comes out to $2,102 (B2C+B2B), including the labor cost. For 100 participants, you’re looking at $3,757 on average; for 200, you can expect to pay $7,069.
In short, as the size of your recruit increases, you’d be spending thousands of dollars less with User Interviews than you would with an alternative solution.
Now let’s look at how this breaks down for B2C and B2B recruiting. (Remember that the cost includes labor cost of time spent actively recruiting).
Let’s say you need to recruit 20 B2B participants in an average month. With a competitor solution, you’re already looking at nearly $1,000 in extra recruiting costs compared to the amount you would pay with User Interviews. With an agency, you can expect to pay an additional $2,200 for the same recruit. And DIY-ing it could run you over $3,000 in extra costs.
The story is similar for B2C recruiting, although you’ll notice that the DIY approach fares better here in terms of ROI. Not great—but better.
That’s because recruiting general consumers is much less time consuming than trying to target by occupation or professional expertise—especially at scale. Still, you can expect to pay roughly $1,600 more for a 20-person B2C recruit than you would with User Interviews; for 100 participants, your additional cost may be over $16,000.
As your recruiting needs scale, DIY, agency, and competitor methods all offer diminishing returns.
Let’s say you need to recruit 20 B2B participants in a month. Making the switch from DIY recruiting to User Interviews results in an ROI of 258%. The comparative ROI increases to 292% for 50 participants, 335% for 100 participants, and so on. This isn’t too surprising since DIY recruiting is so expensive compared to most tools.
When switching from agency recruiting to User Interviews, the ROI for 20 B2B participants is 190%. For 50 B2B participants, you’re looking at an ROI of 209% compared to agency prices. For 100 participants, the ROI is 240%. As with DIY, the numbers just grow from there.
Compared to other recruiting solutions, User Interviews offers an ROI of 140% for 20 participants, 164% for 50 participants, and 194% for 100 participants per month.
The charts look similar for B2C recruiting 👇
✨The more research participants you need in a given month, the more money you’ll save with User Interviews compared to DIY, agency, or alternative tooling methods. ✨
Now let’s take a closer look at how User Interviews compares to each of these methods.
Research teams looking to recruit between 10 and 20 participants in a month can save an average of $713 for B2C recruiting and $719 for B2B recruiting with User Interviews compared to the average rates of leading competitors.
For 50 to 100 participants, research teams save an average of $3,773 for B2C recruiting and $4,013 for B2B recruiting with User Interviews compared to other self-serve tools.
Teams that do a lot of research (like, 200 qualitative participants per month), are looking at potential cost savings of over $11,000 for B2B and over $10,000 for B2C.
And the threshold for positive ROI starts with a single participant (on our PAYGO plan).
If you need 10 participants a month, User Interviews offers an ROI of 47% for B2C and 87% for B2B. ROI only grows as recruit size increases—for a monthly recruiting need of 300 participants, you’re looking at an ROI of between 142% (B2C) and 243% (B2B) when you use User Interviews instead of a competitor solution.
Unless you genuinely only need 4 or fewer B2C participants or just a single B2B participant, trying to handle recruiting on your own is just not the most economical or time efficient approach.
For 10 to 20 participants, research teams save an average of $659 for B2C recruiting and $2,245 for B2B recruiting with User Interviews compared to a DIY approach.
Teams that recruit between 50 and 100 participants per month can save an average of $4,141 for B2C recruiting and $12,281 for B2B recruiting with User Interviews, compared to recruiting on their own.
We genuinely don’t know why or how you would recruit 200 participants a month on your own. But if you did, you’d be paying between $16,376 (B2C) and $42,720 (B2B) for every month you were recruiting. That’s between $11,507 to $33,450 more than you’d pay with User Interviews.
With this method, the threshold for positive ROI starts at just 2 participants for B2B and 5 participants for B2C.
By the time you’ve recruited 10 participants in a month, you can expect an ROI of between 68% (B2C) to 168% (B2B). That number keeps growing as your recruit scales; for 300 participants you’re looking at a return on investment of 271% (B2C) to 408% (B2B).
Out of the three alternatives, switching from recruiting B2B participants on your own offers the highest average ROI of 345%.
With DIY recruiting, there’s no tooling cost (well, you may want to invest in a Calendly subscription and an espresso machine, but that’s another story). The cost of recruiting all boils down to the number of salaried hours it takes a user to recruit any given number of participants.
For 10 B2B participants, you can anticipate spending about 33 hours recruiting on your own. Compare that to the 2.5 hours you’d spend with User Interviews or the 2.5 hours with an agency and the downside of this method becomes obvious.
And if you plan to recruit 50 B2B participants in a month, you better not be planning to do anything else with your month—a 50-person B2B recruit will take you an estimated 165 hours on your own. With User Interviews, on the other hand, it would still only take about 3.5 hours.
💰How much is your time worth?
Say you’re a product manager looking to recruit 20 B2B participants for some continuous research.
Remember, the average no-show rate is 0.1, so you’ll need to recruit 10% more people than you actually need—so you actually want to recruit 22 participants. If we run with the assumption that each B2B recruit takes up to 3 hours, that means the entire process could take 66 hours—over a week of full-time work.
According to Glassdoor salary data for 9 US cities, the average hourly salary pay for a mid-career PM is $73.20. That means that those 66 user hours will cost you (well, your company) $4,831.
On the same PM salary, the actual cost of recruiting 20 B2B participants with User Interviews is just $870—or a cost savings of $3,961 and as much as 63 hours of work.
Recruiting with an agency can be more economical than DIY recruiting at scale, but it’s still much more expensive than User Interviews or similar self-serve options.
Teams that recruit 10 to 20 participants on a monthly basis will save an average of $1,242 for B2C recruiting and $1,677 for B2B recruiting with User Interviews, compared to recruiting agencies.
For larger recruits of 50 to 100 participants, research teams stand to save an average of $6,428 for B2C and $8,814 for B2B.
Making the switch from agency recruiting starts to yield a positive ROI from the very first recruit (roughly 30%).
Teams that need to recruit an average of 10 participants per month are looking at an ROI of between 127% (B2B) and 159% (B2C); for 20 participants, the ROI of recruiting with User Interviews over agencies is between 190% (B2B) and 240% (B2C).
For a 50-person recruit, User Interviews offers an ROI of 209% (B2B) to 283% (B2C) compared to a typical recruiting agency. For 100 participants, the ROI is between 240% (B2B) and 329% (B2C).
As with other methods, the returns scale with the volume of participants required—by the time your recruiting needs scale to 300 participants a month, User Interviews delivers an ROI between 294% (B2B) and 403% (B2C).
That’s an average ROI of 249% for B2B recruiting and 339% for B2C recruiting.
💸 See the cost savings for yourself. Sign up with a free account to make the switch to User Interviews today.
A solid recruiting solution is just one part of the user research toolkit.
The most basic UXR stack requires a video conferencing platform (for interviews), a whiteboard/prototyping tool (for prototype testing, scrappy card sorting, moderated studies), and a word document/spreadsheets (for notetaking and analysis). And a participant recruitment tool, of course.
But the more research you do, the more sophisticated your tools need to be. In addition to the tools mentioned above, a complete UXR toolkit will typically include a platform for running moderated, qualitative studies (like card sorting, prototype testing, and moderated usability tests) and one for unmoderated usability testing and surveys.
🗺️ Explore your options: Check out the 2022 UX Research Tools Map, a complete guide to the user research software landscape
Now, you might think that buying a single tool for all your research needs would be cheaper than multiple subscriptions—but in most cases, you’d be wrong.
As we found when we added up the costs for different tool stacks and compared them to an enterprise all-in-one platform, a modular toolkit of best-in-class recruiting and testing tools is not only a more flexible option, but can often be the more economical one as well.
Anyway, here’s how different UXR stacks stack up.
According to the 2022 State of User Research Report, the majority of UX research teams utilize multipurpose, collaborative tools like Miro, Figma, and Zoom. And you can do solid qualitative research using just these tools.
For example, User Interviews + Miro + Figma or User Interviews + Figma + Zoom offers:
The costs above are based on a single user of a free Zoom Basic plan, paired with a Miro Business plan ($16/ user/ month) and a Figma Professional plan ($12/ user/ month). Both tools offer a free version, but the limitations make them impractical for serious UX research.
The two stacks are more or less the same, pricing wise. Combined with User Interviews, either pairing would save you over $2,000 when researching with 5 participants, compared to an all-in-one panel. For a 20-participant recruit, you’re looking at a cost savings of over $7,600. Need 100 participants a month? Switching from an all-in-one could save you nearly $40,000.
Perhaps you’ve already got a Zoom subscription and a tool like Figma or Miro at your disposal—those costs aren’t a factor here and you’re just looking to suss out your options for a paired recruiting + testing solution.
Here’s what your cost savings could look like:
You’ll notice that the ROI of each stack peaks around 100 participants (per month) with an average ROI of 1366% (cost savings around $38 -40k per month — or $456,000 to $480,000 a year).
Assuming that the all-in-one platform offers a volume discount, we might expect this ROI to start diminishing with very large numbers of participants. If you need 300 qualitative participants a month (3600 per year), switching from an all-in-one will deliver a positive ROI of around “only” 600-700%.
By pairing User Interviews with Maze, you get:
Compared to a major all-in-one platform, and based on the rates above, you could save over $5,000 on a 10-participant study recruited through User Interviews. For 20 participants, you would save over $7,600. To research with 100 participants in a given month, your cost savings could be nearly $40,000—an ROI of 1466%.
Note: Maze pricing calculations are based on the most affordable subscription for your needs. If you’re recruiting 30 or fewer participants per month, you’ll probably do just fine on their free plan. Otherwise, we applied a flat rate of $75/month for their Professional tier for up to 180 participants per month, or an estimated $150/month for a custom, high-volume plan.
With User interviews + Optimal Workshop, you can do:
If you’re recruiting 10 participants each month, you can save over $4,800 compared to our all-in-one example alone. For 20 participants, the cost savings are over $7,400; for 50, over $14,600. With a monthly recruiting need of 100 participants, you’re looking at monthly cost savings of over $39,000 (1392% ROI).
Note: Optimal Workshop pricing is based on a flat rate of $208/month for a single user (for 3-5 users, $191/user/month; for 6-9 users, $174/user/month).
With User Interviews with Lookback, you get:
Plus, we integrate!
To recruit and conduct research with 5 participants on a monthly basis, this stack will save you over $2,100. For teams that do research with an average of 50 participants per month, the monthly cost savings of User Interviews + Lookback vs. our all-in-one example is $14,558 (800% ROI).
Note: Lookback pricing is based on the number of sessions you conduct per year. Their plans start at $25/month for 10 annual sessions—for an average of 5 per month, you’re better off with their $149/month plan (100 annual sessions). For recruiting and research needs that exceed the limits of their published pricing tiers, we estimated a $900/month cost for unlimited sessions.
With User interviews + Usability Hub, you get:
UsabilityHub bills itself as the “Swiss Army knife” of UXR tools—it’s a much more affordable “all-in-one” testing platform that lets you bring your own customers.
With an average monthly recruiting and research need of 10 participants, this tool pairing will save you nearly $5,000 a month. In the event that you’re looking to conduct user testing with 100 participants (recruited through User Interviews) on a monthly basis, you’re still looking at a cost savings of over $39,300 (1466% ROI) compared to our all-in-one example alone.
Note: UsabilityHub offers a free plan, but you’re limited to 2-minute tests with a maximum of 15 testers. We based our pricing calculations on their $75/month Basic plan, which comes with a 5-minute duration limit but no participant limitations. For $175/month, you can remove this time limit altogether.
With User interviews + Sprig (another integration partner), you get:
To recruit and conduct research with 5 participants on a monthly basis, this stack will save you over $2,100. To talk to an average of 10 people per month, you’d pay nearly $5,000 more for an all-in-one than you would with User Interviews + Sprig (the ROI of switching at this scale is 737%). For teams that do research with an average of 100 participants per month, the monthly cost savings of User Interviews + Sprig vs. our all-in-one example works out to around $39,200 (1409% ROI).
Note: Sprig pricing is based on their $175 Starter plan.
Finally, let us consider a scenario in which you’re building your user research tool stack from scratch and are evaluating the costs of a complete, modular testing kit against an all-in-one.
How does our all-in-one example compare against a 5-tool stack of, say, User Interviews + Figma + Zoom + Sprig + Lookback?
With User interviews + Figma + Zoom + Sprig + Lookback, you get :
At this point it won’t surprise you to learn that the single platform costs more than all 5 tools combined. The ROI peaks at 100 participants per month (1037%, or $38,300). That’s a cost savings of $459,600 per year.
Looked at another way, the annual cost of a single all-in-one platform is 60 times greater than the cost of 5 integrated, best-in-class tools put together.
It’s no great secret that many teams save money on SaaS subscriptions by sharing seats. But a single login can only support so many users. What do the cost savings of layered solutions look like with 3 users? 8 users?
With a single user, conducting 20 moderated sessions a month with User Interviews + Figma + Zoom offers an ROI of 1086% compared to an all-in-one platform. On teams of 3, this stack offers a 953% ROI. For 8 users, that ROI drops to a still-impressive 744%.
Recruiting and conducting research with 50 participants per month using a fully-loaded research stack (UI + Figma + Zoom + Sprig + Lookback) offers an ROI of 716% for a single user, 679% ROI for 3 users, and 605% ROI for an 8-person team.
What if you need to recruit 100 participants a month for a variety of usability tests and moderated studies. Compared to our all-in-one example, User Interviews + Usability offers an ROI of 1466% for a single user, 1406% for 3 users, and 1260% ROI for a team of 8.
Not too shabby, if we do say so ourselves.
So far, we’ve been talking about cost and ROI in quantitative terms. But that’s not the full story. Maximizing the ROI of UX research tools isn’t just about how much money you’re saving.
In this next section, we’re going to look at some of the additional factors that can influence your decision to invest in a new tool for user research. We’ll discuss the ‘hidden costs’ (and potential ROI) associated with:
User research is a fast-evolving field, and as a cross-analysis of our annual State of User Research report data shows, teams are scaling quickly. The tools you invest in should be able to bend, not break, as your research needs change.
Here are some of the ways we believe integrated, modular tool stacks win against all-in-one platforms:
Let’s dig into these arguments👇
All-in-one tools claim to have everything you need to conduct user research.
But bundling features and solutions into a single package often means you’re getting a rigid and costly solution, when something simpler and more flexible would have served you just as well.
With an all-in-one solution, you’re often stuck with a pre-packaged set of features you might not ever use or need. Or even more frustratingly, an “all-in-one” may be missing a key feature that you do need to scale your research. You don’t have a say in which features you want to pay for when it comes to all-in-ones—and that can cost you.
Modular tool stacks (like the ones we cover above) offer a smarter and more flexible approach than to use all-in-one platform tools because they give you more control over how you invest your resources.
As we’ve mentioned multiple times in this report, it’s hard to get a good read on a “typical” subscription price for many all-in-one platforms. But according to one online review,
“Their plans come with limits on the number of participants per study, if you want more participants per study you have to pay more. Instead they give you this annoying workaround where you can do an unlimited number of studies, which means you have to send them out in batches and then combine the data later, which can get tedious.”
On top of that, you’re paying for both the testing tool and their panel of testers—meaning you’re locked into a fixed annual rate for all the features that come with the platform, regardless of whether you use them or not.
At User Interviews, we want to make it easy for both small and large teams to recruit the high-quality participants they need. Need 5 participants for a one-off study? You can pay as you go. Planning to do research with 300 participants a month? We’ve got a subscription plan for that.
Our pricing tiers scale with your needs, and become even more economical as you grow.
👉 Explore our pricing plans for Recruit or sign up now and try it out for yourself.
Let’s talk about switching costs. Not just the monetary cost of switching from one tool to another, but also the time, stress, and frustration of being “locked in” to an all-in-one tool.
There are many types of lock-in, but this definition from Wikipedia summarizes them all nicely: Put simply, lock-in is something that “makes a customer dependent on a vendor for products and services.”
When you invest in an all-in-one solution, you become increasingly reliant on this one tool. You’re locking yourself into a web of long, fixed financial contracts, a platform system that’s not easily replaced, and loads of business data embedded into a single system.
Trying to migrate from such a platform becomes messy—like a divorce where all a couple’s assets are tied together and nobody can remember whose grandmother gave them that beautiful coffee table.
Just ask this one Reddit user:
“[**] is painfully hard to use with external panels, overpriced, has critical issues with unmoderated prototype testing on mobile, and has nowhere near the feature set of its competitors… [but] I'm forced to use this platform over far superior tools and it turns usability testing (a process I really used to love) into a bit of a nightmare.”
With an integrated tool stack, you’re less beholden to a single solution. It’s easier to walk away when the situation becomes untenable.
Switching from one integration to another or adding a new tool to your research toolbox becomes easier and more cost-efficient with tools that support API-ready technology, open source software, and multiple data sources.
👉 Visit our integrations page for the full list of tools we integrate with. And if you don’t see an integration that meets your UX research needs, you can also build a custom integration with our API.
“You do research because you want to have impact. […] If you don’t have high-quality participants, that’s kind of a non-starter […] If you're not talking to the right people, it's really hard to make the right decisions.”
— JH Forster, SVP of Product at User Interviews
All–in-one solutions offer access to a large number of (typically) consumer participants, often at an attractive per-participant price.
The idea of having your tester panel and testing tool all in one place is certainly appealing. And there’s no denying that the low price per tester offered by UX testing platforms is extremely tempting.
But unless your research practice consists entirely of high-volume, unmoderated testing, at some point you’re still going to need to recruit high-quality participants for targeted, qualitative studies. And that’s where generalist tools simply fall short—targeted recruiting just isn’t their forte.
Take it from this user in r/UXdesign:
“This is a common problem with u[***].com panelists. They aren’t regular users because they have essentially become professional website feedback providers over their time doing tests.”
Or this review of another popular “all-in-one” that also struggles to maintain the quality of their participant panel:
“If your company can avoid them, look for an alternative now. PS - Their participant panel has a really high incidence rate of cheaters too”
So what are the alternatives? You could try:
… or you could sign up for User Interviews and start talking to the participants you actually need. Not only does User Interviews boast a pool of 2.4 million vetted participants, we also offer unmatched participant quality with flexible pricing and unbelievably fast, targeted recruiting (we’re talking hours, not weeks). Best of all, it’s free to get started.
If you made it this far in the report, give yourself a pat on the back.👏
You’re now equipped to make a very solid argument in favor of the best-in-class recruiting and testing tools your team deserves.
Of course, the best way to make a business case for investing in certain tools is to substantiate the business value of user research in the first place. Namely:
📕 Find more statistics on the value of user research in: 32 User Experience Research Statistics to Win Over Stakeholders
Here’s a summary of each recruiting method to refresh your memory👇
And a quick review of all-in-ones vs. integrated tech stacks 👇
We think we’ve done a pretty good job of showing how a stack that includes User Interviews can save you time and money when recruiting participants for moderated research at scale. (And that’s without even talking about how we offer high-volume unmoderated recruiting and the #1 user research CRM and panel management software on the market, Research Hub.)
If you care about participants quality and investing in a flexible tool stack that can grow with your needs, make the switch to User Interviews today. Sign up for a free account and start talking to participants (the right ones) in a matter of hours.
Have any questions or feedback? Drop us a line at rachell [at] userinterviews.com, or katryna [at] userinterviews.com.
Katryna Balboni, Content Director at User Interviews: Content marketer by day, thankless servant to cats Elaine Benes and Mr. Maxwell Sheffield by night. Loves to travel, has a terrible sense of direction. Bakes a mean chocolate tart, makes a mediocre cup of coffee. Thinks most pine trees are just okay. "Eclectic."
Content Marketing Manager
Content writer. Marketing enthusiast. INFJ. Inspired by humans and their stories. She spends ridiculous amounts of time on Duolingo and cooking new recipes.