an illustration of a donut chart behind a bar chart

Explore the data

Dive into the responses to understand how budgets change according to industry, company size, budget setting-department, self-assessed maturity, and more. 

[X]

The 2025 Research Budget Report

180 research budget stakeholders reveal where dollars are being spent and saved.
headshot for Liz Steelman
Download the dataset

Introduction

These days, budgets are a sore subject. Businesses have been bracing for a possible recession for years, and it seems like research teams are often the ones facing cuts as a result. In our 2024 State of User Research report, participants reported more layoffs and fewer dedicated Researchers and ReOps specialists. And yet, despite these cuts, something surprising also emerged: At most companies, buy-in for research overall remained strong–and has been steadily growing over the past few years. 

“It’s a strange time where we are needed more than ever, and business leaders acknowledge the value of our work, yet there is resistance to invest,” wrote one State of User Research survey participant.

Money matters, but the word “investment” is far too vague. It’s usually based on a gut feeling compared to other departments, not on solid facts. So how can teams know if they’re underfunded when there’s no clear data or benchmarks showing what typical research spending looks like today?

Once we asked it, we realized no one has really taken a broad look at research budgets. (A quick Google shows that the most commonly cited numbers still come from our 2021 State of User Research report!) 

Knowledge is power–and current, accurate knowledge is even better. So, we surveyed 180 people who either control or have insight into research budgets at organizations of all sizes and research maturities. We asked about how their budgets are set, how they’re used, how people feel about them, what’s changing now, and what they expect in the future.

Down below, we’ll cover:

  • The state of research budgets, looking at concrete numbers from budgets for last fiscal year, as well as standard company and team profiles for each budget tier.
  • How budgets look this fiscal year, plus how maturity plays a role in their year-over-year change.
  • The average budget allocation across all research budgets, the average tool expenditure, plus estimated total budget allocations for each budget tier. 
  • How satisfied people are with their budgets, how it relates to their role, and how impactful they think their budgets are.
  • How budgets get set and used, plus what research expenses are the hardest to justify.
  • The comparison between real and ideal budget allocation, plus how budgets would be affected if they were doubled or halved overnight.

We’ve aimed for a comprehensive view of research budgets, but we know your needs may be more specific. That’s why we’re sharing the full dataset, so you can explore how budgets vary by industry, company size, budget setting-department, self-assessed maturity level, and more.

How to use this report: 

Use this report as a benchmark to see how your team stacks up against others with similar budgets. It might help you make better decisions on how your research dollars are spent, spot areas for improvement, or even make the case for greater investment.

  1. Practitioners: See how your research budget stacks up to others in terms of both external and internal line items
  2. Leaders: Understand how industry, company size, department, self-assessed maturity, and other factors affect research budget
  3. All audiences: Understand how priorities shift when budgets are tight or flush.

Key Takeaways

Findings

At first glance, research budgets don’t look as bad as the headlines suggest. Yes, some budgets decreased year over year, but most companies kept their budgets steady or even increased them a bit.

Still, a common challenge arose: the biggest budget hurdle for research is often deciding between growing the team or investing in improved efficiency. Headcount nearly takes up one third of the average research budget, and many teams find it hard to cover salaries that are two or three times higher than other research expenses—while also paying for the tools and resources needed to do the work. Many participants also said research salaries seem high compared to other research costs

This sentiment matches what we found in our 2024 UX Research Salary Report: fewer ICs and managers are making over $150,000 than before. This trend is seen elsewhere too. In 2024, the BBC reported that salaries for new tech jobs in the U.S. are stagnating or even dropping. A recent Dice report showed entry-level tech roles are being hired at lower pay year after year. However, that same report found that companies are investing more in retraining skilled professionals who haven’t yet hit top pay. 

With ongoing talks about AI and democratization, many teams are trying to do more research with fewer dedicated hires. They’re focusing on tools, different methods, and training to improve efficiency before adding new research roles. That shift suggests the challenge isn’t just temporary—it’s a long-term strategy. For many, the answer isn’t simply “hire more,” but “invest wisely” in the people or tools that best link research to business results.

So let’s take a closer look at the data to understand how we arrived at that big picture:

Research budget sizes are spread across teams

But those with low maturity are more likely to have smaller budgets.

We asked respondents about their total research budget last fiscal year.1 In this report, we will refer to budgets in four categories: low- (<$25K), low-mid- ($25K+-100K), upper-mid- ($100K+-$500K), and high-investment ($500K+). For the sample, we found budgets were pretty evenly distributed. While the plurality of research budgets (29%) fell in the lowest investment group, each corresponding budget tier had a similar distribution—even when looking at industry, maturity, and company size. One fifth each had low-mid investment or high-mid investment. And just under one fifth (17%) said their budget was in the highest investment group.

bar chart titled What was your total research budget last fiscal year?

For the most part, budget size didn’t correlate with maturity. For the highest and middle maturity teams, budgets were equally distributed across the four budget categories (low, low-mid, high-mid, and high). The only exception to this was low maturity teams: 73% of those who rated themselves a 1 or 2 in this category fell into the lowest budget bucket—suggesting that less mature teams may still be building their case for funding.

1. Last fiscal year/this fiscal year: Not every budget resets in January. We created our baseline budgets based on last fiscal year’s budget so it’d be a static number for every participant. When we asked comparative forecasts in regards to the current fiscal year, we referred to an organization's currently used budget.

The typical profile of a research budget, by budget size

Click through the graphics to get a good sense of the companies, teams, and research strategies behind each budget size.

Research budget profile for low investment (<$10K-$25K)
Research budget profile for low-mid investment ($25K+-$100K)
Research budget profile for upper-mid investment ($100K+-$500K)
Research budget profile for high investment ($500K+)
hand drawn right arrow
hand drawn right arrow

Budgets are staying the same or growing year over year

Only 17% of participants said that their research budget shrank.

We asked participants how last year’s budget compares to this fiscal year’s budget so far. If it was still being finalized, we asked them to estimate. Close to half (41%) of participants said their budget for this year was about the same as last year, indicating stability in funding for a large portion of teams. Another 35% of respondents said this year’s budget grew. On the other end of the spectrum, 17% said their budget shrank compared to last year.

bar chart titled How does the research budget for this fiscal year 
compare to last year’s?

Of the respondents that said that their budgets grew significantly, 31% said they had achieved the highest individual level of maturity, aka “user-driven” status. One of those who said their budgets shrank significantly, none had achieved this status, suggesting a strong correlation between higher maturity and budget growth.

Headcount, tools, and participant recruitment eat up the majority of research budgets 

Teams pay for an average of 2-5 tools with their research budget.

On average, 71% of budgets were spent on three things: headcount, tools, and participant recruitment. When looking at the entire sample size, the highest budget cost went to headcount, eating up nearly a third of budgets (32%). The next highest expenditure category was participant recruitment, with a fifth of budgets allocated to recruitment services and participant incentives. And rounding out the top three expenses was tooling/platforms, with teams spending nearly one fifth (19%) of their budgets on this category.

bar chart titled How would you estimate last year’s fiscal budget was allocated across the following categories?

We asked participants to estimate how many of their tools or platforms are paid with research funds. The majority (58%) of participants said that their budget accounts for 2-5 research tools.

bar chart titled How many tools or platforms are paid 
from the research budget?

Okay but… in actual numbers, please?

We’ve converted these percentages into numbers by budget size and average allocation to help you compare with your own expenditures.

Research budget breakdown - Low Investment (<$10K to $25K)
Research budget breakdown - Low-Mid Investment ($25K+ to $100K)
Research budget breakdown - Upper-Mid Investment ($100K+ to $500K)
Research budget breakdown - High Investment ($500K+ to >$1M)
hand drawn right arrow
hand drawn right arrow
*A note on salary as a component of budget: In our 2024 UX Salary Report, we found that the median salary for junior to mid-level UX specialists was anywhere between $31,800-$110,250 USD, depending on country and location. (For this inaugural report, we did not collect location data.) For lower budget tiers that do not have enough budget allocated to cover a full salary, we are assuming that headcount/staffing comes from another/multiple budgets (i.e. Product or Design) or they use regular individual contractors on a part-time or per-project basis. We are planning to investigate this further in future reports!

The minimum definition of a majority (51%) are satisfied with their budget

But participants were more likely to be very satisfied than very dissatisfied. 

We asked survey participants “How satisfied are you with the size of your research budget?” The majority of respondents (51%) were either satisfied or very satisfied with budgets, 27% were neutral, and roughly 21% said they were dissatisfied or very dissatisfied. 

We also asked participants to estimate their company leadership’s satisfaction levels. Participants estimated that their leadership were more satisfied with the research budgets than they were themselves.

comparative bar chart titled How satisfied are you with the size of your research budget?

If participants were grading the relationship between budget amount and perceived impact on an academic scale, they’d give it a C (70%)

We asked participants to rate the relationship between research budget and research impact, with 0 being no relationship and 100 being a perfect relationship. We found participants rated a strong relationship (70 median) between budget amount and perceived impact. 

When we broke down responses by budget size, a clear trend emerged: the bigger the budget, the greater the perceived impact of research within the organization. One exception stood out—teams with budgets between $100K and $500K reported a slight drop in perceived impact. This dip may reflect a transition phase: organizations at this funding level are likely hiring their first dedicated Researcher. As teams scale their capabilities, growing pains are likely to emerge.

Line graph titled How would you rate the relationship between budget
and research impact?

When Research sets the budget, teams are most satisfied—and believe they’re having the most impact

But the C-Suite still holds the purse strings for most.

We asked participants which team or department sets the budget. For the highest percentage of respondents, the C-Suite set the budget (29%). However, Research or Research Operations (20%) or Product or Design (18%) also commonly set the budget.

Pie chart titled Which team or department sets the budget?

Surprisingly, who set the budget had no real effect on budget size. However, it did affect satisfaction. Respondents were most satisfied with their budget when Research or Research Operations set it, with 71% claiming they were satisfied or very satisfied and only 17% claiming they were dissatisfied. Their satisfaction dropped 9% when the C-Suite set the budget, but dissatisfaction only grew 2%. Of this group, respondents were most likely to be dissatisfied when product or design set the budget, with satisfaction dropping to 48% and dissatisfaction rising to 21%. 

Given that 39% of our respondents sat on Research or Research Operations teams, this higher satisfaction when those functions set the budget may reflect a sense of ownership and alignment, rather than purely objective outcomes. When looking at responses from those that didn’t sit on these teams, sentiment becomes more apathetic: Nearly a third (31%) of participants were neither satisfied nor dissatisfied with the budget.

table showing team setting budget x satisfaction and dissatisfaction rates

We also asked which team or department controls the research budget. We found that Research or Research Operations teams were most likely to be in control of the research budget (36%), followed by C-Suite or Executive (23%), then Product or Design (15%).

bar chart titled Which team or department controls the research budget?

When we asked respondents to rate the relationship between budget and research impact, we saw slight variances according to who controlled the budget. The strongest relationship was when Research or Research Operations controlled the budget (73), followed by the C-Suite/Executive control (70), and Product/Design (64). When removing Research/Research Operations team members from the sample, the pattern remained the same: The strongest relationship was with Research/Research Operations (74), followed by C-Suite/Executive (68) and Product/Design (66).

table showing team controlling budget x strength of relationship between budget and project + standard deviation

We asked participants how often their research budgets were reviewed and/or adjusted. A little over one third (38%) of participants said budgets are reviewed/adjusted annually. However a quarter said budgets were set on an ad-hoc basis, with adjustments made as needed or based on project scope.

bar chart titled How often is the research budget reviewed and/or adjusted?

Of those groups, those who set their budgets annually were slightly less likely to grow their budget year over year (-4%) and slightly more likely (+4%) to shrink it than those who set their budgets on an ad hoc basis.

table showing budget spend period x budget grew YoY + budget shrank YoY

Hardest budget justification

In an open response question, we asked participants what they thought was the hardest expense to justify spending research budget and why. The top three hardest expenses to justify were:

1. Tools and software - mentioned in 22.5% of responses

Exemplary quotes:

“It’s so difficult and time-consuming to determine and communicate ROI for UX tools. Often, executives don’t see the big picture until the software has been purchased and is being put to use, but it’s nearly impossible to get to that point without serious trust and deference to a UX team.” 
“Management thinks that you can just do it by yourself. They’ve already paid you to be the expert. Why do you need tools? Why do you need software?” 

2. Exploratory research (aka non-evaluative) - 15.6%

Exemplary quotes: 

“Foundational field research is most difficult to justify spending because of the high cost (recruitment, incentives, local moderation, translations, travel) relative to an in-lab usability study. Additionally, the research typically isn’t tied to a specific business decision or feature launch, so stakeholders don’t see the ‘tangible’ benefit.” 
“The hardest thing to justify spending a research budget on is something that feels too far ahead of its time—the kind of moonshot idea that sounds cool but has no clear path to practical results. It might be fascinating, and maybe even groundbreaking one day, but when you’re sitting in front of a budget committee, it’s tough to explain how that’s going to help anyone within the next 5-10 years. It usually comes down to that tension between visionary research and measurable ROI.”

3. Headcount - 13.1%

Exemplary quotes: 

“We rely on product teams to forecast their needs for research but most are not mature enough in their understanding of the role of UXR to include us throughout the lifecycle. So they scramble at various points to find project funding to pull us in. If research was embedded in the lifecycle we could hire more people because the teams would include the resource in their estimates ahead of time.”
“A lot of UXR has been cut and those who leave are not backfilled. Instead, vendors and contractor workers are hired but they aren’t given proper tooling or background and the research suffers. It’s hard to justify a FTE because they are so expensive with benefits, etc. So we hire more and more contract people and overseas vendors.”

Quotes have been edited for length and clarity.

Budget allocation is incredibly close to the ideal

But navigating growing headcount across other opportunities is often a tricky balance. 

The biggest difference between the percentage of what research budgets actually pay for versus what they'd like to pay for was in headcount, with participants wanting to spend an average of 3% less of their budgets on the expense; and training, with participants wanting to spend 2% more.

comparative bar chart titled Last fiscal year's budget allocation vs ideal

But if we focus only on how frequently participants either chose to spend less or spend more, we see a more nuanced story. While respondents did want to cut down on headcount costs, a slightly greater amount actually said that they wanted to spend more on headcount. Respondents also ultimately wished more of their budget was spent on training/development (39%), tools/software (36%), and participant recruitment/incentives (35%).

table showing budget item x spend less + spend more

Dream big, spend smart

With double the budget, teams would try and increase research’s impact exponentially.

In another open-ended question, we asked participants to imagine that their research budget doubled overnight. How would they deploy those funds? Participants would increase investment in the following three areas: 

1. Tools/software - mentioned in 26% of answers

Exemplary quotes: 

“The more tools, the faster the research gets concluded.” 
“Better data systems, automation, testing environments—things that make the whole team faster and smarter.”

2. Headcount - 24%

Exemplary quotes: 

“With another researcher, we could reduce the research load on our product managers, who are struggling under the weight of doing their own research alongside their other priorities.” 
“Recruiting specialized talent would enhance our expertise and broaden project capabilities. This investment would ensure that we can scale operations and handle more complex projects without overburdening current staff.” 

3. Participant recruitment/incentives - 12% 

Exemplary quotes: 

“I’d expand participant recruitment efforts—especially for hard-to-reach or underrepresented user groups—and increase incentive offerings to ensure better engagement and richer, more inclusive data.” 
“I would invest in more participant recruitment and incentives to ensure diverse and high-quality data.” 
Zendesk used User Interviews to consolidate 5+ tools into 1, save 2.5+ hours per project on recruitment alone, and improve compliance and transparency in participant outreach.

Explore the full case study.

Half the budget, twice the focus

Faced with budget cuts, teams would try to do more research with fewer resources

Lastly, in an open-ended question, we asked participants to imagine what they’d prioritize if their research budget was cut in half overnight. Participants would manage by prioritizing: 

1. Leaner types of research/methods (mentioned in 19% of answers) 

“I would prioritize high-impact, tactical research that directly supports key product decisions and business goals. This means focusing on evaluative studies like usability testing, concept validation, and rapid feedback loops–research that helps teams make immediate, informed choices and avoid costly mistakes. These activities typically require fewer resources and deliver fast, actionable insights that keep product development aligned with user needs, even under tighter constraints.” 
“Customer panel and in-depth interactions with profile customers.” 


2. Tools and software (17%)

“I’d use tools and AI to focus on research efficiency.” 
“We would prioritize tools that allowed the most people to complete studies. We would no longer experiment with new or cutting edge tools.”


3. Headcount (14%)

“Personnel is the backbone of research. Preserving key internal headcount would be critical to ensure ongoing projects continue without disruption. Efficiencies could be sought through process improvements, but the expertise of our core team remains non-negotiable.” 
“We can be scrappy if needed, but layoffs go beyond losing capacity – morale and culture shifts dramatically, and it’s difficult to recover.”  

Research teams should plan for the future

So yes, buy-in for research is getting higher each year, and often investments are too. (That said, the data in this report is shaped by responses from those currently engaged in research work, which may influence the overall perspective presented.) But, often investment comes in smaller increments that don’t always add up to new full-time headcount—and it’s not clear if they will, as tools become more impactful and democratized.

While this is true, so is the fact that teams know that to have research, you must have dedicated researchers: the question many budget holders and controllers are asking today is how many researchers are most effective. 

Ultimately, Varun Murugesan, co-founder of Apple and Banana said it best: Being good at scrappy user research is like being the Jason Bourne of research—being effective with what you have, even when you’re shot down by a tight research budget or limited in bandwidth. When budgets grow or shrink, the types of research and support changes, but ultimately research is still happening within organizations and will continue to happen.

Share on LinkedIn
hand drawn right arrow

Key Terms, Methodology, & Sample

Key Terms:

research/Research teams: In this report, we will use “research” (with a lower-case r) as the catchall term for research that happens in an organization. We will use Research/Research Operations teams (with a capital R) to refer to those with a dedicated Research department.

Participant: The folks from Research, Design, Product, Engineering, etc. teams that responded to our survey.

Budget: The total amount allocated to spend on research, including headcount, tooling, recruitment/incentives, outside vendors, training/development, events, etc.

Research tools: Tools/software/equipment used for research and paid from the research budget. Does not include tools that come from other teams’ budgets.

Setting budget: Determining (or working with finance to determine) the spendable amount for the fiscal year. 

Controlling budget: Approving purchases made against the research budget.

Research budget allocation: How budgets are divided up among the following categories: internal headcount/staffing, tools/software, participant recruitment/incentives, outside vendor support (excluding recruitment), training/professional development, retreats/events, and other (swag, appreciation, gifts, etc.) The allocation must total 100%.

Methodology & Sample

From March 31 to April 8, we collected responses via SurveyMonkey after promoting the survey via our weekly newsletter (Fresh Views), a dedicated email send, social media, and an in-product slideout; we posted the survey in research-related groups on LinkedIn and Slack, and members of our team and friends within the UX research community shared the survey with their own professional networks. To be included in the survey, participants needed to either control the research budget (either alone or with others), or have visibility into the research budget. If a participant did not know an answer, we asked them to skip the question and adjusted the corresponding sample size for analysis. Each response was manually inspected for fraud. Qualified participants received a $5 incentive for their responses.

Our final data set consisted of 180 qualified respondents with the following demographic breakdown:

Visibility into budget: 

We first asked respondents “Which of the following best describes you?” Participants who were not sure or did not have control or visibility into the research budget were disqualified. Of the qualified participants, 52% controlled the budget (either alone or with others), and 48% had visibility without control. 

Of this visibility/no control group, 41% knew how the budget was spent, 27% could roughly estimate spend, 22% could closely estimate spend, and 10% knew the general budget amounts.

pie chart titled Which of the following best describes your visibility
into the research budget?

Role level: 

Next, we asked participants to best describe their current role level. About a fourth of those surveyed were individual contributors—strategists, specialists, etc. (27%) or manager/senior managers (26%). About a fifth each were staff or principal ICs or C-level, director or VPs. And a tenth were either advisor consultant (4%) or other (6%), including freelancers and cross-disciplinary roles.

Which of the following best describes your current role level?

Team: 

We asked respondents what team they currently sit on. If they were on a combination team, we asked them to choose what team reflects their work most of the time. 

Nearly two-fifths of respondents (39%) sat on a Research or Research Operations team. More than a quarter (26%) reported into either UX, Product Design, or Product Management. Another quarter reported into either Engineering, Data Science, or Insights (13%), or Sales, Marketing, or Customer Success (13%). Finally, a tenth of respondents sat on other teams (4%) or were the executive/founders of their organizations (6%). 

pie chart titled On which team do you currently sit?

Company size: 

We asked participants to roughly estimate how many employees their company has. We will refer to company size in three groups: small businesses (1-100 employees), medium-sized businesses (101-1000 employees), and large enterprises (1000+ employees). Two-fifths of employees were from small businesses. A little over a third were from medium size businesses, and roughly a quarter were from the largest enterprises.

pie chart titled Roughly how many employees does your company have?

Industry: 

We asked participants to best describe their company’s industry, using Indeed’s 19 types of industries. The most represented industry was Computer and Technology, with 31% of respondents working in tech. Many respondents were also from advertising/marketing, education, healthcare, and finance.

bar chart titled Which of the following best describes your company’s industry?

Maturity

We also had respondents rate, in their opinion, the maturity of their current team or department’s level of research maturity. We used NN/g’s maturity model, paring down the six UX maturity levels into three tiers:

  • Low Maturity: (1) Absent/ (2) Limited – UX is either ignored or nonexistent or rare, done haphazardly, and lacking importance
  • Middle Maturity: (3) Emergent/ (4) Structured – UX work is either functional and promising but done inconsistently and inefficiently, or the organization has semisystematic UX-related methodology that is widespread, but with varying degrees of effectiveness and efficiency. 
  • High Maturity: (5) Integrated/ (6) User-driven — UX work is comprehensive, effective, and pervasive, or the organization is dedicated to UX at all levels, leading to deep insights and exceptional outcomes. 

Roughly two thirds of respondents ranked themselves in middling maturity. A little less than one third ranked themselves as functioning at the highest maturity. Less than a tenth of respondents said they were at the most basic levels of maturity.

pie chart titled Which of the following best describes your current team
or department’s level of research maturity?
End note:

The 2025 Research Budget Report was conducted by Liz Steelman and Ben Wiedmaier. Analysis was carried out in Google Sheets and SPSS with the support of Maria Kamynina.

This report was written by Liz Steelman, with contributions from Nick Lioudis. Illustrations and graphs were created by Jane Izmailova. The webpage was designed and built in Webflow by Holly Holden.