While operational work is intrinsic to research, Research Operations (ReOps) has evolved from informal practice into a distinct professional discipline—small in scale but growing in strategic importance.
To understand the State of ReOps today—what works, for whom, and what’s next—we partnered with Listen to conduct unmoderated interviews with 21 dedicated ReOps professionals. We explored the current ReOps experience, recent changes, and future outlook.
Professionals described how AI has transformed and expanded their remit—and they’re conflicted about it. While 9 of 21 find the work interesting or exciting right now, 10 of 21 struggle with mounting demands for faster, larger scale research, often without adequate resources to support it.
12 of 21 specialists report AI has greatly changed their roles—shifting focus from tactical tasks to strategic management and tool governance.
Read more →Organizations increasingly distinguish between “customer discovery enablement” (for PwDR) and strategic research (for dedicated researchers).
Read more →From unreliable AI-generated data quality to participant fraud, ReOps specialists face increased governance responsibilities.
Read more →17 participants reported scope growth, while 12 saw flat or declining resources (n=21). Whether this feels like opportunity or strain may relate to their company’s research buy-in.
Read more →

Before reviewing the findings below, here's important context from our State of User Research survey about dedicated Research Operations teams:
Keep these organizational differences in mind—they shape the challenges ReOps specialists face. For instance, navigating enterprise bureaucracy and democratizing for 100+ people requires different approaches than supporting a team of 10.
See the Appendix for more details about our sample.
AI has already transformed the research world. Our State of User Research survey found that 80% of research professionals already use it in their research workflow—a 24 percentage point increase from 2024. And our follow-up study indicates that ReOps specialists experience the impact of this widespread adoption: 12 of 21 specialists said that AI (and other technological advancements) have greatly changed their work—automating some of the more tactical aspects of project management, increasing the time spent managing strategic initiatives like longer-term planning, and enabling faster research at scale.
Rather than manually navigating spreadsheets, scheduling interviewing, and sending out emails, AI-enabled CRM tools automatically facilitate steps like consent form collection and tracking, as well as put guardrails on participant outreach. In fact, 11 of the 18 surveyed ReOps specialists responsible for recruiting participants reported that tools have changed their approach.
“It freed us up to do some of the other things that had been consistently backburnered because we were doing full-service recruitment,” said one participant about their recruiting tool. “We were actually able to do some things that we definitely weren’t able to do even with more people.”

User Interviews is your one simple tool to source, screen, schedule, and incentivize participants for your research.
Learn moreNow, what are ReOps specialists doing with all that time freed up from more tactical work? Ten out of 21 participants mentioned that their work has become more strategic—moving from tactical project-based tasks to managing overall research programs. For example, they now work cross-functionally to integrate research more robustly across the business—whether through agentic AI tools and workflows or by building coalitions to gain increased buy-in for integrating research into the organization’s longer-term strategic planning.
Another theme echoed by participants was spending more time on tool management. Eight out of 21 participants in our qualitative study mentioned tool management as a change in their role—whether dealing with vendors, integrating new tools with existing ones to automate workflows, creating onboarding and governance plans for users, controlling access, or advocating for budget.
“My role has become more and more focused on which tools we use and enable. I’ve always worked closely with procurement, tooling, and agencies, but that’s taken up a bigger portion of my time and also my team’s time,” said one participant.

Although most are integrating external research-specific tools into the research workflow, a couple (2/21) are even using general-purpose AI tools to build their own solutions and increase their impact.
“We’re transitioning from building systems that feel more document-based and visual to actually engineering systems that we could only have done before by asking for engineering help through an API—which wasn’t accessible,” said one participant. “Being in Research Ops right now is all about adaptability and figuring out how these agentic AI tools and workflows can work, using AI to assist with those tasks, and then determining how to bring value.”

Not all are convinced their day-to-day work will stay on this management track. A handful (4/21) of specialists who see their work as more strategic mentioned that they see this work as transitional. After AI is more cohesively integrated across the organization and questions about policy and processes are answered, they might return their focus to improving tactical tasks.
“There are times when I wish I could be more deeply involved in the day-to-day work of connecting things than I have been in the past, but I also feel like the state I’m in right now is temporary,” said one participant. “I think as many companies reconcile with the advent of AI and what it means for their organizations, we’ll get past this point.”

Democratization is now mainstream: 71% of theState of User Research 2025 respondents said their organization has people who do research (PwDR). Historically, we've heard from ReOps professionals that democratization brought increased demand and interesting challenges into their roles as they worked to effectively scale research. In our follow-up study, 13 of 21 ReOps specialists indicated that AI tools are furthering the demand for research in their organization or has surfaced more research opportunities—whether conducted through dedicated Researchers or PwDR.
“Previously, it was pulling teeth to get a product manager, who is under pressure to deliver quickly, to spend time analyzing feedback, writing a research report, and preparing a presentation,” said one participant. “Having access to AI tools and embedding them in our research process, particularly in the feedback analysis and research insights generation stages, has definitely made a difference. It ensures that the people conducting research are actually able to draw conclusions from their work and share those learnings with the organization.”

While demand for research has grown—so has the knowledge that not all research should be democratized. Twelve of 21 ReOps professionals said their company had shifted its perspective regarding democratization of research over the past few years. Generally, organizations now understand that to increase the scale of research at a company, PwDR need to be enabled to handle lighter research—like talking to customers and validating designs—but there need to be guardrails, and dedicated Researchers’ time is best saved for studies with a wide scope and/or complexity. One participant even named the difference as “customer discovery enablement” versus Research, noting that while these activities should have distinct boundaries, they should align under a shared vision.
“I think that Research Ops is actually driving that shift in the change of perspective. A lot of us fully understand what Researchers can do and what they want to do. We’re protecting, for lack of a better way to say it, the Research practice [...]—not trying to replace them,” said one participant.

Eight of 21 participants said that managing support for strategic democratization enablement programs alongside dedicated Researcher’s studies has unlocked a vision of a more strategic route for Research Operation’s future, one focused on enabling the collection of insights and elevating them throughout the organization—not only via dedicated Research initiatives. This includes:
“The feedback collected through formal UX research is just the tip of the iceberg,” one participant said.

AI’s capabilities come with new challenges. AI excels at seamlessly connecting workflows and significantly reducing tedious administrative tasks. But it can't do everything—especially when accuracy and expertise are required. While the vast majority of researchers are using AI in their research workflow, the State of User Research found that 91% are concerned about the accuracy of output.
While AI tools have automated processes like collecting consent forms and scheduling interviews, newer features like automated tagging and analysis features transform raw data in ways that raise questions about reliability and trustworthiness.
“While you encourage people to use AI to analyze the feedback data, they also need to review the output of the AI to make sure it resonates with what they heard from those interviews. They need to avoid putting out something that is a result of an AI hallucination,” said one participant.

ReOps professionals in democratized organizations have historically helped PwDR increase rigor through best practices and legal and ethical guardrails (like obtaining consent forms) to reduce risk for the organization. Now they face an additional layer of quality control: preventing AI misuse. Of the 16 ReOps specialists surveyed who work on democratization or self-service, 12 mentioned an increased focus on governance and quality control.
“We don’t want a situation where people just go wild and start falling down on governance,” said one participant.

Beyond scrutinizing outputs, ReOps specialists are also managing data privacy and security considerations for inputs as well. ReOps professionals—especially those in organizations with strict data policies—are finding they must ensure that AI-enabled tools have guardrails to prevent processing personally identifiable data or using it to train models, as well as ensure proper data storage.
“Tooling is my biggest problem right now, and it's not about finding the tool; it's getting my company to approve the tool that is the problem,” said one participant.

Quality control concerns aren’t only driven by researchers’ AI use. Four out of 21 ReOps specialists in our study reported that combating increasing participant fraud is a primary focus in participant recruitment and management. Incidents include participants using AI to circumvent open response screeners and even using AI during live sessions.
“We're also encountering a lot of junk data from AI throughout the process, making it increasingly difficult to find quality participants in many areas,” said one participant.

User Interview’s advanced fraud detection model is trained on 20 million unique data points to proactively safeguard your study.
Learn moreTo counteract fraud, ReOps teams are increasingly focusing on participant quality. Some tactics include recruiting more niche profiles, employing external vendors for participant recruitment, and hand-screening or double-screening participants. Four out of 21 ReOps participants said that their work is actually more hands-on now than it had been in the past—personally sending emails and text messages as well as calling individual participants to validate their profiles, increase show rates, and maintain quality.
“I’ve kind of gone back to the basics of research operations, focusing on manual recruitment and the old ways of calling and getting to know future participants in a more human and authentic way. It’s been really interesting to return to the bare roots of what research operations was like,” said one participant.

The most common concern among Research Operations specialists? Pressure to do more—often with fewer or the same resources. Seventeen of 21 participants said they were asked to increase their scope—whether that be in operational complexity, vendor and participant management, PwDR scale, AI experimentation, participant and data quality measures, or niche recruitment projects. Additionally, 12 of 21 respondents said that they had fewer or the same resources to meet these expanded demands. Examples of constrained resources included budget, ReOps staff, timelines, panel reliability, oversight, and recruitment channel efficacy.
“There is a push and pull between doing more with fewer resources and the pressure to use AI tooling to automate tasks, even when the quality bar isn't there,” said one participant, “Additionally, the need to integrate tools and onboard them requires a significant amount of time.”

While the pressure is widespread, how it affects a Research Operations specialist’s view of the future may relate to the level of buy-in for research within their company. In our follow-up study, nine out of 21 ReOps specialists brought up their company's buy-in—whether they have it or lack it—as an indicator of how it feels to be in ReOps right now. (For broader context, the State of User Research 2025 found that slightly more than half of respondents say their leadership supports research.)
When leadership recognizes not only the increased demand for research but also the opportunity for more strategic infrastructure to support and bring value to the business, ReOps specialists often see an exciting future—with opportunities to facilitate AI integration, train models on research data, and enable democratization across the company. Eight out of 21 participants mentioned that they see growing recognition of the importance of ReOps.
“I feel encouraged and inspired by the trajectory of the field, as we are seeing more of these roles and more people in the forefront, at least in my small section of the world,” said one participant.

Three of 21 ReOps specialists even stated that they are seeing organizations rehire for ReOps positions after previous layoffs over the past couple of years.
“I can see now that people are realizing, ‘Okay, maybe I regret [laying off ReOps],” said one participant. “Maybe we should have a Research Operations [team] here to ensure we have reliable data to make decisions.”

However, when leadership doesn’t see this more strategic opportunity for ReOps to grow alongside AI as valuable to the business, the future doesn’t seem so bright. Six out of 21 participants brought up the question of how many roles will be available in ReOps in the near future.
“I think the future involves one Research Operations person handling many tasks with the help of AI,” said one participant who is the only ReOps specialist at their organization.

While ReOps specialists know that the tactical work of facilitating research at companies will most likely always be there, some worry that their improvement of research systems and processes will eventually make many of their jobs redundant.
“We don’t shape the future of the company. If you’re not supporting strategically important projects, then you’re not involved in important decision-making and parts of the product journey,” one participant said. “We are seen as service providers and not important agents in the whole process. Research is not really valued at my company anymore. I think the C-level has always been skeptical anyway—they don’t like science.”

The opportunity areas for ReOps are evident. AI has allowed ReOps specialists to increase efficiency in their own tasks, freeing up time to work on more strategic initiatives. It has also increased demand for research across organizations—beyond dedicated research teams—increasing visibility and opening up greater opportunities for collaboration and driving business impact. However, given AI’s unreliable and nascent nature, the need for governance and quality control work is accumulating across ReOps specialists’ workflows. Overall, this has created an increased scope of work for most ReOps professionals, and many are doing so without additional resources.
While this pressure is stressful, the outlook of ReOps specialists generally depends upon whether their company demonstrates buy-in to the value of research. On one hand, those working at companies that see increased work as an opportunity to bring more value to the business with more strategic infrastructure often feel more excited or optimistic about the future. On the other, without this buy-in they often feel more pessimistic or anxious about their longer-term job prospects.
The future may be uncertain for some, but that doesn’t mean ReOps professionals need to wait and see what happens. One strategy recommended by many professionals we’ve spoken to at User Interviews (and reinforced by participants at companies with strong research buy-in) is taking proactive steps to build effective relationships and track impact.

Research pro Meg Pullis Roebling, shared her strategy for getting stakeholder buy-in on an episode of Awkward Silences. While her approach comes from her work at large organizations, the basics can be used at a company of any size.
Listen to stakeholders to understand their resistance. Find your allies: You won't have to convince everyone at once. Identify the one PM or UX designer who gets it and wants your help. Rather than trying to get in the room with stakeholders, bring them into yours. The best way to build trust is to show your work—invite them in and show them how you do your day-to-day job.
Focus on the tactical and show value quickly. While this might not immediately get you that additional Research Ops hire, it will demonstrate what you and your team are capable of—setting you up for longer-term success.
Once you have momentum, you can “push your advantage” to grow your team and resources. This requires tracking your numbers and making them visible to stakeholders.
Tip: Learn how to track the impact of your research and use our UX ROI & Impact Calculator to turn your impact metrics into a calculated return on investment and see how your team is making a financial impact.
If your organization doesn’t yet have a dedicated ReOps team, here are some key stats from our State of User Research 2025 data to start the conversation on the value of adding ReOps:
Research teams enabled with dedicated ReOps specialists also track key qualitative and quantitative indicators at higher rates than those without.
However, it’s important to note that while ReOps may better enable tracking, they are also mostly enterprise teams and may have more formal requirements around tracking compared to those at smaller orgs.
The majority of ReOps teams (63%) are definitively fewer than five years old. Building a mature team takes time.
In our State of Research Strategy report, we found that, in the case of dedicated Research teams, 69% of those at organizations with 50+ Researchers said their team was older than six years, and 56% of respondents at companies with 100+ Researchers said their teams had been established for more than 11 years.
Relative to their Research counterparts, ReOps is a newer discipline. This means that scale will come with time, but also that there's ample opportunity to help shape what it becomes moving forward.
Non-Researchers whose titles fall into other categories (Design, PM, Marketing, etc).
With a capital “R,” people whose titles include UX/Product/Design/User Research, or similar terms. Also called “dedicated Researchers” or UXRs.
With a lowercase “r,” a catchall term for people who do or support research at least some of the time (a.k.a. all our survey participants).
People whose titles include Research Operations (Ops) or similar terms.
Collective term for dedicated Researchers and ReOps professionals.
When written with a lower-case “r,” research teams refers to everyone involved in research at an organization, whether or not they report to a Research department.
Of respondents who noted they had a dedicated ReOps presence, 76% were employed at enterprise companies (>1,000 employees), 19% were at SMBs (50-999 employees), and 5% were at emerging companies (<50 employees).
Two-fifths (41%) of respondents with a dedicated ReOps function said their organization had one ReOps team member. About a third (31%) had between two and five ReOps employees. 17% had between 6 and 19, and 11% had 20+.
For more information from the sample of State of User Research 2025, check out the report.
The following is the audience make-up of our Listen follow-up report with 21 ReOps professionals.
How much of your time is spent supporting other people who do research (i.e. those who conduct research for your organization but it is not their primary function?) (n=21)
1-24%
3
25-49%
3
50-79%
7
80-100%
8
How much of your time is spent supporting dedicated Researchers (i.e. those who primarily conduct research for your organization?) (n=21)
0% (I never do this)
4
10-24%
2
25-49%
2
50-79%
4
80-100%
9
Which industry category best describes your current company? (n=21)
Information Technology (IT) & Cybersecurity
4
Finance & Insurance
5
Technology & Software
4
Other
8
How many people work at your organization? (n=21)
2-49
2
500-999
4
1,000-4,999
6
5,000-9,999
4
10,000+
5
Are there any dedicated researchers at your organization? (n=21)
Yes
15
No
6
How many dedicated Researchers are at your company? (n=15)
1-5
3
6-19
6
20-99
6
How long has your company had a dedicated Researcher(s) or Research team? (n=15)
2-5 years
6
6-10 years
4
11+ years
4
I don't know
1
How many dedicated Research Operations specialists are at your company? If you don’t know, provide your best estimate. Do NOT include the number of dedicated Researchers indicated previously or other people whose primary job involves research. (n=21)
1
9
2-5
8
6-9
2
10-19
2
How long has your company had a dedicated ReOps specialist or Research Operations team? (n=21)
0-1 years
4
2-5 years
16
6-10 years
1
Are there any other people who do research at your company? (These are people who often conduct research at least once a month, or spend 10% or more of their time on research. Do NOT include the dedicated Researchers or Research Operations specialists indicated previously. ) (n=21)
Yes
16
No
3
I don't know
2
How many other people do research at your company? If you don’t know, provide your best estimate. Do NOT include the dedicated Researchers or Research Operations specialists indicated above. (n=16)
2-5
3
10-19
3
20-49
3
50-99
3
100+
4
Which department head do you/your team report to? In other words, in which department would your role “sit” in an organizational map of your company? If your team “rolls up” into another department, please select the latter. For example, if Research is its own team but the Research leader reports to the VP of Product, please select “Product.” (n=19)
Product
7
UX/User Experience
5
Design
4
Other
3
What country do you currently live in? (n=21)
United States
12
Brazil
2
Canada
3
Other
4
Which of the following most accurately describes your role? (n=21)
Individual contributor (junior to mid-level)
5
Individual contributor (senior to staff/principal)
7
Manager/team lead
7
Other
2
How many years of research experience do you have? (n=21)
No research experience
2
3-5 years
7
6-10 years
8
11-15 years
2
16+ years
2
Which of the following statements most accurately reflects the primary way you first acquired most of your user research skills and knowledge? (n=21)
I learned on the job
10
My university degree involved research in UX, HCI, Anthropology, Psychology, Sociology, Human Factors, Statistics, or a similar discipline.
10
I am self taught
1
This report was written by Liz Steelman, with contributions from Morgan Koufos and Nick Lioudis. The webpage was designed and built in Webflow by Holly Holden with graphic support by Jane Izmailova.
The 2025 State of Research Operations Follow-Up survey was conducted by Liz Steelman in partnership with Listen Labs. From October 28 to November 6, 2025, we recruited ReOps specialists via members of our team and friends within the ReOps community sharing the survey with their own professional networks. Respondents were screened via User Interviews and 21 responses were completed via Listen Labs. Qualified respondents received a small incentive for their participation upon completion. Supporting analysis was then carried out in Google Sheets and Listen Labs.
The 2025 State of User Research survey was conducted by Liz Steelman and Morgan Koufos. Analysis was carried out in Google Sheets with the support of Maria Kamynina and Jessica Hays Fisher.
From July 25 to August 9, 2025, we collected 485 qualified responses via social media, our weekly newsletter (Fresh Views), and an in-product slideout; we posted the survey in research-related groups on LinkedIn and Slack; and members of our team and friends within the UX research community shared the survey with their own professional networks.
We are extremely grateful to our partners and everyone from the UXR community who shared the survey and contributed to the success of this report. Special thanks to Listen, Cha Cha Club and UXArmy!
Above all, we are indebted to the participants who took part in our surveys. Thanks again!
Published: December 3, 2025
Want more insights? Read the full State of User Research 2025 Report.
Read the reportMost teams are guessing at strategy. Yours doesn’t have to. Download our Strategic Recommendation Checklists to see where to invest, what to fix, and how to level up—tailored to Early-Stage, Growth, or Mature research teams.