SUBSCRIBE TO OUR NEWSLETTER
How to build a research operations practice that can withstand change, pivot quickly, and speed up your user research process.
[2:14] Research operations as connective tissue within an organization.
[8:45] To make your operations scalable, build the smallest operable system first.
[16:34] Slack's Rolling Research Program.
[21:54] How Joey knows if his research ops program is working.
[24:48] Trends in participant recruitment during the pandemic.
[28:08] Research ops as a signal that user research as a field is growing in importance.
[36:08] How to measure success in research ops.
[36:42] Checking your biases in civic research.
Joey Encarnacion is the Senior Research Operations Lead at Slack. He has been working in research operations since 2017. Prior to joining Slack, he was the Program Manager for Experience Research Operations at Airbnb. He’s a black belt cat herder and loves bringing organization to chaos.
Joey: [00:00:00] I think for me it's definitely like very people oriented talk, talk to the people that you support. Are they really finding it valuable? Are they not finding it valuable? And then don't just talk to them, but like really like dig in, do your research
really. hello everybody and welcome back to Awkward Silences. Today we're here with Joey Encarnacion. He is a senior research ops lead at Slack. Thanks for joining us, Joey.
Thanks for having me.
Erin: [00:00:51] I got JH here too.
JH: [00:00:54] Yeah, I think Erin and I are both Slack power users, so I know that's not the topic, but I'm sure it
Erin: [00:00:58] Uh, I am a Slack apologist to the core. I love Slack. I really do.
Joey: [00:01:04] Wow. That's cool. I, you know, was I've used Slack before at previous roles. Never quite as extensively
But but I've grown to enjoy its capabilities quite a bit more while working here.
Erin: [00:01:19] it's mainly emojis and everything else.
Joey: [00:01:23] That's the best one, honestly.
Erin: [00:01:24] the best one. They're pretty good. Uh, But Joey, we're here to talk about, let's call it agile research ops, meaning research ops right. Is a pretty hot topic, at least in UX research circles these days. And you know, you can build a system that is Fragile or a system that can withstand change.
And this is obviously been a year of tons of change, seismic change. And today we want to talk about agile user research ops. So how do you build operational systems for research that can adjust to life as it actually happens. And how have you done that in your experience, Joey? So, I know one of the thing you talk about when you talk about research ops is research ops as connective tissue.
What does that mean?
Joey: [00:02:14] Yeah, connective tissue. That's kind of a term that I borrowed from biology and connective tissue is one of, I think four or five types of tissues that you find in animals. And generally you find it in between other types of tissues hence the term connected tissue. And really what I mean by that is I think that research ops methods should be focused on connecting disparate systems, both within and outside of research.
And what I mean by that is, think about all the tools that you have at your disposal. As a research org, maybe you use user interviews, maybe you use user testing, you use zoom, you use Slack, you use email.
What are the ways that you, as an operations team can connect those pieces to make sure that the work that your researchers are doing or the people that you're supporting don't have to do any heavy lifting between the systems. How do you essentially erase the seams and create a seamless space, right?
Like if you want to make something seamless, you got to focus on the seams and the seams are where those tools connect together.
JH: [00:03:28] Nice. It's part of that as well. Like I think of connective tissue, again, not a biologist, but holding things together, connecting it, obviously in the name, but also being like a little like flexible and stuff. So if something is drifting away, it kind of keeps it together to some degree. Obviously at some point you can have injuries and things like that, but is that part of research ops as well of like, We're fighting with this tool a little bit, or we're switching here and like, we're just kind of keeping it glued together somehow.
Is that part of the metaphor?
Joey: [00:03:51] Yeah, I guess, I guess he could have that as part of the metaphor also. I don't think that I thought of it that way, but, and now that you bring it up, yeah, I think that's a really good point. You know, a lot of what you do in any operations team, not just research operations is flex to meet the needs of the people you're supporting.
And in that way Yeah, we are connected tissue in that way as well.
Erin: [00:04:13] I like the flex reference, no pun intended, connective tissue doing all sorts of things. And so like connective tissue, there's sort of like these inner, like layers connecting together, like, like real tissue or bigger tissues. Right. And it's a great point because those tissues without. Like connection can actually function.
We talk about silos and organizations, right? Where this is happening over here, and this is happening over here. And maybe there's some value in that, but if those I'm going to mix metaphors now, but if they can't talk to each other, like, right, you're obviously minimizing the impact you can have and worse.
Right. And when we think about different teams trying to work together to. Solve user pain points. Having those layers that are focused on connecting these existing systems is especially important. Right?
Joey: [00:05:03] Right. And I think that's one of the biggest revelations that I've had in sort of transitioning into working in the COVID era. Is that one of the big places that research ops can provide value is in helping your research team interact better with the different product teams, the different design teams that they work with.
So in a sense, as a research operations team, you're not just supporting research, you're supporting every team that interacts with research as well.
JH: [00:05:33] Part of this, right, is making the connection between two tools or systems. Like, you know, we need Zoom to talk to something else. And we've identified that as being beneficial, but it's also like knowing what to connect. Right. So figuring out the connection is one piece, but also this does this need to connect to this or not like, how do you think about that part to like make the right connections, so to speak.
Joey: [00:05:50] Uh, That's a great question. I think for me, most of it comes down to. I take a very design oriented approach to that. So I look at the actions that people want to achieve. And then I look at the steps that they currently have to take to achieve those actions. I think a good example of this is when I was working at Airbnb.
So I used to work at Airbnb when I was bringing it Airbnb. We had labs that were fantastic. You would walk into a lab. You would hit a record button on the video mixer, and you would record all of the footage and all of the audio that you wanted. I took a look at that and noticed that what was happening is that researchers would record everything in the room.
Then they would have to go grab those recordings and then they would have to take those recordings and upload those to a transcription service and potentially other services as well. And I looked at that flow and I told myself like, wow, that's a lot of steps to kind of just get the thing that you actually want, which is for example, at the end state the transcript.
So how do we remove those boundaries and just say, okay, after you record, you should just get the transcript, right? Like if we were to dream of a perfect world, you'd never have to do any sort of file moving or anything like that. takes a few clicks, but it's still you know, jarring to your workday to have to take some time to do that.
So what we did is we worked with a couple of different vendors, a couple of different engineering teams, and we built pipelines where everything that was recorded was automatically uploaded to a different file system. And then from there automatically, also again, transcripted, so that researchers only had to hit the record button and then they would get the transcripts automatically.
Erin: [00:07:41] When you think about you're looking at, you know, folks you're working with in their workflows, right. And what are the repeated tasks and how do we either make those faster, better, easier, more streamlined, right? Those sorts of things. How do you think about designing those systems and you know, like how heavy they need to be or how much work needs to go into building the system versus.
Is that workflow going to just change like next month or right. Like, how do you think about how heavy or light to make those systems and how that might impact how hard those systems themselves become to then change as they need to change, to adjust to, you know, the kinds of research you're trying to do, for example.
Joey: [00:08:26] That's a really good question. So I you know, design methodology in terms of thinking about what we should be doing, but in terms of how we should be doing, I actually do take a pretty agile approach to most of these things. So, you know, really what we do is when we're building out systems, we build the smallest operable system first.
And if over time it becomes something that is a mainstay, something that people want to do constantly, something that needs to be evergreen. Then we bolster it up and we sort of reinforce it and we're like, okay let's make sure that it's really sturdy, doesn't fail very easily. But when you take that approach, it also means that if it turns out that it's something that is a little bit less evergreen, something that maybe is used to address a problem that is a little bit more transient, something that's a little bit more temporary. That means that it's really easy to move on from it and sort of dismantle it as well and just say, well, it was good while it lasted.
That's, let's move on to sort of a new system instead and actually we have started to build out a system like that here at Slack to address some of the needs of our product team.
Right now we have more product need, I suppose, more product team demand for research, then our embedded researchers can supply. And so we're working on our first iteration of a system called rolling research here at Slack where we intake requests from product teams and try to give them turnaround research within two weeks start to finish sometimes as fast as 48 hours, sometimes two weeks, but you know, somewhere in between usually And yeah we started off really small and we started off with just just a very particular use case there.
For transparency, we are using a great partnership with user testing currently. Yeah it's just been really good and really scalable so far, I want to say. We sort of started the partnership at a much smaller scale to start with, just to see sort of like what the demand was like while people are going to interact with it.
And now as we've seen the demand grow quite a bit and people's satisfaction with the program. Go way, way up. Now we're working on like, okay, how do we make this a little bit more solid? How do we make this something that's a little bit more evergreen? And I think that's a big mistake that a lot of people tend to make where you try to come up with the perfect solution first and you spend all this time building out something, and then it turns out, Oh, it looks like we're going in different directions.
And now you've lost all that time.
JH: [00:11:14] For sure. How do you think about the other side of it? So I agree with all of that, right? Like very iterative figure it out as you go approach, that's kind of how I'm wired as well. And I do a lot of that at user interviews, but there is like the hidden cost of like, is you're always like changing systems or changing tools.
Like I played around with different ways to do product documentation and stuff, Um, and I do think we're getting better over time, but then you're like, Hey, where's that doc from like 2020 that we did. It's like, Oh, that was in a whole different system. Do you run into that at all?
That is like tough. Or do you have like a change log or how do you balance that side of it?
Joey: [00:11:41] Yeah, that's a great question. You know, right now we don't really have a change log. That's a great idea. Something to think about, but I think knowledge management as a whole is a really difficult thing. And so in some ways we are trying to apply this agile methodology to figuring out how we should approach knowledge management, which is kind of like a weird or ouroboros way of like us circling back on on that.
But essentially what we're, what we've found in research shout out to Mike was Simi here at Slack. Is that in terms of knowledge management in particular or sort of, I guess, building up systems of documentation or deciding on what systems to use. We find that really what most people do is they just ask other people.
That's like the most common thing that people will do is they will say, Oh, I there was this document. Let me just ask someone else. Right. And Oh, it was in a separate system. Let's figure out where it was. And it's like, sometimes it just gets lost, but the essence of that document probably still lives with someone in the company.
Like what was on that document, probably still lives with someone in the company is concept of knowledge. That is just sort of like worming its way through an organization and sticking with people, even though there's no real document or source of truth for it anymore. And so one of the things that we are looking at is this idea of, okay, well, what if you just did away with the idea of having things written down in documents?
Right. A crazy, crazy idea. But what if you had a way to manage knowledge without actually having it written down anywhere, which is kind of a crazy thing to think about, but one of the ways we are thinking about doing that is, is building out a bot system where a bot kind of like manages that knowledge and you ask questions to the bot and the bot gives you the answers.
Of course the knowledge still has to be stored somewhere. But it's kind of doing away with this idea of there is a document and go read it.
Erin: [00:14:30] Yeah. That's really interesting. I Slight tangent, but just thinking about, you know, the like research management systems and you know, all this sort of tagging and like information architecture around, how do you organize this, these insights? So they're evergreen and accessible by people, you know, who might want to access them like years into the future as part of a completely different research question from a different team, but someone's already done the research.
And how do you find it? And the systems you can think of to do that, like really rely on people spending a lot of time, tagging things in a consistent way, or like really having this unified vision of metadata. And I do think there's an interesting world in the future of knowledge management that takes advantage of the full promise of AI and just lets the robots make sense of all of it.
JH: [00:15:17] It's tough too because the language at an organization changes over time, too. Right? Like you used to use this term for something, and now use this acronym and like that morphs, like, which is just happens in general, but. It makes it really tough.
Joey: [00:15:29] Yeah. I hate acronyms. It's like it's acronyms to me are focused on the usability for the creator, not the consumer.
It's like, yeah, I shortened the amount of characters I have to type, but it also means that anyone reading this has no idea what it
Erin: [00:15:47] Yeah, but I have to spell out what that acronym means every time I type it.
So right. How much
JH: [00:15:51] but the dumbest one I ever came across from a former company was a, there was an acronym P, which stood for something. But then somebody also made a version that was called P three X, like three PS. And it was like, you don't need two acronyms for this thing. It's the same number of letters.
It doesn't even save us
Erin: [00:16:06] So good.
Joey: [00:16:07] What does it mean?
Erin: [00:16:09] Okay, so let's back up a second. So we were talking, you were talking a little bit about you know, how research ops is serving through the rolling research program, right? Where you have I'm inferring. It sounds like you have kind of embedded researchers on the one hand and then like a central body of researchers who can help with, you know, like reactive requests.
Is there, how are you structured and how does research shops fit into that?
Joey: [00:16:33] That's actually a really good way of putting it reactive. So we our research team we have a small body of embedded researchers who are embedded in different product teams. And then we have our ops team, um, important to note that the researchers also includes survey scientists and data scientists as well.
So not just qual. The ops team is myself and one other person Nicole Anselmo she's great. And our role in research program is a partnership with user testing, utilizing their professional services team.
So, you know, we were thinking we were racking our brains, like, okay, how do we scale up doing research?
You keep hearing that product teams want to do more research than we're capable of doing right now. And we thought about a bunch of different options. One was or could have been a disaster. It could have been great, we don't really know yet, but one was you know, like let's give product managers and designers, keys to templatize studies and just like drop them in there and say go which, you know, we figured we could. Be comfortable with our stakeholders, doing the research. Where we found we were a little bit uncomfortable was more along the synthesis. Just because it's easy to fall into traps. If you're not a trained researcher of interpreting things in a particular way, maybe not. Checking your biases, maybe not understanding that the insight that you're getting is potentially limited to the population that you're talking with and population biases as well.
And so we played around with that idea. And then as it so tends to happen, when you're in operations, we landed on, okay, Joey you will have templates and you will program things into the templates and then we will have you know, we'll trust your ability to interpret this stuff. You've been around research long enough.
You've done research yourself. And so we ran with that for a little while. That was kind of like our very small pilot program. And then when we moved into like, okay, the demand here is actually very high. How do we scale this up even more? And so we looked into different professional service partnerships that we could have approached more a service model rather than an embedded model.
And then we landed on user testing. So, so it's almost like we have a mini research team that is a service model within the embedded research team.
Erin: [00:19:08] Is that primarily usability testing or what sorts of
Joey: [00:19:12] Yeah. Great question. So we try in rolling research to focus more on the evaluative side of research rather than the generative side of research. Mostly to keep turnaround times tight. Mostly because also because we want to make sure that the generative type of things, that we're doing stuff that tends to be more foundation on more nebulous research questions.
Those are a little bit harder sometimes to look at, if you don't have the context that you would have working at Slack. And so we wanted to make sure that we weren't also overstressing our partners at UserTesting and saying like, Hey, can you tell us about how you know, how do people think about channels in Slack?
It's just such a broad question. And so we try to keep it more along the lines of like, okay, Hey when users use the search bar, what happens?
JH: [00:20:05] question, just on a different note, circle back to something a little bit you mentioned always trying to take a design approach to being really iterative and understanding where you can make the path smoother and stuff.
The flip side of that, and then this happens in product work all the time, right. Is people who are kind of your stakeholders. So product managers, designers, whoever, I assume just come to you sometimes with like specific asks, right? Can we just get this feature coming into this? Do you like always unpack those to understand the need?
Or are there times where it's just like, they give you a pretty specific request and you're like, yeah, this fits in. Like, let's just do it.
Joey: [00:20:31] That's a great question. We do always say take the time to sit down and unpack. We never jump in to a request without first really understanding what their request is. Which is funny because we love automation here at Slack. I love automation in general, and we do have this very nice flow that folks go through to request support.
From us, but we always take the time to even a five to 10 minute meetings, sit down and say like, Hey, this is, this is what I think you wanted. Is this what you wanted?
Just to get that sign off. And then, and you know, from an operational standpoint, it's also kind of a CYA cover your,
JH: [00:21:10] Ass. can swear on this.
Joey: [00:21:11] Yeah. Okay.
Cool. All right. Awesome. I used to be a kindergarten teacher, so I like really try to watch my language here. But yeah, it's also just a CYA for like, okay. You know, I checked in with you on what your request was and you said it was this, and now you can't tell me that what I delivered wasn't what you wanted.
Erin: [00:21:32] Yeah. Um, So you talked a little bit about the rolling research and I'm curious to it's always good with research ops to make it more tangible, right. In terms of some of the changes you've made or ways that, you know, you've specifically thought about building systems that are able to be changed or how do you know when it's time to change a system, right?
Like what are some
Joey: [00:21:53] Okay. Oh boy. Yeah. Let's tackle the second question first. How do you know when it's time to change a system? One of the big things that I've learned in my time in working in an operational capacity for a long time not just in research ops is metrics. If we're not measuring what you doing, then you'll never know if it's changing.
And so one of the things that we like to do is measure the number of hours we're spending on projects. We're measuring how many projects we're getting every week. We are taking a look at a longer timescale, like, well, like if we look at the number of projects that we did every week over the entire year, are we seeing trends?
In terms of what quarter is the busiest? Are we seeing a downward trend overall? Are we seeing an upward trend overall? And I think that's really the only way to really find out when you need to change something that isn't bunking you over the head. Sometimes you'll go through a process and you'll see, like, someone will give you the feedback, like, Oh, that was terrible.
Right immediately. And yeah, you don't need long trend lines to examine what happened there. But sometimes you'll build a system and it's seems like it's running smoothly, but it turns out over time engagement is dropping off and you kind of have to figure out why. And that's kind of where you kind of have to put your research hat on and start talking to people, asking people like, Hey, what's going on here?
You submitted five projects last month, but now you haven't submitted any, is that just due to the cadence of your work? Is it because you were finding diminishing returns in using the system like what's going on? I think measuring things is very important.
JH: [00:23:37] Yeah. The time metrics you mentioned, make a ton of sense for what you're doing in research ops. How do you actually like track time though? Is it something people have to log it manually or do you have something clever there? Because that seems like a hard problem in and of itself. How do you know that?
That's actually accurate.
Joey: [00:23:51] Yeah, that's a that's tough. I actually rely on my partners at UserTesting to track that time there for that particular example. In terms of how many hours they've worked on a project. And I think there are some you know, there's some potential for reporting error there. But at the same time, it's better to have some idea than no idea, like your best guess idea versus
JH: [00:24:17] Right.
you're probably in the right, like order of magnitude and trends and stuff. If you're just being mindful about time, week to week you're gonna, you're gonna have a pulse on it. That makes sense.
Erin: [00:24:24] Let's talk about recruiting one of our favorite topics. But I know that's a kind of core area of research ops focus and one for you. And just curious yeah. What your experience with recruiting has been as Slack has evolved and as the methods of research you're doing have evolved from, in person to remote more perhaps during the pandemic or otherwise
Joey: [00:24:48] Yeah. You know, I joined Slack in kind of the middle of the pandemic. And so kind of my view here is skewed from my, my last couple of months at Airbnb moving through my beginning months at Slack. It's been. It's been kind of hard actually, I think you know, as you go as we all, I, right now we're all starting to feel this burnout pandemic, burnout, I would say
Erin: [00:25:13] starting
Joey: [00:25:15] we're just starting to feel it. And I mean, I'm sure some of us have been here the entire time.
JH: [00:25:21] that the term?
Erin: [00:25:22] Yes.
Joey: [00:25:22] Yeah. Languishing. Yes. Yes. We've all been languishing. And so, you know, reaching out to people it's. It's I would say that response rates have gone down just a little bit. But for the most part at Slack it's interesting because we are B2B mostly folks are pretty comfortable with zoom.
Folks are pretty comfortable doing things remotely. And so for the populations that we talk with it's actually, hasn't been too bad. I know that at Airbnb. Yeah it started to get a little bit more difficult, switching everything to remote because it is a B2C, business to consumer, not every one that we recruited was super familiar with zoom.
Although I would say probably most people are super familiar with zoom now. And so, yeah, it's just been a little bit It was hard and now is a lot less hard, but I think that's because the population of people like everyone in the world has been more more attuned to zoom and remote communication.
JH: [00:26:18] Yeah, the video chat literacy has been kind of a silver lining of this whole thing. It's it is cool that everyone now has some sort of baseline comfort with some of these tools.
Erin: [00:26:27] Yeah, they used to be one of the main complaints we would get from researchers is, you know, participant, they don't know how to assume they don't have it downloaded yet. They can't share their screen. The, you turn on the camera a lot less of that now. Not for the best reason, but you know, people know how to use them.
Awesome. What other changes have you seen happen in your so you've and it's, like you said, in the middle of the pandemic, you joined
Joey: [00:26:51] Yeah. I joined Slack in August of last year. And you know, it's really interesting. We did support a couple of recruiting projects upfront. And now it seems like most of the work that we're doing. Doesn't require super custom recruits. I think anyone listening that is in research ops understands that sometimes there are recruits that are super custom meaning I want to talk to users who have X, Y, and Z attributes.
That we know they have in our database. So you go through the process of getting a query written, whether you read it yourself, or you have a data scientist or survey scientist do it for you. And then you go and say like, okay, I'm going to recruit from this pool of people who we know are, I don't know using an Airbnb example because it's probably a little bit more accessible for folks.
We know these people are hosts in California. Right. Like, we know that there has in California because we see that on our backend. And I've seen that has dropped off and I'm not exactly sure why why as, but I think we're all trying to focus more on general types of research research with people who are anyway, who's using Slack right now, not necessarily specific types of Slack users.
JH: [00:28:04] Nice and on a slightly different note, but I feel like research ops is a signal that research as a whole is maturing, right? Like it's become a big enough thing that people now want to like optimize it and streamline it. Is that kind of your perspective as well? And how do you see that playing out from here?
Joey: [00:28:20] yes, that is I completely a hundred percent share that perspective. And I think that research ops is where design ops was a couple of years ago. And I think research as a whole is kind of in a similar position to design several years ago, where we're just getting to the point now where companies are starting to realize, Oh, it's good to talk to our users in an organized way.
Right. Where, you know, if you looked at companies a couple of years ago yeah. Oh, it's good to actually have thoughtful approaches to design. And so, you know, as research follows the pathway that sort of design went through, I think this in the same way research ops is sort of like following in that path where I want to say like five years ago, there weren't really any research operations jobs.
Maybe jeez flat circle and I'm sort of like mixing up my timescales, but yeah.
You know, some, some number of years ago, there weren't even any research operations jobs. I want to say you know, at Airbnb we had a pretty big research operations team. And when we had this exercise where we were looking at other companies, like, okay, like, what are these other companies doing? couldn't really find anyone that had a research ops team. When we were doing that exercise, we were looking at Facebook. We were like, Oh, okay, well, what is Facebook doing? It's like, yeah. You know, Facebook is massive. Like what is Google doing? What is Microsoft doing? But companies that were at Airbnb size, we weren't finding any research operations folks to talk to.
But I'm seeing that change. I'm seeing not that I'm looking for a job, but I'm seeing job postings and things on LinkedIn all the time now about. Research operations and also research. And I think as research matures research operations is also going to follow that up as well. And it's really cool being in sort of this space where there aren't really any rules.
I feel like, I mean, there's the obvious rules of like, don't leak PII of course, like GDPR followed CCPA. But but in terms of like how you do things. there's no like playbook. You do what works for you and what works for your team.
JH: [00:30:41] Yeah. Do you have any like bets or forecasts of things that might come in the next year or two for D
Joey: [00:30:47] Yeah. You know, I think that what we're going to see happen with research ops is that it's going to be more focused on reconciling how product teams work with how researchers work. And this is a trend that I've seen over time. If you think about the way product team works it tends to have sprints tends to work really agile to use that term again.
And if you think about that you'll have like a sprint that's like two weeks off. What research can you get done in two weeks? Right? If you look at your researchers, tend to be PhDs tend to come from academic backgrounds, tend to enjoy having lots of rigor around the work that they do to have a high level of confidence in sort of the insights they're providing.
And those two types of timelines really clash.
And I think that as research ops matures, it's going to move on from okay, we are doing participant recruiting. Okay we're providing technology support to, okay, how do we really get these two groups to gel and work together? And I think we're starting to do that here with rolling research and saying like, okay, I'm not a researcher, but I also understand that the product team wants to move fast.
How do we balance the speed with the rigor that the research team also wants to have?
Erin: [00:32:16] I think you're right. And I also think it's one where like, opposite trends can kind of happen at the same time sometimes. Right? Where. On the one hand, like we talk a lot about, we're very oriented around speed and you know, like if you can't find the participants, you need to talk too quickly. You just either won't do the research or you'll talk to bad participants either way.
It's not a great result. But that's particularly acute for what you're talking about, which is these like two week cycles, this like embedded in product kind of, kind of evaluative, right. Sort of research. But at the same time, we know a lot of organizations, particularly like smaller ones can get the budget, get the buy-in for doing that kind of research, at least in small doses, right.
Where it's like, well, we can at least run a usability tests on a prototype with tools like Figma and zoom and make it so easy to do that. But I think you're also starting to see more organizations say, well, what about the discovery research? What about making sure we're kind of getting ahead of where the market's going and, you know, kind of lifting our head up and doing more of that kind of opportunity finding research too.
And so I think it just depends on where organizations like are a bit, right. In terms of what are they leaving out of their research diet. You know, like where are those use the connective tissue example? Like where are there gaps in the research that's holding this organization together. And probably different organizations could stand to recalibrate in one direction or the other,
Joey: [00:33:47] Yeah, I agree completely. And you know, I think part of that is in the evolution of research ops is not just and we kind of chatted about this before is not just helping the research team, but also helping connect the research team to other teams and sort of building in resources for product teams.
So that product teams understand when they should be doing generative research. Like if your product team wants you to do the generative research, If you are in the middle of a sprint, what are the options to you for you in terms of research, you can do how should you get those things done? I think that's something that I've seen as missing generally from companies that I've chatted with and looked at is sort of that level of resourcing for not the research team. Hey, what should, how should we work with you? Is a common thing. And so I think that's a big space that research ops is going to
JH: [00:34:47] Yeah, it's interesting. I feel like from the research side, people will always kind of approach it from the why, what, how, right? Like what's the opportunity. Why do we need this? What's the pain point for somebody? Like, what do they need? How can we actually solve it? Right. Like, that's kind of the, you hear that framework sometimes.
But if you look at like the tool ecosystem at the macro level, we've really gone. Like how, what, why in the sense of how being, like how we build stuff with AWS and heroku and github, like got really easy to build things in a way that used to be really hard. And then the, what the design tools got great with sketch and invision, Figma, all this other stuff.
And now finally the why and the researchers maturing, and like, it's actually now starting to flip back around and kind of go the way that I think people think it should go, but it just got so easy to build stuff that everyone kind of raced and build stuff. And now it's actually kind of turning around and at least, I don't know.
I don't know if that makes sense, but that's how I think of it.
Erin: [00:35:32] Definitely a circle.
Joey: [00:35:35] time is a flat circle.
Erin: [00:35:37] Yeah, I like it. Come back to that Joey, like zooming out. When you think about research ops, you talked about metrics, right? And the importance of measuring the various systems, you know, you've built or managed or change. How do you know if you're doing a good job in research ops?
Right. In such a nascent field that is growing as research grows. Is there like a hierarchy of, you know, research ops excellence, you know, where like at the, yeah. How do you think about it? What are the success metrics for research ops and how did those evolve as research ops evolves?
Joey: [00:36:13] I think it's going to be different depending on what team you're on and what you're supporting, like what company you work for. I think everyone is going to have different things that they're aiming for. You know, in some cases it might be, Oh, we did X number of recruits. In other cases it might be, Oh, out of this number of recruiting requests, we supported this percentage of them.
Or it could be something like, Oh, we've had zero days since an incident in the lab. Right? Like a lot of different things that you could look at there. But for me, I like to keep it really simple. And I just like to talk with my researchers. I talked to them like, okay, what are the problems you're facing?
Are the things that we're putting together working for you? Because at the end of the day, really what it comes down to, even though numbers give you a direction, the impact that you have is on people. And so maybe the numbers are bad, but you talk to people and they're like, Oh, I love it. It's great.
And to me, that's success, which is not always good when it comes to like, performance reviews, especially if you're like performance reviews tend to be like very metrics oriented of like, okay, well, how come? There were only this many recruiting projects then it's like, well, there were only this many submitted, I don't know.
I think for me it's definitely like very people oriented talk, talk to the people that you support. Are they really finding it valuable? Are they not finding it valuable? And then don't just talk to them, but like really like dig in, do your research
JH: [00:37:45] Yes. This is random, but are you just like an optimizer in your life in general? Like if you like walk in your house, do all your lights turn on your kitchen is like perfectly laid out or is this just a work thing for you?
Joey: [00:37:53] This is a good question. I'm not really a huge optimizer. But I think that stems from me, not at least in my personal life, caring so much. So weird thing to say, but like, you know, I'm a pretty simple person in terms of what it takes to make me happy. You know, good food family time, need like,
I, yeah, so certainly I optimized for those things. The app more. So just having time to do those things, but you know, super optimization sometimes. Yes. If I'm faced with a problem that is really annoying, but for the most part, I don't find most things annoying so
Erin: [00:38:37] That's very in line with absence early, right? You automate the things you don't enjoy. So you have time to enjoy the things you do enjoy, like talking to users for examples,
Joey: [00:38:50] Yeah. Yeah. I think something that a lot of people can can sort of pick up both in a professional and personal sense is just letting things go. Right. I think there's this, especially in a professional context, there's this idea that, you know, there's a problem that we have to fix you? Have to,
like, you should always ask yourself that question first.
Like, do we really have to fix it? Is it really a big problem? I don't know, because like look into it. I think that goes kind of back to something we were talking about earlier where, you know, when someone brings you problems, do you dig in, it's like, it's always important to dig in to the problems.
Like, is it really a problem? Is it really a problem? No, it's not really a problem. You ran into it once and we were never able to reproduce it
JH: [00:39:41] time is helpful on that too. Right? Like sleeping on something for a day or two and you come back to it and be like, ah, maybe this wasn't that
Joey: [00:39:46] Yeah,
Erin: [00:39:47] Yeah, but I'm fired up. Yeah, totally awesome. Joey, thanks so much for joining us.
Joey: [00:39:56] Thanks for having me. Yeah, I
it was fun.
Carrie Boyd is a UXR content wiz, formerly at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.