All posts

UX Research Topics


Field Guide


Thank you! You are all signed up.
Oops! Something went wrong while submitting the form.


BlogResearch Ops & Tools

Optimizing Your User Research Tool Stack for ROI with Daniel Loewus-Deitch and Leo Smith

Choosing tools that your team will use, reduce the time you spend on repetitive tasks, and save your team money.

Carrie Boyd

With so many research tools on the market, it can be hard to nail down exactly which ones are right for your team. This week on the pod, Erin and JH chatted with Daniel Loewus-Deitch and Leo Smith, who are the Directors of User Experience and Research, respectively, at a large insurance company. They wanted to learn more about how Daniel and Leo choose the tools with the best ROI for their team. 

Daniel and Leo have spent a lot of time building out their tool stack. Since they have a lot of experience working for large organizations with many people conducting research and even more consuming it, it was important to them to get it right. In this episode, they talked about how they evaluate the ROI of tools, the summit they assembled to identify the tools their team could and would use, and how important it is to leave your assumptions at the door when tool-hunting. 


[1:44] We check in with how everyone's hanging in there during quarantine

[5:46] How Daniel organized a UX summit to learn more about the tools his team was using.

[10:15] Leo used to spend 20-30 hours just on recruiting.

[12:53] Sometimes the simplest tool is the most effective.

[14:23] It's important to consider how accessible the tools you're using are to everyone on your team. Even if you choose the best tools, your team won't use them if it's not easy to do so.

[16:23] How the team defined their user-centered design process, mapped tools to the right parts of it, and moved forward from there.

[20:43] Why Leo and Daniel's team prefer a customized toolset over an all-in-one solution.

[24:07] Applying the thinking behind design systems to a larger ecosystem is helping the team build a better toolkit.

[33:56] The shiny new tools everyone is eager to try.

[38:58] Usability test the tools your team will be using.

[42:09] Going for an all in one tool is like going to the Cheesecake Factory, lots of choices, all pretty mediocre. Choosing a few specialized tools is like going to a farmer's market, less choice from each vendor, but better results.

The best stories about user research

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Daniel and Leo shared a few guiding principles when building out your UX research toolstack. Here they are:

1. Gather input from everyone who participates in research

A tool that no one uses is \useless. Even if it’s the coolest, most wonderful tool out there, if you can’t get your team to use it, it’s not going to help you improve your research process. 

To avoid prescribing tools no one would use, Daniel and Leo held an entire summit to choose the best research tools for their team. They gathered twenty UXers at their organization and dove into the tools they were using to do their research. Daniel and Leo wanted to learn about the tools they didn’t talk about. What were these UXers using to make connections between tasks? What did they use to communicate findings to other teams? Were there tasks or tools they were using that Daniel and Leo didn’t know about? 

The summit helped Daniel and Leo answer their questions, identify common tools that everyone knew how to use, and build a tool stack that was more accessible to the entire team. 

You may not need an entire summit to identify which tools your team is using and which ones will be the most useful for your organization as a whole. Have a meeting with researchers and designers in your organization, and walk through the research process with them, talking through which tools everyone uses at each step. This can help you understand not only the tools your team can agree on, but also how your team is approaching research projects and which tasks they complete along the way. 

There may be tasks that don’t fit neatly into the research process, but are getting done anyway. That’s ok! They’re an important part of your team’s workflow, and it’s good to know about everything that’s happening along the way. There also may be instances where your team is using tools you already have (and possibly pay for) for creative solutions outside their actual intended purpose. We have a little bit of this in our research process, using Zapier and Slack to send automatic notifications to a Slack channel every time a research session is scheduled. It helps us coordinate research sessions so that everyone has an opportunity to volunteer to take notes, and makes working out sessions with multiple moderators easier. 

To give you an example of how to map your tools to your process, here’s a quick framework of our research process, filled in with the tools we use for our company-wide research at each stage. You can copy and adjust this framework to better reflect your team’s process to make evaluating your own process easier. We’re a fully remote team, and always have been, so if you’re looking for good remote tools, this is what works for us! 


Identifying the research question, setting up the study, and gathering information we already have that can help us answer the question.

  • Google Drive: We take meeting notes, write briefs, and organize our thoughts here. We also keep track of our notes and findings from previous research projects in a shared Google Drive folder. 
  • Notion: Some of our team members like to write and organize their initial thoughts in Notion instead of Google Drive. 


Finding participants for research and scheduling them for sessions. 

  • User Interviews: We use our own product to source participants for all of our studies, duh! We filter our own participants by customer segments and when they last participated or were invited to research to avoid oversampling the same users again and again. User Interviews also allows us to sync calendars, which is really useful for scheduling sessions when multiple people are attending. We typically have a moderator and notetaker
  • Zapier and Slack: We use a Zapier integration that identifies “confirmed session” emails and pushes a Slack notification to our company-wide research channel. For projects where multiple team members are moderators, we emoji reply a 🙋‍♀️ to volunteer as the moderator for that session and a 📝 to volunteer as a note-taker. Anyone can volunteer to take notes on any project, and we coordinate details between the moderator and note-taker in threaded comments on the original message. This makes our process a little easier and more hands-off, and we pay for Zapier and Slack for other reasons. Adding this step helps our research go more smoothly without paying for anything else, but it’s not vital. 

Conducting Sessions 

Actually conducting the research sessions for your study. What you need for this may change depending on the type of research you’re conducting. 

  • Zoom: We almost always use Zoom to conduct our qualitative research. Our whole team is used to using it for our regular company meetings, so it’s an easy tool to use for research sessions too. We record every session and store it in a shared Google Drive folder. 

Taking Notes & Organizing Findings 

Keeping track of what happens during your research sessions and organizing those findings for later. 

  • Google Drive: We take notes in Google Docs, using this note-taking template. We take a few minutes to organize themes after each session in Google Sheets, using a spreadsheet like this one. When it’s time to present our findings to the team, we scan through the theme spreadsheet and put together our findings in a Google Slides presentation. We keep everything organized in a shared Google Drive folder throughout the process, so it’s all already in one place when we finish our study. 

2. Choose specialized tools over all-in-ones

When it comes to choosing tools that have the highest ROI, picking a few specialized tools over one generalized tool may not seem like a big cost saver, but often can be. Daniel and Leo stressed how intricate the research process is, and said that for them, generalized tools don’t cut it. They’re mediocre at a lot of things, instead of good at anything in particular. Daniel compared it to going to the Cheesecake Factory vs. a good local farmers market. 

“It's kind of like The Cheesecake Factory where you go in and you get this massive menu, and then you know, you've got your burritos and you've got your stir fry and all these different things. And it's like, Oh wow, this is great. I can get anything I want. 
But they're all kind of, eh, you know, a mediocre experience. Whereas if you go to like, say a farmer's market, you've got all this great diversity and everyone is like an artisan in their own particular domain, and they're doing the best possible spaghetti sauce that you've ever tasted in your life or, you know, whatever it is.
And so those are the ones where you go and find those specialized tools to, target gaps that you have in your process. And sometimes you can come out with a better result that's cheaper overall.”

Research is a robust process, and can be full of intricate and overlapping processes, depending on the scope of your research. If there were an all in one solution that solved the exact needs of all teams with flexible packages to get just what you need, our conversation with Daniel and Leo wouldn’t have happened in the first place. In fact, we probably wouldn’t have a company! User Interviews started by focusing on one particularly painful part of the research process, recruiting participants for research quickly and affordable. Because we’ve focused a lot of our efforts on making that process as smooth as possible, the median time to find a qualified participant from our marketplace is just 2 hours. A more generalized solution, that has to focus on the entire research cycle, may not be as good at recruiting quality participants quickly, though it provides tools for more parts of the research cycle.

3. Focus on the beginning and the end of the research cycle

The two most time-consuming parts of research are finding the right participants and analyzing all the data you get from research. Daniel and Leo stressed that, if you can’t use tools to help you along the whole research process, focus on the beginning and the end. These are the parts of the research process that are the most time consuming and tedious, and saving time here can lead to huge savings for your team. 

Leo walked us through how they audited their own recruiting process to demonstrate to stakeholders that spending money on a tool upfront led to massive time savings for their team.

“We crunched the numbers. We literally did, we were looking at how long on average it was taking us to recruit for a study. And it was anywhere between 25 and 40 hours in total across multiple people, across different teams.
And then we looked at how long it took us with a recruiting service. [With] the recruiting service, there's still some time overhead. You still have to write the screener and you know, I still do the double recruiting, I call it, you know, the followup to message with people to make sure they are the right people before approving them.
But generally speaking, that was an 80% time saving. And that's what we presented to stakeholders and it was so clear, you know, in terms of what this tool was going to cost us versus how many hours it was going to save. It was so blindly obvious that it was going to be a massive return on time invested and return on investment.”

According to data from Payscale, the average User Experience Researcher makes about $89,126 a year, or $43 an hour. In Daniel and Leo’s example, that means they were spending anywhere from $1000 - $1600, just on the time it takes their researchers to recruit participants for one study. That’s time they could be spending on doing actual research, identifying new problems to solve, or digging deeper into their findings. If you can use a tool to save some of that time and money, it’s a no-brainer. 

At 80% time savings, we’re talking about a savings of $800 - $1,280 on one project. P.S. A monthly subscription to User Interviews, which includes unlimited participant recruiting, costs just $300 a month 🤑.

4. Usability test the tools before you commit

Daniel and Leo also stressed the importance of “usability testing” each tool you’re considering before you commit to paying for it year-round. This can help you avoid paying too much for a tool, ensure your team will use it, and see how it works with your existing research process. 

Integrating a new tool into your research workflow can be a heavy lift, and you want to make sure you’re doing all that work to achieve a better outcome. Leo suggests using the tool you’re considering on at least one research project before committing to it full time.

“Usability testing the tools is key, right? We've got to not only practice what we preach, but it's vital to get a sense of how it's going to fit in with your workflow because you just don't know until you get in there and try it. And you know, I mean, the all in ones can be a great solution, depending on which organization you work for. We actually found that rather than focus on an all in one solution, which may offer some benefits in some places, but less benefits in others.
Look at existing processes. Look where the bottlenecks are. Look at what's taking the most time. Look at the things you like about your existing process. Maybe you've got some really cool ways of doing stuff, using, you know, just regular everyday tools that work just fine. 
You don't really need to change it, but there's some other places where you know there's a bottleneck. It's taking too long. You just wish this stuff will be automated. So maybe focus in on those bottlenecks, those places where it's taking a lot of time. And then look for the specialized or all the specific tools that focus on just that piece, to help automate those particular parts of the process.”

You may find there’s a key part of the new tool you’re testing that doesn’t quite fit with your existing process, or that it doesn’t actually make the problem you were hoping to solve any better. Taking the time to usability test the tools you’re hoping to add can reveal these issues and help you avoid adding a tool that won’t help you. Most UXR tools offer a free trial of some kind, which you can use to do your test project. 

Building your UXR toolkit

Now that you know how to evaluate the ROI on UXR tools, here’s a few resources you can use to help you build the right toolkit for your team. 

Our UXR tools map is a comprehensive map of research tools. It groups tools by their use case, and shows where tools have overlapping functions. Daniel and Leo actually created a modified version of it to help them think about how their tools interact with each other and build their toolkit. 

Our evaluation of the best video tools for remote research. We chatted with researchers about their favorite tools, and crowned a winner. 

Our guide to choosing a great user testing tool. We walked through how to choose the right tool for your team, and compared 30+ tools. 

Our remote user interviews toolkit. We covered the best tools for remote user interviews, and talked through pricing, how they interact, and what you can do to get research done on a budget.  

A full rundown of UXR tools, pricing comparisons, and reviews from real live humans. 

Tools mentioned in the episode

Loom is great for recording your screen and sharing it with your team. It can also be used for usability testing. 

Dovetail is a research repository tool that makes it easy to organize and analyze your insights.  

Lookback is a great tool for conducting usability tests and taking notes live. 

Descript is a video, audio, and text editor. We’ve recently switched to it for editing the podcast, and I am absolutely in love. It does really good automatic transcription and makes pulling clips incredibly easy. 

Rev is a transcription tool that provides really accurate, done-by-a-human transcription. 

Zapier is a workflow automation tool that helps us make connections between apps that don’t naturally talk to each other. 

Userbit is a research repository tool that makes it easy to code your analysis into deliverables. 

Dedoose is a research repository tool that makes the academic process of analyzing research more collaborative. 

About our guests

Daniel Loewus-Deitch is the Director of User Experience at a large insurance company. He has over 18 years of experience in UX, and has worked at companies like IBM and Microsoft. Daniel is interested in holistic wellness and technological harmony 💻🎵.

Leo Smith is the Director of User Research at a large insurance company. He has over 19 years of experience in UXR, and has worked for companies of all shapes and sizes in roles ranging from research to design. Leo is also a certified Hatha Yoga Instructor 🧘.


Erin: [00:00:37] hello everybody and welcome back to awkward silences. We're here today with Daniel Lois dite and Leo Smith. They are directors of user experience and user research respectively at a well established insurance company that focuses on employment benefits solutions.

Today we're going to talk about building your UX research tool set something we love. Talking about here at user interviews and are very excited to get their perspective on. So thanks so much for joining us.

Leo: [00:01:10] you for having us.

Daniel: [00:01:11] Yeah. Thank you very much. So you know, we're big fans of this show, and also a. We happened to, use user interviews as well and have lots to say, so this will be a great conversation.

Leo: [00:01:24] But we weren't paid to be here by user interviews.

Daniel: [00:01:27] Yes. Just happens to me.

Erin: [00:01:30] yes, I was going to say, no one, no one is being held, you know, held on them under duress to talk. And yet, you know, by way of a segue, we are sort of all being held captives in our homes right now, actually, by the coronavirus. So how is everybody hanging in

Leo: [00:01:47] Wow. Well, Danielle, I would actually just talking that, you know, we, we'd go out for walks in the evening or cycle rise and it's nice to see so many people, you know, out and about just doing, I don't want to say basic things, but, but you know, it seems that more people are interacting with their environment in the sense of just going for walks, playing with the kids in the garden, you know, stuff like that, which is just, it's kind of really nice to see, rather than just kind of the everyday rat race, I guess.

Daniel: [00:02:13] Yeah, even though we're using more technology than ever with zoom and zoom parties and all that kind of stuff, I find that we're connecting in a much deeper way, you know, so we're having these, You know, get togethers, virtual get togethers where we're really getting to talk with people that we don't usually connect with, that are across the country, across the world,  and it's not just texting anymore. It's not just, you know, posting on your Facebook wall or, you know, Instagram or whatever. And, and so I find this in real positives and, and like, as Leo said, just getting outside and getting fresh air has taken on new meaning and take a new, You know, something we look forward to every single day.

my girlfriend, I will, you know, do two hour walks and that's before we do our workouts at home. And it's just a, so I feel like from a health standpoint, ironically, as long as we're able to avoid this virus, knock on wood, haven't felt healthier in a long time and just feeling more kind of focused and productive and in, in a good space.

Leo: [00:03:08] But that all said I never thought I'd actually missed the morning commute. There are, there are times where you kind of just want to be in your car and maybe, you know, maybe you just kind of need to virtually get in your car and drive to work and back. But yeah.

Daniel: [00:03:21] Another amazing thing is, you know, we've just been really surprised at how our company as large as they've been,  was able to turn the key and really just overnight and have, you know, thousands of people. Suddenly working remote and it's been really smooth. Surprisingly. I mean, I, I work remote, as, as the norm, but you know, there's many people in our company, the majority, this is, this is very new for them.

And so that's been pretty impressive.

Erin: [00:03:48] Yeah, for sure. I think of it too as a sort of, you know, it's like an elimination diet, right? You know, when you're like allergic to something, but you don't know what you kind of like strip it down to like nothing and then add things back in. There's something about like, no, you can't do that thing where you like see people in real life or.

Like touch them or anything that's off the table. you're just left with all this other stuff and you kind of again to your point, like if you know you're able to stay healthy and all of that, really, what is it I miss about the stuff I used to be able to do and what do I like about this new stuff I have to do?

And in a sense, it'll be interesting to see the lives we choose to have when we get back to having more freedom.

Leo: [00:04:29] Yeah, that's all right. That's a great point.

Erin: [00:04:32] JH, how about you? How are you doing?

JH: [00:04:34] I feel like I got to say I'm doing well now, given everyone else's thriving here. no, I'm, I'm hanging in there. we have a pretty good, you know, flexible situation. My wife and I, so that helps. I think once we get to actual spring or nicer weather in Boston, that will help a lot. We had a week, a week or two ago.

It was just like rainy and cold the whole week. And I felt myself that week going a little crazy. But I think when we get the sun out and I can go outside regularly and stuff like that, I think, I think it'd be really nice.

Erin: [00:04:59] Fantastic. Well, speaking of remote technology and calm technology, let's do a hard pivot into talking about, you know, technology to support research. And of course today that research is 100% remote, at least from, from what we're seeing and supporting. So. you know, take us from the beginning. I know you've both spent a lot of time exploring and using different tools, to help with user research.

And there are certainly a lot of them on the market these days. How do you go about kind of figuring out, you know, what, what sort of stuff you need to support your organization and your product  and so on.

Daniel: [00:05:39] Yeah. Do you want me to start with this, Leo? And then, 

Leo: [00:05:42] yeah. You stop.

Daniel: [00:05:43] all right. So,  we started doing this UX summit. We, we've, Got a team of about 20 same people, and, we get together.

the first time was physically, the second time we actually did virtually, even before this, this coronavirus, situation. But, what we did in those was, was really exciting because instead of just having a group of leadership. you know, a couple people sit down and say, these are the tools we're going to prescribe to the team.

We, we did some activities in the same way that you would do with like a design studio, right? With your clients or with your, you know, project teams where we sat down and we really tried to understand what people are doing in the shadows, what they're doing, you know, kind of a, you know, Marine style guerrilla techniques and trenches as they're working on their projects.

Uh, just trying to get by without having a formalized system. You know, we're just, You know, we're, we're a young group when it comes to a UX and a, I shouldn't say young, a maturity cause we have a, we have a lot of experience. But just in terms of, you know, having not been there very long. and so over this past year, year and a half or whatever, we've really made a lot of progress and trying to understand what people are using, what they're trying to do.

And that bottom up approach really helped inform, not only giving us an idea of tools that we didn't know about, but also the things that people are trying to do. And I think that was really helpful. we started actually using a, I think it was an infographic that you guys had created. it looked like a subway map and we were pleasantly surprised that a lot of the tools that are on there, which, you know, you guys went pretty far in depth that we, we had heard about one way or another from the team.

but it really comes down to alignment and methodological focus, you know, in, in how we see ourselves doing research. and I think that you, there's not a one size fits all for that. And it also depends on the special domain needs that you have, you know, for the industry that you're in. You know, for us being in, you know, working with benefits and claims and things like that, it's very different than, say, like, retail, you know, and you might have very different needs and how you collect, Research findings and how you kind of track analytics and that sort of thing. and that leads to, you know, Leo's expertise that when he was brought in as, the head of, not only our, our lab, but the overall research efforts that we do across our projects.

Leo: [00:07:54] And that's, and that's a great segue into, you know, when I started at the company, Daniel came over to my desk and he basically said, well, what do you think about a tool that automates recruiting and testing? And my immediate response was, I'd be very interested in that for unmoderated testing to augment, you know, the moderated research capacity.

And that was my kind of main focus. I was actually pretty skeptical in the beginning about using. Online recruiting services, recruiting panels. You know, in terms of getting quality participants and, and in terms of, you know, what I've come to Corp or what we've come to call professional participants syndrome, where, you know, they've participated enough that they become, it becomes more like their job and, and that they're, you know, they're therefore filling a role when they participate in your research.

And, and, you know, with certain services that I've tried, I've kind of noticed that sometimes why I don't really feel like I'm getting, you know. An authentic user as opposed to, you know, a professional participant. But as we kind of went down this journey, it began to become increasingly clear that the having a good automated recruiting or a good recruiting panel or a good recruiting service was just massive in terms of time savings.

you know, because recruiting, I was lucky for some of my career in UX. I, I worked in the public sector. So, and, and we had, you know, we were kind of grace with the fact that we have many people who, who were willing to participate in our studies. We were able to self recruit pretty easily. We'd put out, you know, email blast with surveys and, and people could sign up, and give us a bit of that.

Demographic data. So we, you know, we could filter that to get the right participants later on. in that particular instance, they weren't even allowed to accept incentives. They just wanted to do it to help out. and it was really pretty straight forward and it didn't take a whole lot of time, but that typically isn't the norm.

And, and, you know, in, in many other settings, it can take, you know, 20 to 30 hours to, to do that recruiting piece. When you, when you think about, you know, it often involves more than just one. Group. You know, oftentimes you have to go across marketing, you have to go across the UX team, getting a sales, getting a list of customers, and then working with marketing to send out an email blast and crafting all of that.

And then, and then, you know, doing the scheduling. And typically anywhere from 20 to 30 hours is pretty typical just for recruiting. And that's a huge chunk of time. So, so when I actually started to. delve more into these automated recruiting tools. The time savings really became apparent were things that were taking, you know, 20 to 30 hours.

We're taking two hours, and that just the massive time savings that then allow for the actual stuff you want to do, which is, which is the research.

Daniel: [00:10:39] Yeah. You know, and it really, it's the recruiting is. The long tail of everything. There's kind of two long tails. There's the beginning where there's the recruitment, the tracking, the scheduling, you know, reminders and all that. and then the other side obviously is the analysis and reporting. And those are things that you really have to think about those bookends when it comes to your tool set, because those are the things that are going to keep you from being able to do all this great lean user research that, you know, we've been developing.

I'm really excited about actually the program we've been building out with our. ability to rapidly respond in ways that we'd never been able to before with the kind of recruitment that Leo is talking about. And so this has opened up a whole new world for us we want to be able to present the impression that. You know, research, you know, isn't a bad word, and that we can get amazing things back within, you know, a few days, a couple of days, you know, whatever it is, and turn it around and then turn that into, you know, true design recommendations that are informed and prioritized.

And, and this is a lot of us to do that by really thinking thoughtfully about our tool set. another thing that Leo mentioned is that cross functional aspect. So one of the things that he's organized is a. It's biweekly a sort of get together. You know, where we have marketing, we have CX, customer experience, we have all of our UX folks and anybody else who's doing any sort of research across the company.

And that's really important because we need to understand what kind of things that they're doing from a, you know, say a survey or a focus group standpoint and how that can tie in with the things we're doing. And where can we, what kind of tools are they using and how can we transfer data across those?

JH: [00:12:15] How do you, when you're like, kind of looking at different tools, how do you maintain the balance where, you know, like there's a cost, right? To switching tools and there's a cost to playing around with new stuff. But obviously finding the right tool represents tremendous efficiency. Like you were just describing.

Like, how do you make sure that you're not letting it kind of getting away from you and getting into like diminishing returns where, you know, the shiny object and you're just kinda chasing down new tools. But you're actually finding things that are like adding value and helping, you know what I mean?

Erin: [00:12:40] Not that anyone would ever do that change.

Leo: [00:12:42] that anyone would ever do that.

JH: [00:12:44] Because I get accused of this all the time. So this is a very, 

Leo: [00:12:46] Shiny object syndrome is definitely a, an easy route to go down. And, and that's a great point. I mean, you know, to a certain extent, sometimes the simplest tool can be the most effective. so, you know, for, for a lot of, especially when it comes to research, you know.

Microsoft office stuff like PowerPoint and Excel lend themselves very well to research. most user research is qualitative. there's obviously quantitative stuff as well, but with qualitative, you're looking for patterns. And a great way to tabulate patterns is in, is an XL. and then, you know, if you're looking to kind of ramp that up a bit, you can look at tools like air table.

So, so look around what currently tools are available and that everyone else is using. Sometimes the best tool is the one that you have and the one that everyone else has, so that then you can more easily collaborate. Another great example there. In a previous position I had, I, I worked with a company who they were using PowerPoint to do.

Prototyping. At first I thought, well, that's a bit weird. And then I got into PowerPoint and realized just how much functionality is in PowerPoint, to be held, to do really good early stage prototyping, to throw in front of participants early on and then use the PowerPoint as a specification communication tool as well.

So it's doing double duty without having to kind of delve into more expensive tools. so it, it is really being aware that the latest isn't necessarily the greatest. The shiniest. Sorry, the greatest, sometimes the latest and greatest is just a revamp of existing tools. and sometimes the only way to find that out is to, is to give them a try.

Daniel: [00:14:23] Yeah. And I think, you know, the, there's this, democratization effect of that too, because when you have people that already have those tools, they're familiar with them, it allows more people to get involved, which is always a good thing. for the most part,, uh, you know, a good example of what Leo's talking about is, you know, we've been using these things called rainbow sheets, which, uh. You know, allow you to kind of track findings across and see how many participants in a given study or across studies, set a similar sort of thing or, or had a similar sort of, issue and that sort of thing. And then kind of categorize those.  And this is really just, you know, a Google sheet or an Excel sheet and we evolve it over time and we add different columns, you know, and things like that.

And people use different versions depending on what their, what works for them. but that's a really good example of what Leo's talking about that is just become ubiquitous tool, to allow us to much more rapidly, collect our data and then analyze it.

Erin: [00:15:14] And it sounds like, right. There's two, two parts to the finding a tool, right? It's like one, what is the problem we're trying to solve with the tool? And then two, does this tool solve the problem? And then there's lots of sub questions for that. If I heard you right, it sounds like when you sort of endeavored into finding an ideal set of tools for your specific, you know, organization You almost did a user research kind of internal study in terms of what problems are people trying to solve right now and how are they solving them? Which I love when people do that, you know, sort of bring user research methods internally. And it sounds like certainly recruiting was one of those which backs up, you know, why we have a company, which is that we know there's a ton of pain around recruiting when it comes to user research.

What were some of the other areas where you found, you know, in the dark corners who are sort of prevalent everywhere, where there is a lot of, you know, need for people to try to find.

Daniel: [00:16:18] I mean, one of the things we tried to do was we tried to look at, we really tried to define our user centered design process, and we even, you know, developed a nice pretty diagram that we posted all over the wall. and we tried to not just do a generic one that we pulled off Google images, right?

Cause there's a million of those. Oh, you know, do iterative process. Then you first, you go and do discovery and then you design, and then you. Test it and all that. You know, we, we really wanted to get a little deeper than that and it really helped for us to map those tools onto the different phases of the user center design process.

and the things that we found, you know, um  uh, as we got into looking at all in one solutions versus kind of creating our own customized tool set was that, there's this heavy focus on the validation side and not so much support for discovery and, and being able to capture those really important artifacts and things that you might be talking about.

Even if you aren't putting. A proposed interface or a production interface in front of somebody yet and still doing sort of that recurring requirements gathering. And then there's the analysis and reporting of course. And so that's, you know, that other long tail I talked about earlier where, you know, you have the synthesization or synthesizing of raw notes and tagging and categorizing them and, you know, trying to get them as quickly as possible into a form that.

You know, when you're dealing with qualitative data that can be usable and actionable and turned into real design recommendations that can be used to sell to, the stakeholders to see the pain that users are feeling and pulling out the really hard hitting quotes and clips and things like that. And then bringing those into deliverables such as personas, a journey, maps, findings, reports, you know, and figuring out that right balance of how pretty to make it versus how lean to keep it so you can keep moving along.

Leo: [00:17:57] you know, I think kind of one of your original questions coming back to that is you may start off thinking you want to go in a certain direction, you have a certain problem to solve, but you may find out that that wasn't actually the problem you have. and I guess I'm thinking about myself where I thought that I'm moderated.

It would be really good to introduce our moderated into the mix. And I still think, you know, we still want to do some more of that, but. But in a sense, I'm moderated and Diana lie, I've kind of had discussions about this. To get the insights, you still have to go back and look at the recordings, unless it's purely quantitative stuff you're gathering, whether they completed the task and where they clicked.

You still have to go back and look at the recordings to get the qualitative stuff. And even then, obviously you don't have the advantage of being able to ask follow up questions. so. Well in the beginning, I was most interested in that. I ended up being perhaps least interested in that by the end of this kind of a, a process.

so I, I, you know, I ended up not being where I started. and the other piece of it is, is, is again, coming back, looking at tools that are already available. the conduce some of this stuff pretty well. an example of that is, is Microsoft teams. So, you know, a lot of companies now are moving into teams and for those listening in who are not familiar, although most probably are, it's basically, you know, video conferencing with, with chat, with teams, chat, you can set up channels for different interests, different groups.

And for some of the lean research we've been doing, We kind of keep, we're trying to keep it all in one channel. So we create one channel for one round of research, and then we use teams to do the remote video conferencing and the recording, and then the recording comes into the channel. So you have your recordings in the channel, you have your script in the channel, you have any artifacts in the channel, you have your high level findings in the channel, so that one channel becomes.

The whole round of research, including the recording. So it's kind of cool for keeping it all in one place, and then you move to the next channel for the next round of research, and you can kind of see this longitudinal development of the findings as well. And you know, that's, it's a F it doesn't offer massive amounts of functionality, but it's really good.

If you'll, you know, perhaps looking at a more lean approach and you want to keep everything in one place, and you want to have that collaboration and you don't want to spend tons of money and you want to use existing tools. So, you know, always look at existing tools and see what they can do it. It's not about developing.

And I think, Dan, we'll talk more about this. It's not about. You know, developing a tool set that you have to stick rigidly to. It's about coming up with different ideas for the tool sets. It's like having that toolbox and depending on the problem, you pull out a different tool or a different set of tools for that particular project that you know, work well together.

Daniel: [00:20:42]  that's a good segue into, you know, kind of the all one solution versus the, you know, kind of customized tool set because the only one solution tries to prescribe a lot when it comes to workflow. Right. And so the ones that are doing that, it may be helpful for, for teams that, you know, maybe don't have the same level, you know, the same years of experience of building user research program because it gives them some structure around it, which is really great.

but we found that restrictive on our side because. You know, we really wanted to tailor things more towards moderated, for example. And there's all that focus on unmoderated, which, you know, we, we discovered the dirty truth, as Leo is saying, that unless you're collecting metrics, like, you know, a tool like maybe a loop 11 that's able to automatically collect some useful metrics without you even watching those videos.

you still have to sit through all of those, and that's going to take just as much time as if you were doing it in real, real time. And, you know, so looking at that and then also looking at, the way that you have to. Use a particular tool in a given project. So, you know, we, we don't try to prescribe a process that's just kind of, you know, everything.

Every time you do a project, you do it this way. You have to kind of think on your feet. And so we're always taking tools that we have. you know, let's say Miro, for example, the mirror white boarding tool, or, you know, Acture or whatever, and using them in new, innovative ways, depending on what the stakeholders need in that time, you know, so.

It's a, it's something that you can't, you can't ever get to a point where everything's completely standardized because you want it to be lean. You want it to keep evolving the process for the particular situation and not trying to, you know, so you have this tool set in this, all these different tools you can use kind of like a carpenter or a specialist, but depending on the constraints that you run into in your environment to run into, in a given project, you're going to have to tweak and get really innovative and creative on the fly with those.

  Erin: [00:23:10] So for the master carpenter, right, to, to use the, the metaphor, having a toolbox of, you know, all right, all these tools, I'm familiar with hammers, that's about it. having, all these tools to use and know how to use them gives you tremendous flexibility. When in your organization and when you work on sort of finding tools to support a number of people.

I'm not just. Potentially expert carpenter, like how do you think about, you know, the right balance of sort of having customizable tools that are really good for certain things and not other things. And then communicating to others, you know, how and when to use them, right? Like, how do you sort of share that?

Does that introduce complexity or is there a way to make that approachable? if you end up sort of having a lot of tools in the arsenal.

Daniel: [00:24:02] So one of the ideas that, Leo and I have, test around is so everyone knows about design systems. Now, design systems have become like the big buzzword, right? And we're actually working on building ours out, but we decided, as we started to do that, as we started thinking. Can we think bigger than design system, right?

You know, you've got color pallets and you've got fonts and you've got, you know, various interactive components. And you know, you're trying to describe when to use those and you know, what the best practices are, the do's and don'ts, and, and then. Having samples of those writing different implemented examples, but because user experience is, you know how I like to describe it to people that don't really know it, is it's research informed design really.

And you can't remove the research from the design or the design from the research and separate them out. That's, that's one reason I'm a real fan of, you know, People being both researchers and designers, not necessarily being kind of divided off and only doing research, only doing design because you have to really understand that translation.

And so within our design system, we're actually talking about calling that, you know, a user experience. This third digital experience system where we would include a research system component of that. And that might include exactly what you're talking about, Aaron, and the best practices. When do you use this or that in given situations, how to use different methodologies.

And when. You know, one would work better than another, you know, maybe unmoderated versus moderated or a diary study that goes across many a days or weeks versus doing something, you know, just kind of within an hour. And then. Having things like templates that we talked about earlier or you know, persona and journey map, things that you can just grab, right?

Knowing where to grab all those resources, those tools and those assets and having those all centralized in one place and then tying those in with the design system so that people understand how the research led to the guidelines within the design. we were still experimenting with that and kind of build that out, but we think that it's going to have a lot of promise and hopefully more and more people will be thinking that way.

 JH: [00:25:59] , I think, you touched on a little bit there of like, knowing how to use the tools and like knowing the fundamentals is always like. It gives you so much leverage, you know what I mean? I think, you know, if you're thinking about like a professional golfer or something, right.

And gave them a crappy, like five iron from a garage sale or something. And I had a bag of clubs with all the latest and greatest, like they're still gonna beat me cause they know how to use the tool better. You know what I mean? And so at some point, like the tools are really important and, and do help.

But there's just this aspect of just the fundamental skills and knowing how to use the tools you have that is really huge as well.

Leo: [00:26:28] It really isn't. And I think kind of to build on that, doing show and towels, you know, and what we've started doing quite, you know, the, these kind of cross team meetings that we have, cross research team meetings, people doing show and towels about, you know, the project they worked on, what tools they use, what successes they had, that type of stuff.

You know, that can really help build the awareness of, of what, which tools work for different cases.

Daniel: [00:26:53] Yeah. And there's also the idea of mentorship too, of having kind of a dedicated mentors to a particular tool.

So, you know, having particular individuals in the team that have raised their hand and said, you know, I'm, I'm going to be. A mentor for this tool that people can come to and ask advice on how to use it the right way, when it comes to, you know, keeping up templates for that in libraries. And, and the same thing on the research side.

So if you're somebody who, you know, uses user interviews a lot for recruiting or use, say, look back for capturing. your sessions or, you know, whatever tools you're using, right. have people kind of, assigned to each tool that can, be, reached out to at any point by somebody who's just joined the team or maybe somebody outside the team that wants to leverage some of these tools that we brought in, that sort of thing.

I think that can really go a long way to knowing how to use it in the right way and not just being overwhelmed by all the features and the, you know, kind of, you know, shiny objects, stuff that we talked about earlier.

Leo: [00:27:47] And I, and I think, you know, I, I mean, in my spare time, I, I do photography and I, and, and, you know, over the years, I, there were a few years where I literally thought going out and buying a $3,000 camera was gonna make me a stellar photographer. And, and, and lo and behold, it didn't. And it will. These people said, it's not the camera, it's the photographer.

And I'm like, yeah, yeah, yeah, no, it's the camera. But it really wasn't. And I think, you know, it's the same with, with, with any tool that, you know. Sometimes you have to look at the basic, what are you trying to do? So Xcel is a great example. So, so you're using Xcel to do grounded theory. Grounded theory comes from social sciences where you take the qualitative data, you look for patterns, and then the patterns from the pans.

You create hypotheses and theories and you know, some kind of tabulation. To find those patterns is really all you need. And XL, in many cases, provide that. 

Erin: [00:28:40] So look, I don't know about you guys and your in your business, but in ours, and what we're hearing from a lot of clients right now topically is, you know, ROI, cost cutting. These are, you know, important things right now. Right. And, So we've been thinking a lot, but you know, in the UX sort of design space in general, you know, how are people thinking about that?

And has that shifted in terms of the value that UX can bring to an organization broadly, but also like right now, right? And, when you think about. Tools and what tools are going to empower yourselves in your teams to do great research in that context? You know, how do you think about the ROI of a tool, or do you think of it that way?

Right. When every tool has a price, and it's, it's meant to have some sort of, you know. benefit. And so how do you, do you try to quantify that? Is it more of a qualitative, spiritual matter? how does that factor into, yeah. Your decision making.

Leo: [00:29:42] well, we've got one grid. I'll start off with a great example and then, and then Dan will come in with some other, is that good? Dan?

Daniel: [00:29:48] yeah, absolutely. Go for it.

Leo: [00:29:50] So, so with the recruiting one, we, we did a, we crunched the numbers. We literally did, we were looking at how long on average it was taking us to recruit for a study. And it was anywhere between 25 and 40 hours in total across multiple people, across different teams.

And then we looked at how long. It took us with a recruiting service and the recruiting service. There's still some time overhead. You still have to write the screener and you know, I still do the double recruiting, I call it, you know, the followup to message with people to make sure they are the right people before approving them.

But generally speaking, that was an 80% time saving. And that's what we presented to stakeholders and it was so clear, you know, in terms of what this tool was going to cost us versus how many hours it was going to save. It was so blindly obvious that it was going to be a massive, a return on time invested and return on, you know, return on investment.

so that's the concrete one. Dan, do you want to jump in more about the other ones?

Daniel: [00:30:48] Of course. Yeah. So, yeah, I mean, I think that's a great example because it ties into those, those long tail bookends of, you know, doing the recruitment and then doing the analysis and reporting. And if you target those, you're really going to get the most ROI because those are the places where you're spending the most time and not able to do as much research as you would desire because it's just, it's just taking longer.

And then of course you have pushback. and so if you can show the time savings. Across the entire, you know, from end to end of how long it took you to do a study. That's really important. but also, you know, the costs, there's lots of costs when it comes to recruitment. If you just look at the number of hours that people are spending on it, or if you're having to go and outsource that, you know, to recruitment agency and then, w it's, it's really nice to be able to just say, Hey, and, you know, go onto a project team and say, the only thing that's going to cost you is the incentives of the participants.

And. You know, let's say we, you know, spend, you know, $60 for an hour of their time or whatever, that's way better than the hundreds of thousands of dollars you're going to be spending each time to, you know, do a recruitment with an outside agency or whatever. If you don't have a special tool set up for that in a process, But there's a whole thing when it comes to, especially in large company of trying to get these tools approved, and particularly when you're dealing with research and when you're dealing with, talking to people intimately about their pain points and their experiences and things like that,  and you know, us being insurance company, that is even more important, you know, when it comes to, you know, PII and Phi and everything. So we have to be really thoughtful about, How protected and how secure the tools are when it comes to capturing that and what we do with those afterwards.

And just that whole process. And that's a big part of getting the things approved. The other thing that you have to consider is.

How are you going to roll these things out? You know? And how many licenses are you going to get initially? How many people are actually going to be doing the research? So you're not kind of just buying it for the whole team and then just hoping they use it. So doing kind of a staged approach and then kind of figuring out if it catches fire, I'm adding on as you, as it becomes more, more popular and people realize the value of it.

And you know, considering enterprise plans, you know, is it worth it to have things like a single sign on and is it worth it to have a . You know, set it up on prem versus in the cloud and all that kind of stuff. I mean, I think people are, are starting to recognize the convenience of doing things in the cloud a lot more and starting to trust it more.

But, you know, these are all conversations you have to have when it comes to each tool and just really looking at, not only how they've set up things technically, but also what their policies are in terms of data retention and everything. so those are all important things. you know, maybe kind of tangential to ROI, but.

They can be barriers to you getting tools in place. If you're not a, if you're not careful and you just kind of are really cavalier and just say, you know, Hey, we want this, this and this, and just hope for the best, cause you might just kind of get shut down right away.

JH: [00:33:37] I F I feel like we've been, you know, we've been pretty good here in terms of, you know, talking about what's the underlying problem you're solving, do you have any can already use, et cetera, et cetera. But like, what are the shiny objects, like if you had some free time, what are some of, what are some tools or apps you want to like check out and play with?

I feel like it's fun to actually like reference them by name. Is there anything that you're eager to give a world

Daniel: [00:33:56] Absolutely. so I think, first of all, you know, when it comes to capturing, you know, we're, we're big. We have a lot of interest in look back, because of its collaborative, real time note-taking. that's something that is really exciting because when you have a lot of observers and you're trying to figure out how to, Have everybody get involved and collect data from their different perspectives that they might have is they're watching somebody go through something or listening to the responses. that's really exciting. Being able to get clips, very easily, that you can share instantly. create highlight reels, things like that.

that's not something, you know, we do a ton of, and we'd like to do more of if it was easier, right. When, when it comes to using video in that way.  Another one that's kind of a new kid on the block. Is a user bit, and that's one that just stumbled upon. And they really are trying to target the analysis and the reporting side, in a, in a really kind of innovative way.

And, you know, making, making it much easier to code all of your qualitative data and then get it directly into deliverables in a much faster. Process. Cause those findings reports can take a long time to create those personas, those journey maps, you know, anything where involves communication and they're also trying to kind of target the longitudinal tracking of data across, various studies.

So, you know, you don't just do a study, toss it in the drawer and then forget about it. But how do you look at things across and build a knowledge base over time that you can keep going back to and making sure that the peripheral information that you're learning that you. Maybe weren't trying to capture in that study, but maybe it would be useful.

Down the road is somewhere where it's accessible. And I think one of the things that's really important as you look at some of these tools, they have a lot of exciting. features up front, but what's really important, for it to be usable is, and it's funny because these are tools for UX, but sometimes they're not the most usable enough themselves is really just the basic interaction of them, you know.

So going back to that coding of data, for example, I have kind of a, an interesting story with that. So there's these academic tools out there that, a lot of universities use for. Analyzing qualitative data they'd been around forever. And there is a company called deduce that came out. That started to kind of innovate on that by making it very collaborative and making it, you know, web based and that sort of thing.

And it was really cool because you could highlight segments and you could tag them with various categories. Like, you know, this is a pain point, or this is a related to dashboard design or whatever. And you know, you could highlight them with all sorts of things that would allow you to abstract your data later.

the challenge with it though is that there was so much mouse use required. And when you're doing that and you're going through reams and reams of were all notes and all this sort of stuff and transcripts, you know, it becomes really difficult when you're just having to use the mouse the whole time and do all these extra clicks.

And so I love the tool so much that I decided just to send them a different design. I called it a quick code widget and you know, I didn't know any, I didn't know them at all. I just, I just sent them an email and said, Hey guys, you know, really loved this. But it's killing me because there's no keyboard show shortcuts.

There's no easy way to do this in a rapid way. And my hand is about to fall off because I've been coding all this data and it's, it's like really inefficient. And I said, just take a look at this. Tell me what you think. And it was, it was really awesome because they, you know, I ended up talking to the CEO and he ended up jumping on this, diverted his dev team to building out this quick code widget.

And people just, you know, got it, got the best response. Everybody was like, wow, this is great. This is so much different. And it was one of those things that you wouldn't think it's really flashy, but just you have to focus on that basic usability. I think another tool that does that really well is air table.

they make it very rapid to be able to do that same sort of, coding and, and data entry and things like that by having all sorts of type aheads and, you know, not having to use the mouse more and you have to do that sort of thing.

Erin: [00:37:41] Awesome. What else do you think about, so we've talked about, right? Like, how do you figure out what tools are going to solve real problems, point solutions versus all in ones where you're going to get your best ROI training teams, how to use tools, some, you know, tool crushes, new kids to the block.

But how does it all kind of, how do you make it all fit together? What are other things you need to think about in terms of. integrations, a common mistakes, that folks might make that, you mentioned usability of the tool. some tools have free trials, others don't, right? And that tends to be kind of price point dependent.

You know, what's the sales process going to look like? You know, what security concerns that we have. You talked about a little bit, but when you think about the ROI on your own time too, if you're procuring a tool, how much is it going to time? Is it going to take me to figure out if. If we want to use this tool or not, and you know, how deep do I need to get into to trying it out to really understand if it's going to be a good solution for us.

So, balancing, you know, all those sort of intangibles and other things to think about. Once you've identified, you know, your methodology for doing research and where your real needs are, how do you start really like building out, building out that stack.

Leo: [00:38:55] So I think, I think usability testing the tools is key, right? You've got it. We've got to not only practice what we preach, but it's vital to get a sense of how it's going to fit in with your workflow because you just don't know until you get in there and try it. And you know, I mean, the only ones can be a great solution, for depending on which organization you work for.

 We actually found that rather than focus on an all in one solution, which may offer some benefits in some places, but less benefits in others.

Look at existing processes. Look where the bottlenecks are. Look at what's taking the most time. Look at the things you like. About your existing process. Maybe you've got some really cool ways of doing stuff, using, you know, just regular everyday tools that that works just fine. You don't really need to change it, but there's some other places where you know there's a bottleneck.

It's taking too long. You just wish this stuff will be automated. So maybe focus in on those bottlenecks, those, those places where it's taking a lot of time. And then . Look for the specialized or the all the specific tools that focus on just that piece, to help automate those particular parts of the process.

Daniel: [00:40:05] And yeah, just adding to that,  you know, the trials are really important because without those, you can't really suss out the little. Challenges that could become much bigger than they seem, you know.

So for example, we found in one of the ones we are testing, and one of our, our real, real world experiments was not being able to adjust, screeners and not being able to adjust our study after we had launched it. or having really rigid, things on a screener where, you know, you can either. Every, every question has to have an accept and reject, but you can't have, you know, just sort of an FYI ones you can do sort of the followup that Leo was talking about with that double, screening, having that accept accept is it, you know, it seems like a small feature, but it's really.

It makes a huge impact on the quality of people that you get in your studies.  Those are the kinds of things you don't figure out if you're just kind of doing things hypothetically and you're just kind of, you know, maybe look into a tutorial or a demo and kind of playing with it for a few minutes.

You know, you really need to have it over the course of the whole study. And so any, any company that's able to provide a longer term trial. And we really had to kind of a fight for that, you know, because these companies were used to giving you, maybe giving you seven days or you know, 14 days and then they wanted you to either buy it or whatever.

But you know, we said, no, we really need to be able to sit with this and try it out. And a legitimate sort of way before we can really make a, a decision that made sense for our team.

Erin: [00:41:33] Yep. Preaching to the choir. I like to try before I buy myself. 

Leo: [00:41:38] and I think

Erin: [00:41:39] It's hard. It's hard with SIM tools. Yeah. 

Leo: [00:41:42] that's such the crux though, Aaron, is, is that you've got a usability tasks, tools, right? You've got to, um, yeah. 

Erin: [00:41:51] Yeah. And when you think about buying, you're buying solutions, not features, right? You're buying a use cases, not a list of stuff on a pricing page grid. Right. And it's very hard to know how well a solution executes against your real needs without trying something out. So 

Daniel: [00:42:09] And I think with that too is, you know, because that's the way things are sold. It's, it's problematic because, you know, the more features I can put on that homepage of, you know, my product site, you know, I can, I can then show that as a sales person, right. And say, look, how many more we have these.

Then this other guy. But that that's really, it's kind of, and I hope I don't offend any cheesecake factory lovers, but it's kind of the cheesecake factory where you go in and you get this massive menu, and then they got, you know, you've got your burrito's and you've got your like stir fry, you know, Asian meals and all these different things.

And it's like, Oh wow, this is great. I can get anything I want. But they're all kind of, eh, you know, mediocre experience. Whereas if you go to like, say a farmer's market, you've got all this great diversity and everyone is like an artisan in their own particular domain, and they're doing the best possible, you know, spaghetti sauce that you've ever tasted in your life or, you know, whatever it is.

And so those are the ones where you go and find those specialized tools to, target gaps that you have in your process. Like Leo was talking about earlier. And sometimes you can come out with a better result that's not only cheaper overall. and especially if you may not need some of those features.

Leo: [00:43:17] So I guess approach the whole thing as a user research project. What are, where are your bottlenecks? Where are the different, right. I mean, but it sounds so commonsensical, but yet so, so missed sometimes, right? Apply practice what we preach in terms of our own processes.

JH: [00:43:36] Yeah, totally.

Erin: [00:43:39] Parting thoughts? What else? Anyone listening who, you know, has made it this far and still ready to get out there on, you know, refresh, their, their stack. What else should they know?

JH: [00:43:53] I got a random a plug. There is a, a YouTube channel called the essential craftsman. It's this older carpenter guy. Who just goes into like insane detail about like process and his tools and stuff. But it's very like philosophical almost. So if you're somebody who likes tools and likes to think about that stuff, give one of those videos a world, they're a, they're pretty interesting.

Daniel: [00:44:13] That's fantastic. Yeah. Yeah. I find that, you know, digging into other sorts of design domains and you know, anything where you're producing something and learning about their process and how they think of what they use the tools for and that sort of thing, it really helps kind of open up your mind a little bit.

you know, one of the examples is. I once hired an architect, who decided that she was going to move into the UX realm and kind of switch gears in her career. And it was amazing what she brought to the table, in the way that she thought about physical spaces and the, and the tools that she used for that and the way that she spec things out.

and then how she was able to bring that perspective into, the way that we did things on our team when it came to virtual spaces. you know, so, so think big. Don't. Get caught up in buzzwords. Don't get caught up in, you know, Oh, everyone's doing this thing now, so we must find the same, you know, sort of solution.

Make sure that it's tied to what you guys are trying to do and how you're trying to do it. 

Leo: [00:45:09] And, and yeah. I mean, look at what you've currently got. Look around at tools that are, you know, prolific in your organization. What can they be useful?  And the collaboration piece is key. Sometimes maybe you know that in the, like with everything in life, there's trade offs.

Sometimes maybe you trade off a little extra functionality for going with a toll that is prolific in the organization because it's going to be much more collaborative. 

Erin: [00:45:36] The last thing I would say too, is to not confuse, you know, An internal sort of process or way of working need for a tool need. You know, there's so much that can be solved with thinking critically about how do we want to work and how can we work efficiently? And then letting the tools help you do that.

you know, a lot of, a lot of problems can be fixed by just rethinking, the work and how it's done and how it's prioritized. and starting there and then, then figuring out what tools can help you do that.

Leo: [00:46:06] I think that's, I think that's probably one of the key things Aaron is, is literally map, you know, do a workflow analysis of your current work flow for research and you know, the, you know, we all moving towards a more kind of lean, agile approach. So looking at the places where you can remove. Load from your existing processes without any tools, and then with the stuff that's left over, then looking for the tools that can help, you know, make that quicker and easier as well.

Daniel: [00:46:34] Yeah. 

And curious, about you guys and J H you know, in your company, what, what have you found, from trying to create a tool set for the things that you guys are doing? Any particular, takeaways on your side?

Erin: [00:46:48] Yeah, well, we, we, dog food, as they say, of course. So we use user interviews quite a bit. beyond that, the, I'm, I'm a big fan of what you're saying, the free kind of everywhere tools. so we use Slack, we use zoom, we use Google calendar. We use Zapier to integrate in real scrappy, hacky ways, all sorts of things across all sorts of things.

Um, and then we try with some of the, you know, the. Sort of shiny objects that are, you know, free or cheap to try out and just see. So we've played with  descript,

which we use for our podcast and is also good for transcription and time coding for user research. we've played with, dovetail, we've played with, rev. Look back. You know, we, we were always experimenting with things that are right. Not going to force the entire team to work in a different way to try out this tool, but can this make what we're already doing?

Work a little bit better? And then just kind of iterate from there. So, J H what, what have I left out? What are some of your favorites?

JH: [00:47:48] I don't know. I think what I'm thinking about a lot lately is something one of you mentioned earlier of, you know, just being more thoughtful about how I go about trying to roll out or explore new tools. I think I've been guilty in the past of, you know, bursting in Kool-Aid, man style and be like, check this out.

Like, we should all use this now. 

and being a little bit. A little bit more mindful of like, you know, playing with it a little bit more myself and maybe grabbing some, a smaller group of people that kicked the tires on it and deciding if it's actually helpful or not. So, I think process wise, I've been trying to be a little bit more thoughtful, since, you know, I like to tinker around with stuff in my free time, but I don't need to disrupt the whole team if it's actually not that useful.

You know what I mean? So I'm trying to be better at that.

Daniel: [00:48:26] Yeah, I've made that mistake myself. Yeah. You gotta you gotta get that buy in, right. That user adoption and you know, you can't just force it. It's gotta be organic.

JH: [00:48:35] Yeah. There is a, a, there's a note tool called Rome that I keep seeing mentioned, so I want to, I want to play with that. But, that's, that's the only one summer radar at the moment.

Erin: [00:48:43] Okay. Well gentlemen, this has been, this has been great and I think folks will get, I get a lot of this, we sure love talking about tools, so this was fun 

to, to do that. I'm going to stop recording.

Leo: [00:49:00] Can we cut? We can't finish with an awkward silence.

Carrie Boyd

Content Creator

Carrie Boyd is a Content Creator at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.

More from this author