down arrow
How to Interview Customers Continuously with Teresa Torres of Product Talk

How to Interview Customers Continuously with Teresa Torres of Product Talk

Talking to more customers, more often can be the key to product success.

Teresa Torres is a master of continuous interviewing. As a product discovery coach and founder of Product Talk, she works with teams of all shapes and sizes to help them build better stuff. Part of that is talking to customers all the time, and establishing a cadence that keeps customer needs top of mind. In this episode, she talks to Erin and JH about what it takes to establish a continuous interview practice, shares some tips for doing better interviews, and encourages everyone to get out there and start talking to customers.

💫 Continuous user feedback surveys can help inform future UX research studies, improve product designs, and speed up iteration cycles. Explore best practices, tools, and examples for effective surveys in the UX Research Field Guide

Listen to the episode

Click the embedded player below to listen to the audio recording. Go to our podcast website for full episode details.


[3:31] What's continuous interviewing all about anyway?

[6:39] How Teresa became a continuous interviewing champion

[10:01] Continuous interviewing may not be for all orgs and all contexts, but it can work really well at the team level

[11:52] Focus on the frequency of your interviews, not the number of interviews.

[14:03] Continuous interviewing is a constant reminder that customers know their world best.

[14:56] Automate your recruiting process first

[15:54] Offer an incentive that's actually valuable to your customers

[17:03] Make customer interviews a part of your weekly schedule, just like any internal meetings you might have.

[22:21] Throw away the dicussion guide

[23:12] Teresa shows us how to do an active listening interview

[28:12] How screener surveys can help you find the right people to talk to

[31:40] Don't stress if every interview doesn't go to plan

[35:55] Map everything on an opportunity solution tree

[37:07] Make your synthesis visual

[40:38] Check out Teresa's course on continuous interviewing

[42:00] It's all about the magic lightbulb moments



Erin: Hi, everyone. And welcome back to Awkward Silences. Today we have a special guest, Teresa Torres. She's a product discovery coach at Product Talk, that's And today we're going to talk about continuous interviewing. So we know that most of you out there want to be talking to your customers more, and arguably, no one knows more about how to actually do that than Teresa. So thanks so much for joining us.

Teresa: Thanks for having me.

Erin: J.H. is here too.

JH: I am also here. We talk about habitual and continuous research a lot, and so I'm super excited to see where that kind of overlaps with our own definition of it, and where it diverges.

Teresa: Yeah. Great. I'd love to hear what you're doing today too. We can kind of build on that.

JH: Perfect.

Erin: Fantastic. Teresa, you have a really interesting background. We were talking a little bit before about your academic background, which bridges kind of HCI and psychology and linguistics and learning and organizational change. And you've worked in house at a variety of companies, and now you focus on product coaching. So you're bringing a lot of cool kinds of experience to bear, and have been focused on this topic of continuous interviewing, continuous discovery and delivery, for a while. So we're going to talk about that. Just to start from kind of the beginning, what is continuous interviewing all about? What are we trying to accomplish with this continuous interviewing stuff?

Teresa: Yeah. I'll start with the continuous piece. I think we've seen over the last 15, 20 years, a lot of growth in user experience design and user research. But a lot of that growth have been fueled by a project mindset, where when we start a project, we do some upfront research. And then over the course of the project, we reference that research, so there's this idea of research is a phase that happens at the beginning. Once we've kind of come up with personas, or our fancy research decks, we then use that to go and make decisions.

Teresa: I think what we're finding as we kind of move towards more of a continuous improvement mindset, both on the delivery and the discovery side, is that there's always questions that could benefit from customer feedback. And so it's really adopting more of a continuous mindset, where instead of saying, "We're going to have a research phase at the beginning, and then move onto doing," it's really stretching that research phase out to be continuous throughout both discovery and development. And that only works if we make our research activities smaller, so that we can do them continuously, rather than upfront.

Erin: And so practically, are we doing delivery and discovery literally at the same time? Or are different people doing different things? Is everything always in a kind of state of overlapping-ness? What's that look like?

Teresa: Yeah. This is a big kind of ongoing debate. Right? I think the first part that's great is we have this distinction between discovery and delivery, which we didn't used to have, and really just elevating the work that we're doing to decide what to build, so that's sort of all in their discovery realm, and putting it on equal par with what we do on delivery to scale and build production quality products. And then I don't think they're phases, so I think the simple view is we start with discovery, and then we do delivery. I think the challenge with that perspective is that we learn as much by delivering as we do by interviewing. So as soon as we release something, and we measure the impact, we're going to uncover gaps. And then that should be feeding back into the discovery process.

Teresa: And then I'm sure you've both had the experience, and all the listeners have had the experience of you go into delivery thinking you're going to build something a certain way, and then you hit a technical constraint, or it's going to take longer than you thought, and you're changing the scope, and you're changing how you might do things. I think it's just as important to keep that discovery track infused as you're building, just to make sure that all those teeny tiny decisions that we make every day, that we're infusing as many of those with customer feedback as we can.

Erin: I love that, where it's like delivery is discovery. That's part of what you're saying. And then I'm wondering if the opposite is true. Discovery is also delivery in a way. Right? You think of delivering a product as that's the real delivery. But you're delivering discovery that helps you deliver, so this idea that it's all, not all the same, obviously, but intertwined.

Teresa: Yeah. I think what's hard is I think the distinction between the types of work is really important because it has elevated discovery work. But I do think it's all the same at the same time. So if you deliver the wrong thing, have you really delivered? So I do think delivery is discovery. You haven't really delivered anything if nobody uses what you built, so I do think it's important that they stay really closely coupled.

JH: Yeah. That makes a ton of sense. What is the origin of continuous interviewing? Did you see a team working this way, and it just kind of snapped? Or did you see a persistent trend in problem with how teams were working, and this emerged as a potential solution? How did you find your way into this as an approach?

Yeah. This is a hard question for me to answer because I know I've been influenced by a lot of different things. There wasn't one moment in time where I said, "Oh, here's a team interviewing every week, and that's magical." It really started as early as my college experience taking design classes through an HCI program, and just really learning sort of the underlying principle of infuse as much of your process with customer feedback as possible. And then at various stages in my own career as a full-time employee, I was better or worse at that.

Teresa: I will tell you, in probably my toughest position when was a startup CEO, the way that I found sort of solid footing was by just spending as much time with our customers as possible. And I actually talked to a customer every day in that role. And it really just solidified for me how important it is for product teams to be talking to customers on a regular basis. So some time after that, and that was in 2011, I just started talking publicly. I would say things like ... I was at Target one time, and I was on a panel.

Teresa: And I said, "If I worked at Target, I would talk to my customers every day," because I can go to a Target store and find them easily. A lot of people thought I was crazy because that seems like too much. But if you think about an interview as an hour, maybe talking to people every day is too much. But if you can do an interview in five minutes, and then if you work across the street from a Target store, and you're a Target employee, I think it's crazy not to have those five minute conversations every day. And so it just kind of grew out of that, and I just started to realize for a lot of teams, based on where they are, every day's probably terrifyingly scary. And every week felt like the right amount of scary.

JH: Nice.

Erin: Who out there ... We're hearing about this kind of continuous interviewing a lot more. I know you talk about Agile and kind of the evolution of how products teams work. And you're seeing more teams kind of say, "Well, we don't have to follow Agile to the letter of the law. But the spirit of it, of kind of continuous learning and shipping is what we want to preserve and make it work for our organization." Are there any organizations you're seeing just kill it with continuous discovery in particular?

Teresa: I don't think we're at the point where I can say this company, across the board, does continuous discovery really well. I think at the organizational level, there's so much complexity, that may be the wrong place to look. I do see lots of teams doing this well. And I think we see this across all changing industries. So for example, we know that Google has popularized OKRs, but I can tell you because I've worked with Google teams, not all Google teams do OKRs well. Right? We know that Spotify popularized Spotify model of squads and colleges and guilds. And I've worked with Spotify, and I've seen that not all teams at Spotify follow that model well.

Teresa: So I think expecting a whole organization to do this really well is a pretty high bar. I hope we get there someday. But I think what's nice about looking at it at the team level is every individual can look at: How do I help my team get better at this? And then you don't have to worry about: Is my organization the perfect context to do this? Because I don't think there is an organization that is the perfect context to do this.

Erin: That relates to my next question, which is: Who's a good fit for this? Is it everyone? If not, who's not? You talked about how don't sort of let the confines of your organization restrict you from trying to make progress on continuous interviewing. There's some sort of starter conditions that are good to have in place to get going with this.

Teresa: I think it was Jeff Gothelf just wrote an article about how not every project needs to be agile. I don't think we need to use continuous discovery for every single type of product project we might work on. And but I would say for teams that are new to this, your judgment of when you need it is not going to be very good. Right? So I'd encourage teams that are new to it to over index, maybe do too much discovery to compensate for the bias that we think we're probably right when we're not. But if you're doing things like, if you're redesigning your reset password flow, and there's not a lot of risk in that initiative, I don't think you need to spend weeks doing a ton of discovery.

Teresa: I think the key is to look at, our goal with discovery, whether it's project based discovery or continuous discovery, is to try to mitigate the risk in building the wrong thing. And so how much discovery you're doing is a trade off between what could go wrong if this was the wrong thing to build. And then how much discovery do we need to do based on how much risk there is, and how much risk our organization is willing to stomach?

JH: Just to make sure I guess we level set for the listeners, the name continuous interviewing is pretty self explanatory, interviewing people often, continuously. What does it actually mean or look like though? Does that mean weekly? Does that vary team to team? Is there a baseline definition that people should have in their head when they hear the words continuous interviewing?

Teresa: Yeah. I use the benchmark of interview a customer at least once a week. I think, but I really do also encourage a continuous mindset about this. If you're interviewing once a quarter, maybe focus on getting to once a month. If you're interviewing once a month, maybe try to do every other week. And even if you're interviewing every week, I would even push to: Can you interview a couple times a week? The metric that I like to look from an interviewing standpoint is: Can you reduce the cycle time between interviews?

Teresa: People often say, "I've talked to six customers. Is that enough?" I think the number of people you talk to is the wrong metric because for some things, you can talk to one person, and you get ... If you have a glaring usability problem, and the first person you interview helps identify that, well, you don't really need to talk to another person. You realize that, hey, we labeled this thing wrong. Other things are more complex, and you might need to talk to more people. But I think this metric of reducing the cycle time between customer touchpoints encourages the right behavior of: How do we just talk to customers as frequently as possible?

Teresa: And then what does it look like? It could be, I really like to play with this idea of: What is a customer interview? We tend to think about it as, I need to have a three page discussion guide, and it's going to be an hour long, and it's this really formal research activity. But I actually think we can do five minute customer interviews. It just depends on what we're trying to learn, and that based on what you're trying to learn can dictate what's the length and volume of interviewing that you do.

Erin: Yeah. So let's pretend I'm on a team, and we're not doing any of this. Probably most organizations are doing some, and probably most are not exactly where they want to be. But let's say we're starting from scratch, and we want to be talking to customers more. We could start from kind of learning objectives. That could bring us to that project mindset, perhaps, right to quickly. Or we could start from this goal of getting to the once a week, or something in between. But how do I kind of marry this idea of there's this frequency component and this habit component? And then there's also this. Well, what am I actually trying to learn? How do I combine any of that into a, let's just get started?

Teresa: Yeah. I think the place I would start is with the frequency. Can you just talk to a customer every week? What you talk to them about obviously matters. But what I have seen is that when teams talk to customers more often, even if they do everything wrong, they get exposed to how often they're surprised, and how often they misunderstand something, and how often what they're building isn't going to work. And I think that's the value of continuous interviewing, is really just being reminded on a regular basis that our customers will always know their world better than we possibly could.

JH: This feels like one where a team, here's this and they love the concept, and it makes a lot of sense to them. But then it's a question of, you almost make objections and stop yourself before you start. Well, we can't do this, or we can't ... What are the common objections that team bring up as they start to explore this and try to get into it?

Teresa: Yeah. I think the biggest one I hear is, I'm not allowed to talk to customers. I don't know how to find customers. Takes three weeks to recruit customers. I really think the biggest barrier to continuous interviewing is getting somebody to talk to on a regular basis. And so the very first thing I have teams do is I have them automate their recruiting process. And this is going to be really tailored to your organization, but I can give some examples. For consumer sites, it's really easy. Recruit from your product.

Teresa: You've probably seen products do this, where they pop up an interstitial offering some sort of value in exchange for an interview. I worked with a company, Snag a Job, they're a job board for hourly workers. So they offered $20 for a 20 minute interview. So when you went to their home page, they popped up an interstitial. It just said, "Hey, do you have 20 minutes for us? We'll pay you $20." Now what's nice about that offer is 20 minutes is a small amount of time. And for an hourly worker who's likely making minimum wage, $20 for 20 minutes is a high value.

Teresa: Enterprise companies can do this as well, as long as they have ... If you're Salesforce, and people work in your product all day every day, this type of recruiting will also work. $20 probably isn't going to cut it. In fact, I think for enterprise clients, cash is rarely the right reward, so you have to look at: What's something valuable that you can offer? And it could be anything from inviting them to an invite only webinar. It could be giving them a discount on their subscription for a month. It could be giving them access to a premium helpline. You've got to just think about: What is out sized value for the ask that you're asking for.

Teresa: And then for companies that are brand new, and they don't have a product, or just the nature of their service, they may not be able to recruit through their product, so there's other strategies. You can use your sales and account management teams to help you get in front of customers. You can set up customer advisory boards, where you require that your participants participate in one on one interviews. So there's a lot of ways to do this, but I think the key, the goal that I tell teams to shoot for is you want to wake up on Monday morning, show up to work and look at your calendar, and there's already an interview on it, without you having to do anything, so that conducting an interview is the same as going to all the various staff meetings that you go to throughout the week.

JH: All right. A quick awkward interruption here. It's fun to talk about user research, but you know what's really fun is doing user research. And we want to help you with that.

Erin: We want to help you so much that we have created a special place. It's called, for you to get your first three participants free.

JH: We all know we should be talking to users more, so we've went ahead and removed as many barriers as possible. It's going to be easy. It's going to be quick. You're going to love it, so get over there and check it out.

Erin: And then when you're done with that, go on over to your favorite podcasting app and leave us a review, please. We're talking about interviewing existing customers. Do you ever extend this kind of ongoing continuous process to prospective users, or people who look like your customers, for those kinds of research?

Teresa: Yeah, definitely. I use customer broadly. It could include current customers, potential customers, churned customers. Obviously, current customers are the easiest people to talk to. Churned customers are probably the hardest. But I think all three groups provide unique value.

JH: It feels like another objection I could imagine coming up is, okay, we've automated this recruiting. I have two interviews on the books for this week coming up. But it's maybe not the right persona for the thing we're working on. And if I had known that we were going to be working on this project this week, I would've actually recruited a different profile from our user base. Is that something that comes up? Should people worry about that? Or should they just have the conversation and worry about fine tuning it later?

Teresa: Yeah. So yes and no. I think when you first get started, talk to anybody. Kind of chip away at this bit by bit. Obviously, in an ideal world, you're going to want to recruit, like the Snag a Job example, where you can put an interstitial up. You can use a screener with that interstitial. If you're asking your support teams to help you recruit somebody, you can give them screening questions. And if you automate your process in a way where you can decide on those questions on Monday and still have somebody on your calendar by Thursday, it gives you a little bit more of that week over week flexibility of who's the best type of person for me to talk to this week.

Teresa: That's obviously going to take a little sophistication in that automating the recruiting process. So I think step one is just get someone in the door every week. And if this week's conversation isn't the most optimal, know that you're going to do another one next week. And then with time, you can refine those automated recruiting processes to be even more and more fine tuned.

Erin: Now let's say I'm on a product team, and we've got some designers. Maybe we have a researcher, maybe not, probably got an engineer, maybe some PMs, et cetera. In this process, are you trying to get everybody on the team talking to somebody once a week?

Teresa: In an ideal world, I want to see the trio, so a product manager, a designer, and an engineer do the interview together. We want to balance, here's what's hard, we want to balance with we don't want to overwhelm the participant, so we don't want 20 people in the room. But we also want as many different perspectives as possible because the engineer is going to hear very different things in the interview than the product manager. Just because of our prior knowledge and expertise. Different elements of the interview are going to be salient to different people. And so to capture as much value and insight from the interview, we want as many of those perspectives represented.

Teresa: Now if that's hard from a scheduling standpoint, you can record your interviews. But so often, people record their interviews, and nobody else listens to them. So I think the key is to cultivate this mindset of, we want a cross functional team evaluating interviews.

Erin: Cool. Okay. So it's interview time, my calendar's loaded up with people I can talk to. I scheduled a 20 minute interview. Teresa told me to. Now what do I do? What do I talk to this person about?

Teresa: Yeah. I think the biggest mistake people make when asking questions is they develop these two or three page discussion guides. It's a whole bunch of what questions, or how questions, or why questions. I call these direct questions, or speculative questions. Here's the thing, because of cognitive biases, and just because as humans we all want to put our best foot forward, when we ask someone a direct question, we're going to get their best self answer. So for example, if I ask you what you like to watch on Netflix, you're going to tell me that you like to watch documentaries. And you're going to leave out the fact that you watch reality TV.

Teresa: And it's not that you're lying to me. Right? You probably do watch documentaries on occasion. But because I asked you a direct question that is encouraging you to speculate about your past behavior, which is something no human being is good at, your brain is immediately going to think about an answer that reflects who you aspire to be. And so it's going to think about the one time you watched a documentary, and you're going to tell me about this great documentary you watched.

Teresa: So what I encourage teams to do is throw away the discussion guide. I want you to come up with one interview question. Now you can have some warm up questions, like tell me about yourself. Or you can ask quick fact questions, like again with Netflix, you might want to know how long they've been a subscriber. Although, don't ask that in an interview because it's in your database. But you might have some warm up questions, just to build rapport. But the meat of your interview is what I think can be driven by one question. And it's a tell me about a time when question.

Teresa: So with Netflix, it might be, tell me about the last time you watched Netflix. If I'm looking for something more specific, it might be, tell me about the last time you binge watched Netflix. Here's the key. I want to collect a specific story because that's what's going to be me reliable feedback. So if I ask you about the last time you watched Netflix, you're not speculating about your behavior. It's a lot easier if you tell me about a specific time. We can role play this a little bit. Erin, if I asked you, tell me about the last time you watched Netflix.

Erin: Yeah. I'll do it. I was passively watching Descendants Two, which my daughter really wanted to watch. Do you guys know this movie?

JH: I do not.

Teresa: No.

Erin: This is all of the children of the evil queens from Disney movies. And they are in this moral dilemma because they want to be good guys, but they're fighting against their genes. And there's lots of music, and everyone has blue hair. And I was working on my laptop while Stella was watching this.

Teresa: Okay. Perfect. Erin actually did a better job than most people will do. But she still didn't tell a very good story. And I'm not trying to pick on Erin in this. My job as the interviewer is to pull out the story. So what most people are going to say when you ask that question is they're going to say, "Oh, I was watching Descendants Two with my daughter." Okay, cool story, bro. Right?

Teresa: Now the interviewer has to do the work to excavate the story. Right? So I'm going to ask. Let's role play this a little bit. So Erin, you're watching the Descendants Two with your daughter. It sounds like you were working at the same time. Set the scene for me. Where were you?

Erin: I was in my living room.

Teresa: Okay, so at home in your living room watching on a TV.

Erin: Mm-hmm (affirmative). Yep.

Teresa: And when was this?

Erin: This was last night.

Teresa: Last night, okay. And you were watching with your daughter.

Erin: Yep.

Teresa: And so tell me, I'm just trying to imagine the scene, so you're on your laptop in front of the TV doing some work.

Erin: That's right. I've just put my younger daughter to bed, so Stella's going to watch a little of the Descendants Two. She's going to go brush her teeth. I'm going to use the watching 10 more minutes of Descendants Two as a carrot to brush her teeth. And I'm going to use that opportunity to get a little work done and check out this funny movie she's watching.

Teresa: Okay. Perfect. You guys see, I can continue here. One thing I want to highlight is I didn't actually ask Erin a question. I just repeated back what she told me. Right? So you're in your living room. You're on your laptop. You're watching the Descendants Two. I didn't even have to ask a question. I'm just using active listening principles of, I'm going to summarize what I heard and verify that I got it right. And then that's going to encourage the participant to keep telling the story.

Teresa: Now I can choose to drill in anywhere I might be interested. So for example, if I'm on a team focused on helping people figure out what to watch, I might say, "Oh, tell me how you chose Descendants Two." Right? If I'm on a team that's focused on the in story experience, I might ask something like, "Oh, so you're jumping in and out of the movie. Did you ever miss parts?" So there's lots of places where I can dive in, depending on what I'm trying to learn. But the key here is we're exploring those what, how, why questions in the context of a specific story. So you're not speculating about your behavior, you're answering those questions in the context of a specific instance, and that's going to make them much more reliable.

Erin: Right. And you're starting small. Yeah, you're not like, "All right. Tell me everything," and asking all these sub questions and so on.

Teresa: Yeah. And then it also feels a lot more like a conversation. And it's going to build much better rapport. Here's the thing, if I ask you a why question like, "Why do you work while your daughter's watching TV?" We don't ask ourselves those questions. We just do things. Right? And so you can come up with an answer to that question, but it's not necessarily reflective of your behavior. And that's the key in interviewing, is staying out of this realm of speculation and trying to stay grounded in specific stories.

JH: I was role playing taking notes during that, by the way. I was a part of that too.

Erin: Do you let your would be participants, your future participants, know what you want to know when you're booking them for these interviews? In other words, how much do you want to reveal or hide in terms of what you're trying to discover? You used examples of maybe I want to learn how you discover new content, or how you X, Y, or Z. What's the right level of, kind of, here's what we're trying to do you want to share with them?

Teresa: Yeah. I think this is kind of a distinction. You want to share enough to make sure you're getting the right person in the room. For example, if you want to learn more about binge watching, and you want to make sure the person has at least binge watched before, so that would be a screener question. You would probably want to ask them maybe something like: What's the most number of episodes you've watched in a single sitting? What's hard about that, again, is you might get an idea answer, and their answer may not be very reliable. So to me, that's reasonably okay as a screener question. It's not a very good interview question. So the distinction there is the screener question is: Am I getting the right person in the room? But an interview question would be treating that answer as if it was a fact.

Teresa: So if in my screener, you told me the most episodes you've watched is four, I don't want to now go make decisions based on the fact that a bunch of people told me they watched four episodes in one day. I probably want to look at my database for that answer. Right? So I think, I don't know that I would tell them, "I'm going to ask you to tell me about the last time you watched Netflix," because I don't really want them crafting their story before they arrive, because again, that's when that sort of best self is going to start to come out. And I want to real messy story. But I would think about: What are the right screener questions to ask to make sure I'm talking to someone that's relevant to what I need to learn?

JH: Just to get a baseline understanding, when you are having people automate this and make it a recurring habit and cut down on the cycle time between conversations, are you always using it for this kind of discovery type probing? Or will you have teams who take advantage of the time scheduled for a usability session? Or do you do those separate? Is that a different type of thing? Or can that fit into this too?

Teresa: It can fit into it as well, but I always encourage teams to do a generative discovery interview every week. And if you're also doing a prototype test in that interview, that's fine. But so in this example, I would still start with, tell me about the last time you watched Netflix. I would get you to tell me a story, and then I would test the prototype in the context of that story. So instead of giving you a made up, canned task, I would say, "Okay. Now image that it was last night you were watching Descendants Two with your daughter, and you had this instead. Walk me through what would've changed in this specific instance."

JH: Okay. Yep. That makes sense. And you've talked a lot about lowering the barrier to entry here, and doing quicker sessions, even 20 minutes, whatever it may be. How do you deal with the fact that different people kind of give different types of responses? Some people are one word answers, and other people, once you get them going, will tell a story for 10 minutes. How do you make the best use of time when people are kind of going on and on when you ask them to recount a specific story?

Teresa: Yeah. Well, first of all, very few people are going to go on and on. I mean, I know there are those people out there that exist. But especially in an interview context, to ... We have this expectation that conversation is 50/50, you say something, I say something. So if all you're doing is asking a question, you're going to get a short answer, and that's where this sort of excavating the story comes out. There are going to be times where people tell a part of the story that's just not that relevant to you. We saw that a little bit with Erin. She went off and starting getting excited about Descendants Two, which she sold me on it. Now I want to go see that movie. But that's not what I was trying to learn. Right?

Teresa: So the key is you never want to interrupt your interview participant. And even if what they're telling you is totally irrelevant, they're telling you it because they care about it. And if they care about it, you need to care about it. One of the things that nice about reducing the cycle time between customer touchpoints, you can have a total dud of an interview, and it's not that big of a deal because you're going to do another one really soon. So I think if you keep that in mind, it helps with, hey, we only have 10 minutes, and you went off on this thing I don't really care about. That's okay because I have another interview on the books later this week or early next week.

Erin: Yeah, awesome.

JH: Do you encourage teams to right size their schedule or cadence here around maybe their other product development frameworks or processes they're using? So if they're doing two weeks sprints, does that change how you approach this at all? Do you know what I mean? Do you try to get them into some sort of harmonious lockstep, or this is its own thing, and how you make tickets and schedule them is its own thing, and they can be separate?

Teresa: Yeah. I don't get too caught up on the delivery rituals of this because I feel like they're so unique to each team. This is where I think Agile has gone off the rails a little bit, and it's not the Agile mindset, it's the scrum ceremonies. And we're forgetting the retrospective, where we're supposed to change the process to better meet our teams' needs. So really according to the Agile mindset, no two teams should be doing the same thing because we're continuously improving the way that we work. So I focus a lot on: Where does this fit in your sort of team rituals? Because I feel like that should be completely up to you.

Teresa: I do like to see, if you're only talking about delivery in your daily stand ups, you're probably overemphasizing delivery like discovery, what you're learning in discovery should show up somewhere. I do think what you're learning in discovery fits at the start of a planning meeting pretty well. I'm not suggesting that they're totally divorced and that you shouldn't be thinking about this. I just feel like it's so custom. It should be so custom to each team. I don't really like to give guidelines around it other than do it continuously.

Erin: You started to talk about this. But what do you do with this data, this qualitative data, after a week, after a month, after a year? You probably are working on shipping different things throughout those time periods. Maybe you're on a squad, or one of many teams within your larger team, within your larger company. Who knows? To your point, every team is different. How does this information synthesize across teams or people over time and add up to something?

Teresa: The first thing I do is I have teams synthesize each interview one at a time. So because we're interviewing continuously, it's not like we're doing half a dozen interviews, and then going into a conference and affinity mapping what we're learning. Right? We have to be doing ongoing synthesis. The way that I teach that is I have teams do a one page interview snapshot. It's just meant to be a visual depiction of: What did we learn in this interview? And then that snapshot becomes almost your index. A lot of the teams that I work with, some of them literally print these out and put them in a binder in their workspace, so that when they have a question that comes up, they can flip through all their past interview snapshots and look at: How often are we hearing this? What are the interviews we need to go back and revisit? Who do we want to talk to, to learn more?

Teresa: It's a way to make your research really accessible and easily reference-able. And so most teams are doing some combination of some physical artifacts around this, but also a digital version that they can tag and search. That's the first piece. And that document is: What is everything we hear and learn in this interview? And then I think about interviewing as a way to discover opportunities, so customer needs, pain points, desires, wants. And you're capturing those directly on your interview snapshot.

Teresa: Then I have teams periodically, maybe every couple weeks, will look across all their interview snapshots and say, "What are the most common opportunities that we're hearing?" And I have them map those on this visual structure I use called an opportunity solution tree. And it's just a way for a team to map out the opportunity space related to the current outcome they're working on. For example, again we'll run with the Netflix example, if I'm trying to increase the minutes watched, if I'm a consumer team focused on getting subscribers to watch more, I would look for all the opportunities across my interviews, if we addressed them, would drive that outcome. And then they're mapping that on their opportunity solution tree.

Teresa: You might be interviewing week over week, creating snapshots as you go, and then periodically, as you're starting to see, oh, there's new opportunities that are emerging that we haven't captured on our tree, you would continue to update that document. And then that becomes sort of your living road map of here's where we are, here's what we could do, here's where we're going.

JH: Yeah. Just to backtrack for a second, you mentioned when you put together that document after an interview, you mentioned making it visual. And I always think of the artifact that comes out of an interview is a long, rambling set of notes in a Google Doc. What about it makes it visual? I guess, what does that actually look like? I'm having trouble picturing it.

Teresa: Yeah. I think the key for it to be visual is a couple reasons. One is when we visualize things, it's easier for us to remember them. And then two, the goal of a snapshot is to be an index to our research. So if you're not going to read a full page, that is a good index. There's a few ways we do this. I always like to encourage teams put photos of who they interviewed if they can. If you're in the healthcare space and you're talking about really sensitive topics, you're probably anonymizing your snapshot. But if you're Facebook, you probably can put a photo of your consumer on your ... And then different companies have different policies about personal identifying information, so of course, you have to follow those policies.

Teresa: But you can use their company logo. Or if they talk about their favorite movie, you can include a cover of that movie. But the big piece that makes it visual is when you collect a story, you can actually draw that story. So if you're familiar with customer journey mapping and experience mapping, you can draw out the flow of the story. And doing that, one, helps you really understand the story. But it's also then a very nice, quick visual reference. And then if you look at those drawings across lots of stories, it helps you to start to see patterns across what sound like very distinct stories, but all share a common structure.

JH: Cool. That's awesome.

Erin: I wish I could draw. I have this wish a lot.

Teresa: Yeah. Here's my comment on drawing. I draw like a kindergartner. I think when we talk about drawing a synthesis, it doesn't really matter what your drawings look like. Your goal is not to create a piece of art. Your goal is, you're drawing to think. I think we all have the ability to do that. And then obviously, the more you do it, the better you'll get at it. And actually, what I love is that sketch noting is becoming really popular. So there's a lot of tools and videos and books about how to get good at sketching noting, so those really help. And then Christina Wodtke has a book out called Pencil Me In, which is drawing for business people who don't draw. And it's just amazing.

JH: Nice. That's a good title. I like that.

Erin: Awesome. I'm going to check that out. What questions do you get the most in the classes that you teach, that you're constantly answering? What are you tired of saying over and over again? And say it one more time.

Teresa: The first we talked about already, which is collecting of stories in interviews, and really getting away from those direct speculative questions. And then the second, I think the hardest piece for people is synthesizing what they're learning, and doing it in a way that it's actionable. I tell people if there was one takeaway people took from this whole interview, it's that they would recognize that interviewing and the synthesis of interviewing are skills. And they're skills that need to be practiced. So what's hard about this is we think about interviewing as, oh, I'm just having a conversation. And you are just having a conversation, and the more it feels that way, the better you're going to interview.

Teresa: But all the things we talked about, about collecting a story, digging to excavate that story, reflecting back what you heard instead of asking question after question, all of those are skills that just take practice to get comfortable with them. I will pitch, I have a course on this that gets people practice, called Continuous Interviewing, that hopefully we'll include a link for. But whether people take that course or not, I think the key is just to recognize it's a skill. The more you do it, the more comfortable you're going to get with it, and to just think about: How can you be deliberate about focusing on the specific pieces? Collecting stories, reflecting what you heard, synthesizing through drawing, and really just looking at each one of those as a skill that you can build over time.

Erin: Awesome.

JH: Yeah. That's great advice.

Erin: Well, I think that's a wrap. I'm going to ask one last question that I like to ask people in general, which is: What do you love about user research? What's your favorite part about it?

: I love that even though ... When we do a lot of user research, it's easy to feel confident, be like, "Oh, I know my customers. I'm learning what they need." But if you do user research well, and you always cultivate this mindset of what is surprising me in this interview, you will always find something that is surprising, that no matter how much research we do, no matter how much time we spend with our customers, we can't completely know them. And I feel like the goal of user research should be to find those moments. And so I love them because it's just a good indicator that you're doing your research well.

Erin: Thanks for listening to Awkward Silences, brought to you by User Interviews.

JH: Theme music by Fragile Gang.

Erin: Editing and sound production by Carrie Boyd.

Carrie Boyd
Former Content Writer at UI

Carrie Boyd is a UXR content wiz, formerly at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.

Subscribe to the UX research newsletter that keeps it fresh
illustration of a stack of mail
Table of contents
down arrow
lettuce illustration against an abstract blob filled with a teal brushstroke texture

The UX research newsletter that keeps it fresh

Join over 100,000 subscribers and get the latest articles, reports, podcasts, and special features delivered to your inbox, every week.

Yay! Check your inbox for a welcome email from
Oops! Something went wrong while submitting the form.