All posts

UX Research Topics

Podcasts

Field Guide

SUBSCRIBE TO OUR NEWSLETTER

Thank you! You are all signed up.
Oops! Something went wrong while submitting the form.

FOLLOW US

BlogAwkward Silences

Using Session Replay Tools to Supercharge Your User Research with Elyse Bogacz

Seeing what your users are doing in real sessions can help stakeholders build empathy and add to your existing research practice

Carrie Boyd

This week on Awkward Silences, we talked to Elyse Bogacz. She's worked on product at places like Drift and Runkeeper, and now she's VP of Product Design at NDVR. To add more depth to her user research, she uses session replay tools like FullStory and analytics tools like Hotjar to understand how users interact with products in real life. She walked us through how she uses session replays to build empathy with stakeholders, what she does to protect users privacy, and how she uses session replay to reach out to the right users at the right time.

The best stories about user research

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

Highlights

[1:11] Tools like FullStory hand Hotjar have helped Elyse learn important things about users at early stage startups

[4:43] You only get a limited number of time to speak with each user, use it wisely

[6:20] How Elyse uses session reply to decide who to reach out to for user research

[9:35] Actually seeing users struggle in session replay helps stakeholders build empathy

[16:40] There's no replacement for a one on one chat with a user, but replays can be a good icebreaker

[19:46] How privacy and GDPR plays into all this

[25:21] Session replay is not screen recording

[28:59] How Elyse keeps track of all the insights that surfaces

[31:33] How to cope with backlog

Transcript

Erin: Hello everybody and welcome back to Awkward Silences. We are here today with Elyse Bogacz. She is the VP of Product Design at NDVR. She also has extensive experience at Drift, Runkeeper. And she's an adjunct professor of product design, teaching seniors at the Massachusetts College of Art and Design. So she knows a thing or two about product design.

Erin: Today we're going to talk about, how do you use passive data tools such as FullStory and complement those with your user research to get a fuller picture of what's going on with your user. And of course, to design products to make them very, very happy. So thanks for joining us today, Elyse.

Elyse: Happy to be here. Thanks for having me.

Erin: Got JH here too.

JH: Yeah, I've never really actually been in an organization where we took good advantage of tools like this. So I'm super excited to hear stories about how they can be deployed and what you can learn from them.

Erin: Get the full story.

Elyse: Oh yeah.

JH: Oh, look at that.

Erin: But I'm going to get those sound effects going.

Elyse: Yeah, so I'm kind of accidentally a early employee at a lot of the companies that I've been at. And you know, when you're super early you don't have a whole lot of customers. If you do, you wear out access to them pretty quickly.

Elyse: So tools like FullStory and Hotjar and those types of passive tools are really good supplement to the actual user conversations, that I found. So I would consider them essential to any part of a user testing toolkit. Even at small or large companies.

JH: Just to get the landscape, like are those the two main players? Those are the two I always hear of, FullStory and Hotjar. Are there other ones that kind of fit in this space at all?

Elyse: Yeah, I mean like so Crazy Egg does some sort of a session replay and I think they do optimization. I feel like they used to be known for A/B testing and that's evolved a little bit. There's also things that you wouldn't think of as like passive feedback collectors.

Elyse: So when I was at Drift and I've seen this done on other websites a lot more lately, like sentiment pulling is something that we're used to. Like if you have a conversation with a rep, you get like this little poll that's like how was your conversation? And there's like an emoji from like really angry to like super happy.

Elyse: But at Drift we used to use our own tool at the time. I've seen more companies kind of building out bots on live chat platforms. Not just for customer support, but to kind of be a part of either a new feature rollout or if you're using like a new part of the service. A bot will kind of be like, "Hey, how's that going for you?" Maybe if you engage it'll ask you a few follow up questions.

Elyse: I've also used bots as a way to build user testing, like opt-in pools. Which is super useful and kind of a lot nicer than having to email a bunch of people and be like, "Hey, I'm Elyse and this is why you should care about anything that I'm saying. Can I bother you further for like 30 minutes in the next couple of weeks?" With bots, you can kind of set that up and let people opt-in automatically. Then kind of remind them when you reach out via email like, "Hey, you said you would do this thing. Any chance I can grab some of your time?"

Erin: Nice. So, how long have you been working with tools like this and how did you sort of get started figuring out how they can be useful? Because I know for me, it's like what's going on with these heat maps and just like figuring out how to actually use these and make sense out of them can be a little bit of a learning curve.

Elyse: Yeah, I think you see heat maps being used more so on like marketing websites or like single page web apps. So I got started using these things probably, maybe six years ago, six/seven years ago. FullStory was actually like a game changer kind of in the early days of Drift. So we had no customers and we were working to get customers like every step of the way. We were super, super early. It was literally just when we were building the live chat portion of the product. There was nothing else to it.

Elyse: Sort of what we found, is either we had people that we talked to all the time and became kind of either biased. The super fan customers like end up not being like really good target user testers because they have all of these like power ideas or they get really excited about telling you what you want to hear. So when you're super early, like either no one wants to talk to you or the people who do want to talk to you, you wear them out very, very quickly.

Elyse: I think like access to users, as you get like a set amount of tokens per user. Maybe let's say it's five right? And passive tools like FullStory let you hold onto those tokens a little bit longer. So when we implemented FullStory into the product, we really focused on using it for like onboarding. We would spend time during lunch reviewing. It would be like story time. We would all sit around with our lunches and instead of talking we would queue up some FullStory sessions and we would watch those.

Elyse: It's kind of a nice way to surface certain things that you wouldn't otherwise know through any types of data warehousing. You're not going to know that somebody's got a little tripped up on your copy if you're just looking at like Mixpanel data, right?

JH: Mm-hmm (affirmative).

Elyse: So we would look at these things. They have this feature called Rage Clicking, which I think is like a fun feature. But some people rage click, some people don't. I think the biggest indicator is when somebody is like trying to use something and you're watching them like not be able to use it. You're sitting there and this anxiety is building and you're like, "Oh my God. I designed this thing and this user is having a really hard time with it. This is surfacing, like this is something I need to revisit."

Elyse: So it helps you build a list. Some of the things maybe you can address right off the bat. But sometimes you need to be able to say like, "Okay, here's the list of things that I saw that are kind of interesting sticky spots in this version of the product or this area of the product." And now I'm going to reach out to users who I know have recently interacted with that interface and I'm going to kind of ask them about these specific things.

Elyse: If you think about usability testing, you have a prototype that is very specific and you're testing a very specific thing with users. FullStory can give you a list of specific things to talk through. Because conversations just for the sake of conversations with customers can be really all over the place. You can go into a rabbit hole very quickly.

Erin: When you're using FullStory, you're finding people who are having issues. Are you literally reaching out to those same people? Or using other data sources to find people who abuse those features?

Elyse: Well, I think it depends. So, it depends on the sensitivity of the area that they're working in. So obviously at NDVR, we're in the fintech space, so like some of these things are a little bit more sensitive. We really want to make sure that we're protecting people's privacy and data. We're going to be using the privacy screening features on those. So we won't actually be able to see real metrics inside of people's accounts.

Elyse: But, if it's something like an onboarding, FullStory surfaces those users. If it's a registered user, you have their information and you can reach out to them directly. You can do it right in a non-creepy way. You don't need to reach out to somebody and say like, "Hey, I was watching your session."

Erin: Right. No, that's what I'm wondering. Yeah, how do you get from A to B?

Elyse: Yeah. Well, so normally what I'll do, is I'll put together like a list of users who I think would be good. You guys know this with user interviews. Like you reach out to 50 people and maybe only 20 respond and maybe only 10 agree to talk to you. Then maybe out of those, only the eight show up, something like that.

Elyse: So I'll make a list of people who have been in that area and sometimes it's really easy to do it with FullStory. A lot of times, I'll just ask like a backend engineer on my team to be like, "Hey, can you pull a list of emails for users that have done like X, Y or Z." And with like some simple SQL we can get that list.

Elyse: Then I just send an email and I just hope they book some time. A lot of times people do. Especially if they have a hard time with something. They want to talk about it and they want to feel acknowledged. They want to know that someone is listening to their pain and hope that person is able to fix it.

JH: Have you found a benefit? Like watching these sessions the way you described, hanging out and eating lunch and kind of going through these replays. Is it there's almost like some motivation or urgency created within the team to fix issues that you observed? Because for me, when I've like played around these tools, to your point about the quantitative data, of 10 people hit this onboarding flow and four of them dropped off. It's hard for me to get developers or designers excited about like four people. That's a lot.

JH: Whereas, when you watch one session of somebody stumbling around the screen and clicking and really not getting it, there's something about that. That just like creates this feeling of like, "Oh man we got to fix this. This is like really bad." Is there a piece of that is at play here?

Elyse: Oh my God, absolutely. So like something that I think about all the time, is like how do you build empathy across the team in non-designers. So I'm a designer but I don't necessarily work with other designers day to day. I spend most of my days with engineers and product people. Part of my job as a designer is, and I think yours too, John, like working in product there's nothing more important than understanding the user. Like that is the most critical thing.

Elyse: I think sometimes we kind of disadvantage our engineering colleagues by not bringing them along for the ride when we're doing user interviews. They don't get an opportunity sometimes to actually see the pain that the customer has. They just have to take my word for it. And you know, that really only goes so far.

Elyse: So when you're watching and you have your whole team and your engineers sitting around and you're eating lunch and you're watching these things, they're kind of like Netflix short stories. People who aren't normally exposed to users, even though they're building the software, they start to understand and they're like, "Oh my gosh, that's horrible." And also, "This is an easy fix. I can spend 20 minutes and I can get this fixed." It's really a way to get and get these small things just sorted out very quickly without this whole add it to the backlog. Because backlog is where things go to die.

JH: Yeah. Yes. Some are very cringey to watch.

Erin: Does somebody before these lunches, or any kind of activity where people are gathering and kind of watching these together, how do you find good ones to watch? Are you doing a random sampling or is someone prescreening them or?

Elyse: Yeah, that's a good question. So random sampling, and usually I go for longer sessions. FullStory has this feature where it autospeeds through any dead time. So if you were to open a session, start something and then walk away from your computer for like 45 seconds and then come back. FullStory will just kind of skip you through that dead space of they're not doing anything. So it's really easy to just kind of quickly move through the set.

Elyse: The good news about tools like this, as a designer it's not that hard to implement with a little bit of help from some engineering people on your team. They have paid sessions, obviously, if you're doing kind of like more targeted segments. But their free platform is like a thousand sessions a month, which you know, that's a lot of data and no designer is going to watch all thousand. So if you're trying to just get started, that's more than enough data to get started.

Erin: Is it pretty obvious to everyone in the room? Like what's sort of happening? Like rage clicks or just frustrated or, "Oh no, they can't figure it out." Is that all kind of obvious? I'm imagining because we're eating lunch, we're like mass spitting out our food, "Oh my God. I think that was terrible."

Elyse: Yeah. I mean, I think it's funny, sometimes yes, sometimes no. Which is why the sometimes no you need to kind of file them away and follow up. Because there's certain things, so I think about levels of user testing. This is kind of relevant. At NDVR we have that really hard persona that's tough to get FaceTime with because they're super busy. They're kind of doing a bunch of different things. They're either a professional or they're some kind of executive. There's someone whose time is super, super valuable and they're not going to... I don't want to waste any of my tokens with those people. Those are super, super valuable.

Elyse: So, the way I think about certain levels of user testing is there's like a range. Usability is kind of the basis. Humans at any level can decide if something is usable or not. And then there's comprehension. Do we understand what's happening in this screen and why? And then there's desirability. Desirability is something really only like the target audience of your product can give you.

Elyse: But, the other two things, usability and comprehension, normally you can assess with normal individuals. Who maybe are familiar with your space or have something in common with your user but aren't like a perfect user. Then usability is something that I think can really be answered on its own sometimes with FullStory.

Elyse: At Drift, here's a really good example, we're working on this onboarding. I think we designed a little widget that allows you to like pick a brand color and it would automatically restyle, like reskin the page as kind of like a first wow moment.

Elyse: I think we had this really complicated color picker. People were just having a really hard time using it. We were watching them like select a color and then it wasn't saving and it wasn't updating the page. So two things right there. There's a bug where if someone makes a selection and it's not restyling the page, like that's something we need to fix.

Elyse: And two, no one's really trying to pick their own color. We can probably, in the onboarding, just give them a little palette set of six to eight colors and just let them click one and that will make it a lot easier. So those are kind of two levels of insight from a usability perspective that we don't need to actually talk to users about.

Erin: So we're bridging these tools with talking to users. When do you talk to users? We talked a little bit about when you find opportunities and you kind of reach out. Can you talk about some of your experiences having done that, going from one to the other?

Elyse: I think there's kind of two levels to this. So obviously, the first one I've already talked about a little bit. If someone is struggling in your product and you can see them struggling and you want to understand why or what they're trying to do. So sometimes reaching out to those users directly will get you a conversation.

Elyse: But then talking to users in different ways, like sentiment polling or if a user ops-in for a live chat session. So we roll something out, for example. The first couple hundred users that come to it are engaging with it. They might get like a bot pop-up that's like, "Hey, how are you liking this so far?" If they answer "not great", we want that bot to follow up and understand why. Sometimes they'll reply, sometimes they won't. Sometimes they'll reply with something that's like a little bit cryptic. But there's always value in digging in more.

Elyse: So I think the goal in those sessions is to be able to reach out directly and say, "Hey, I got your feedback. I work on this directly. I want to really understand like where you're coming from and what you're looking for. Can I grab 15 minutes of your time?" And a lot of times people are really open to that.

Elyse: There's really no replacement for having one-on-one conversations with your user. But I think about these tools as sort of like icebreakers. Ways that you can have an easier entry point to having those conversations with your user, instead of just a cold email that they're not expecting or have no context for.

JH: Can you go the other way with tools like this? So we kind of were talking about how FullStory and tools like that kind of sit in the middle between qualitative and quantitative like learning and insights with users. We've been exploring, we see people struggling on the screen, we can outreach and then go talk to them. Which is pushing more towards the qualitative end. Can you do it the other way? Like with 2000 recordings on this part of the experience, can you pull out any quantitative analysis from these tools as well?

Elyse: That's a good question. I'm sure there is a way. I haven't played around with that at all. Normally we're using something like Mixpanel for activity data or some kind of data warehouse system. If you have a backend engineer helping you, they can probably just like pull stuff right from Redshift.

Elyse: But you know, I can kind of mess around in Mixpanel and get some easy answers. Usually we're doing event tracking and event logging for other reasons and that's a richer dataset. The tools that work for like large sets of data do it really well. Whereas, something like FullStory is really good at just session replay. So I only really use it for that. The reason why, I use Principle for prototyping, even though Sketch has it. Principle is just the best at it. So why would I use Sketch's, prototyping when I can have lots more freedom with this other tool.

JH: Right. So the other kind of data tracking tools, event tracking, just do it better. So just rely on those for those cases.

Elyse: Yeah, they just do it better. When you're building out your features and there's certain things that you want to make sure that you're tracking and you can set up for those. It's just, you're taking advantage of a system or whatever the process is that your team has. I think a lot of companies do it differently. But you're basically taking advantage of something you already have. Versus trying to do some kind of more in-depth integration with a third-party tool.

JH: The big thing with tools like this that comes to mind right away, is around privacy. In terms of, how do you let users know that this is happening? How do you make sure you're protecting sensitive information? Are there any best practices, or like things you've seen implemented, to do that well?

Elyse: I haven't seen anyone do this super well, but I totally agree. Especially with the space that I'm in right now with FinTech, there needs to be a level of transparency. I think people need to be able to opt-in or out of some of these things. How we're going to implement that is something that I'm starting to figure out right now. But I totally agree that privacy is the most important thing.

Elyse: If you're going to be collecting data on your users, I mean GDPR is like the new thing. Where users can be able to have you delete all of their data right off the bat. I feel like we'll probably go towards that a little bit more in the US.

Elyse: So one team that I know does this really well is PillPack. I think their process for this is actually really impressive. Because they're a pharmaceutical company, all of that data, anything that is personally identifiable is like super, super sensitive. No one on the team should have access to that. So they put together like a really in-depth privacy process where any designer or anyone on the team who's like maybe looking at a user account has no idea who that person is. There's nothing about that profile that can give that data away. But they're able to actually look at real information, which is really important.

Elyse: There's this sort of Catch-22. Where as a designer, or as someone who's working on product, you want to be looking at real information so that you can make the best choices for those circumstances. If you're looking at fake data that's not super useful. But, we need to protect the privacy of our users.

Elyse: FullStory has this feature where you can go in and you can kind of tab like certain areas. If you have a dashboard, for example, you can go and you can tag like metric one, metric two, metric three and it'll screen them automatically. So if you're watching back a session, you won't actually be able to view those numbers. They'll be kind of blurred out. It does the same thing with passwords. I think it does the same thing with any JavaScript snippets that are on the page.

Elyse: There's definitely ways that you can protect a user's identity. And I think that's something that, as designers, we have to. We can't skimp on, that's super important. But I think this is still a space that people are figuring out.

Elyse: Here's a flip question, have you seen anyone do it really well? I'd love to know. I'm always on the hunt for examples.

JH: No. I asked because it's been a real debate within our team. Some of the engineers have been very hesitant to deploy it. We were in a similar boat of, we handle other teams' customer information and so we want to be really careful and protective of that.

JH: You know, if some engineer has been really like, "This is like essentially almost like key logging. Like we're not comfortable with this." Then other people are much more on the other side of the camp of like, "This is so helpful and we can improve the experience so dramatically." How do we strike the right balance? I think it's something we'll look into further.

JH: I know FullStory, I think, has some options now where you can prompt consent from the users before the screen recording kicks in. Like to your point, you can hide fields and stuff. So I think if you get it configured right, you can kind of strike the right balance. But we haven't had enough time yet to dig in and actually make sure we resolve all those issues before we move forward.

Elyse: Yeah. I think this all kind of pulls back to trust with your users. If your user, I mean you guys have a lot of technical users. I've worked on products that have a lot of technical users. If you just kind of right click and inspect, you can see the FullStory snippet installed in the header. If me, as a company, if we haven't let you know that's happening, you might be like, "What? Why are you recording my information?" Like trust might be broken. So how this is handled, I feel like, it always needs to err on the side of over communication.

Erin: Yeah, it is tough. And I do think that these privacy and user experience trade-offs are only going to become more complex and plentiful as GDPR and other regulations tick up. And as at the same time, companies' attempt to know their users better and become more customer-obsessed.

Erin: Are the pervasive, "Hey, we use cookies," helpful for privacy transparency? Or are they just a really annoying user experience? Or are they both? Do those notifications have buried within them, if you actually click through to the privacy policy, do they divulge, "Hey, we're using FullStory. Hey, we're doing X, Y, and Z." I mean, you could just imagine to make fully transparent every sort of scripter technology that's potentially a privacy concern. How do you actually get that through in a way that isn't like a thousand popups with every technology that we're using on our website?

Erin: I think there are interesting questions. Like how do you let people know? How do you know what they care about? And I do think there'll be more sort of UX research on this question of understanding. Forget about the European Government, that's regulation. But like what do users care about? What do they want to know about privacy and how do they want to know about it? So that hopefully you can kind of thread that needle of delivering a great user experience while keeping people's privacy top of mind and letting them know what's going on. It's a lot to do at the same time.

Elyse: Yeah. I mean the whole thing is so nuanced. I think your point about cookies, I think as users kind of across the board, we accept cookies. Like we know that most sites have them. I think we know that most sites are tracking our movements like in a logging perspective.

Elyse: But I think where we get really weirded out, and this is just because the technology is new... FullStory, it's not screen recording. So just to make sure that's clear. It's session replay where they like recreate it. I think they recreate everything with like JavaScript. It's not like they're recording your screen.

JH: Yeah, that's a good distinction.

Elyse: But it's really weird. You know, when you're talking to your friend about something and then like later in the day you get served like an Instagram ad. Like this whole like video-voice technology that's starting to track users, it's kind of creepy because it's new and we're adjusting. Like we've already adjusted to cookies and kind of like the broader tracking. But like the voice tracking is really weird. Because as a user you're like, "Oh my gosh. Can I just talk openly without worrying about like Facebook or Instagram listening to me?"

Elyse: But at the same time, like I recently was on... I follow a lot of like female entrepreneurs on Twitter. There is this thread that was in conversation that was starting to happen around the idea that, I think someone revoked all data. Now they're getting served like totally random ads that they don't care about and are not personalized to them.

Elyse: So it's like, where is the balance? Like we want to see things that are personalized and relevant to us to like make our lives a little bit easier and discover things that are interesting. But at the same time, we don't want anyone tracking us. So I mean, this whole thing is super nuanced. I'm really excited to see how it unfolds.

Elyse: One of the things that I talk about with my students, is how do you make sure you're moving forward in an ethical way with everything that you're doing? If you're asking a user to connect their bank account to your app, like you need to explain why. If you're asking a user for this permission, you need to explain why.

Elyse: You can't just be asking with no context. Like if you need someone to give you permission to your contacts, the reason is like, oh, it's easier to add your friends this way. It's a better experience. Like whatever that reason happens to be, you need to make sure that the user understands it.

Erin: Right, and then do that and not do...

Elyse: Yeah, and then actually, yes. Then actually do that and not do something else.

JH: Do you guys want to know my big prediction is? That there's going to be a huge privacy scandal at some point around browser extensions. Because you basically give them permission to see like everything in your browser and your internet and people just do it passively. I think there's going to be one that comes out eventually that's like, this innocuous browser extensions actually been doing like really shady stuff and it's going to be like a big thing. That's my prediction.

Elyse: Oh wow. Well, you know what's really funny? I was having this conversation with a coworker about the Honey acquisition to PayPal. Really all they're buying is the data. They're not buying Honey as a tool. They're probably just buying them as a data site is my thought. But I think that's probably right, John. I'm kind of nervous for who it is. I hope you don't have it installed.

JH: Cool. A question, if I could just go back a little bit. You mentioned, you watch these things during lunch. You see something that's really cringey. The team's motivated, maybe a quick fix goes out. That's a really satisfying loop.

JH: But the other part of product and design and fixing things, is there's sometimes issues you encounter that are lower priority. You're like, "Ooh, this wasn't great. I want to log this so we can revisit it, but like we can't fix it now." What do you do with the session replay? Do you have like a library of ones like, this is where somebody hit an issue with this feature. When we are going to prioritize fixing that, here's all the stuff we can go back to. Or is that not really a part of it?

Elyse: So I have a Trello board that I keep certain things in it. But the way I think about stuff, is if it's a real problem, it's going to resurface multiple times. It's not just going to go away. If it's a real problem, it's going to keep coming back so it's going to come up again. So I might stick it in a Trello board. But honestly, I feel like if it's something that is either something we're going to work on within the next couple of weeks, or it's not something that it's not really a priority, I might just let it slide.

Elyse: Because when I'm in the mode of like going through these sessions, I'm looking for high priority things. This is like that idea of a rabbit hole. Like progress for the sake of progress and shipping for the sake of shipping. If you're just putting out small fixes all of the time, if they're not super important or users don't actually care about them, they're not noticed and they don't add up to much. But, if someone is not able to do something in the product that they should be able to do, like something's broken, that's something that needs to be fixed.

Erin: Yeah, the whole backlog thing is really tough. It's like we have all this research, we're not going to act on it now. Some of it we are, "What do we do with it?"

Erin: You know, Maggie? You were talking about Drift before in the story time. We had talked with her about that on an earlier episode. But she'd done an episode recently on just like kill your backlog. Which is really compelling and scary. It's like, "Ah, what do we do with all this stuff we know we should do?"

Elyse: Yeah. Because, I mean, I feel like if a backlog is too emphasized, like you might come to... So like at NDVR, I work on a team of engineers and a product person. What we work on is obviously client-facing. But then we have an engineering team that's kind of solely working on the backend and like dev ops and these other things. If we're working on something and we need some of their support, we go to them and we're like, "Hey, like we need X, Y and Z. This is sort of our timeframe."

Elyse: It's not, but if their answer was, "Oh well, you know we have this backlog." It's like, whoa. Well, like that's not a great answer, right? Because it's blocking our team from doing things. Not that I don't like the agile framework of working. But product teams do need to be agile and kind of adapt to each other's needs and be supportive. Backlogs just become this weird like order of, we're marching in this direction. We're just going down this list and why are we doing this? We don't know anymore, but this is how we do it.

Elyse: Like there's never really a good reason for over-indexing on the things in a backlog. A lot of times, some of the things that end up in there, ultimately, they're not that important and they never come up again. So why would you fix something that's low priority when there's six or seven things high priority that are blocking users that typically are floating around.

Erin: Right. The tricky bit is making sure those things that do come up over and over again get surfaced in the absence of a backlog. You have a good system for that?

Elyse: I'm just like a Trello queen. I have lots of Trello boards. I usually only have like two or three things that I'm focused on in my left column. Then everything else to the right I don't even pay attention to. And if that column was empty and I have nothing to add into it. Like I'm constantly pruning it. I might add some things that I want to look at. But if it's something that's going to be revisited within the next couple of weeks and it might change anyway, I just delete it all.

Elyse: One of the things I do when I'm designing, if I'm revisiting something and we have a little bit of space to evolve it, I always ask myself the question, "If none of this was here, what would I do?" And a lot of times some of the answers are the same as what's already there and sometimes it's not.

Elyse: If I can just forget about my backlog and forget about the restrictions that I have, it allows me to kind of innovate and design a little bit better. Versus trying to fit five pounds of apples in a three-pound bag.

JH: Yeah. I think some sort of expiration date on stuff in the backlog is like a nice balance. Where an issue comes up, you can log it. But if nobody touches it, or comments on it, in the next two months or whatever, it just disappears. Then like the ones that, to Erin's point, keep coming up or keep getting additional information added to them, stick around. And then the ones that don't just get naturally pruned out. Kind of feels like maybe a nice middle ground.

Erin: Yeah.

Elyse: So you mentioned that you guys have a whole bunch of research and a backlog and obviously like User Interviews, as a platform, is still pretty new. I mean, I remember talking to your CEO like maybe five years ago when you were just getting started. So you guys have done this kind of homegrown, from inception thing. What types of things are you doing to test? How are you guys keeping track of the most high priority things that need to get done?

JH: For passive stuff, like insights you learn either through customer support or whatever, we've deployed a product board for that. So it is a little bit of like you let stuff pile up. Like somebody said they had trouble messaging, or somebody had trouble with this rescheduling feature, or somebody asked how this works. We can kind of organize them around themes or to specific parts of the app.

JH: With the idea being, that we don't take action on all those things. But it's more of like you get kind of a heat map of like where comments are coming up. Like, "Oh we're getting a lot of comments around scheduling," or like, "Oh, we're getting a lot of comments around this."

JH: So, it's not like an exact marching order. It's not the only way we prioritize things. So it's not like we just go, "What has the most smoke around it? Do that next." But it's like a really nice input. Then when you do decide to work on something in there, you can click into it and like read all the comments that you've associated with it. So you get this kind of pretty rich, quick context of, "People when they're doing this, are thinking about this." Like you get all this richness that is just off the shelf.

JH: So it's not like a backlog in the sense of, it's not like a bunch of features or designs or whatever that we're committed to doing. But it's more of just like organized snippets from users in like logical groupings that we can go back and parse when we decide to work in an area. So that's like the biggest thing that we've probably done over the last two years or whatever.

Erin: Yeah.

Elyse: Right. And it probably helps you very quickly get caught up on why this thing matters in the first place.

Erin: It's good. It's a lot of qualitative, contextual data too. It's pretty useful.

JH: Yeah. For a real example, when you create a project on our site right now, you can't add any of your teammates to it, to like help you out with it. Which we know is a huge pain point. So we're working on that now. When we started it and we kicked it off with the developers, we're like, "Hey, just everybody read through these like 70 comments we have from people who've expressed frustration or a pain point here. Just to like internalize that." That was like what we did on Day One with the engineers.

JH: It was super helpful because like it really runs the gamut. Some of it is just like, I just take screenshots and just like Slack them to my teammates. I just need people to be able to see it. Like that's all I need. It's like we're pretty low bar.

JH: Then other people are like, "I need somebody who can come in and actually like edit it and like make other changes." Or, "I want someone to build a comment on it." And so then we get to start to parse out like what is the right solution? And then we can actually go talk to those people too since we know who said what and all that.

Carrie Boyd

Content Creator

Carrie Boyd is a Content Creator at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.

More from this author