SUBSCRIBE TO OUR NEWSLETTER
Sometimes, to learn how to do better UX research, you have to fail
Early in Alec’s career, he was convinced that the only research method worth using was user interviews. He wanted to be able to sit down and have a conversation with each and every person he was going to be researching. Surveys, unmoderated testing, etc. were too cold, and often didn’t give him the context he wanted.
After some time working with great teams, full of both researchers and people who do research, Alec has realized this wasn’t the right way to think about it. He couldn’t just write off all the other methods of research and only do user interviews forever. There were contexts that called for unmoderated testing and even surveys. In fact, there are many instances in which those things are better suited to learning the information you need than user interviews.
Though Alec was wrong to think that user interviews were the end all be all research method, he did take something valuable from that line of thinking. He now tries to incorporate that curiosity for context into every type of research he does.
Alec still sees these kinds of changes in perception happening through the The UXR Conference. This year, he said, there were many speakers that people didn’t agree with wholeheartedly. That’s a good thing, because it opens up an opportunity for dialogue and brings us closer to an answer that is right. It forces people to reexamine their way of thinking and find evidence to support their arguments, as Alec did with his views on research methods.
Being open and honest about what you do and how you do it shouldn’t be a radical position. But it is. It’s hard to be transparent, especially with stakeholders and teammates. It’s hard to get in front of a room of your peers and say “I did this big project, but I don’t have a clear answer,” or “The research showed the direction we’re going in is wrong.” Research digs in to some really fuzzy, difficult to define, or just plain difficult questions. So sometimes, the answers we find are also a little difficult to nail down.
Alec advocates for having a transparent conversation about this when you start your research. This can help your team nail down what you should be focusing on, or just to talk through some of the things that are most important to your project. This makes it easier for everyone to stay on the same page when the research does get difficult.
JH also offered some good advice on remaining transparent. He said remaining transparent throughout your process makes it easier for everyone to get on board with whatever experiment you’re doing, regardless of the outcome. Letting everyone in on your thought process makes it easier for everyone to understand why you’re doing what you’re doing and how you’re providing value to your team, even when you fail.
When I was in college (studying copywriting), we constantly did exercises to improve our creative writing skills. We would write nonstop for 15 minutes, jot down as many headline ideas as we could think of, and even sat around in circles writing word at a time stories. Researchers can also benefit from creative challenges and exercises to improve their practice. Jing Jing Tan, one of the speakers at The UXR Conference, uses a plotting excecise to enrich her user interviews. This involves the researcher and the participant drawing on a whiteboard together, creating an interactive environment and an artifact the researcher can use in their analysis.
Alec and JH took some creative inspiration from product teams, who use a “retro” meeting to take some time to reflect on how their team is doing. Alec has used a format in which everyone writes something in “happy/sad/mad” columns. This forces people to write down negatives they may not voice otherwise, and helps teams make meaningful changes to the way they approach problems together.
At User Interviews, we also have a retro meeting. When the company was only six people, four of whom were the product team, it seemed weird to leave two people out of the meeting. As we’ve grown into a team of 24, retro has stayed a team-wide meeting. Once a month, everyone gets an opportunity to write down what’s going well and what’s not. Once our answers are all submitted, we take an hour to sit down as a team and discuss what we shared. A lot of great developments have come out of our retro meetings, like a monthly wellness stipend, weekly founder updates, and even our culture handbook.
You can get creative through different research methods, retro meetings, or just by approaching your research projects in a different way. The important thing is to keep iterating, growing, and pushing the boundaries of what you can do with your research, even if you’re wrong sometimes.
Alec: [00:00:00] If you're batting a hundred percent on all your like points of view, like you're not, you're not trying hard enough, right. You're working on stuff. That's too easy.
This is Erin may. I'm John Henry Forester. And this is awkward. I won't say.
Erin: [00:00:28] Hi, everyone. Welcome back to awkward silences. We are here today with Alec Levin and he is one of the cofounders of the UXR collective strive conference that just happened recently in Toronto. Um, this is a conference that started as a meetup and that we've been hearing awesome things about from our client community.
And so we wanted to pick Alex brain about. What's going on with UX research right now. And what do we want to see happening in the future? So thanks so much for joining us, Allie.
Alec: [00:01:05] It's my pleasure to be here. That was very flattering introduction. I'll take it.
Yeah. Uh, JH is here as
Erin: [00:01:12] well.
JH: [00:01:13] Yeah. I'm having some serious FOMO about this conference. We keep hearing about it.
Alec: [00:01:16] Oh, you got to, got to get there next year for sure.
Erin: [00:01:20] Alright. On that note. So today we're going to talk about, uh, I, I'm excited about this one. So failure is so hot right now, right? If you, um, read up or participate in any of the startup dialogue and, um, and that's a good thing, uh, but we don't hear as much about failure.
In terms of. Being a leader within the UX research space. And so Alec is opening, I think a cool conversation about, um, how we can be braver in our methods and our approach to UX research and the sense of. Seek to fail so that you can, um, learn what you really think. Um, so I like this whole idea came up.
We were talking about, you get a lot of applications for people who want to talk about a variety of things, uh, at the strive UXR conference. And we were talking about what are you seeing? What are you not seeing? So talk, talk to me about what you aren't seeing and why we need to hear more talk about failure.
Alec: [00:02:28] Yeah, pretty much everyone who spoke at the conference, uh, spoke through like public speaker application form, which is cool. So we've got a lot of, got to see what was on a lot of people's minds. Very happy to say that we're like, Moving past the whole like stakeholder management is like our key talking point.
Um, part of research. I think that that, that, that courses has been beaten to death a few times. And we're starting to move into some, some newer topics. Some of the themes that. Uh, it really came through this year was one of them main ones was, is it okay as a researcher to have a point of view or is my job to be like objective?
And I think a lot of people are starting to think that, you know, it's okay for me to come into the room with the, with an opinion based on the evidence that I've gathered. And I think that's great. One of the ones areas that I I'm really hoping to see more conversation about is, like you said, it's just.
Being more comfortable with being incorrect about things. So it's wonderful that now we're starting to talk about having a point of view and fighting for that point of view based on our interpretations of what we're seeing in the field. Um, but even if you just think about how much growth the research space has had in a really short amount of time, Um, from really being these old school usability labs, lab codes, white drywall, and a two way mirror.
Like, you know, that was the reality for quite a while to something that's a lot more sort of embedded in people's lives with these ethnographic approaches, um, spending time visiting people and where they work and live and all that kind of stuff. And so even in, just in a short amount of time, we've seen the field evolve a lot and as.
New platforms emerge, right? New, new interfaces come out where it's going to require new methods. It's going to require new thinking. Um, and I think it's important for us to challenge ourselves to, you know, as we try these things out, you know, to have a, uh, with limited information, ha take a point of view and.
And try some stuff and see if it fails, like, and write about it and tell other people what you think. And, you know, maybe find out that you didn't know all that you did and that your point of view is wrong, but that's great. That's a learning opportunity for everyone. You see failure in the UX research sense, is it.
I went in with an opinion, and we learned that that opinion was wrong. And so now we're smarter, based on the feedback we learned or use suggesting more of like, we tried this new technique, or we came up with a new method to try to get some insights and we didn't learn anything because it didn't work.
Or is it both, or I guess I'm just trying to kind of double down on what failure means in this content. It's a great question. I think it's both, but I mean, I think the way, at least the way a lot of the startup world talks about failure is. Let's say borderline unhealthy, um, you know, like strive to burn yourself out by working 800 hours a week and it's not working out.
And then that's okay. And that's what failure looks like. I think it's it's I would even just pivot it to saying it's, it's kind of like, it's okay to be wrong. Um, so whether that's, you know, maybe you were lacking some context that you weren't aware of. Um, and when you go into a meeting arguing for a point of view, someone else in the room, Brings up some work that they did that you might not have known about that changes your point of view.
I think it's still good to go into that room, arguing for what you think. Um, and the same thing with the w on the method side, uh, all these methods were designed by fallible human beings and it's sometimes easy to get swept up in. You know, this is the right way of doing things. And at some point it became the right way of doing things, perhaps, but I'm sure before that there were many iterations of it that were less right.
Let's say I remember one thing for me was I, and perhaps other people have had different experience, but. Learning to become a researcher in Toronto. There is, seems to be a consensus that in order to do an interview, you needed like a pretty thorough script and you had to stick to that script in order to, for it to be, you know, um, in order for it to be good research, which in order for it to be a sort of looking for efficacy.
And I think that has kind of changed in the last few years where we realized like, ah, you know, qualitative research requires a lot of interpretation anyway, and building trust is really important. And it's really hard to build trust if you're reading from a piece of paper in front of a, uh, you know, a research participant.
So no, these things changed. And I think a lot of the times I feel like there's a lot of folks in the research community. I feel a little timid, right. Because we know how much we don't know. And that can sometimes be a blocker to us trying to move something forward or take a point of view because we don't have as much information as we'd like, but I think it's okay.
Like we're a supportive community. People are gonna say you're wrong about things and hopefully a polite way. And then, like I said, everybody learns that's right. And it's interesting to me, right. You're talking about how researchers tend to, and you know, not to put everyone in the same group, but you're in it to learn.
And so there's this. Um, humility to that, this kind of starting off the point of I'm just going to assume, I don't know everything. Um, but you're saying. In addition to that, why not start with an opinion, even if you're going to be wrong later. Um, and it seems to me that the value of that is, uh, not only can it be useful to have a sort of straw man out there for everyone to react to and to bait and to try to validate or not, but to figure out your own thinking, right.
And to kind of lay your assumptions bare as opposed to keeping them quiet, where they can maybe be more dangerous. Um, Have you found that to be the case? I mean, I think that's, that's totally spot on, I think. Um, it's interesting. It has having a conversation at work the other day, and we're, we're building a, a new product and it's super hard and there's a tremendous amount of uncertainty.
And we're having this conversation with the engineering team and they're like, well, we don't want to be idle. Like, there's things that, you know, we want to be building things. How much certainty do you need in order for us to build this out? And there's a whole bunch of factors that play into this and said, basically like, we don't need.
Like a hundred percent certainty about what we're making. We don't even need 80% certainty. I'm saying I'm not passing any requirements onto you unless I have like 15% certainty, right. 20% certainty. I just need to know what we're even doing. And sometimes that's hard when you're making something new.
And I think so it's not about starting off with an opinion based on nothing. It's. A lot of times, like even just our own experiences are good starting points, right. There's information there. There's evidence there. Now it's not. The most sound in the world. It's not like you designed an experiment and run it and that's your starting point.
But I think again, I think that's okay to start with 10% of the information to put a claim out there. Again, welcoming criticism, welcoming different points of view, even something as simple as well that worked in this context, but there's a whole bunch of other contexts that this won't work. And here's why I think that's a really healthy thing.
Um, And that even at our conference, I know there were times that, um, people did not, that did not, people did not necessarily agree with it. Right. It came out of that. And that's a really healthy thing. Uh, you know, either you end up with a more informed reasons for why you think, what you think, right? You found those arguments, not very persuasive.
But now, you know, uh, you know, what the other, a better understanding of what the other side of the coin is, or you changed your mind because he found it very persuasive and now you're, you're open to new ideas. Like it's, there's no, there's no loss there, but we just have to get comfortable with being incorrect, being wrong.
And I think a lot of the time we'd like being right, because we spend all of their time. Digging into evidence and, and coming up with points of view, based on a lot of Infor information that we spend time gathering and whatnot, like right. Sometimes it's just okay to be wrong. Yeah. You're sort of one upping the old, uh, get comfortable being uncomfortable mantra.
It's like, no, like. Be comfortable being wrong, you know? Yeah. Okay. Uncomfortable. Fine. I can do that as long as I'm not wrong. God forbid, so yeah, I like that kind of one up like an undercurrent of all this right. Is, um, humans are not good at like imagining the fact that some of our current assumptions are things we take as fact.
Might turn out to be incorrect. Right. Um, which is weird because if you think back just a few years and you can do it in like health or any other, you know, nutrition, it's like, can you believe that we thought it was a good idea to put BPA in water bottles or that people used to think smoking was healthy or whatever.
It seems so obvious in hindsight. Um, but there's definitely things that we do right now that we don't know are bad. Right. And two years from now, five years from now, we'll look back and like, man, I can't believe in 2019, we thought it was okay to blank. Um, and when you were just saying with like methods and the evolution of.
You know, having a defined script versus being a little bit more improv in some of the sessions and like how that's evolved, you know, in the span in a pretty short span, it's like, there are things we're doing today within the research, you know, the research world, then a few years from now, we're gonna, I look at and say like, Oh man, I can't believe we used to do it that way.
And what are those things? And so to find them, I think you're going to have to challenge and get some stuff wrong and push on the edges. But it's, you know what I mean? I don't know if that makes sense, but like we just, it's just something humans aren't good at. Yeah, I couldn't agree more. I mean, it's funny with the script thing, cause this, this is, uh, like this was earlier in my research career, but it was like a Hill I was willing to die on.
I was pretty sure that scripts were bad for search. I mean, obviously there's contexts in which that's not true. So offer that caveat. Um, but I, I remember going in and I was as extra as a participant and I was like, For a research project. Um, and these people had a script, they were building this new website for this program that I was a part of.
And I had joined the program as a startup program and they were going through their script and asking me questions after, you know, they showed me sort of like this prototype and whatnot, and that came to the end of the session. And I couldn't help, but feel like none of the things that I thought were important were even talked about.
Right. Like all those questions that you ask and maybe you had reasons for asking them, but the things that really mattered to me and that I thought was useful information to have for what they were building. Never entered the conversation. And so I just like, at the end of it said, okay, cool. But I like this program and I want to see you guys be successful.
So here's like a five minute rant of what I actually care about and think, and I think that was farm. I think, I don't know. I don't know it could be wrong here, but I think that was more useful than the scripted questions that we had. Right. Because every question you have is like, You put it on the agenda.
Do you have a point of view that it's important? Right. And a lot of the time in research, we don't know yet what's important and that's why we do this qualitative, contextual inquiry, whatever you want to call it. Um, So, yeah, I mean, I, that was, that was one thing. And I mean, I don't know if this'll be ended up being something that I'm right about in the longterm or again, in some scenarios.
Yes. In some scenarios. No, but I just remember at the time being like, I'm pretty confident about this, but who am I? Yeah, I guess there is the there's the meta piece of you could be wrong about being wrong, I suppose. Right. If you want to get really loopy. Yeah, totally. Yeah. I mean, that's the, I could be wrong about this point of view that I have, but I'm for sure I'm happy to be wrong, right?
Like, you know, maybe there natural progression of, of knowledge and practice that is, is more thoughtful and that's the way we should be doing it. And maybe that's the consensus and I'm open to it. But I really think that we can, we can do better. I think we can do more, um, by just getting comfortable with like, being incorrect about things that we think.
That point, if you're trying something really new or, you know, whether it's a method or just like an idea about what's going on with your customers or whatever, uh, with the idea that, uh, you know, you're totally self-actualized right. And I know I might be wrong about this, um, in terms of tactically, those all important stakeholders we need buy in from right.
Um, When you think about the risk of being wrong, uh there's uh, my bruise, your ego, you might look bad, you might suffer politically in your organization. All these horrible things that could happen. Are you these new things, things that you want to try as experiments as, Hey, I might be wrong here, but hear me out, or are you kind of quietly doing this wacky stuff on the side?
Do you have. What's your approach to getting some of this stuff that might be wrong out there and remaining effective. I mean, I would, I would, I think transparency is totally cool. Um, about this stuff. I think it's okay. Saying like, I, I I'm, I really don't, I'm not super confident about this, but here's what I think.
And here's how I want to move forward. And here's why we should do this. Like, Um, there's not, I think it's, I've now I'm in a product role. I've seen, uh, I spent a lot of time in research doing research and startups with like product people who have not figured it out. Let's say, um, But one of the biggest challenges they have is they think they figured it out.
They think they understand it. They think that they have more knowledge than they do. And it's often what causes a lot of really poor choices. Um, I worked in one company that none of the product managers spoke to one of their target users for like a year. Um, And they had like months and months of roadmaps, like here's what feature we're going to build them the next feature, then the next feature.
And it doesn't acknowledge any of the uncertainty, any of the limitations to like what human beings can know. When you're doing something new. So from my point of view, it's saying, here's what I think here's what I think it, and here's what I want to do. And yeah, this could be wrong. And, but this is how we learn.
This is how we get better as an organization and as a product team, as a group, as a whatever. And if you get a lot of pushback on that, it's probably not a healthy, healthiest place in the world to work. Right. Cause if everybody around you thinks they know. Everything already. What as a researcher, what are you even doing there?
Right. They already have all the answers they want. What value are you bringing to the table? Yeah, for sure. It's a, it's a good question right now. Not much. It seems this whole time feels like an area where there's a lot of lessons to learn from like other creative disciplines. Uh, what comes to mind, right?
Is like exercises. You see people do, whether it's like writing and you give people a list of nouns that they have to use or photography, and you make them do different, weird techniques as an exercise or painting. And you're using different materials. Like it feels like other creative areas. Part of the learning process is, you know, intentionally trying to force you to do things in nonstandard ways and, um, just like force yourself to create output and try new techniques and see what happens because it's a.
You know, they're disciplines that really value novelty and new approaches and new perspectives. And, you know, not all of those things are going to apply to user research, but it does feel like there's probably exercises or things that you could structure just to like almost force people to be creative or, or force people to try to do things in a, in a novel way.
Um, and maybe that helps people, you know, get comfortable with it. I don't know if that's, you know, early in careers and you, and you make it part of the training or it's something teams do his exercises periodically. Um, Mixed in with their more traditional approaches to take care of the day to day work.
Um, but I don't know if that makes sense, but it feels like there's lessons to learn from other areas here that we're where they really value this type of stuff. Yeah, totally. I mean, um, do you guys do retros at user interviews? Like every week or every other week or so? We do. Yeah, we do. Um, we do a full team one every month, which is really cool.
So people who are not in the product development side of things, get to say their piece just about the company at large. And then we do every two weeks within the product of design kind of trio. And do you guys use like, um, you have like columns, like happy, sad, glad, or start, stop, continue. And you put stickies up and all that stuff.
We're fully remote. So we do, uh, we do the virtual sticky sort of, but, um, we kind of collect responses, uh, through a Google form and then cluster some responses into areas and let people kind of speak, say their piece and call out trends and kind of dive into areas. So, um, similar, but just different tools.
Yeah. I mean, I think retros are a really interesting sort of case study. Um, for, you know, for, for, for researchers and for people in general about like the importance of trying new ways of doing the same thing. So, you know, for me, in my role, like there's a few people that I have regular one-on-ones with, um, on the product team.
And, you know, we'll talk about like, you know, what are the challenges that are going on? What's going well, what's not all this kind of stuff. And it's this one on one conversation. We had a great relationship. It's all kid yet. Sometimes it takes like a retro where we're saying, okay, we want to have a column of things that we're sad about and a column of things that we're mad about to actually have people feel comfortable to like write these things down and talk about them.
And it's the same thing. Like, it's, it, there's nothing different between it being like conceptually between it being like a one on one, versus it being a retro style thing. It's just a different way of approaching it. But the information that it produces. And the benefit to our team is like super high.
When we take the time to have this different kind of structure to the approach of, we're going to talk about things we're happy about, upset about and, uh, and sad about. And we're going to take 10 minutes and we're going to write these things down and then we're going to like talk through them all one at a time as a group.
Think about what? Yeah, I think it's one of the best meetings. We have to be honest. And it started as a whole company thing, largely just because there was only six of us and it felt weird to exclude two people from the, you know, product and dev one. So we just did it company wide and um, we thought a lot of great changes honestly, of how we all interact and how the team kind of communicates in a bunch of other areas that have come from those comments.
Um, so it's been, it's been super cool. We've even had a, we've had a retro of the retro, which is one of my favorite, many Mehta moments. Yeah. We're very good at medicine. Well, with the retro, it's not, so we've, we've mixed up the retro format too. That's awesome. But I mean, even the principle of it, or just like the way, you know, you can get feedback in a one on one setting and that.
It's not as comprehensive and is not often as insightful as doing one of these retros where you have this kind of structure to it. And what does that teach us about research methods right now? We're so focused a lot of the times on user interviews, right? Mmm. Uh, interviews on, you know, a lot of these like, sort of like more standard ways of gathering information, but yeah.
Um, even at the conference, one of our speakers was talking about these like plotting exercises that she's done with, um, attendees or sorry, participants and how she's used them in order to get a better understanding of how people's lives look. Right. It's the same sort of like, you know, it's still two people in a room, but instead of someone typing on a computer, fiercely a notes, or like, you know, whatever it is, They're like at a whiteboard together and they're producing an artifact together and it makes it easier sometimes to talk about some of the nuances and like what your experience that's really cool.
Do you have, um, things within user research right now that you think we're doing wrong or we're wrong on, do you have like a hit list of areas that, you know, you might have a nonstandard point of view on it? The script one was a big one. Um, for me in the past, I'll tell you, maybe I'll tell you something that I was thought I was right about, but I was definitely super wrong about this was early, earlier in my career, but I was pretty convinced that the only method that really mattered was like user interviews.
Right. Like the only thing that actually, you know, like surveys were stupid, you know, they were poorly designed and poorly use and whatnot. Um, a lot of these, like, you know, testing tools with people, you didn't know it was bad because you didn't have the historical context. I was pretty convinced. That's literally the only thing that mattered was getting in a room and talking to somebody that's since come around and saying like, that is wrong.
That is not a correct point of view. However, what I learned from that was that I, I. Putting more emphasis and value on understanding sort of like people's content before you talked to them. Like, it didn't make sense to me that you would ship out a product or prototype or whatever it is to somebody who you had no idea who they were.
Do. I, no idea what their familiarity with technology was, whether they understood what you're doing, whatever to get feedback that made no sense to me. Right. And there are some very successful financially successful businesses that do this. Right. And, and it was very, it's a very common practice to like, Design something or build something and ship it out to a bunch of strangers that you don't know anything about.
And that seemed like crazy to me. So I think that part I still feel right about, but, you know, I used to, but that just, it was like totally like surveys are valuable tools. They're just often use very poorly. Right. And my point of view that they're bad tools was definitely wrong. Definitely wrong. Yeah, I think an important piece to have, um, if you're willing to be wrong.
And so you have this initial opinion, and then, you know, sometime later you have a different point of view on that same topic. If you don't take the time to explain to people like what information and what data helped you change your mind and evolve your perspective. It does kind of come across a little bit as like a flip, floppy type thing of.
Last, you know, last month this guy was talking about this thing, and now he's talking about this and like, he's just so inconsistent. Um, whereas if you take the time to at least take people on that journey of, I thought this because of these assumptions, I learned these other things. And now I believe this.
I think you actually come across as like much more credible and people want to, you know, work with people like that who are open to changing their minds when they have new evidence. But if you don't share the evidence or share how that evolution happened, Um, I think people can have a pretty different reaction to the fact that you've like, changed your mind if that, if that makes sense.
Yeah. I couldn't agree more. I think the whole, uh, you know, I don't want to be wrong as a researcher because I might lose credibility with my colleagues. Like I don't buy that for one second. Right? Like your job is to, is to figure out really mysterious, enigmatic hard problems, questions. Um, like if you're batting, you know, a hundred percent on all your like points of view, like you're not, you're not trying hard enough, right.
You're working on stuff that's too easy. Um, you know, and ours as qualitative research is really well suited to solving these really hard, you know, messy, wicked problems. And if, if you're not struggling with any of these things, and you're just able to figure it all out that fast, like. You know, are you really adding that much value?
Are you really making a den? I don't think so. And taking people along that journey helps them, like you were saying, it helps them understand how to think, which is even more important than what the actual results of your study is. Right. You know, how do we, how do we, even before we even do any research.
Like we should try and have a point of view of like, what kind of information would help us make a decision or help us figure out what the next step. And sometimes it's, it's totally fine. Say, look, this is just such a new project. We don't even know. We just need to talk to people in this space and listen to what we hear.
But oftentimes you actually can say, you know what? We want to know. Is this onboarding experience that consistent and clear and easy to understand, right? Like, so there's an evaluation and evaluative part of the onboarding experience. We want to make sure that we understand this person's historical experience with these kinds of products before they even see it.
Right. You know, like you can pull those pieces out and as you walk them through how you're building a framework to evaluate or test or whatever it is that you're working on. And it comes back and you know, you're wrong about X or you're wrong about Y then it's very easy to understand how, again, you've created a lot of value.
You've improved, everyone's understanding of what they're working on and now you just get to, you know, make amends and take the next step and build that knowledge base. Like, I don't think there's any chance that you can lose credibility with, with. Uh, your colleagues by just like doing your job, honestly, and intentionally, and trying to push the boundaries.
Agreed. Thanks for listening to awkward silences brought to you by user interviews, theme music by fragile gang editing and sand production by Kerry Boyd.
Senior Content Creator
Carrie Boyd is a Content Creator at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.