All posts

UX Research Topics

Podcasts

Field Guide

SUBSCRIBE TO OUR NEWSLETTER

Thank you! You are all signed up.
Oops! Something went wrong while submitting the form.

FOLLOW US

BlogAwkward Silences

Being Data-Driven vs. Data-Informed with Hannah Shamji, Head of Research at Copyhackers

How to let data inform your decisions instead of drive right through them.

Carrie Boyd

There's a lot of data out there. Keeping track of Google Analytics, NPS scores, site metrics, usability test results, industry data, and everything else can be downright overwhelming. Which is why Hannah Shamji, Head of Research at Copyhackers, likes to say she's doing data-informed work, not data-driven work. 

For Hannah, her team, and her clients, working with tons of data can be overwhelming. Since you can usually find at least one graph to support a research point, it's important to put data in context. Hannah outlined how she gets in the zone with large amounts of data, puts things in context while doing her best to stay unbiased, and frames data around her research questions.

Highlights

[2:12] The difference between being data-informed and data-driven.

[6:21] Why it's important to put data in context and pull from many different sources.

[9:25] How Hannah approaches data through the lens of her research question.

[16:40] How Hannah tries to build data narratives that tell both sides of the story.

[23:21] Digging deep into data is a little bit like meditating.

[27:07] Hannah, Erin, and JH chat about data and COVID-19. (This episode was recorded on April 24, 2020.)

The best stories about user research

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.

About our Guest

Hannah Shamji is the Head of Research at Copyhackers. There, she helps clients create great, data-informed, copy and marketing strategies. She blends qualitative and quantitative research to tell client stories.

Transcript

Erin: [00:00:41] Hello everyone. And welcome back to awkward silences. We're here today with Hannah Shamji. She is the head of research at copy hackers. She's also a podcast host of her own. And today we are excited to talk about being data informed versus data-driven.

So how to really, Make data useful to you and not let it drive the ship so to speak. and so we're, we're excited that you're here to talk about, how data and qualitative data and quantitative data can work for you to make better decisions. So thank you so much for joining

Hannah: [00:01:16] Yeah, thanks for having me. I'm really excited. And I am particularly excited about this topic. It's a. It's a little loosey goosey, which I kinda like to dive into, but, I'm glad that, that we decided on that. And thanks for having me.

Erin: [00:01:30] Yeah thanks for joining. We've got JH here too.

JH: [00:01:32] Yeah, excited to learn about how to not to get bullied by data. Let it push me around.

Hannah: [00:01:39] That's awesome.

Erin: [00:01:40] Yeah. Fantastic. So to that point, right? Not being bullied by data, let's start from, you know, the sort of premise of this episode, what's wrong with being data driven. Right. We hear about being data driven all the time. And here at user interviews, we take a broad interpretation of data, right. And a lot of times folks.

equate data with quantitative data, for our part, we go data as being quantitative and qualitative, both. but in any case, what's wrong with being data driven or with letting data drive you?

Hannah: [00:02:12] Yeah, it's a, I think it's like such a nuance, right? We can kind of use that term so freely and it's definitely like a buzz word, especially now with more and more big data, who would, he wants to be data driven or have data that they're leaning on to make decisions? I don't know that it's something necessarily wrong with being data driven, but I do think that there's an important difference between, data driven and data informed.

And it's really helpful to know. Where you are on that spectrum. the trouble with being data driven and not necessarily knowing that, Knowing kind of what that means that it is like you are looking at data and using exactly what the data says to drive a decision. So it's essentially, there's not really kind of a middle interpretation.

It's Hey, we're looking at the data. And as a result of the data we're doing X, Y, and Z, whereas data informed is typical and more common with like multiple data points. Especially when the data doesn't have like a clear trajectory it's not super linear or, telling on its own or different data sources are saying different things.

And then you kind of need to figure out how to stitch them together. that may or may not be in some sort of like, Statistics statistical model or predictive model, or it may just be kind of you using your kind of data sense and mind and sort of seeing how things fit. But that kind of being informed is more of, You know, you're not driven directly by the data.

And so, you know, that there's a space between the data and then you making perhaps some hypothesis or assumption and then taking a particular action. So I would just say that it, I think a lot of cases, people are not directly driving from the data. They are being more informed by the data. and that it's just really helpful to know where you are on the spectrum so that you can actually kind of have the proper feedback loop.

if the, you know, if you are being data informed then, and let's say you, , You, you perform some kind of marketing or you write some kind of copy, which is a lot of the work that we do at copy hackers. and it doesn't go and get the results that you want. Well, then you have some room to go back and address the hypothesis.

where's data-driven is like a much more narrow or, Mindset, at least as far as, you know, the space that I'm kind of specifically referring to, which is again, that marketing copywriting space. So it's just, it's, it's not wrong. I would say that it, it just helpful to know where you sit on the fence so that, you know, kind of where that line is, if that makes sense,

Erin: [00:04:49] Yeah. And I think when you, you mentioned sort of part of their, maybe like seven signs, you might be data driven. Right? We can do a quiz. but one of those being, you really looked at one source, right. And that sources probably quantitative and. Right. One of the things you can do with one source and one chart is to make it say whatever you want.

You know, if you're looking at one thing or you pick the one that validates your hypothesis and you know, often you're not doing this necessarily on purpose, but maybe you want to get to a decision quickly, you know? And so maybe it's pretty easy to find the existing data that that can help you. To do that.

And then yeah. Then what happens? if that data didn't take you where you thought it was going to go, what are you left

Hannah: [00:05:35] Right. And we see this so much with the pandemic data too, right. There is a lot of, misinterpretation or kind of massaging of like graphs that will say one thing. And then someone else will come back and say, Hey, this was actually misleading. There's just so much room to. Maximize on the data that you want to highlight and minimize on other data when you're playing with one particular source, especially.

so it, it's just a very fine line that it's helpful to, to kind of know where, where you sit on that and what you're looking at, in terms of sources, if that makes sense.

JH: [00:06:12] It's almost like when you say that or like the trend you seem to be observing or highlighting is when people are data-driven, it's almost like reductionist where they're finding one data point or one source. And then using that to move forward. Whereas I think what you're saying is be a little bit more open to the kind of like a collage of there's a variety of sources and you want to be well-informed across all of them to move forward.

Is that, is that like a fair summary?

Hannah: [00:06:36] Yeah. No, absolutely. I think that, and it depends too, right? Like what your goal is in the case of like, Maybe we are, like a copy hackers, we're evaluating in the performance of an email campaign. Well then we're specifically looking at performance data, but when you're doing something that does require or has like multiple inputs and there's user tests and user interviews and, surveys and polls and.

Quantitative data and Google analytics and all of those pieces, absolutely. You do need to have more of this curious lens, and, and kind of figuring out where and how pieces fit together versus that, that kind of one track source. So for sure, yeah, that, that was right on the money.

JH: [00:07:21] Where does, where does the term data-driven even come from? I feel like I've heard it a million times now and I just say it and know it, but I don't have any idea like how that came to be the phrase.

Hannah: [00:07:28] Yeah, I don't, I don't know. I, so my background prior to, kind of. Marketing was, in public health and data was big then, but it was more like evidence-based was the term that was really big. that has kind of that more like academic feel to it. I think data-driven has just come out of a lot of this like big data and access to mass amounts of data through social media.

and it just, it feels like a really catchy thing, right? Like, it's great to be able to make a point that you can. Lean on data with it because suddenly you're not making the point and you have all of these numbers and who can argue with numbers, type situation. So I, yeah, I don't know the exact routes, but it's definitely caught on like wildfire for sure.

Erin: [00:08:13] Yeah, and I think there's, two, maybe we could use some examples, but there's a syntax piece to this too. Right? When you think about. Your human judgment relationship to data, to users, to decisions, right. And like what should come first and how do those things fit together? Right. So for instance, you can imagine a company dashboard, right.

Full of your charts with your most important KPIs, right? And you want to monitor those to, you know, are they going up to the right? Is all charts should call out to basil? You know, are they doing what you want them to do? And if they're not, of course, now we want to understand why. And I think, right, so the data driven mentality, maybe to not sort of dig into the why enough, right.

To understand what the data's really telling us into instead take a kind of reactive approach and, something, something like that, like, you know, what's the, what's the sort of syntax in terms of how do we. Become more data informed of letting that data work for us versus reacting to it without understanding why the data is moving, you know, in the direction that it

Hannah: [00:09:24] Yeah. Yeah, I think, so the, the way that I try to approach it is always with like these like pillar research questions. Like when you have data that isn't performing the way, or it doesn't look the way you want it to, it is absolutely common and normal to be reactionary. Right. And, and jump to, Kind of trying to sort of find that bandaid solution or just like fix the immediate problem.

or in some cases often look to the data that you have. That's like easiest access, like what's at your fingertips. when that may or may not be very helpful. For the bigger problem you're trying to solve. So the way that, I typically approach it in the way that we do it at the agencies, always with those research questions.

So whatever the problem is, how does it manifest in term of, in terms of the question that you want to answer? so taking a step back from like, what data do we actually have access to? What data are we looking at? What data should we look at? That precursor is always like, well, what, what's the problem?

Like, what's the question that we need to answer. The gap, because often, and, and this happened actually recently with a client of ours, they wanted to answer, and, and identify the why behind a change in their sales trajectory. And it's easy to feel this kind of, Sense of we're being comprehensive.

If we go like and do all of the data stops, like we get, we, we do research and we do kind of from the ground up and interviews and surveys and post-purchase surveys and polls and like the full gamut. but is that really effective? Like, is that really answering the question that we want? To be answered.

So I think it, it requires really getting clear on what that is. And we might think like research questions can be sort of these general, like, well, why do people buy in the case of marketing or, you know, what makes them convert? But it's like, those are, yes, the bigger ultimate, like this is the golden question.

Every once. Wants to know all of these companies, right? Like who is my target audience? Or why, why are people making the decisions that they do to buy or not buy? But there really are a lot more like granular questions when you sit down and look at the data and try to figure out, like, what do I need to know to be able to change or to change those numbers?

so typically that's kind of where we started. And then often that question has sub questions and you get to start to fill in like, well, what do you know? We'll actually already know the answer to that question so I can dive a little bit deeper there. and then that will start to spell out like what data sources you need.

And so it ends up being a little more, like the data sources are the step kind of two or three on that, that bigger spectrum. Do you know what I mean?

JH: [00:12:08] Hearing you explain this, and you mentioned it a little bit earlier of, you know, it used to kind of be referred you more as like evidence and that kind of feels academic or almost even like from like the legal field. but like when you just gave that example, like, it almost does feel like that's a better terminology.

Like evidence-based is, you know, we want to answer this question and we went and like, what is the supporting evidence that can inform us? Whereas, like data just feels like the Aaron's playing the very beginning, then it just kinda feels like numbers and it doesn't even necessarily imply that it's relevant to what you're talking about or what you're looking at.

And I really wonder like how we got off, like on the terminology. Cause it feels like, I don't know, evidence feels cleaner to me. Like having just listened to what you, what you shared.

Hannah: [00:12:47] Yeah, it's a it's tricky because then evidence based can also be. It has a similar leaning to, like data driven, at least the way that it was used in the academic space. Right. It's like, I'm not going to question X because it's evidence based as opposed to, evidence informed. but yeah, it definitely like, and data does have like, like you both said that quant meaning, and then research has a very different.

Spin to it too, right? Like at least with our clients, some are I'm all for it. And others aren't really clear on what it is. And it's very sort of, this kind of nebulous, like what exactly are they doing? so evidence does definitely have a bit more sturdiness to it. I would say. but yeah, it does. it does beg the question.

I wonder if there's like room to play with coming up with some framing that's inclusive and not necessarily intimidating, but also clear because the difference between like driven, informed is something that. I didn't, I wouldn't necessarily bump into like the first little while that I'm in the research space, you know, it's like a nuance that you pick up on after, having to kind of navigate and explain, the, the terms or the field.

so it, yeah, it will be cool to have like a better, a better term that's more accessible and more effective for sure. Yeah.

JH: [00:14:06] Have you all like within copy hackers, do you refer to it as like being data informed? Like, is that terminology that's common within the.

Hannah: [00:14:16] no. The, the distinction between the data informed and data driven is, so the way that it's seen in a team is, research. And that is this all encompassing. Including data, data, specifically referring to the quantitative side. and fortunately for me, like as I have room to kind of play and experience how clients respond to the different framing that we offer, so I can kind of play with how we position it.

it's definitely been something that's been, like we do get different reactions. And so because of that, it's sort of, it's been a bit of. Experimenting with, understanding how they view it. and also being really careful. So often when we, when, when we present, data to validate a particular hypothesis, We'll present kind of the data narrative.

So it will include multiple assets and, our interpretation or actually like the, the summarized findings and then how the qualitative makes in with the quantitative. And what's interesting is of course, when they, when, when we share it with them, They tend to gravitate towards their own bias, right? So some of them are like not, they, they won't let go of the qualitative, even if it kind of contradicts with the performance data and others will not, will be really, really stoked about the quant, when the qualitative data saying something different.

And so it's sort of really interesting that it's our job then to make sure that we. We add in those caveats and do them in a way that, we do of course anyway, add in the caveats for sure. But, it's just interesting to really focus on like, as much as it is the data. It is the, the story that you tell and how you tell it, that even if, as a researcher where we are more of the curious mind and data informed versus data driven, can we make sure that that's also translated to the client or whoever's using the data to kind of keep that.

That consistent mindset, with, with respect to the way that they're using the data. So it's, yeah, it's definitely not a linear answer, and, and a work in progress, but it's cool to be able to try it out and see how different people respond to data depending on the needs and, and figure out like the right way to position it.

I'm still trying to work on what that sweet spot is for sure.

JH: [00:16:34] Yeah, totally. Do you, do you see that when you try the different positionings and present it more as a narrative and stuff, people, you mentioned still have their kind of implicit bias of favoring one or the other. but does it make them like a little less entrenched? Like do they become a little bit more open to considering the other side than if it was presented in a different framing?

Hannah: [00:16:51] Yeah, I think so. So an example is like we, we presented some data to a client and the quantitative data was, in and of itself. Like if you just look at the quantitative data and I think, I believe it was like Google analytics. we had done an audit and we were looking at kind of visitors versus, what visitors like visitors versus customers and, the breakdowns and comparisons and differences there.

And the data was. Interesting like it, it did tell a particular story, but once you paired it with the qualitative, you realize that it, like, you couldn't just look at the quantitative and this, this tends to happen a lot. Right? Like one slice of data says something and then you put it as like, Oh wow. It's actually a part of this like giant pie.

Had I only looked at that I would have made, you know, a wrong hypothesis. So I think it's a matter of like, What we found is that when we, when we communicated that and they were really excited about the clarity, that was the clarity of the quant data, I guess, really hard to not feel. I mean, they, I think they got excited and because it was, it was a lot more, it wasn't as fuzzy as qualitative data can sometimes feel like in qualitative data, you have to create the buckets.

Right. Whereas Quanta's like, Oh, numbers are. Like nice and proper and like buttoned up. there's less argument there. And so for us, it was a matter of having to. kind of explain to them this bigger picture and, and where and how this fit. And so it is, what we've, what we've come to find is like, it is about explaining both sides.

Like, Hey, this is a really, really cool finding so that they can also feel, interested in and kind of change their frame and perspective on research, especially the ones who are less keen on it. but also like. Adding that disclaimer, that kind of warning of like, if, you know, if we only lean on this, it is a mistake and here's why, and here's the other elements.

So it's become this kind of like coaching through, through the data, which, which is fun. and, and there's a lot of like on, on the spot learning for sure. But yeah, it's, it's definitely like an evolving relationship. I think people have clients have with like research and with, with data and where, and how they see that, that value and how it fits, you know?

Erin: [00:19:09] I guess, you know, my question for you is just around. You know, when you choose what stories to tell with whatever data set you're looking at, you know, how do you make sure you're telling good stories, accurate stories, truthful stories. and I guess he's the language of our episode, right?

Like data informed versus data driven stories. You know, part of it is looking at multiple data sources as you talked about, but how else do you make sure those. Stories are meeting the goals that a good data story should meet.

Hannah: [00:19:45] Yeah. it's an important question. I think one of the ways that we do it, so we have, for example, like a. One of the projects that we'll work on. That's very research heavy is kind of a, like a value proposition development for a client or a business company, et cetera. and in, in that Avenue, I typically will have kind of a flow of the way that I walked through, assets.

Okay. At this stage, I know what type of sources I'm looking at and almost in what order. so one of the ways that I. I do it is that while we have these kind of overarching questions, there are specific pieces of data and types of answers that I want from each particular source. So for example, it'll start with like a Google analytics audit.

 our Google analytics expert will kind of take a first pass and not, try to stop from going down rabbit holes. So whether it's with acquisition channels or exploring, keywords or search terms, that'll all come in like the second pass, but, We really want to keep the, the quantitative data, much more of this.

Like what's the initial cut that we can use then to layer in the qualitative. So initially we're, we're kind of the buckets and categories that I'll use to, code qualitative data to will be more general. and then if I want to go and kind of drill down further, that will typically happen in the second pass.

so it's really about. Starting kind of, I start sort of surface level and see what I find. And then I just notice what peaks my interest, but I don't pursue any of those rabbit holes, across any of the data until I do kind of a second pass. And then, and then another way that I can kind of stitch all of the data together is by, putting everything into like one giant.

And it typically tends to be like quite large Excel file. We use Google sheets and try to code data, as consistently as possible. So if I'm doing user tests and I'm also doing surveys and interviews and polls, for example, What are some ways that I can identify overlapping themes. even though they might be data at different points in the customer or the user journey, I may be able to start pulling out themes or topics that are overlapping or tones, et cetera.

so it, it, it, the way that we arrive at the story, that's more data informed is really kind of these like multiple passes that you're noticing your curiosity, but you're not following it until. Each of the, once you reviewed all of the assets, then you can kind of keep iterating and going a little deeper and exploring what it is that you're finding.

So it's very like iterative. and the more immersed you are in the data, at least for me, the better. So I'll typically do, like, for example, if I'm doing interviews, I'll try to do like all of them, you know, anywhere from like eight to. 12, let's say for a particular client, all of them, like over the span of two days, or if I'm trying to code a bunch of qualitative data, I'll try to do it again over the span of like a couple of days.

So that they're, I think you kind of build that muscle a little bit to figure out how to navigate and what the story is, the more in the data you are. so those are some of the, the. Techniques that I use to try to, you know, dive in and kind of pull out, like tease out this like story that I can, present to the client.

Erin: [00:23:21] The language you're using is so reminiscent of meditation to me. Right. When you think about when a distraction comes into your head, you like notice it. And then like gently push it aside. And it really is about this. And you're also talking about this like, focus right. Of like being in the data, being in, you know, the meditation zone, whatever it is.

So, I just thought that was interesting in terms of like this way to, to be present, to be engaged, to notice things, to. you know, to try to garner some insight without getting overwhelmed and going into wormhole. So I think maybe also in this quiz, right? Like how data-driven are you versus informed if you're data driven?

Like you're more likely to just be overwhelmed by the data. Right. And to let it. To let it push you versus what can this data do for me. And obviously, you know, in the world of big data, there's infinite data available to us. And so it can very quickly distract us or, yeah, that's interesting. But as I have anything to do with what I'm trying to learn right now, or with what I can action right now.

And so I think that's a really, really good tip of sort of notice and kind of push it aside and, and also. That benefit. We've seen a ton of benefit to doing this of, you know, really having all of your interviews in this condensed period of time. So you absorb and internalize the trends in addition to obviously sort of documenting and coding those.

Hannah: [00:24:50] Yeah. Yeah. It's funny. You mentioned about them meditative piece. I'm pretty sure like if my husband commented or described what he saw, I would just be like, kind of crazy eyes on the computer, you know, like, just like in this like ridiculous. Do not disturb face and zone. cause it's very like. It is a bit of like that labor work of coding and reading and coding.

And then your, your category of like price, for example, maybe get further chiseled down into like, Oh, well there's a difference between price and cost and budget. And those nuances like emerge as you go through the data. But yeah, it definitely is like, I think it's, it's just funny. I'm like, Oh, maybe it sounds a lot more romantic than it is in the moment.

You're kind of just like really sort of like eyeballs deep in, you know, in trying to like tease apart all of these differences.

Erin: [00:25:44] Yeah,  Um, let's see, what else did we want to cover?

JH: [00:25:53] I mean, should we do a quick detour on, you know, there's a lot of data in the news. You have a background in public

health. 

Erin: [00:25:58] Yeah, let's do it. Yeah. So yeah. One of the things Hannah that you were talking about before was we joke about this internally. So I lead a growth team, right. So what we're trying to do is to produce exponential outcomes, right? Obviously in 2020 today is what is it? April 24th or something. exponential growth is something the whole world is thinking about in a negative sense.

and you know, we're always internally sort of sharing the different graphs and things get real Matthew, right. And a lot math here probably than the general person, tends to, to kind of think about an a day to day basis, right? Logarithmic scales are now the sudden appearing on the nightly news and so on.

So anyway, In that environment, there has never been a more important time to a, from a general consumer perspective, but be for anyone trying to, make smart business decisions within that macro environment of this crazy world, we now live in. how do you. be data informed, how do you figure out what's true or what's signal what's noise, how to make decisions and, tell good stories.

Hannah: [00:27:07] Yeah, it's a, it's like, data's kind of getting a bit of a spotlight, right now, for sure. and it's, it's definitely like, I think that in the beginning, When, you know, friends that I know are in like completely unrelated fields, even on like Instagram of posting graphs and kind of pointing to things and like highlighting and like these giant red circles on bar graphs, it can be really, really alarming.

So, it definitely trickled down. We have a family chat on WhatsApp and. You know, my parents and, aunts and uncles, and kind of like sharing a lot and like a lot, a lot, a lot of forwards and forwards that are like really, really misinformed. so you know, the, what we've been communicating to them and encouraging is like question your source and validate your source.

 Cause it can even happen with qualitative too, especially now with like what you should and shouldn't do. And, the, the masks and when you know, who's saying, what, and why are they saying it?

And the public health. recommendation being slightly different and different consideration to like that individual, what an individual might need to consider. So, yeah, it's, it's just so important to like questioning your source there's it really is about, and, and that might not mean me might not be something that, you know, everybody's equipped to do, but find somebody that, you know, whether it's a friend or, You know, like just finding people that you can actually trust.

 So whether there is data or not, that data is changing like a mile a minute. so I would just be really, really where you, more than your data hat put on your. scientist's hat, you know, and just kind of question the source before you jump to your like hypothesis and conclusion and ask someone else, you know, to kind of fact check as best you can, because I think that there is.

There's just a lot flying around and it's really, really easy to spiral up your own panic, unnecessarily or, you know, not kind of take the useful bits of data and be able to sift away, like, yes, it is an alarming situation and it is, you know, really like important to take certain measures and there's a lot of seriousness around it.

JH: [00:29:22] it's also, for me, it's been a really good reminder of like how important trends are like snapshots are useful. Right. And great. But like it's hard without the context of, you know, there was 2000 deaths yesterday or whatever, which is obviously horrible in a really large number. But it's really important to know.

Well, like did the day before have 2,500 or 1500 the day before and like, so I think like every day you hear new numbers and everything's flying around, but if you can zoom out a tiny bit and get some of the trends, which I think people are starting, it seems like there's more and more of that in the coverage now.

But, I think that applies to business as well. Like if you just take it one snapshot out of context, it's hard to know what it means. 

Hannah: [00:29:57] Yeah. And there is that big chunk to missing, right? Like with all the, like how many people have it and aren't tested, which is really, really critical reference point to add, to add to that data. And. On the side of business I know with our clients, definitely kind of, you know, wanting an explanation for, like the change that they're seeing in sales, whether good or bad.

but it's, you know, there's so many other factors too, like, Hey, this could be seasonal. are people at home and perhaps. Because they're home, they have more time to do research about a particular product or buying more of that product or, which means we're not necessarily converting more people.

We are just kind of making more space where they were already going to convert. I mean, there's just a lot of factors in play, to consider when we look at the data and it, it's easy to look at like the big shiny one, in this case, not necessarily shiny, but, but certainly big, but, Yeah, there, there are, it is really, really critical to kind of pay attention to all the data points before we react.

And we can react from that informed, informed place, kind of staying clear of like what data you want given the goal that you want, that you have set before you even look at that data. So that can kind of be your sort of North star, in the midst of it all. Yeah.

Erin: [00:31:14] I think that's a great place to leave it.


Carrie Boyd

Content Creator

Carrie Boyd is a Content Creator at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.

More from this author