SUBSCRIBE TO OUR NEWSLETTER
How research can help improve conversion rates, and why they're not as simple as you think.
Erin: [00:00:45] We are here today with John MacDonald. He is the founder and president at the good, a conversion rate optimization firm. Today, we're going to talk about user research and how you can use it to drive website conversion and specifically more conversions and better ones. Better rates and all that great stuff.
but I think about a lot on the marketing side and maybe some of you listening, are super familiar with conversion rate optimization or not that familiar and vice versa with user research. So we'll be talking about the nexus of these two great things. So thank you so much for joining us John.
Jon: [00:01:23] Thanks for having me today.
JH: [00:01:25] We've
Erin: [00:01:26] got JH here too.
JH: [00:01:27] I am here. Yeah. I'm excited. I feel like you must have so many stories of some little thing that was uncovered that led to a huge breakthrough, like in conversion rates and stuff. And I'm excited to dig in and hear some examples.
Erin: [00:01:38] no pressure.
JH: [00:01:40] just put you on the spot. You've even doing this for a while.
I feel good about that.
Jon: [00:01:43] Yeah. it's interesting actually, that's where your head goes. I think that's where a lot of brands I speak to on a daily basis. That's where they go first too, is they want the quick win. They want that one thing or that one app or one hack, that's just going to double their conversion rates overnight.
And I think a lot of my job, I feel like almost like a evangelist for the fact that. That just doesn't happen, right? There is no quick win. It's really all about iteration and like compounding growth. And that's really where the sustainable gains are. So many if, especially if you're on Shopify, there's so many Shopify apps that are just.
Pitching, we'll get you double your conversions overnight. And if you search conversion in the app store for Shopify, it's just, it's ridiculous. The number of apps that are in there that all promise to do the same thing. So a lot of brands just don't know where to start.
JH: [00:02:35] I can't believe I fell right into the basic trap of my intro.
Erin: [00:02:39] The perfect.
JH: [00:02:40] but a good, yeah, I'll play the straight man on this one. I'll be the
Erin: [00:02:44] I was going to say, we had a really great kind of, we found something. Yeah. Easy to fix that made a big difference, but it was one of those really slow. And then all at once kind of things where it was fast, but it was also like two years of user research in the making. If So I'm excited to talk with you about how you can get to know your users more over time and how that.
Compounds. awesome. So let's start from the beginning. what is conversion rate optimization? I'm sure some folks listening know a lot about it and others less but I do think as we start to see product and marketing and growth teams and other teams try to break down silos, it's a good concept for everyone to know something about.
So let's start with that. What is conversion rate optimization all about?
Jon: [00:03:31] conversion rate. Most folks, when they think about conversion rate optimization, they're thinking about the one metric and that's the metric of convergence, which means different things to different brands, but it's really more holistic than that. And what I mean by that is you really need to be thinking of it as.
A bunch of different metrics instead of I realize conversion rates in the name. it's really about looking at how to get people into the next step of your funnel. And that doesn't always mean the final step of, somebody actually turned it into revenue. So you really want to look at what are those micro conversions along the way that really are great indicators that somebody is going to be A customer eventually, and those are the things you might want to influence. Now, obviously you want to make that customer journey as simple as pro as you can, and make that process for conversion as simple as possible. But the reality here is that people are only at your website for two reasons. And we've been doing this for 11 years now, and there's only a couple of sites like Facebook where this isn't true because people just go to hang out, but they're not coming to hang out on your site.
They're looking to one. to do research and understand if your product or service can help solve their pain or their need. And two once they do that research and they determined that it can, they want to convert as quickly and easily as possible. So your goals of getting a visitor to convert into a customer and that customer's goals of doing their research and converting to solve their pain or need.
Are really well aligned and I think most brands don't consider that.
Erin: [00:05:17] you mentioned the funnel ending at revenue or that being one of the last steps. And I'm wondering do some of these same principles apply to product teams or to driving, ongoing engagement within the product.
Once someone has become a custom.
Jon: [00:05:33] of course, You really want to. Understand what the goals are of the consumer and help grease the wheels to make that happen. So it's a much more of a consumer or a user oriented, really an obligation is how I look at it. our mission at the good is to remove all of the bad online experiences until only the long one until only the good ones remain.
So if you're thinking about this, where are. Mission really aligns with consumers more than, and it does with brands, but that has the outcome of helping consumers to accomplish their tasks. So let's think about this for a second here. In terms of once somebody is already converting in your app, for instance, in your you're more of a service or a SAS product.
what are you going to do for retention? How are you going to get people to accomplish their goals on a daily basis that makes sure that they stick around. And I think that's another metric. A lot of people don't think about when they think about conversion optimization, it's really a holistic user centered approach.
I say this quite often, but it's really hard to read the label from inside the jar. So, what I mean by that is that as a brand. You have a hard time understanding what a new customer or new user, what their experience of even the marketing side of your website. Too, once they convert and they become a customer, what that experience is as a brand, everything about your products or your service, you know how to navigate your website, what your typical objections are and how you want to position yourself against those.
But a lot of brands focus too much on what they want to communicate, as opposed to what a consumer actually wants to accomplish on their product or service.
JH: [00:07:22] when you were talking earlier about all the steps and not just being the revenue, the checkout step or whatever. my mind first starts to go to like, all right, like how many people make it from the landing page to the gallery and how many people make it from the gallery to a product page or, like whatever the funnel might be.
So like a, to B to C. C to D. And then eventually you get to the point where they spend money. And it seems like there's an idea of, the more people would get through each of those things, probably the better. But I think kinda what you're saying about the, hard to read the label from inside the jar and stuff is there's, there is the question of are those steps, even the right steps?
do you need step B or C, right? or is the user thinking about it very differently? And you've just fallen into that pattern. Is that kind of the mentality you're getting at in terms of like, when you talk about being holistic and viewing it from the user perspective.
Jon: [00:08:04] Exactly. I think that, a lot of brands continue to market at visitors once they get to your site. And my point of view is that once somebody has reached your site, your marketing has won, it's done its job. It's attracted somebody to your site. Now it's time to serve their needs. And I think if you don't take that approach, Then you're likely going to repel people instead of just help them through that process.
Erin: [00:08:31] And so that's the user research, bit right. Is how do you serve somebody needs if you don't understand them. And for someone just starting out with conversion rate optimization or trying to understand it, or on the other side, starting out on the user research piece of it.
Where, how do you, recommend diving into getting to understate the needs of your customers to ultimately drive those mini and macro conversions that you need to drive as a business.
Jon: [00:08:57] I think there's two aspects to this and we always want to be making data backed decisions, but you look at qualitative and quantitative data and vast majority of brands. When they think about conversion optimization, they immediately jumped to the more. Data around the numbers, the metrics. And they forget about that.
The more qualitative side that is having a conversation with tumors and understanding what their challenges are and use it. Research is a large part of that. Most brands, they immediately go to things again, where they can get more quantitative data. So they do onsite surveys and onsite surveys.
Unfortunately, the first thing they do is disrupt the user flow. So it really creates a negative experience. Yeah. The only people who are filling out those surveys are ones who have. big problems. And so we run very few onsite surveys because onsite surveying, usually it turns more into tech support, help, or it turns into problem challenges and support.
And really that's not what it's for. So we want people to go through the right channels and then that just muddies up the data for us. So what we often will move to is doing. Yeah, user interviews, of course, but we do a lot of user testing as well, where we send people who match the ideal customer profiles to the site.
We ask them to talk out loud about their experience while we give them simple tasks to do, and we record their screen and their audio. And that works extremely well for just uncovering where people are having challenges. And I think under the umbrella of all user research, it's really just gaining empathy.
So many brands are missing that empathy angle of what our consumer is going through on their site. And the number of times that we were in a meeting with a very large, printer manufacturer, and we were helping them optimize their site and their site mainly sells toner and ink on their site. And what we were finding was that a lot of people were bouncing off the site because they could not find the right types of toner for their printer, because it was just way too complicated.
And which, really blew my mind, thinking about how large this corporation is and how hard they made it to find replacement toner for their printers. But. Once we got into there, we suggested we run some AB tests around that and we redesigned the ink finding tool if you will, to our finding tool.
And we did some of our own user testing on it and it tested way better. So we came in to our monthly meeting with that client and we made proposal around what we think you should change. And here's why. And one of the executives stood up and said, I'm sorry, this is the way it's always been at work. first we're, a very large corporation.
This must work well. I don't think we really should change this little thing. It's always been like this. And I said, okay, I disagree wholeheartedly. This is not what the data is showing us, but let me see, let me come back and give me a week and back. And I'll show you more about why you should care about this.
And they said, okay, that's fine. And so we did a whole bunch of user interviews and user testing and put a highlight reel together of people actually using the site and who were also telling us what their experiences were. And we kept a little. Running tally in the bottom corner about how much revenue they were losing every time somebody bounced out of that process, because they were really frustrated and couldn't find what they were looking for.
And we made it about 15 minutes of people, basically complaining about the experience in sharing what they, their frustrations were and how much money they were leaving in the cart as they left or were saying, Hey, I need this type of toner. And, and they couldn't find it. And so we kept running that tally, brought it back.
We showed about three minutes of that video to the executive and the next week. And he said, you can turn this off and just make the change.
Erin: [00:13:02] yeah. I love how you combined this sort of qualitative emotional frustration with the dollars and cents adding up. I wonder which one made the bigger impact, but, you can imagine the two together did the job, you mentioned you were doing quite a bit of both user interviews and user testing.
I'm curious if you could walk us through just a little bit more of what that looked like in terms of. By the time you got to user testing, right? You had a good idea what the problem was with the current experience and what a better alternative might've been. So how did you get to insights on what the problems with the sort of status quo were and what a better alternative might be?
Jon: [00:13:44] the quantitative data really can give us clues about where to start looking, Where people are dropping off in the funnel is a big indicator. There's a challenge. There are some roadblock being put up. So we usually do start with that quantitative, but it's really brief. We're just looking for the scent trails, right?
Where do we want to start first? Once we do that, we immediately go into the more qualitative and we start doing user testing on those areas. Now we do that because quite honestly, it doesn't matter what we think. It doesn't matter what the client thinks. It really matters what the end customer is thinking as they're on the site.
And we want to immediately go in that direction so that we're not biased. So we have some hints based on the data. And that's really where we want to focus around with the user testing. But I'm really shocked at how many people start doing optimization and they either wait to do user testing until it's way too late in the process.
And they've already muddied the results and are only asking questions that are really leading to get to where they want to be with, where they think they should optimize. And I think that's a skill in itself, Is asking the right questions of course. But I think that we really push.
Having this qualitative data upfront and as early as possible on our team. And we do quite a bit of that. And then, in terms of actually interviewing customers, it's really, we start broad and want to understand generally what their challenges are, why they're, what their pain points are, why they're looking at the site and that really can be helpful.
we worked with a baseball company called Eastern baseball. If you're not familiar, you don't have a kid in little bit. then you probably don't know much about Easton baseball cause they don't, they make aluminum baseball bats, but it's about 99% of the swings in college baseball and in little league.
So they're a major sponsor, literally world series every year. And we came in and helped them optimize their site. And the first thing we noticed was you got to the site and it was a wall of bats, right? E-commerce trying to sell a bat. They all look the same. They're all tiny little pictures, maybe different colors, but you can't tell much about a bat from, having a grid of 40 or 50 bats on a page.
And the eCommerce experience and parents were getting really frustrated. Because they would show up to this site and they would try to buy a bat. And the problem they were having was that not only did they not know what bat they should buy their child, but then they would get a bat and they would just take a guess based on price point.
And maybe the kid liked the colors or something, and then they would buy it and they'd spend a few hundred dollars on this bat. It would show up and then they would get to the game and the empire wouldn't let the kids swing with that bat because it wasn't certified for the league. So we found this by interviewing actually customer service.
And we said, okay, what are the top calls that you get? And the top calls were things like, I don't know what back to get my kid and looking at your site. So pretty general. So we would start there. Then we would dive a little deeper and we'd say, okay, what's next? And they say, a lot of people were returning bats that had been used once.
And it's really hard to return a bat. They can't resell it because, imagine you start swinging with a bat, use it batting practice, and it dings up the paint and things like that. And then, it makes it hard to resell that bat without some, putting a lot of effort into it. So they really wanted to reduce returns.
And so in talking to customer service led us to interviewing consumers about those two points. And so it's really finding, again, those scent trails and then dive in a little deeper on those that allowed us to formulate a bat finder for the site where a parent would go in and answer a couple of questions.
What type of hitter is your child? are they swinging for the fences or are they looking to just get on base, are, are they just bumping every time and they're really quick, or, you know, the Slugger and then, what leagues do they play in? how old are they?
What's their weight so that we can help determine what the right weight of bat is and where that weight should be positioned in the bat and, what league, so that they know what certifications. And then we were able to, by asking just a few really relevant questions. Able to get it down to, four, maybe five bats at different price points that would all qualify.
And a parent could then say, okay, I want to spend $200. This is the right bat based on my budget. And that immediately was one of the things that doubled their conversion rates. but it wasn't overnight, It took a lot of effort and a lot of user research to really make that, insight come through.
JH: [00:19:12] You mentioned, following the scent trails of got initial data to inform the users. As a research strategy that comes out of there. Will you take that all the way? if you notice that, people who come through a certain channel convert at a much higher level than people from another channel, will you then go out of your way to make sure you're predominantly talking to people from that lower converting channel?
or do you run the risk of if you go down that path, like over constraining, the profile of person you decided to speak to, or have them record their screen or whatever it may be.
Jon: [00:19:41] I think we always want to have multiple personas that we're going after. we don't want to rely on just what one person says. And I think this is where AB testing really comes in or multivariate testing. We're able to form these hypotheses based on these user interviews. And, so we start with the scent trails, as you mentioned, then go to user interviewing user testing, take that data to say, okay, these are the challenges that we think are being caused by the user interface and maybe the messaging or the content that may be missing or is not being engaged with.
And then we run some testing to help combat those challenges and make it easier for the consumer. And that is where we can really start diving into the quantitative side and start getting that data back that it's going to tell us exactly what should be done on more of a statistical level.
JH: [00:20:35] And do you have a rule of thumb when you go into this, for the user testing piece, like how many you want to see, or is it very like dependent on how quickly trends emerge and that determines the number of people you want to, have run through that process.
Jon: [00:20:50] it's really interesting. I get this question from brands quite a bit, which is how many user tests are you going to do if we work together and the number is super low, really, we start with five. We really don't need to do much more than five to start. There's been a lot of research on this. I'm sure you two are aware of this too, with the user interviews, but really what you find is that.
Anything over five initially is really not gonna start telling you anything. That's going to be that eyeopening. we rather would rather go deep with those five. And then if we start really finding some. Areas of the site that need a lot more research, we'll start focusing in and run another batch of five on just that one area.
But it's really what we find is more important than the quality or quantity. Yeah, I should say of the test is how well you have done your pre questioning on them to make sure that they're a good fit. If you find somebody who is fitting that ideal customer profile or ideal user, and you have done your homework and how you question them to profile them into the, and get them into being able to do the test or the interview, that is going to be way more valuable than, just allowing anyone to take it and putting it up on Twitter.
Erin: [00:22:10] and it's almost like I hear you saying recruiting is one of the most important parts of user research.
Jon: [00:22:16] I think it's the most important part. Quite honestly, I think that, so many people do recruiting haphazardly and then they get. Bad data back or not as helpful data. And it all starts with who you're allowing to do these tests and, in the baseball, company. And when we started doing those tests, we went to Intacct, did interviews with actual users who had returned their bat to understand why.
And if we didn't have that data that we imagine, we just put out on social media, Hey, come take our, we want to interview and talk to some people. We wouldn't have found those pain points as easily.
JH: [00:22:51] will you take advantage of, like any of the screen replay software, like FullStory Hotjar and stuff like that as part of this toolkit to understand it, or do you need the voiceover of somebody being like, this is so frustrating. Like what bat? Like I can't is this one longer shorter?
I can't tell, is there a place for those or do you always go to the user testing approach?
Jon: [00:23:07] the user testing approach is going to come first. And then once we have some idea and understanding, is this a larger scale problem? And this goes back to your question around how many people we're interviewing. Yeah. We only doing interviews with a handful of folks upfront. But once we have those centrals and we dive deeper and we start looking at Hotjar, we love Hotjar, we use it on almost every customer, And that really allows me just to see if this is a larger scale problem, because then we can collect that data at scale.
Erin: [00:23:38] I wanted to ask you are talking about, most brands have more than one target, persona, a few different flavors of ideal customer profile. And we all are. We're on the same page, I think with how important it is to, especially when you're doing qualitative research, talk to the people who can give you the most meaningful insight.
I'm curious from your perspective, working with different clients, you find that, sure. There's a variety, but do you find that people tend to be in tune with who their ideal customer is? Do they not really think about that? Do they have an outdated idea of who they are? How do those interactions go? And are you able to, help businesses actually understand who they're serving, in a higher level more?
Jon: [00:24:21] Yeah, that's very true. It's a great question because we ended up having to run workshops at times to help brands to determine who their ideal customer profiles are, or maybe they have. 10 ideal customer profiles. And it's we're not selling to everybody here. let's whittle it down to three.
and start there. and I think that's, what's really interesting is, we've worked with like bike helmet companies and say, well, everyone who rides a bike is our customer. It's that's not accurate. Let's dive down a little more here and we can really truly start aligning the messaging with who exactly you are selling to.
And that's going to perform a lot better. niching down is almost always the way to go in terms of helping. Really be more relevant for a customer journey and helping them to understand if you're the right fit. But yeah, I think that most brands know who they're selling to when, by the time they get to optimization in terms of working with a team like that, like I said, the good, because other ones they've already tried all the little hacks and tricks that they read online that didn't really move the needle, or at least, maybe there were some best practices in there that they worked with and they've done over a handful of years.
And now they, have a lot of traffic there. They have, several million in sales at least, and they're really looking to take to that next level. And that's when they come and reach out to us and say, We know who we're selling to now, we found product market fit. And I think that's the key here.
If you know who you're selling to you, you have found product market fit for the most part. And if without product market fit, there's no reason to do optimization because you really just need to better understand who you're selling to and align that messaging. And while we could probably help with that, I don't know that your return on investment is there because you're likely not.
Seeing as much revenue or profit as you should be before you would start working with somebody to do your optimization.
JH: [00:26:18] How do you untangle like existing conversion issues and how that influences, like how you look at different segments? Like in the baseball example, right? Baseball obsessed parents probably aren't returning the bats at a high rate because they know which ones they need and stuff. Whereas like baseball, agnostic, parents, look like bad customers.
Cause they don't cause the site's not helping them. is, I don't know if I'm getting too into the loophole here, but it seems like those things are tangled together a little bit.
Jon: [00:26:44] Oh, it sure is. And I think that, Really, this is why we have data scientists on our staff, and we really want to help dive into that data. And I think this is where going back to where brands are often too close to their own problems. It's hard to read that label cause they're inside the jar.
That is exactly why they would work with us. We're coming out with a fresh perspective active, and this is where we want to make sure that we're letting. Those initial scent trails lead us as opposed to us looking at the site and just saying, we think this is the problem, right? So being data driven is really key here and letting that data really tell us where the problems are, as opposed to us just coming to the conclusion ourselves.
Erin: [00:27:28] Awesome. So you mentioned you do a lot of user interviews and user tests you use Hotjar. What else is important in your toolkit? When you think about getting insights to run experiments that are going to lead to the best conversions possible.
Jon: [00:27:45] we want to track every click and movement that's happening on a site. And we want to do that in an aggregate fashion. So there's no privacy concerns. of course GDPR and California regulations and all of that. We really want to make sure we're maintaining privacy, but you still want to understand how people are engaging with the sites.
When we look at both qualitative and quantitative data sets, we're definitely using things like hot jar, Google analytics, everybody uses it. it's really helpful. and, there's other data platforms out there like glue G L. E w and I think there's a whole bunch of those type of data things that will help tell you what's happening on the site.
but really when it comes down to is, just helping to understand where people are dropping off in the funnel. And what content they're engaging with or not engaging with. And that's really what the more quantitative data is for. and we're, we use some of that type of data, but we a relatively small tool set the number of brands that we.
In inherit that they have, massive amount of codes. Yep. It's running on their site and they come to us and say, why does our site slow? And they have code snippets from stuff they haven't used for years. And so one of the first things we always do is audit that and help them clean that up because we don't really need to be slowing down the site anymore than, it was a couple of different items.
Once we get through that and we have formed a B testing plan and roadmap, which is all about, what should be tested on the site, where those tests should be run on the site. And in what priority order you should be doing that testing. Once we have that all formed based off the data, then we start looking at testing tool sets and, there's some really great ones out there.
the market has come a long way over 10 years, as you can imagine. When we first started doing this, the only real tool out there was optimized plea. and it was a great tool. they moved to a handful of years and it's been quite a while now actually into the enterprise market. So it's quite expensive if you're, a mid market brand.
but it's a great tool. and really what has come in the last 18 months or so, and just. Really was a great tool, is Google optimize. it integrates so well with the rest of the tool sets like analytics, et cetera, that it really just can't be beat. And I think, they've really come in and eaten lunch of a couple of competitors.
We had used really frequently before.
Erin: [00:30:13] we've been using Google optimize a lot in the last year and a half ourselves. And I can, it does, you do run up to limitations eventually, but for the price of free, it's pretty incredible. okay, so you've got your testing toolkit. It's always, I think I've never worked in an agency, but to me it sounds like a kind of cool aspect of being a consultant or an agency where you really. Develop and iterate on a playbook over time where you get really good at helping people with similar problems, but their own flavor of it.
So you go in and you're like, let's get rid of this crap on your site. That's not doing anything and just slowing things down. It makes me want to take a look at ours right now. and then you have this like list of possible experiments you would run. How do you, is that like standardized or I know you work with a lot of eCommerce sites, but do you have a kind of punch lists of, like different areas of the funnel you might look at or different products areas you might want to tackle into?
And you're just looking for where. where do things look the worst and what do we hypothesize might be solves? And how do you prioritize the millions of things you might do and which ones might make a big impact?
Jon: [00:31:23] Yeah. So we tend to do a much more customized approach as opposed to just going through a checklist. a checklist is really a great way to start if but most of the brands that get to us, they've already done that checklist. They've found those checklists online and tried to get through them, and have implemented a lot of those.
Little best practices if you will. But, by the time they get to us, they're looking for something a little more nuanced and more customized. And so that's where, every plan we're putting together for our customers is fairly unique. Now we do have a program, we call it our conversion growth program.
That is a methodology, the repeatable methodology we've done for, a decade now of how to get. Ah, conversion rates to grow month over month and how to get these metrics to the, all the different metrics we could be looking at like average order value and, even just, add to carts or how to get people next, step down the funnel.
And as I said at start, find out what. Those small conversions might be that Eric, and it could be great indicators that somebody is going to eventually turn into a customer. and so we want to really focus on what are the unique metrics for this brand that are gonna help them to. Achieve their goals.
And, and then, in terms of testing each test, we have different areas that we know wayfinding and navigate emission is usually an issue, but how we test that and what the, what is going to be relevant to the consumer is really differs per brand. and so that's generally the approach is much more customized within a.
A program of how we do these things. So for instance, you asked all about how we prioritize these tests. we have a formula for how we prioritize test and that form has been tweaked and, created over a decade. Yeah. Where we think the most impactful things are and why. And, I'll tell out of this formula without nerding out too much on it is that, the biggest thing is return on investment opportunity.
That's really where brands when they come to us, of course they're looking for these single metrics, like conversion rate, cetera, but. Overall, we are how we measure our success is return on investment. So we average about a nine to one return on investments. And for every dollar that our brand invest in optimizing with us, we're able to show them about $9 in additional revenue and, out of all marketing activities, that's a really high return on investment.
Really, that's, if we can get that return on investment really high, then we've done our job. regardless of if it's increasing conversion rate increase in average order value, increasing customer lifetime value, retention rate, things of that, those all are things we're looking to optimize based on the unique situation, how we do that through the op through the conversion growth program is, follows a methodology that we've, have seen.
Produce great results for a decade.
JH: [00:34:28] That's awesome. I hypothesis I have. And I'm just curious if this is true is like where teams maybe get stuck when they try to do this themselves is, you put your detective hat on and you start looking through all the data you have and stuff, and then. People just start getting obsessed with slicing and cutting the data more and more where without a specific purpose for doing so.
So it's what's cut it by device. Look at it by channel, look at it by location like this lifecycle, whatever. and you can just get so much noise almost that it's hard to figure out like, which cuts or which signals or senses you were saying earlier are like worth following are worth digging into.
is that something where people lose their way when they try to do it on their own? Is that they just. Get like too overwhelmed with like how data informed or driven they try to be.
Jon: [00:35:11] For sure. We see this all the time where, somebody has been into an analytics account, annotated, has a thousand annotations in their Google analytics account to the point where they just can't keep up anymore. I think that David data overload is a huge challenge. And the bigger issue here, I think is that start with that data focused approach, as opposed to goes to looking at it and getting it cleaned up and having great baseline before they start really optimizing, which is important to track success.
But really need to get quickly into that. more into the qualitative side of action, talking to consumers and the quicker you can get to that step, the more you're going to get out of the data weeds because quite honestly, yeah, it matters what device somebody is on. And when you start running AB tests, you really want to segment down in those ways.
But that doesn't matter up front. And if you focus on those things upfront, you're going to miss the big picture items. And I think that's part of actually being too close to their own site right there. the best thing they can do is look into the data and, because they have a hard time understanding from a consumer's perspective.
So if that's the best step for them, I would argue, take two steps back. Look for the wider picture, talk to some people, customers and consumers, and yeah. And, you'll really, I think, see a broader issue that then don't focus on how to solve it just yet just focus on what the broader issues are. And then you can focus on potential hypotheses and how to solve those later.
Erin: [00:36:52] I think that's a great point. Like to the devices in, there are lots of examples like that, but I can remember back in like 2011, everything was, we need a mobile app. And of course, eventually, yes. the world's going mobile, but to your point about how do I buy a bat or, what are they actually trying to do?
And. if you assume your customers are in desktop and a hundred percent of them are on mobile, then yeah. Maybe that is your biggest issue, but starting from the beginning of what are they trying to do? And are we doing a good job of that, regardless of all these more kind of nuanced. So I think it's huge.
Jon: [00:37:27] Yeah,
Erin: [00:37:27] anything? Yeah,
Jon: [00:37:28] I was just gonna say real quickly, it's interesting. Most brands come to us and they say, Oh, we want you to optimize our mobile site. And I said, okay, what type of products are you selling? Have you talked to consumers and how they're doing that research for their PR for your products?
Most brands e-commerce have way more traffic on mobile than on desktop, but their conversion rates are way higher or they're the money generated from the site is much higher on desktop. So if you focus on mobile where all of your traffic is. And you're optimizing for conversions on mobile. It's, you're really not going to move the needle as much.
You're not going to get as high of a return on investment. Where if maybe what you do is you focus on messaging on mobile and then you focus on actually conversion on desktop. So there's, there's nuance there that I think a lot of brands don't really consider.
JH: [00:38:16] Yeah, that journey is so complicated these days. You know what I mean? Like how many times? just for me personally, like sitting on my couch with my phone researching, a pan I might want to buy or something with no intent really buying it. But then later when I hop on my computer, the next day or something, I might pull the trigger, but it's that interplay is so tough to pin
Jon: [00:38:32] Yeah, the attribution models is, doing that alone, it can be a full time job. And even then, you're pretty much just guessing which way the wind's blowing at this way.
JH: [00:38:40] Yeah. It's so complex these days.
Erin: [00:38:41] John, any parting words of wisdom for listeners, doing the best they can with conversion rate optimization and taking advantage of that sweet, qualitative user insight at the same time.
JH: [00:38:53] One cool track perhaps. No.
Jon: [00:38:58] we'll start. I think that getting into that more qualitative side early is really helpful. And then don't forget to check back in on that every couple of months. every few testing cycles we do testing cycles on a monthly basis. We find that's generally a good pace. And, the reality is every few months you need to be checking back in, because e-commerce trends are always changing.
let's talk about the PA this 2020, it just the past six months alone here. And what the, I saw a stat the other day that. e-commerce has grown more in the past two quarters than it had in the 10 years prior. And so that's a lot of growth. And so trends changing who's buying from you online is changing rapidly.
So you want to pay attention to that data is often as you can, because it is going to be changing and now more than ever. so going to the qualitative side early and then often is really important.
Erin: [00:39:58] Fantastic. I think we'll leave it at that. Thanks for joining us, John.
Jon: [00:40:03] Thanks for having me really appreciate it.
Carrie Boyd is a Content Creator at User Interviews. She loves writing, traveling, and learning new things. You can typically find her hunched over her computer with a cup of coffee the size of her face.
January 19, 2021
Going it alone can be tough—but there are perks to solo work, too. Izzy Nichols walked us through how being a lone UX researcher can be good for leveling up your skills, taking the next step in your career, and finding out what you really want.