Join over 150k subscribers to get the latest articles, podcast episodes, and data-packed reports—in your inbox, every week.
UX Research Topics
SUBSCRIBE TO OUR NEWSLETTER
February 15, 2023
Carlos Tellez explains the power of growth research. Learn about what growth research is, what it isn't, and how it works.
Erin May, John Henry Forster, and Carlos Tellez, Growth Research Manager at Nubank, get to the bottom of growth research. This episode is focused on two of Erin's favorite things – optimization and research. They discuss how research informs growth teams as Carlos describes the moving parts involved in the research process. Listen to hear what Carlos loves the most about his job, the scope of growth research, research design methods, tools, and tactics.
Carlos Tellez is a digital strategist, UX researcher, service designer, and education enthusiast. Currently, he is the UX Research Manager at Nubank. Much of his work is focused on creating effective research teams and advancing the field of UX research. Carlos has a Master's degree in Architecture from Harvard University Graduate School of Design. He also has a Bachelor of Arts in Urban Studies from Vassar College.
Carlos - 00:00:00: I think research gets called in those instances, which is like a more kind of inspirational figure. It's like, okay, what ideation can we get out of a round of research? What new angle, what new sort of strategy we can derive from the research? And so, I think that's really powerful.
Erin - 00:00:21: This is Erin May.
JH - 00:00:22: I'm John-Henry Forster. And this is Awkward Silences.
Erin - 00:00:28: Silences.
Erin - 00:00:35: Hello, everybody, and welcome back to Awkward Silences. Today we are here with Carlos Hernández Tellez, and we are very excited to be talking about growth research, which is one of my very favorite topics to talk about. So, thanks for joining us, Carlos.
Carlos - 00:00:52: Yeah, thank you so much for having me. I'm very excited to talk about growth and all things research with you today.
Erin - 00:00:58: Fantastic. We've got JH, too.
JH - 00:01:01: Yeah. Erin, I feel like you're a resident growth person, so we'll see if I can get a few words in. Take it away.
Erin - 00:01:08: Yeah. Two of my favorite thing’s growth and research. So, this should be a lot of fun. Really enjoyed your presentation at Learners as well. So, it's like our own personal Q&A follow-up in a way. So very exciting. Well, thanks again for joining. So, we'll just go ahead and jump right in. I think a good first question often is, what is this thing that we're talking about? So, what is growth research, and how did you come to be focusing on this?
Carlos - 00:01:35: Yeah, so there's a funny story associated to that because I had never really personally had any direct experience with growth research prior to really joining Nubank three years ago. So, I think the definition of growth research for me is very tied to this specific place where I've been able to really kind of get creative with all things research applied to growth work. But the way I would define it is exactly that. It's, I think, using and applying all our different techniques, tools and even mindsets of UX research and the things we know how to do really well as UX researchers, and then applying them to helping companies, helping startups in their growth journey and helping them attract, onboard and properly sort of set up each and every one of their clients for a successful product experience. So, I would say that it is defining growth research is not necessarily about talking about product-oriented research, but more of a platform that really helps, obviously, and enables everything that happens in the customer journey after that, after that client, after that person that user has joined the product or signed up for it. So, it's applying all that we know how to do for products but thinking about it in a platform way that I think can be even more elastic, even more exciting in many ways.
Erin - 00:02:58: Great. Yeah. I imagine a lot of people listening are more familiar with the research part than the growth part. So, you were talking about growth in terms of onboarding and customer journeys and really setting them up for having a good product experience. And you also mentioned the word platform a couple of times, so I wondered if we could dig into that a little bit. What do you mean when you talk about sort of platform research and platform work?
Carlos - 00:03:22: Sure. So actually, and maybe even to double click a little bit on your question about growth, I think it's something that I've personally seen a lot of excitement around in the last couple of years. And obviously, again, growth means obviously supporting the company's expansion and growing the customer base and making sure that obviously that ties to a lot of the marketing efforts at the very, very beginning of the journey. But then also focuses a lot on conversion activation efforts and then as well as onboarding and proper sort of handover even to all the product teams that are within the product itself. I think growth is sort of this kind of sexy term that's been throwing around, that's been thrown around a lot. In fact, I think us at Nubank had changed it a little bit. Even we used to be called ‘focus on acquisition’ but then we understood that actually our work has to do with more than that. So, everything from awareness to onboarding and I think growth is a good word to tackle that. And then going back to your question about platforms, even that I think took us a little while to get used to sort of what it really is, platform thinking. Like when you as a researcher are not allocated within a specific product that obviously has certain set definitions, typically a very well defined or at least as well defined as it can get roadmap and so on and so forth. And so, when you are a platform team, I think you not only get to be more creative and have more of a wide reaching impact in what you do, but also that comes with a lot of ambiguity. And so, I think working as a platform team means dealing with so many different variables, so many different interdependencies with other teams. But understanding that you are a cross functional and even you're a very sort of foundational team for other product action that in so many ways happens above you or sort of I don't know, in a more sort of siloed is not necessarily the right word. But platform is a lot more about all that horizontal foundational work whereas product efforts I think are a lot more vertical and clinical in that regard focused.
Erin - 00:05:38: So, the horizontal like literally covering more surface area. And so, when you think about growth multipliers, the work you're doing has far reach right? Because it can potentially impact the entire sort of product experience, customer experience.
Carlos - 00:05:54: I think as platform teams, our command and really our premise is you can go work on anything as long as it drives work growth for the company.
Erin - 00:06:04: It’s good to make friends.
JH - 00:06:08: We've talked about this a lot because me on the product side, you are thinking about growth. There's obviously a lot of overlap. And as we try to have come up with a working definition for ourselves, I think something we've kind of landed on a little bit is growth is a little bit more about connecting users to the value that we already have. Like so whether that's improving adoption rates or success through funnels or setting up different loops, where the product team, the core team, is a little bit more responsible for like, creating new solutions that can add some value. So, we don't have a way of solving this problem for people today, and now we do. That's not a perfect definition, but in broad strokes. I'm curious, given the overlap, but also the uniqueness, where does growth research differ from product research? So, I'd imagine there's a handful of things in common, but it's a unique thing. You're speaking to it in a unique way. Like how is it different?
Carlos - 00:06:52: Yeah, I think one of the key differences, and again, it sort of really speaks to our placement, I think, within the customer journey, is that when you do growth research, virtually, your sample really, by definition is everyone who's not yet using your product or everyone who's not yet onboarded or part of your business. So I think that that is very exciting. I think it gives me great pleasure to be working in such an environment. But I think it also comes with a lot of different challenges, particularly around data collection, around reaching that customer, around understanding that future customer even. So, while a lot of our product allocated teams can dive in and get to know that customer, get analyzed, a lot of the analytics and sort of know really well their segment and their behaviors once they're using our product, I don't think that's necessarily the case for growth. So constantly we're working with brand managers and product managers who are interested in entering new markets or opening up new segments for the company. And that exactly means that we're just operating in this sort of open sea environment. And so, I think that requires a lot of elasticity, a lot of sorts of comfort with ambiguity, a lot of detective work to then understand and piece together right there's this open sea, but we're going to cast our net here first and then that's going to take us a little bit forward and we're going to get a little bit more clarity. And so, it's a little bit of sort of an adventure or like a quest kind of way of working, which I really enjoy. I would say that's the main difference. And again, in that we're operating from the door out. And so, as a result, we, I think, have to be a lot more creative in the ways we go about gathering that data and learning about our users. I would also say that the second bit is that obviously work growth teams, for those of you who know, are very focused around a ton of metrics. So, this means that we need to be able to be creative in the way we apply our methods and the way we connect with this research and design mindset. But we also need to be very sharp in our interactions with these business teams that are very metric oriented, that are very obviously in one way or another, pressing for that business impact and monitoring our success quite closely. And so I would say that it requires a certain elasticity and creativity to operate in this open environment as well as a certain sort of business sharpness in order to create good relationships with the product teams.
Erin - 00:09:24: Have you had to do a lot of selling of the value of qualitative user research to the quants you work with? Or are you coming into a scenario where it's like, we need user research, we know that already, we believe in it, or maybe somewhere in between?
Carlos - 00:09:39: No, this is a great question. About three and a half years ago, when I first joined Nubank, I was the only first researcher allocated in the growth business unit. So back then I was lucky to have support and already a good amount of buy-in from the general managers and others in sort of leadership in the business unit. But I do think there was more than, more than a selling, there was a need for demonstrating how we were going to do this and how research was going to operate within the growth context. So again, the teams were already good to go with tons of dashboards, tons of data, tons of analytics. They were constantly wondering about the why. Why is it that that's happening? Or why are people dropping there and why are people not following through? And so they understood the need and they had some questions formed pretty well when I joined. But yeah, I think there was a need more of like, okay, well, come along on this journey, we'll do this together. And then you'll really sort of see the impact. And I think ever since the team has grown, so it's tripled at this point. We have a very, very busy research pipeline right now. And so I think we've really tried to work together to show that really good impact and juicy goodness that I think qualitative research can bring into a very quantitative, again, metric oriented environment.
JH - 00:11:02: I really enjoyed the open sea casting net adventure frame.
Erin - 00:11:06: I was going to go, yeah, don't boil the ocean, though.
JH - 00:11:08: Yeah. The thing I was wondering if maybe you have an example here is I think sometimes when you have people who are so fluent in the data and metrics, you inevitably find something that just makes you scratch your head. You're like, this trend or this data point makes no sense. And then you probably can plug in and help figure out the why or answer for them any of those moments. Because I think that's one of the things I always think about when people are really good on the data side, you inevitably find something that just like you can't explain and then the research. I'd imagine that dynamic is really powerful.
Carlos - 00:11:38: Yeah, actually yes, we do have several instances of that happening. For example, one of the I mean, I don't know. No, I think I can go into fair amount of detail about this actually.
Erin - 00:11:53: You decide. We’ll take whatever you got.
Carlos - 00:11:55: But there was an instance within our flow right before the customer was going to get released and we're going to get their new bank account open and ready to go. And we used to show a nudge, right, encouraging them to invite family or friends to join Nubank as well. And that screen did really, really well typically when you only just looked at the dashboard. So, tons of invites came from that, tons of releases, and that was all good to go, except when during one of our sorts of funnel deep dives last year, we really realized that, well, the copy was still a little bit weird. We also were not giving the user the amount of choice that they would like in that scenario. So obviously while they were waiting to be released, they were encouraged to shoot an invite out to someone they knew. And that was good. But many users are related or told us that that was not necessarily the best moment for that since they had very little context about the product yet. They hadn't even entered their account. And we also didn't necessarily offer, again, a good amount of pathways for them to choose what they wanted to do forward. So, the business analysts as well as the designers got together to create a much better nudge to place it in a more appropriate part of the journey. And that did even better. And so, on the dashboard and on the numbers way, was seemed to be doing okay. We understood that in the UX and in the qualitative experience, it was sort of missing the market in so many ways. So, with a quick nudge and a quick amount of adjustments, we were able to create even an even better solution that is still up and running today and doing much better than the original one. So, I think that was one of the instances in which were like, okay, how can we bring value and how can we also inspire new ways of thinking and problem solving with the team?
Erin - 00:13:48: So, it sounds like you had a funnel or a loop, you had a flow that was working well, and only upon sort of digging deeper did you realize maybe it could be working a whole lot better than it already was. Because I think what happens a lot is you see a problem, a drop off or something and you want to go in and optimize and fix it, but sometimes you can get your most leverage from improving something that is already maybe in your mind, in folks mind, this is working great.
Carlos - 00:14:19: Yeah. And I think we find ourselves a lot in that spot. I do think that as a whole at Nubank, we pride ourselves on our user experience. We've given it great care and great attention. And so, for all things considered, our flow at this point, I think we've gone through several rounds of iteration and optimization. And so, it gets even harder every time to kind of get that next percentage point and get that next improvement. And so, I think we're constantly sort of working in that way, obviously optimizing whatever needs to be optimized, but also understanding what things can be even better and we don't even yet know about.
JH - 00:15:00: Yeah. You were talking a little bit about some of the differences in mentality and mindset and proficiency with the metrics and the business impact. Does that translate to different research methodologies or approaches that you're using that maybe people in different research roles lean on less or approach in a different way?
Carlos- 00:15:18: I think triangulation is something that I love a lot and that we care a lot about here at growth. I think the reason why it's important is because, again, we're constantly needing to sort of collect our own data in so many ways and also understand the ways in which we also need to crisscross it in order to paint a fuller picture of the funnel. Again, you have a quantitative view. You can also get an entirely qualitative view, but how do those two things talk to each other? So, I think we use a fairly familiar, I would say, toolkit, everything from interviews and quantitative surveys, usability tests, diary studies, analytics dashboard. So, we use the whole spectrum, I think, in order to inform this triangulation. It just depends on where in the funnel you apply that method. Because obviously we've also, I think, learned a lot about like, okay, you know what, applying that survey or doing an interview at this stage doesn't make sense. It doesn't really give us any new insights or not as many as we expected. So, it's understanding where to place these methods at what stage of the funnel. I think that gives you that true richness and in so many ways then articulates that triangulation properly, because people will tell you something in the initial interview. People will show something else entirely differently when you're running a usability study, then say something differently in the diary and then when you go to the dashboards. And so, I feel like it's more of articulating the narrative that each of those methods provides, but I wouldn't say that there are any. Well, we've used a lot of interesting applications of Kano model, of concept testing, also applied to a growth context of a sign up. It's sort of an interesting way, but I wouldn't say that there's any particular method that's new or unknown. I would say it's more about understanding that it's like a curatorship attempt to when to place them and how to use them so that they give you a fuller picture of the funnel.
Erin - 00:17:15: And that's what you see, I think a lot of in modern research too right, is applying old methods that have existed in academic research for a long time, new ways and a new combination to new kinds of problems, whether it's we talked with someone from Spotify on digital ethnography and applying, again, existing methods to new contexts and ways of doing it.
Carlos - 00:17:40: Yeah, we're huge fans of digital ethnography. Here, especially a couple of years ago, we did a big right when COVID was in full force, obviously. And again, taking advantage of a lot of that remote research shift that took place, because, obviously, in order to navigate this open sea environment that I was talking about earlier, I think that’s just brought us so much value and being able to reach so many more customers and potential customers all throughout Brazil and Latin America, which is a huge territory. And I think applying them, I think, digitally and finding new creative ways to take it further and take them further as methods and as tools, I think has been really exciting.
JH - 00:18:22: When you were saying triangulation, what does that actually mean in practice? I think three, you're trying to get different signals and use that to kind of figure out where to focus. How does that come to life?
Carlos - 00:18:34: Yeah, so here at growth, we actually team up a lot, really closely with our business analysts and we call them CIA, which is really hilarious, but it's the central insights and analytics sort of squad here within the growth business unit and we rely on them a lot to help us in this effort. So, okay, we have aligned with a certain PM or a certain designer that we're going to run a survey or a usability study or an in-depth interview. And then as part of that results we try to, as much as possible, pair again tactical or I would say as we deploy the method we also try to understand how are other alternative avenues or parallel ways to collect data as that certain experiment is running or as that field work interviews are running, so that we can then more kind of clinically and accurately, I think, assess with CIA what's happening in real time. And so, I would say it's, again, letting the user researcher and the designer do their thing, run their show. But also, not forgetting that there is a really big important ally here on the insights and analytics side that can help us and can help again triangulate the data that's coming in qualitatively or through that survey, through that usability study, and then pair it up with the efforts that we get from CIA. But I would say that there is this kind of approach in which I think that is exemplified even more successfully, which is a yearly kind of state of the funnel research project. And we're just about ready, getting ready to run it in Q1 next year. And so basically that is like a whole month-long research effort that also is really helpful and really useful for all of the teams as they're working on the roadmaps for the new year. And basically, what happens is that we run the full gambit. We run interviews, diary studies, as well as sort of specifically created dashboards for the participants. And so in that way, we're better able to monitor sort of reported qualitative data in a live interview, reported qualitative data via a diary, as well as sort of tracking of actual behavior that's taking place in the app once they navigate through our funnel and then become our clients. So, I would say that there are ways to do it within the same research project and we try to do this in this sort of very intensive way. But then there are also other ways in which even just cultivating strong relationships with more quantitative teams can help you triangulate your data and better inform again and build your more robust insights.
Erin - 00:21:19: So, if you're doing this every year, are you benchmarking? You talked about, where does your funnel start and end? What are the keys? Is it when you become a customer or when you have done certain activation actions?
Carlos - 00:21:33: Yeah, so our funnel actually extends a little bit earlier and later than you would originally think. So, we tackle everything going from awareness and consideration, so even before the user starts to fill their application to the project, to the product, and it also ends after they've become an active client. So, we do consider onboarding activation and recommendation. So, the MGM ‘member-get-member’ part of it as well. We're even trying right now and very strategically using research to try to talk to some of those other teams that we hand over to, to see if they can let us play with them a little longer. So, the funnel covers all of that. And I think obviously it becomes an articulation and even kind of descriptive exercise. And how you define your funnel, what happens really within each of those stages, and then how to act and how.
Erin - 00:22:31: To really and so you have a structured way to do that, it sounds like, and you're able to compare it over time. Is that part of the idea? We can say last year this part of the funnel was looking, I imagine you're doing this continuously as well, but this is more of, again, the comparison, deeper dive and you're looking at the quant and then pulling in the quality. And I've seen some of your mural boards. They're mural boards, right?
Carlos - 00:22:55: Yes.
Erin - 00:22:55: Is that how you kind of bring all this data together? Yeah, I imagine figuring out how to do that the first time was quite an endeavor and maybe is it a little easier as inconsequential goes around at it?
Carlos - 00:23:10: Yeah. Oh, my God. Even just reflecting on that first time brings all the memories back because, again, that was right at the very beginning. I was there alone. I worked with a lot of designers and PMs in putting that together, but it was basically, you know, Carlos against the world at that point. It started from this crazy idea. It started from, you know, the desire to start paying a lot of this research debt. So, there was obviously lots going on and I knew that through a big research effort like this that was going to be helpful. But yes, ever since that first edition, we have not only kept track of our own internal evaluation tools and there's product excellence frameworks and other sort of customer centric metrics that we use to benchmark every year we run this as a way to like, okay, yeah, keep a benchmark. Like, are we improving? Is the funnel getting better? Is our own sort of self-evaluation of the funnel improving? But then also we try to become more ambitious every year. So, we knew that while there were some things that were happening in a very rudimentary way in the beginning, at least in that first edition, then we went in and improved them for the second and now we're getting ready for the third edition. A much more mature team, a much better understanding of what are the nooks and crannies that we can get into, and even double clicks. I think we're getting more specific in our inquiry every year. And as well, yeah, we're sort of always benchmarking and comparing to where we were a year ago.
JH - 00:24:43: All right, a quick awkward interruption here. It's fun to talk about user research, but you know what's really fun is doing user research, and we want to help you with that.
Erin - 00:24:51: We want to help you so much that we have created a special place. It's called userinterviews.com/Awkward for you to get your first three participants free.
JH - 00:25:03: We all know we should be talking to users more, so we've went ahead and removed as many barriers as possible. It's going to be easy; it's going to be quick. You're going to love it. So, get over there and check it out.
Erin - 00:25:12: And then when you're done with that, go on over to your favorite podcasting app and leave us a review, please.
JH - 00:25:20: I'm going to speak in some broad generalization, so we'll see how this goes. I think one thing I think when I think about a growth team versus maybe a core product team is sort of maybe their pace, like a little bit faster experimentation or quick ideation and stuff. Not strictly speaking, but I think if you looked at the whole portfolio, probably a little bit of a trend that way. And the thing we talk a lot about with researchers is you need your insights available at the time when they can actually influence the decision. So, I'm curious, does that translate to the research pace or the formality? Is there anything you have to do as a growth researcher to make sure that you're getting stuff that can inform decisions at the right time and all that?
Carlos - 00:25:57: Yeah, definitely. I would say that when we compare our pace to that of some of our colleagues allocated in the products, we do tend to face a little bit more of that pressure, of that need to collect data as fast as possible, to inform a certain decision or to inform a certain again, roll out of a feature or not. And so, I do think that that is a little bit intense for sure, when compared to product teams or depending on what the scope is, maybe it can take a little bit more of a I wouldn't say necessarily. Relaxed, because it never is. But I would say a little bit more of a okay, parsed out maybe even more extended timelines for the research, particularly when it's like new product ideation and so on and so forth. So, I think that we are a little bit operating in the funnel and then to throw another metaphor because it's not only an open seat, but it's a bit of a vortex. And so, you always feel like it's like spinning and spinning and spinning and it's like, okay, there we go, there we go, there we go. So, I would say that there is that pressure from the business and particularly in a start up with hyper growth numbers such as Nubank’s, to sort of kind of keep that pace. But I would say that that can be really good too. And I think us as researchers, as part of the growth research team, we've learned a lot around that because the need to bring product teams into the insights even though they're not fully cooked or even though there's yet some data to roll in and to be analyzed. I think our product peers have also really helped us see that there's already value in that and that the pie doesn't need to come out of the oven like when it's perfectly cooked, that we can actually all be in there, in the kitchen together and ship and give insight and be allies in that decision making in a way that I think that's been beneficial for us and also to the business to sort of try to be I think a little bit less attached with the perfection or with the rigor in which we have to complete a full synthesis process or a full analysis. And so I think that we've worked a little bit more in a scrappy mode, in a workshop mode, in a co-creative way with them and I think we've learned a lot in the process. So, yeah, there's the pressure of speed, but I think that's also helped us see our value, I think, more quickly and also ship insights, I think also faster.
Erin - 00:28:19: One of the big themes we hear from researchers these days is around how to identify the impact of the work they're doing. There's a lot of different ways to do that, but do you find that that's easier in a growth context or different? And maybe you could talk a little bit about some of the impact you've been able to have in your work doing growth research?
Carlos - 00:28:41: Yeah, I think that yes, I would say initially it was equally hard to try to begin articulating this impact. I myself have learned quite a bit from colleagues and from conferences on this topic of articulating the impact and showing the impact. I think that in the beginning, again, it was really easy to find that really juicy, amazing insight that you could just tell as soon as it was implemented, boom, there was an impact like double digit percentage growth. And I think that that also showed us, okay, cool, that way we can correlate literally our work with what happened in the funnel and what happened with the business. Obviously, as some of those big areas where we're receiving more attention, we're receiving more focus, then it just becomes harder to get to that jewel, to get to that little pearl of an insight. But I think that doesn't mean that necessarily the impact will diminish quite the contrary. I think again, we have learned ourselves within the growth context to articulate our research impact in that way. And paired with the experimentation that the designers are running constantly, I think it's been a good way, perhaps even easier, I think also in showing and in articulating that impact literally related to the insight, the experiment, the design experimentations that happened and then the business result that came. So, I think that there's I don't know if it's because our context is so multidisciplinary already that researchers, designers and product and business analysts are all working together already and so sort of in the right place. But I would say it hasn't been hard to show and really literally see it in real time the impact of our work. I, in the talk and my colleague Kakao talked a little bit about the unexpected impacts that we've also found in doing research in a growth context, such as, again, helping with onboarding new arrivals in the company. So obviously we invite them to a research session, so they understand how users open an account and how the funnel works. So, it's inspired a lot of co-creation and letting teams solve other teams’ problems. Also, there's been a lot of that hard impact, that's easy to articulate on, that you'll always get a percentage and a conversion rate and total active user base increase. But I think that those softer impacts, I think, have also become really integral in our research culture at growth.
JH - 00:31:09: Tactical question on that. So, you know you're having impact, you can see it in these ways. Is that you kind of like summarizing out to the organization like ‘here are some big things that research help drive’ or is it just like when the other growth stakeholders are talking about the wins and the impact they're driving, as they do that storytelling, it's just really obvious that research was a part of it in the sense of ‘we moved this nudge because we found out people didn't like it here and it's led to this huge increase in usage’. How does it actually get translated to the organization that the research is really helping here?
Carlos - 00:31:39: So, I think that there are examples I have for both ways. So, we do try to get shoot our research updates on Slack obviously communicate at a BU forum hey, research help drive this and obviously design and any other team involved and so I think that role of research leadership to articulate and remind I think and show the business of the impact of any given research activity, I think that's for sure there's an active kind of showing the impact way of doing things and we've done that a lot and I understand that that's valuable, that's needed also in so many ways to unlock headcount or open up access to more strategic forums. But surprisingly, and to be honest, that's like in the sense that the bit I enjoy the most of all of this. There's been a lot of passive impact articulation, if I can say that way. So, whenever there's a semesterly hackathon at growth, research just gets like spontaneously quoted even within teams that we didn't even know where that close to the results or to the content even. So even extending as far as the sort of farthest reaching up steam that's there answering customer questions or solving any sort of funnel pain point and then research is there and then research just gets quoted. And so, I think there's a lot of that passive articulation that's not even done by researchers in so many of the instances that I think gets us really giddy and that gets us really happy about the way in which, again, there's just like, a whole impact and a whole way of talking about research that we necessarily didn't articulate, or we weren't so careful about. We love it when it gets out there and people, again, name products or name initiatives based on insights and say that it was like that because if a user said something or because this happened in the session. And so, I think that there's both, again, that active way that you can go to show the impact and again pair with the numbers and cite the initiatives. But then there's also that really beautiful way in which it all happens kind of on its own and spreads around the teams organically. So, I think we've seen instances of.
Erin - 00:34:02: Both of those kind of triangulation in a way
JH - 00:34:06: Yeah, there you go.
Carlos - 00:34:09: Yeah, impact triangulation particularly, and let me just say something, maybe my last point a little bit on impact, which is that growth teams I think, get really good at optimization, get really good at, again, following and diagnosing what's going on. But that research I think, can bring a really big impact, particularly when teams are looking for new ways of doing things. It's like, ‘listen, we've tried every which way, we've rolled out like six experiments on this, and we still can't get the metric to go up or down or whatever way we want it to go’. And I think research gets called in those instances, which is like a more kind of inspirational figure. It's like, okay, what ideation can we get out of a round of research? What new angle, what new sort of strategy we can derive from the research? And so I think that's really powerful.
Erin - 00:35:03: Yeah, I do imagine a lot of growth research can skew evaluative here's the, but yes, to your point, when you hit a ceiling and optimization, you need a new loop, you need a new funnel, you need a new segment, you need a new, you need something new. And there's always something new. It's just figuring out what the right new thing is. So obviously research can be really helpful there. I think research and learning are maybe not synonyms, but they're damn close. So, what have been some of your biggest learnings and mistakes or however you've gone about learning your lessons, doing growth research? What are some of the biggest ones that maybe it would be useful to some folks learning so they can learn the easy way?
Carlos - 00:35:48: Yeah, no, absolutely. I mean, so many and again, especially after three years at a place like Nubank, it really feels like three decades worth of learning. I don't know. But there have been several. I would say the biggest one has been again about collaboration with other functions. So, we arrived and when we were sort of setting up the research operations in a growth context, we arrived obviously in these minds that are like, ‘okay, let's roll it out, let's be specialists, let's be very clinical and very rigorous in what we do so that we show our value’. And I think that was great in sort of helping establish that foundation. But it became very obvious very quickly that research in a growth context is very complex and it requires, I think, by definition, that it become more democratic or more accessible. So, to product folks, to business analysis folks, who might not be so used to the way we work or the way we like to do things. And so, I think that bringing others along has been extremely important. Something that I wish we had learned or implemented right from the get-go. Because it really made a big difference, not only in, again, sourcing data for triangulation when we need it, but also helping think of the analysis and the ideation bit in a way that obviously they'll always bring more technical, they always bring more implementation knowledge than we can provide on our own. And so, I think that was a big, big learning to us in the beginning. A mistake that I would say also that happened I think at the beginning is that it's really easy to face growth research or to try to draw your quantitative tools first because that's the way that everyone else in the BU is working. That's sort of the big driver. And I think that it was really powerful when we realized, like, listen, actually, let's take a dose of real good quality here. Let's see where that takes us. And then see that honestly, applying a dose of pure quality or dropping a little drop of pure quality in a bucket of quantity was like just chemistry. It was just chemistry in so many ways. And so, I think that understanding the pure value that a little bit of qualitative insight and finding can do to a place that it's already so quantitatively driven, I think that was a big lesson also and I think we keep doing that more and more. I know that might be intimidating. That also has to do, I think lots of researchers struggle with this okay, but my audience is so quantitative or the environment that I'm operating in is so like this, how can I just show up with just that? And then I would say that was a big learning to own that and to understand that the way in which you go out and do that detective work and then the way in which you articulate that point of view and argue that point of view and show how it is relevant for the decision that's been taken can just be so mind blowing. And this happened even, and it was just an approach that we were trying to do about fast sign up or the idea that we could optimize it and make it as easy as possible for people to open a bank account. And there was a way, obviously, that project became very qualitative in nature with tons of usability tests and tons of concept testing, and it made it all the way to the very top leadership forums because anyway, again, we had collected tons of footage and tons of great takes from the users on that, and it was entirely qualitative with a relatively small sample size. But so much goodness came from it that I think really unlocked the value that qualitative data can bring even in a very quantitatively-driven environment. So, I would say I also have learnings around recruitment. That has been something that we have really had to constantly focus on. Again, going back to the open sea metaphor, but there's tons of things that you can do to guarantee diverse user bases that will reflect the way that people will go through your funnel that will really show you how we will behave out there. And so I think learnings around qualitative versus quantity, learnings about diversity in your recruitment to really reflect how the funnel will behave once it's out there in the real world. And then about owning and being confident, I think, in what you bring as a researcher for growth and as a strategic partner in driving higher growth, I would say.
JH - 00:40:22: How do you get somebody comfortable doing that? So, in the quant quality example, there quantity, quality. You have this group of people that are very quantitative, they're looking at all these charts, they're talking about statistical significance, confidence intervals, all these things, and you want to walk in there and be like, here are some clips, or I have some insights. I'd imagine that is intimidating for folks. Is it just go in there and do it and see what happens? Or is there a way of approaching it and sharing those insights in a way that is going to be like most receptive to that audience? Or how does somebody break that barrier?
Carlos - 00:40:52: Yeah, I think through stories. I think through stories. I mean, storytelling, I think, has been such a huge unlocker for us in delivering that data and also allowing us to be more confident in that approach and entering those rooms and delivering that. So, I think that it has to do with multidisciplinary thinking and the composition of your team. Obviously, you need sort of left thinkers and right thinkers, ideally hybrids, whenever possible. But I think confidence in understanding both sides of that equation and then actually also developing and crafting the narrative, which is I think, again, we can do all we want collecting the data, setting up the slides, setting up the mirror boards, all of that stuff, but without the crafting of the narrative. That will title together, that will again take the audience with you along the way and then very clearly stating the point of view in the end. And again, I love the take that UX researchers should act more like counsel to our businesses. And so, I think that articulating that POV, whatever it is based on, whether extensive quantitative study or just a little usability study with a smaller qualitative sample, I think that's all great. Not one of them is more valuable than the other. If, I think you, at the end of the day, can stitch together that POV in a convincing way, in a way that is logical, but also responsible sensibilities of the data. So, I don't know how you become more confident. I think it's just doing it like over and over and over again, trusting in the partners, trusting that everyone's here to do their best work and try to paint the picture as best as we can. And so, I think that really going back to the development of those relationships, the cultivation of those stakeholders, I think, is what also makes it easier and makes you more confident in the process.
Erin - 00:42:53: I think that's a great place to call it. Thank you so much, Carlos, for joining us today.
Carlos - 00:42:59: Thank you so much for having yeah.
JH - 00:43:01: It's a lot of fun. I appreciate all the metaphors and imagery that made it easy to follow.
Erin - 00:43:05: I've been in the ocean this whole time. We love a metaphor.
JH - 00:43:11: We do. Storytelling helps. It makes it memorable.
Erin - 00:43:17: Thanks for listening to Awkward Silences, brought to you by User Interviews.
JH - 00:43:22: Theme music by Fragile Gang.
VP, Growth & Marketing
Left brained, right brained. Customer and user advocate. Writer and editor. Lifelong learner. Strong opinions, weakly held.