SUBSCRIBE TO OUR NEWSLETTER
Getting beyond gimmicky design thinking tutorials + Beavis and Butthead.
I can't recall a time in which a participant created a warzone during a lab session. But I have been in situations where working with facilitators was a total failure. I was in a situation (not recently) when a graphic designer I was working with kept asking very very leading questions about a registration workflow we were testing for potential friction points and dead end. Eventually, this designer became frustrated with the participant's confusion/defiance and muttered (audibly) something to the effect of "just use the *expletive* internet like a normal person." AWKWARD.
Design thinking needs to mature from some of the gimmicks that have been around for years in order to gain more respect and participation from executive-level stakeholders. Trying to get CMOs and Tech VPs to sketch cartoon personas and create paper prototypes is tricky. Unfortunately, I don't think I have a good answer as to how to make exercises more inclusive and efficient. But as the culture and attitudes within upper management evolve, the "Design Thinking AND YOU" exercises will be less necessary.
Sometimes you can’t find 18-20 people from a specific demographic to participate in sessions. In those cases, I may use just a handful of responses to validate what is already my gut feeling.
Not letting go of biases toward one idea or design.
The biggest lesson I've learned is how to determine when to use quantitative analysis vs user testing to prove a hypothesis. It's easy for anyone to suggest that an idea warrants testing, but it takes some critical thinking to determine which test approach is appropriate for what you're trying to learn. Know the right scenarios in which an A/B or limited/Beta release will provide you the the results you're looking for.
Typically, anything that I consider impactful to the fundamental intentions or goals of a power user OR has the potential to damage your company's core business model is worth bringing into the lab. If the change you're considering would mostly affect the larger, more commoditized audience, a multivariate test will often collect enough data to make the decision for you.
I'll review my scripts and tasks over and over and over again, typically fine tuning things right up until the start of the session. Make sure all of your visual tools are functioning properly.
Working prototypes tend to yield better results than static images. I've also found that writing notes tends to make participants feel nervous about the responses they give, so I lean toward using audio recordings.
Well, I really appreciated the ice breakers you threw me at the beginning of the interview. It's a very effective warm-up exercise.
I typically use the standard email digest and slides for the wider audience, but for key partners and stakeholders I like to keep the format more of a live discussion or conversation. This helps prevent a situation where feedback comes in the form of email bombs going back and forth.
If you'd like to be interviewed for Other Side of the Table, reach out to email@example.com.
Want to contribute to User Interviews content? Here’s how.
VP, Growth & Marketing
Left brained, right brained. Customer and user advocate. Writer and editor. Lifelong learner. Strong opinions, weakly held.
Leadership & Strategy
December 20, 2019
Read on for details on how Openroad’s Rafi Finegold uses Facebook ads and landing page conversions to drive user research on new products in development.
Leadership & Strategy
October 31, 2019
Beth Koloski shares how to switch from a product-centric to a user-centric mindset, her favorite research progression plan, and the must-have tools she uses as a remote researcher.
Profiles & Interviews
October 28, 2019
Marie shares how she successfully pivoted from a failed product to FYI, which questions give the most insightful takeaways, how she knows it’s time for product launch, and more.