SUBSCRIBE TO OUR NEWSLETTER
September 26, 2019
A detailed look at the earliest days of a startup in transition.
We were simultaneously testing the viability of 8–10 different product concepts, and our riskiest assumption often boiled down to:
Is [insert product] something people actually want?
To answer that question, we talked to a lot of people. We started with friends and family, but we exhausted our personal networks pretty quickly. So, then we had to find strangers who were willing to give us feedback. We approached random people in coffee shops, went to numerous hotel lobbies, and got kicked off Airbnb after we messaged every host in Boston to see if they would talk to us about our “Airbnb for storage” idea. We even bought refundable airline tickets and approached people at their gates while they were waiting for their flights.
That’s when we realized that finding people to participate in user research studies sucks. We would have gladly paid for a tool to simplify the process of finding and scheduling participants. If we — three broke startup founders (without a startup) — were willing to open up our wallets, maybe some other people would too…
We had two overarching goals for our user research:
We used generative user interviews for the first goal and built an MVP for the second.
Our first step was determining what would count as confirmation? One lesson that we learned from our first failed startup was that it’s impossible to validate (or invalidate) an assumption unless you set clear success and failure criteria. What if we found 1 person that said, “yeah, I guess that’s annoying?” Would that be enough to “validate” this assumption? What if we found 10 people that said the same thing? It’s not immediately obvious what the conclusion would be. That’s why it’s critical to set clear success and failure criteria.
We determined that if >50% of the people we interviewed brought up “participant recruitment” or “scheduling” it would be a success. This was the best way for us to remove our internal biases from analyzing the results. (If we didn’t do this I’d be building the Airbnb for storage, Dennis would be protecting people from Craigslist scammers, and Bob would be building digital aquariums).
User interviews are valuable tools, so we wanted to make sure we took full advantage of the time people were spending with us. In addition, to just confirming success, the viability of our idea, we wanted to find out: how they were currently solving this problem, where we could find them to sell to them, and who the target user would be.
There are a ton of great tips for developing user research questions, such as in our guide to generative interviews. But, for the sake of brevity, the main one for us was to not let them know what we we’re thinking of building.
One of the universe’s rules is that if you tell someone your startup idea, they’ll tell you they like it and it would invalidate the information. So it was important for us to not give them any leading questions or biases. Once they brought up recruiting themselves, then we felt comfortable asking them about their process etc.
We then talked to 10 researchers and we had 7 interview subjects who naturally brought up recruitment or scheduling as one of their top 3 pain points. 70%>50% = Success!
We also had a bunch of qualitative insights.
First of all, companies were hacking together solutions for this using tools such as Google Forms/SurveyMonkey for screener surveys, Calendly/YouCanBookMe for scheduling, Amazon Gift Cards/Visa Gift Cards for payment and managing all the emails themselves. Companies gluing together other tools is always a good sign that there is a need in the market.
Second, huge companies such as Zipcar were primarily using Craigslist to find participants. Craigslist is not optimized for specific use cases at all, so if companies are using Craigslist, it is another sign that there’s a large need in the market. (We’re not the only ones with this insight).
After our user interviews. We felt confident that we should build an MVP for this idea and try to sell it. Luckily insights from our user interviews gave us the instructions on how to do this.
Just like with our user interviews, we needed a metric for success. We decided we’d count it as success if within the first 3 weeks we could get 4 people to pay us to manage the process of recruiting and scheduling participants.
This was the easy part, we just built a basic google form asking the researchers who they were looking for and other important information. Then we followed their own existing workflow and glued together our MVP by using Google Form for the screener survey, YouCanBookMe to schedule participants, and paid the participants via Amazon gift cards.
Finally, we were at the moment of truth. Luckily, from our user interviews we knew that companies were posting on forums like Reddit and Craigslist to find participants. So for 3 weeks we scoured the “Volunteers” and “Gigs” sections of Craigslist and emailed people who were looking for participants saying we could do it for them.
We were able to find 4 paying clients!
After validating our assumptions, we knew it was time to start working on User Interviews full-time. Our CTO (Bob) cleaned up the website & replaced the Google form with one that made us seem slightly more legit. Meanwhile, we continued selling our recruitment / scheduling service to market researchers and UX researchers. As our customer base grew, we began automating parts of the process and productizing our offering. And over the next 2 years, we were able to convince researchers at some pretty cool companies—Lyft, Spotify, Pinterest, Groupon, maybe your company?—to start working with us.
We’re just getting started, and we’ll keep making user interviews and user research part of our process for building User Interviews as our product grows and matures. You can read about our continued adventures here on our blog!
Want to contribute to User Interviews content? Here’s how.
Basel Fakhoury is a co-founder and the COO of User Interviews. Sometimes he writes things, too.