Gathering customer insights, part three: learning without validating
Or, how to avoid building a bloated product.
Welcome to the third article of this series about gathering customer insights.
Part one explored the pitfalls of talking AT your customers. You visit customers, talk at them incessantly, never pause to listen, rarely ask questions, fear their feedback, and are blindsided when you get it.
Part two delved into the hubris of expertise and why we sometimes listen to and ignore our customers.
This installment, let’s call it 3-a, is about dedicating time to learning what customers need by asking good questions and paying close attention to their responses. You learn a lot about the problems your customers are trying to solve. A clear picture forms in your mind–you have solutions–and you immediately take action on the feedback and build like hell. Your product plan looks like a cornucopia of features, making a dinner table groan!
Except you forgot a crucial step: validation.
When you talk to a customer, how do you know any other customer wants what they are asking for? Figuring out feature applicability matures your product from customer-centric to market-driven. We’ll explore this in more detail this week.
What if your customers only tell you what they don’t like about your product? When all you know is what the customer doesn’t want, replacing one bad thing with another is easy. You have to learn what will delight them. But we are wired to take action on negative sentiment. We will explore that asymmetry and negativity bias in the next article.
Building bloat into your product
Product review season descended on our team like a skier falling off a mountain. I flew my team in quarterly to check in on their products, make plans for the coming quarter, and celebrate our successes. Six products in two days. It was going to be a whirlwind.
I worried about one product in particular. Sales were lagging, customer support requests were up, and the team regularly missed delivery deadlines. My strongest product manager helmed this product, so I was flummoxed. What was going on?
She presented her data, and nothing stood out as an obvious explanation, but there were some clues. The sales data showed proofs of concept were taking longer and converting less reliably. Customer support requests ambled across topics and features in no organized pattern. Technical debt consumed an increasing proportion of the backlog.
My product manager ran an ambitious customer development program in the previous quarter. She was thinking about two significant new initiatives for her product that would likely impact many, if not all, customers. She flew the world, campaigning tirelessly to get the data to test her hypotheses. She asked smart questions, and customers gave her indispensable feedback.
We reached the roadmap portion of her presentation, a now-next-later summary of the important work the team would do over the next 90-180 days. The problem smirked at us from the conference room’s monitor.
Her previous roadmap focused on only two themes. After her roadshow, two themes had blossomed into eight, each featuring up to a dozen near-term priorities.
“Help me understand,” I asked. “Most of these features seem orthogonal. What is the organizing principle across your themes?”
She blinked at me.
“This is what I learned on the roadshow. This is what customers want.”
“You’ve delivered a bunch of features for each of these eight themes already?”
“Yes.”
“So, that explains the tech debt.”
The problem came into focus. The PM had collected a lot of information from customers. But she’d missed a crucial step. Her questions were open-ended and optimized for discovery at her journey's outset. As her roadshow progressed, she continued asking the same open-ended questions without validating whether the things she was hearing from one customer were also important to other customers.
Validation and refinement tell you whether the customer in Istanbul running point of sale terminals in uncontrolled environments has the same needs as a telecommunications company in London trying to manage a fleet of service vehicles or a bank in Pittsburgh migrating their transaction fraud prevention system to a new cloud platform.
My PM learned a hard lesson. Each customer asked for a new feature that was, to them, business-critical. It’s hard to say no in those meetings. She ate meals with those people, listened to stories about their hardships, and imagined being their superhero. She convinced herself that each learning was equally important, and she was equally certain her team must deliver those features—all of them.
And that, my friends, is how you introduce bloat into your product.
Luckily, bloat is relatively easy to detect and remediate. Unless you’ve already said yes to a panoply of features, in which case you are in for some difficult conversations.
Worrying signs and a remedy
The POCs took longer because there was suddenly so much more product to show, and customers weren’t interested in most of it. The sales engineers and account executives were frustrated by the lack of traction, not helped by our intrepid product manager’s indefatigable feature enablement programs.
Our customer success counterparts openly theorized to the product manager that the support ticket diversity was because the customer reporting the issue was often the only one using the feature.
What at first seemed a positive sign–a contained blast radius with bugs only affecting one customer–turned out to be the harbinger of an entirely different flavor of doom.
With teams spread too thinly across features, engineering managers fretted about tech debt and the sudden emergence of single-person dependencies. Often, only one developer sufficiently understood a part of the code base well enough to make changes.
A razor-sharp product had morphed into a bloated mess, a mile wide and an inch deep. We were solving many of our customers' pressing problems, none of them well.
How to remedy this situation? My PM needed to run the missing validation step of her roadshow.
We set her goal to validate that others share a problem described by one customer. My benchmark for this is ten. If you can find ten customers with the same problem, they represent a broader market need. If you find ten, there are likely fifty, but what you learn from the other forty only nominally improves your chances for success. Most of us don’t have that money to burn. Reach a point where you feel comfortable you’ve matched a pattern, and stop.
Achieving market-driven status
When we reviewed the data later in the year, things looked better. POC success rates had increased, customer success inquiries had stabilized, revenue numbers were back on track, tech debt was under control, and in-product adoption and utilization improved.
Our PM took her data and created a smartly phrased survey designed to pinpoint the likelihood of customers adopting certain features. Then, she culled her list back down to three themes. Not prioritized, mind you, she outright rejected those other features as interesting but not relevant to her product.
The product’s backlog and roadmap recovered its razor-sharp focus.
The relationship between features and workstreams was apparent. She could easily relate the roadmap to our broader product strategy and the portfolio’s long-term vision. During 1:1s, we discussed how she’d reconnected her day job to the mission. Like all product managers, she was still juggling knives, but the number she kept in the air was manageable.
Then, we executed the last step of the discover-validate-build cycle. We divided the list of unhappy customers, and I helped deliver the bad news.
Up next
Up next, we tackle negativity bias. What happens when you fall prey to the asymmetry of negative thinking?