How (I failed) to validate a product
A reflection on the dumbest thing I did on the job and what I did to turn things around
I wasted six months of my team’s time building something that worked.
At the project start, I felt that the customer pain was real. I had belief that our engineering was solid. The improvement shipped.
But nobody really cared.
That was the painful lesson: solving a real problem is not the same as solving the right problem.
We had listened to the loudest complaint, but we had not understood the full user journey. And that difference cost us six months.
How is that possible? Didn’t you do any due diligence?
The squeaky wheel gets the oil
A complaining customer - “Perfect”! I thought. “We have a clear-cut problem and a person to provide us all the requirements, and feedback during the implementation process”.
The tests are running super slow because of the slowness in the playback of the test data. Having this bug is super annoying. It adds friction to our work.
Let me tell you how testing in my project was crucial.
When designing hardware in disciplines like Augmented Reality, that aims to “understand” the real world, you need to test it with data that resembles the realities of where it will be used. For instance - the data is a video from first-person perspective of looking at an object in an office room.
The data is piped through the system to produce the ‘PASS’ or ‘FAIL’ - unfortunately for our client, it was packaged in a way that made it really difficult for the testing framework to deal with it efficiently.
And you need a lot of diverse data to frequently ‘probe’ the product for any defects in order to ensure its high quality.
“We’ll solve it for you”, I joyfully told the client.
I felt satisfied with the request, as I heard a very clear pain point that I could cash in on. My thought process was: OK, we have a (loudly) complaining client, we have to analyze the issue, solve it, deliver the fix and then sip champagne.
Lesson #1: Working on a misunderstood problem means doing ‘busy work’
Hold on - but isn’t the problem already defined?
The customer has problem with the tests being slow. That sounds like obvious value added. You can’t understand the problem only having the opposite perspective - you need to put in the rigor and work to paint the challenge from the system point of view, too.
The test framework was used in two scenarios - off-device and on-device.
The use case where my team principally focused on was “off-device” which meant a quick & easy way to schedule the test run, as opposed to the “on-device” case that required the engineer to look for a physical device, boot it and configure it, upload the feature that they’re working on and then run the test.
We should’ve stopped there.
The logical decision at the time was “off-device” because it seemed to be easier to address. “On-device” support was punted down because it had hardware limitations barring it from being incorporated.
Only a limited part of the problem was understood and not the entire user journey, that was critical to uncover the pain points that our customers were experiencing.
In reality, it was the “on-device” variant that was the actual problem - the users had to go through all the hassle of preparing the hardware for test, the slowness in the testing framework added to the frustration of the user. It was experienced, it was visceral.
Misinterpreting the actual issue had consequences:
Pain point persisted - the “off-device” tests were executed automatically, and no human was in the loop, meaning - no slowness that can be “experienced”;
No visible user satisfaction after six months of work caused our demotivation.
It was just busy work that scratched my “another feature delivered” itch.
Lesson #2: It’s bad business to serve just one customer out of many
If you’re in the business of selling apples, will you make more money selling one apple to a single client, or to two clients each?
The answer is obvious - but I took another path.
The test framework we set out to improve was used in different ways. The clients ran the tests in different ways - they liked to write and execute their own custom scenarios.
“The datasets that I’m running are pretty short and I’m not impacted that much by this problem”, we heard from another customer, upon offering to roll out the improvement to his team.
“We’re mainly running the off-device tests and we’re seeing 16% speed improvement - not bad by any means”, said a developer representing yet another client.
A dangerous pattern emerged - the clients were voicing dissent.
A fix was worked on and delivered, and not only was it not serving the critical consumer case of the one client (because it wasn’t working on-device) but the other clients simply didn’t care for it.
It didn’t bring them any value.
What was missing: thorough due diligence into our entire audience and not only its subset represented by a single client. A cardinal mistake of a business offering their services.
Lesson #3: Inspecting your own results helps you release a better product
Build-inspect-learn loop is the flywheel of self-improvement and helps identify points of improvement for the product team.
“Let’s try to improve the off-device test execution for as many teams as possible”, and we concluded the project shortly after that.
I feel that the six-month period took a lot of energy out of us. It was a period where we tackled arising complexities, managed relationships with our clients and aggressively decreased scope wherever possible to avoid the scenario of digging ourselves into a rabbit hole.
But it was a time of learning, too, that needed to be revisited and distilled into concrete, specific steps and action items that propelled us to take on more fruitful challenges.
We reviewed everything that we did - from the added support for parallelism in data ingestion, optimizations for streaming compressed data quicker, many small improvements in the housekeeping code.
That wasn’t yet the time of agentic AI, mind you - most of the software engineers on the team didn’t have professional experience with C++, so it was a huge jump in terms of the tools used.
There was a tremendous boost of self-confidence and the belief that now we could deal with any problem and overcome it in a quick time.
It also made everyone aware of how critical it is to think outside of the “software” box, given that the most of performance improvements in our testing framework could’ve been made had we understood the entire user journey.
Build, inspect, but foremost - learn to succeed
In the ensuing months we went to apply our newly discovered strengths.
We went on to deliver a feature, that saved thousands of hours for the users of our testing frameworks - by going the distance and compiling a list of thoroughly investigated problems, we’ve adopted a strategy where we forced ranked each problem by the amount of customers that it was impacting.
By compounding the time saved by the multiple clients that we served, we were able to obtain time savings unattainable without:
fully understanding the problem;
identifying customers benefiting from it;
keeping the learning feedback short & always closed.
Ultimately, time savings are equal to money saved for the organization.
It was an experience where I realized that even as a manager in a large company, you got to think like a small business owner - operating like your lifeline depended on not only providing a good level of service, but knowing the clients inside out and the problems that they face every day.
Before you build, learn where to look
The story above is one version of a mistake many builders make: we fall in love with the solution before we fully understand the problem. That is expensive inside a company.
It is even more expensive when you are building something of your own.
Barbara and I created USER-DRIVEN BUSINESS IDEAS to help you avoid that trap. It is a free six-lesson email mini-course for busy professionals, creators, consultants, coaches, and aspiring founders who want to start a business, but feel stuck at the idea stage.
The course teaches a simple process for finding real customer signals, spotting repeated frustrations, asking better questions, and shaping what you learn into a business idea worth testing.
The goal is not to find the perfect idea - the goal is to stop guessing.
Join the mini-course here: https://yourbookhub.aweb.page/discover-idea-building



