Knowing what done looks like.
In January 2024, OpenAI unleashed the ChatGPT store on the world. The store is a great way to find customized GPTs for various tasks. At least, that is the hypothesis. Instead, the oft-delayed product is quite obviously not done.
The team focused on building a community and publishing as many custom-developed GPTs as possible. But there’s something terribly wrong with the experience. There is no way to understand the reputation of the GPT’s developer; in some cases, it’s not even possible to learn the developer’s identity. Making matters worse, you can copy anyone’s GPT and publish it as your own. Yikes.
The store has a “Writing” category, and the first listing is a GPT called Write For Me, created by a developer known only as puzzle.today. You'll see this lovely error message if you follow the link provided on the GPT’s page.
Who is this developer? How trustworthy are they? How much confidence should I have in using this GPT?
Launching a store with no way to discover the identity or reputation of a community developer is bad on a fundamental level, suggesting the OpenAI product team was either forced to release something they knew wasn’t ready (most likely) or that OpenAI hasn’t hired any experienced product managers to help them build this Store product (also likely, if slightly less so).
The saga prompted me to reflect on what it means to be “done” when shipping software.
Have we had enough of terrible software?
I’m not fond of MVPs. I read Lean Startup, the book that popularized the idea that shipping unfinished software was necessary to build a successful startup. I listened to the software makers who encouraged entrepreneurs to “move fast and break things”. I learned agile development practices and embraced iterative delivery. I came to love small batch sizes. This product management journey is not unique. Unfortunately, what happened next was that a generation of software companies blindly followed this advice and consequently shipped a lot of objectively terrible software.
During recent travels, I used an airline’s website in an attempt to purchase a carry-on bag as my bag’s weight was well over their 8kg allowance. I selected my desired option and clicked the partially hidden (on my mobile phone, anyway) “Continue” button only for the site to repeatedly redirect me to confirm my selection to add an unaccompanied minor to my itinerary. As best I could learn, there was no way to purchase a carry-on bag option. I tried three times before giving up.
A past employer forced me to use a conference room scheduling app that only displayed each room’s available times in Coordinated Universal Time, aka UTC. There was no indication in the app that the times displayed were UTC. Pacific Standard Time is eight hours behind UTC, and my conference room, safely ensconced in Pacific Time, appeared unbooked.
I scheduled a meeting for a Monday at 9 am. On the appointed day, I arrived at a conference room full of someone else’s meeting, blissfully unaware I’d reserved the room for 1 am PT. I pleaded my case with the room’s interloper for an embarrassingly long time until I ultimately moved my meeting to the building’s lobby. You see, all the rooms were booked.
I’ve never built a conference room scheduling app. Maybe I should; most are terrible. But I suspect that one of the VERY FIRST THINGS you’d build is a feature to match the conference room’s displayed timezone to its physical location. I struggle to understand how any team could declare the product “done” and ship it in that state.
A generation of software users, constantly frustrated by such bumbling, believe themselves technologically inferior and blame themselves when software doesn’t work.
Building software iteratively.
Building software using iterative techniques is arguably the best way to ship quality software. Small batch sizes protect your product from bad designs or poorly implemented features and let you quickly correct the course without wasting too much time, energy, and money. I love the approach and have successfully built software using the technique.
Building iteratively does require deliberate thinking about the definition of done. Most teams consider what done looks like for an individual story. I encourage product managers to include a “we will know we are done when…” statement in the user story template in whatever backlog management tool we use.
Establishing a definition of “done” is less common at the product level.
We sometimes call a collection of stories or a series of releases a roadmap. The product manager draws an arbitrary line below a collection of stories and declares that once the story just above the line is complete, that phase of the product is done. Drawing the line often requires lengthy negotiations between the product manager and the engineering lead. Typically, the line represents the team’s capacity, not the product’s completion.
User journey mapping
The question is, how do you know what done looks like? User journey mapping is an excellent way for a team to visualize what “done” looks like. A user journey map imagines the first touchpoint with your user and carries the user across an experience until she realizes the value of your product. The journey often shows you precisely what your product is great at doing.
When I do this exercise, I imagine discovering the company and product for the first time. Perhaps I see a banner ad on LinkedIn. I find my way to the company website, where I am encouraged to sign up for a product trial. Then, I am guided through various product tasks, the elegance and simplicity leaving me with the epiphany that I’ve just used the most remarkable enterprise software product ever developed. I am amazed and overwhelmed by the joy of knowing my life is about to change fundamentally.
User journey mapping forces you to think in a granular, systematic way about the time it takes for your customer to experience value. You take each step of the journey you hope your customers will someday take. When you see the steps through the user’s eyes, you suddenly have the power to build, sequence, and prioritize your product’s backlog. Fluff becomes obvious, allowing you to cut unnecessary items from the team’s workload. The efficiency this process creates is staggering the first time you experience it.
Visualizing done
When you do the user journey mapping exercise, you don’t have to guess what “done” looks like for your product. Once you’ve implemented the features required to complete the experience, you are done, at least with that phase of your product’s lifecycle. You can plan for whatever milestone you use–I like releases even though that term often sparks vigorous debate about what it means to be Agile.
Unfortunately, it took a long time to find and refine user journey mapping and incorporate it into my toolset. I invested effort into guessing what “done” looked like and inventing arbitrary milestones. Now, I use the technique all the time.