The Limiting Factor

In high school chemistry, my excellent instructor, Mr. Gleich, drilled into us the concept of the limiting factor. It was so counter intuitive. Why wouldn’t speeding up any part of the reaction not speed up the overall reaction? It made no sense to me and many others, but he drilled it in. He used analogies, exercises and demonstrations… eventually, even though it’s counter-intuitve, it was clear: only the limiting factor matters.

As a newbie PM, I instantly recognized the same math applied in many situations I faced. I had a product or milestone, a chain of reactions turning raw materials into a final result, over time. I wanted to know how long the reaction would take, and where to apply the catalyst. And I still think this is the key to optimizing any process. Reducing pain, playing politics, and playing favorites has their place in deciding where to put your energy, but the only thing that will really make a process faster is ruthless focus on the limiting factor, enabled by a clear understanding of the rates at each step. You need to understand what the slowest, least efficient step is, but also keep an eye that optimizations to that step are not offset elsewhere in the process. Although a full model is helpful, action anywhere outside the limiting factor is not. Know the whole system, but act on one factor at a time.

To give a concrete example, say you have an enterprise product that has a per-client implementation process. Say you want to increase the number of customers you can onboard per month. If your can make the install process faster and the IT training smoother, but the real blocker is legal sign-offs on the contracts, your revenue will not really budge. If legal can only approve 4 deals a month, the number of trainings you can do is irrelevant to your goal. Of coarse, after you get legal to pick up the pace, you might find those trainings are limiting you after all, but that is what iteration is for.

That’s a simplistic example, but this applies to things you might not normally think of as “processes”, but can be modeled and attacked this way. For example, revenue is actually the residue of a reaction. Your product/marketing/sales meeting the market atmosphere. If there is a slow reaction, a low revenue per unit of time is the result. If revenue is slow, or inert, limiting factor analysis can identify what part of your hoped for, theoretical, market reaction is not happening. Marketing message not landing? Missing a key feature? Pricing wrong? Lay out your factors and their rates. Then act ruthlessly to improve the limiting one.

 

Data Games

One of my least favorite kind of PM compatriots are those who insist that they are purely data driven. Like all zealots, they’re often tiresome bores, and while there is a lot of truth in what they advocate, their smugness and certainty is off-putting. But, of course, numeric measurement is useful! So here’s a few downsides of shallow quantitative methods, as a dose of skepticism for all you quant advocates.

I think the greatest downfall of data driven approaches is picking the wrong number to build a goal around. If you work with smart people, even the best intended will game a simple numerical target, when selected without care and consultation. In one of my favorite stories about this, a company decided to set sales quotas by weight of items shipped from the warehouse. After months of record setting sales, accounting realized the weights shipped didn’t match the actual revenue the invoices reflected. The sales people had paid off the warehouse guys to add bricks to the shipments.

Another pitfall is not actually understanding whether the KPI you are tracking really means what you think it does. If a number is a “proxy” or “simplification” of some more complex behavior, beware! Always understand what the numbers really represent, not what they have come to represent to the team, especially when working with a mature product. What events trigger them? Don’t assume something labeled “conversion” is actually when revenue is generated. Look past the numbers to the underlying activity.

The next step is to move beyond description to analysis. Always be looking for the story. Why a number is, not just what it is. What does it mean? Being data driven doesn’t mean putting graphs in your decks and having dashboards. It means using quantitate data to make decisions, and understand how people use your product. Explore your data. Don’t just refute or prove theories with it, discover hidden failure points and identify new opportunities. If you’re lucky enough to have a good analytics system, get immersed.

And woe, beware the data driven emergency! I have seen many many times an urgent alarm is raised based on a single chart or number. Something is down, something is up, this looks way off. I would estimate that 90% of these end in one of two ways: a bug in the data, or a misunderstanding/change of definition. Analytics are tricky systems, and events and calls can easily move or change implementations without everyone knowing. Before you freak out, eliminate these options.

Numerical data is not an inherently superior or more honest. It’s as easy to lie with numbers as stories. It’s as easy to have bad analysis as bad taste. Always be scientific with data when you need to know the truth: taking measurement without prejudice and being clear eyed in your analysis. But be lawyerly with data when you use it with others. Pick and choose the evidence that best supports you, not lying, but treating the data as secondary to the narrative. In the end, I believe, data is there to build and support the narrative, not t’other way round.