UPDATE – The proof is flawed. I have found an example that breaks the model. See update section at the bottom of the post.
I love applying Real Option thinking to things to see what pops out. This is an example of how Real Options makes proving Theory of Constraints really simple.
A few weeks ago Marc Burgauer (@Somesheep) introduced us to Klaus Leopold’s excellent boat game to demonstrate the benefits of limiting WIP and its impact on lead time. The game is elegantly simple, a line of people take turns making origami folds to turn a sheet of paper into a paper boat. Each person does the same one or two folds each time. The last person records when each boat arrives since the start of the exercise.We had a bit of a challenge to get the point. i.e. The rate of completing things remains the same, but the lead time becomes stable and predictable. After some discussions, Marc, Nick Poulton and I decide to modify the experience to hit people over the head with the results. We modified the game to record the time at which each boat was started as well as the time at which each boat was finished. Then we plotted a Cumulative Flow Diagram of the start and finish times…
When you push work into the system, you build up work in progress (inventory) and the lead time increases.
Next you introduce single piece flow and plot the start and finish times…
This results in a fixed amount of work in progress (inventory) and a fixed lead time.
IT DOES NOT INCREASE THE RATE AT WHICH VALUE IS DELIVERED! THAT IS FIXED.
Any additional work (investment) entered into the system above the finish rate is waste. It simply builds inventory. The area in the triangle formed by the two start lines (red and blue) below is pure waste. It is investment that is trapped in the system that is not generating a return.
I plotted the graph to complete each item of work as it moved through the process. To do this I considered that there are two ways to express the rate at which a process can be expressed.
- The number of widgets in a fixed time.
- The amount of time a widget takes to process.
We tend to use the number of widgets in a fixed time. It is easier to discard the time elements and easier to compare one rate to another. In software development, we refer to velocity which is the number of widgets in a fixed time. Plotting the graph of each work item is easier when you use the amount of time a widget takes to process.
One thing that jumps out at you is that the rate for finishing a widget is the same as the rate of the slowest process step. This is not intuitive, especially when the process is a network rather than a linear set of steps. But why was I surprised? This is surely just the Theory of Constraints.
And then I realised. Theory of Constraints was a belief for me. It contained uncertainty. I did not know for certain that it always worked, in all situations. I had doubt and because I had doubt, I did not always apply it.
So I created a reasonably complicated network process:
<Warning – Possible bad maths ahead. I’ve not checked this with an expert yet>
And I created a spreadsheet to see how long it would take things to complete. This is where Real Options thinking came in (or value stream mapping if you prefer) and we work backward from the end of the process to the start. For each process step, the time to the end of the process step Pn(k) is:
- The time to complete the process step T(Pn), plus
- The earliest time the process step can start.
Pn(k) = T(Pn) + ET(Pn(k))
The earliest time the process step can start is the latest of:
- The time the previous Pn piece of work finished ( Pn(k-1) )
- The latest time the preceding x process steps finished
Therefore the earliest time the process step can start is:
ET(Pn(k)) = MAX ( Pn(k-1), Pn-1(k) … Pn-x(k) )
The rate for Pn (expressed in seconds per widget) is
Rate(Pn) = MAX ( T(Pn), Rate(Pn-1) …. Rate(Pn-x) )
This leads through recursively to
The rate for Pn (expressed in seconds per widget) is
Rate(Pn) = MAX ( T(Pn), MAX(T(Pn-1), Rate(Pn-y), ….), MAX(T(Pn-x), Rate(Pn-z) )
As the MAX function is associative, We can write
Rate(Pn) = MAX ( T(Pn), T(Pn-1), T(Pn-x), T(Pn-y), T(Pn-z) )
i.e. The rate for any process step is the max rate (expressed in seconds per widget) of itself or any preceding process step in the graph.
<Math note – I’m not up to snuff on how you annotate graphs. Hence its a bit of a mess.>
Conclusion – Theory of constraints is not a theory. It should be easy to prove mathematically by someone with better math and more rigour than me.
It also means that pushing more work into a system beyond the rate of the constraint is quite simply a waste of money. IT Executives should be judged on the rate at which money is invested into the IT department and the rate at which that investment delivers value. In effect, a CFD based on money in and money out which is just another way of representing lead time.
If you want a copy of the spreadsheet, leave a comment here, or ping me on twitter.
UPDATE – The following example has a rate faster than the constraint.This is due to inventory building up behind the constraint before work items through the other path reach the convergence process point (P4).
It looks like it still holds over the long run in the steady state. Need to think more about the start up phase and the implications of that.
More details to follow.