Failure; friend or foe?
It’s a cliché to say that we learn more from our mistakes than we do from our successes, but clichés are clichés, usually because they are true.
I think the archetypal exponent for using failure as a force for good is a guy called Matthew Syed. I had the pleasure of seeing Matthew present at a pre Covid conference and he was brilliant. I’ve since read a number of his books. He’s an ex-Olympian, a table tennis player, now a sports reporter and writer consulting on how to improve systems. Black Box Thinking, one of Matthews books, contains several brilliant examples of how failure can be used to leverage success.
The Black Box referred to in the title of Matthew’s book is the recording device found in aircraft and he makes insightful comparisons to the way in which the aviation sector works, where each mistake, and even potential mistakes, are reported and seized upon as a means to learn and improve in an open and non-judgmental manner. This is, sometimes scathingly, contrasted with the culture in some sectors of healthcare where surgeons and consultants have massive egos and are treated like gods meaning they absolutely cannot be questioned or challenged. This, coupled with a litigious environment, results in a situation where medical error is now reported as the third biggest killer in Western countries!
Conversely, following any air investigation, reports and data are made freely available to any pilot in the world and airlines have a legal responsibility to implement the recommendations. This open and completely non-judgemental approach allows everyone to learn from the issue; a process which turbo charges the power of learning.
In short, any failure or potential failure, is used as a welcome signpost in an attempt to reveal hidden aspects of systems and processes that cannot have been fully understood previously. Vital clues are teased out of the situation in order to update strategies, systems, processes and even behaviours.
In contrast, apparently, the NHS typically sets aside £25 to £30bn a year to cover outstanding negligence liabilities. Learning from mistakes is not a drain on resources; it is the most effective way of safeguarding resources (and lives).
The question often asked in the white heat of an investigation is usually, “can we afford the time to investigate failure?” Given the above scenario’s, perhaps a more telling question should be “can we afford not to?”
The point is, if we are threatened by mistakes, we become prickly and defensive when people mention them, there’s no way we can learn from them. Syed suggests we need to create a culture, like aviation, that consigns the idea that smart people don’t make mistakes to the metaphorical bin.
As an ex-Olympian, there are obviously a number of sporting examples where people like Dave Brailsford and Clive Woodward (another brilliant and insightful presenter) are heralded for their ability to see problems and weaknesses in a different and more positive light. Brailsford and Woodward created cultures where every error, every flaw, every failure, however small, is regarded as a marginal gain in disguise. Information and feedback is not a threat but regarded as an opportunity, as a precious resource, a chance to avert an issue before it occurs. In the world of NLP (neuro linguistic programming) there is a great one liner “There is no failure, only feedback”.
What stops us?
There’s another guy, a professor called Jan Hagan, who has researched this area and can confirm what we all intuitively know, which is that people are uncomfortable raising the mistake of others in open forum, preferring instead to deal with it a later stage in private, if at all. As might be imagined, this appears to be particularly the case if the mistake has been made by somebody senior.
This reticence obviously creates problems, especially if you work and make decisions, like many of us do, in teams. It can be very hard to row back and make course, or decision, corrections later. It might take hours, if not days or even weeks before you get the chance to highlight the issue in private and by that time wheels have been put in motion and consequences may already be beginning to be felt.
The $200m Google blue
Obviously, it’s beyond the realm of most organisations but did you know when Google was changing its hyperlink colours a few years back it set up an experiment where 40 different shades of blue we’re trialed to track the click through rates on each. This generated a wealth of deliberate “mistakes” in order to learn from them. According to Google’s UK boss at the time the results from the “best blue” in the experiment would translate into an additional $200m revenue per year.
Deliberately seeking problems
The Toyota Production System, or TPS, is one of the most successful techniques in industrial history, and to a large part this is due to the culture of deliberately seeking problems. If anybody on a Toyota production line has a problem, or observes an error, they can stop the entire production line. Senior staff then rush to see what has gone wrong, they then work together to resolve the issue there and then and once and for all. It focuses the mind if there is a multi-billion production line at a standstill. Problems and errors are assessed and addressed, lessons are learnt, and the system adapted. So, if Toyota can do it with all that is at stake for them, why can’t we?
So, what can we do?
When it comes to learning, Syed suggests implementing what he calls the “precision guided model”. He talks about the “ballistic model” being based on the idea of ready, aim, fire, whereas the precision guided model involves ongoing tweaks and nudges once the bullet is in the air. In many ways this is similar to a concept described by Julian Russell in “Alpha Leadership” as “ready, fire, aim”.
How does this fit with my management system?
Applied properly, learning from mistakes is at the heart of all ISO systems; but sadly, it’s part of the system that’s often side-lined.
It rears its head as a core component of the Plan, Do, Study, Act process. And I say “Study”, deliberately, not “Check”. Study means “think long and hard” about what went on; check, as a concept, is far more cursory. Learning from mistakes is also about that horribly worded component of any ISO, “nonconformance”. A core component of any ISO, is the inevitable “nonconformance, identification, tracking and reporting”, but who wants to be “non-conforming?” The language used is completely alien to the idea of an open learning culture. The language of ISO does us no favours; at Statius we’d encourage people to use a different terminology to that of the standard. The most common terminology we use would be “ChIP reporting”, where ChIP is Ch = Change, I = Improvement and P = Problem. It desensitises the idea of non-conformance, which is backward looking and problem centred and raises the spectre of an issue perhaps having not yet caused a problem but also something that could be better or improved.
Looking further forward, all ISO’s require a company to have objectives and targets. These objectives and targets should reflect a desire for a different future, but none of us (as far as I’m aware) are omnipotent. The future will always be different in some way to how we expect it or indeed want it to be. So, again, the Syed’s “precision guided model” works well where objectives are set, targets are defined, lessons are learnt and adjustments are made along the way.
Progress in the majority of human endeavors has depended on a willingness to learn from failure. Edison tested over 3000 different designs for light bulbs, Dyson made over 5000 prototypes for his eponymous vacuum cleaner. Both showed a thirst to engage with failure. If we cower from it, we are effectively destroying some of our most precious learning opportunities.
Failure has much to teach us if we see it in the same light as Syed, Edison and Dyson. Unless we see failure as a friend, rather than foe, it will remain woefully underexploited.
The paradox of success is that it is built on failure.