Author: edwellspm

Telling Stories

You may not have noticed it, but there is a battle quietly raging in the offices and boardrooms of the business world.  On one side of the battlefield stands evidence-based analysis, empirical research and a solid grasp of cause and effect. Meanwhile, massed across the valley are the forces of story-telling, narrative and opinion.  If only a ceasefire could be negotiated and peace were to break out; decision making, planning and problem solving would be far easier and vastly more effective. But don’t hold your breath. For the time-being, they seem hell bent on destruction.

beautiful businesswomen career caucasian
Photo by Tirachard Kumtanom on Pexels.com

If you’re not certain about this, just turn on your TV or listen to a radio show on current affairs.  Be it a show on politics, the economy, society or sports, listen long enough and you’ll hear something a little like the following exchange;

Host: “This is an interesting development, now over to Chris for some analysis;”

Guest/Commentator: “Thanks Martin, I think it’s pretty clear that this means [insert compelling speculative statement and some further exciting narrative]”

In some respects, this can be seen as no big deal but in fact, it serves as an insight into how the two competing worlds of analysis and narrative are locked in something of a turf war, scrapping over their own very different definitions of the truth.  And make no mistake, it’s a zero-sum game and the stakes are always high.  What we’re encouraged to accept as analysis is opinion, guess work and highly subjective.  It is also, most of the while, extremely convincing.

When opinion, narrative and storytelling (which are both vital elements of human interaction) take over, they have a dramatic impact on the effectiveness of our decision making. Storytelling helps us paper-over the gaps in our knowledge, encourages to apply faulty time lines, makes us feel comfortable about jumping to conclusions over what’s happening and about what we should be doing about it.  In very simple terms, narrative is a hothouse for producing the illusion of certainty.

Furthermore, if we’re the sort of team or organisation that focusses on people and is quick to point and blame, narrative will give us all the comfort we need to carry on in this damaging direction.  When ‘facts’ are fluid or completely absent, virtually any solution can be made to fit virtually any problem. But it doesn’t have to be this way.

As a firm that has worked with 100s of organisations over the past decade, some small, others enormous, we have steadily recognised a fascinating pattern in this area.

Namely, that the overwhelming majority of companies operating without an established problem solving method (one that demands objective analysis) tended to focus most of their problem solving activity on the behaviours of their workforce.  In practice, most of their solutions would fit neatly into what we now describe as The 3 R’s:

  • Re-Train
  • Re-Write
  • Re-Communicate

All 3 of which are eagerly implemented at the expense of better, longer-lasting systemic improvements.

The main issue here is not that training, documentation and communication aren’t vital to running a good business, they are, they really are! It’s that these ‘Re’ solutions usually involve a substantial amount of repeating an action whilst hoping for a different outcome. And that systemic improvements, which are nearly always relegated in this scenario, would offer far stronger solutions that benefit an organisation in the longer term, without having to overload the workforce.

Some of this probably sounds familiar.  It is very common for teams and businesses, particularly those under pressure, to drift away from a structured approach to problem solving (ironically when it is needed the most), leaving opinion and narrative to fill the vacuum. When a structured problem solving method, particularly one that includes a fact-based visual analysis process, is adopted it is virtually impossible to fall into this trap.  The behaviour of people is seen in context and solutions tend to be re-focussed on systems, offering vastly better value and far stronger returns on investment.

Advertisement

Man, I don’t envy THAT guy!

Homer Simpson

In an episode of The Simpsons dating back almost a decade, Homer is so overwhelmed with his adult responsibilities that he decides to shirk them by getting drunk. He justifies his choice by stating;

“That’s a problem for future Homer…
Man, I don’t envy that guy!”

Homer’s is a clear choice between dealing with the tough decisions of now, versus kicking them into the long grass for another day. All the while knowing it will be more problematic for him (and others) later.  It’s this kind of decision making that many of us display or encounter day after day in our working lives.  Problems come along and we do our best to ignore them, deny they’re really a problem at all, patch them with a quick fix, or if we possibly can, bump them upstream or downstream for as long as we can.

Many of us just don’t feel confident that we have the time, resources or support to manage big problems in an effective or structured way. We’re used to hearing objections like ‘we just don’t have time for this’ or ‘let’s just get it up and running for now’ or ‘we are already way too busy for this’.  But deep down we know that this is faulty logic and that these problems are always going to catch up with us in the end.

Arguably most of us do this because our decision making is based on a major oversight – namely that we ARE already dealing with these problems.  BUT we’re dealing with them in the least efficient, least predictable and least effective way that our organisations could possibly tolerate. It’s only because many problems are so drawn out, so thinly and widely spread across an organisation that we can somehow compartmentalise and delude ourselves that the problems are anything but massive.

In fact, it’s not uncommon for us to become so conditioned to the problems around us that even though we’re fighting exactly the same fires over and over again, we cannot see them for what they really are.

The tale of the fish and the frog springs to mind:

Two young fish are swimming across a pond and a wise old frog calls down to them from his lily pad. “Hey boys, how’s the water?” he calls.  They look at each other, embarrassed, and swim on by.  Once they are a safe distance away, one fish turns to the other and asks, “So, what’s water?”

You see, when you’re in it, you don’t know what ‘it’ is.

Once we recognise that our problems share root causes that can be addressed, we discover that putting time into solving these problems is NOT more work. On the contrary, it’s the biggest resource saver available. Through a dedicated, effective program of uncovering and tackling root causes we can start to save huge chunks of time and money and use the outcomes for value-added problem solving; for design, for creativity, for improvement, for efficiency and for planning.

THE ‘BIG SEVEN’ PITFALLS WHEN PROBLEM SOLVING (AND HOW TO AVOID THEM)

All organisations have problems and some of those problems are certainly worth solving. Clearly some teams and some individuals are better at solving problems than others, so what pitfalls can you avoid in order to improve your problem solving skills and outcomes?

Here are the ‘Big Seven’ pitfalls that we know contribute to weaker problem solving:

1) You don’t really know what problem it is that you’re solving.
Question: Have you clearly defined the problem you want to solve?

Being clear about the problem you want to solve is essential. If the problem definition is not clear in your own mind, and has not be coherently stated and shared to your team, how will you or others set about understanding and solving it? Experience demonstrates that individuals rarely have a shared perspective we assume they have when it comes to major issues.

2) You’re not in a problem solving state of mind.
Question: Have you got your inquiring mind set in place?

All too often, problem solvers are judged on speed and not effectiveness – professional perception of a role or profession can imply that good problem solvers should be able to come up with solutions immediately. Expert problem solvers will always put aside any assumptions that they know what caused a problem or that they already know what the solution is. This process prioritises effectiveness over speed.

person holding black and orange typewriter
Photo by rawpixel.com on Pexels.com

3) You’re telling stories.
Question: Have you broken down the causes of the problem down into its constituent parts?

Many of us rely too heavily on narrative (aka story-telling) which comes with inherent issues, such as artificial start and finish dates, truncated analysis, focusing on activity (usually the interesting bit), simplified timelines and reduced detail. An effective analysis drills backwards in time from the problem, methodically picking apart the cause and effect relationships at play. Only patient analysis will push us beyond the superficial ‘symptom level’ to the root causes.
4) Your focus is skewed.
Question: During your analysis have you paid attention to the systems and circumstances that have allowed change to take place?

Actions aka ‘points of change’ are usually the most obvious causes, but unless we consider systems and circumstances we will only have part of the picture, at best. Although systems and circumstances are often subtle and are sometimes harder to uncover, they are no less important when it comes to effective problem solving.

5) You’re blaming people.
Question: When ‘Human Failure’ is apparent, have you ‘drilled back’ to really understand what made the person behave in that way?

People are often the aforementioned ‘points of change’ and in that sense their role in a problem is often the most obvious. For many of us it’s easy to become focussed on the actions of individuals and this easily slips into a blame culture. This usually results in less information being shared and a reduced appetite to assess tools, practices and the working environment. In this scenario problems will never be satisfactorily solved. Equally, avoiding accountability altogether by dismissing causes as simply ‘Human Error’ gets us no closer to applying effective solutions either.

6) You’re searching for THE root cause.
Question: Have you taken into account that your problem will have multiple interdependent causes?

If only problems had just a single root cause!  All problems, especially complex problems have multiple causes. Fixation on a single cause leads to a similar fixation on a single solution. Avoid convincing yourself that a solution applied to just one cause, even a major cause, will completely solve your problem.  This is rarely the case. In fact, this pitfall, above all, explains why the majority of problems are frustratingly stubborn.

7) You’re choosing the wrong solutions.
Question: Have you methodically addressed your analysis to select your best solutions?

It’s all too easy to select solutions on criteria that don’t stand up to rigorous scrutiny, or apply solutions that cluster in the part of the problem we are familiar or comfortable with. A systematic evaluation of all possible solutions should help us decide which will offer us maximum effectiveness, provide a strong return on investment and won’t trip us up badly when we’re further down the track.

Lessons Learnt or Lessons Lost

Over 400 years ago, Captain James Lancaster, an English sailor, performed a benchmark experiment in his pursuit of a prevention for a disease called scurvy.  Scurvy was one of the biggest problems at sea at that time, killing or debilitating many individual sailors as well as rendering the operational capability of a crew so diminished that the remaining sailors struggled to man their ships safely.

On just one of four ships in a flotilla bound for India Capt. Lancaster prescribed three teaspoons of lemon juice a day for the entire crew.  By the half way point of the journey everyone on that ship was alive and well. On the other 3 craft, 110 men out of 278 (40%) had died and others were becoming increasingly weak and sick.

sunset ship boat sea
Photo by Pixabay on Pexels.com

This was an incredible finding that directly linked scurvy, a killer disease, with a chronic lack of Vitamin C.  It revealed that relatively modest consumption of lemon juice was a way of avoiding hundreds of needless deaths on future journeys as well as the loss of millions of pounds (in today’s money) of ships and naval hardware.

Despite this discovery, it was to take another 200 years (and thousands more unnecessary deaths) for the British Royal Navy to enact firm dietary guidelines as routine on its ships. This is an adoption rate that can only be described as a ‘glacial’.

One would hope that things have changed and moved on since then, however, many large organisations still struggle to analyse their mistakes and learn from them. And for some, even when learning is identified, these learning opportunities don’t easily flow through the system to the ‘front line’.  For example, adoption rates in global healthcare, in particular, are known for being ‘low and slow’ and have been described as universally sluggish for many years. One recent study examined the outcomes of nine major medical discoveries made at the end of the 20thcentury. The study revealed it took an average of 17 years before the new treatments were fully adopted by the majority of doctors.

But even in this ‘information age’ many differing organisations and sectors continue to struggle with implementing important ‘lessons learnt’. multiple studies clearly show that adoption rates are directly linked to the way that important learning is formatted and distributed to relevant parties. Revealing that necessary knowledge has often not been translated into a simple, usable and systematic format.

Now, although it should be noted that direct comparison between different sectors should be handled with extreme caution, the aviation industry is regularly presented as a sector that has worked assiduously on this issue of sharing learning. Examples being;

  1. In aviation in the aftermath of an investigation the report is made available to everyone.
  2. Airlines have a legal responsibility to implement the recommendations.
  3. Every pilot in the world has free access to the data
  4. Aviation has protocols that enable every airline, pilot and regulator to access every new piece of information in almost real time.

In this sector, data resulting from investigations is universally accessible and rapidly distributed across the world. This enables everyone to learn from the mistake or error, rather than just a single crew or a single airline or nation. Crucially, learning derived from investigations is immediately filtered and refined into targeted guidance. This accelerates the speed of learning and, as a result, the adoption rate in these scenarios is almost instantaneous.

Atul Gawande, Surgeon, Researcher and Author, highlights this challenge of presenting complex information across other organisations and sectors;

“If the only thing people did in aviation was issue dense, page long bulletins…it would be like subjecting pilots to the same deluge of almost 700,000 medical journals per year that clinicians must contend with. The information would be unmanageable. Instead…crash investigators distil the information into its practical essence.”

The crucial lesson to take away is that if an organisation is to generate the maximum dividend from their problem solving and lessons learnt programs then it is imperative that they create a culture that promotes the recognition of mistakes. They must implement a process that investigates mistakes openly and effectively and devise a system that enables the key learning to be distilled, distributed and assimilated as efficiently as possible.

Don’t learn a lesson the hard way, only to lose it.