Nut

Book Review

The Logic of Failure
Join Now
FREE registration allows you to support this site and receive our regular M-News newsletter.

bkused120x60.gif - 3168 Bytes

The Logic of Failure

By: Dietrich Dorner

Paperback - 222 pages
Published by: Perseus Publishing
Publication Date: January 1996
ISBN: 0201479486

Contents

  1. Introduction
  2. Some Examples
  3. The Demands
  4. Setting Goals
  5. Information and Models
  6. Time Sequences
  7. Planning
  8. So Now What Do We Do?

Our Review

Subtitled "Recognizing and Avoiding Error in Complex Situations", and featuring a graphic photograph of a steam train lying on its side next to a railway track, I was looking forward to reading this book. I was especially keen to read it because it received high praise from James Reason, who's book "Managing the Risks of Organizational Accidents" I highly recommend to all engineers - read my review at www.plant-maintenance.com/books/1840141050.shtml. Reason wrote of this book - "An especially important book that deals with the nature and origins of mistakes in a way that has no precedent". And it is indeed an important book.

Dorner is a behavoural psychologist, and has spent his life researching into how humans make decisions relating to complex systems and situations. In particular, he has applied computer simulation techniques to create models of complex societies and technologies, and tested human responses in trying to control these systems. His conclusion is that, more often than not, we completely fail to effectively manage complex systems, and that this failure is largely due to weaknesses in our own cognitive abilities, and our own behaviours.

Starting with a simulated, fictional African society "Tanaland", where the average participant, acting as dictator of this society, managed to make decisions leading to irreversible, catastrophic famine, Dorner leads us through a range of simulations and experiments, and discusses the results of each, and the reasons why some (or most) participants failed to achieve the objectives of the experiment. His conclusions are that, in general, we humans:

  • Are limited by the slowness of our thinking - although our subconscious can process information at an enormous rate (just imagine the amount of information being processed by a driver in heavy traffic), when it comes to conscious thought, we are extremely slow. This leads to us making shortcuts in decision making. This is typified by us leaping to action first, often before we have clearly defined what our goal is, or collecting the information necessary to make effective decisions.
  • Tend to oversimplify our mental models of complex systems - focusing only on one or two "key" variables and underestimating the importance of other factors.
  • Are especially poor in analysing, and forecasting based on, sequences of data in time. We tend to assume linear extrapolation of trends, and do not cope well with accelerating or decelerating change, let alone with the possibility of a change in trend direction.
  • Tend to see new situations as simply extensions of old, established situations, and therefore apply old, established actions, which may not be appropriate. This may be a self-protective behaviour to allow us to feel secure that we can cope with the situation.
  • Tend to ignore the possibility that actions we take now may have unintended side-effects, and cause problems that currently do not exist.
  • Make "ballistic" decisions, where we do not monitor the outcomes of those decisions after we have made them.
  • Only act where we feel at least minimally competent to do what is asked of us. Without some expectation of success, we are likely to not act at all, and simply let fate take its course.
  • Form simple hypotheses and limit the search for information in order to preserve our own self-perception of competence.
  • On occasions, pursue planning, information gathering and structuring processes that go on interminably as a defence against the possibility that we are incompetent. Excessive planning and information gathering may keep us from making contact with the reality that our actions are not working, or are wrong.
  • Equally, on occasions, for self-protection, we may only solve those problems that we know we can solve - even those may not be the most important or pressing problems to be addressed.
  • Are not particularly effective at recalling past information and events, which can lead to us repeating past, inappropriate decisions.

With all of these weaknesses, is there any hope for us? Dorner's experiments, however, revealed that leaders from business and industry tended to make more effective decisions in complex situations than other groups. His conclusion was that these people had "operative intelligence" - that they understood the most appropriate behaviours to be used in given complex decision-making situations, and they were adept at applying these. He believes, therefore, that these behaviours can be acquired, and that more effective decision making in complex situations can be learnt.

How does this apply to us in Maintenance? All of the concepts and conclusions outlined by Dorner apply in any Failure Investigation of a large complex system. In particular, the chapter relating to our understanding of time-series data is highly relevant in this situation. Having recognised these behaviours, both in others and ourselves, this provides us with the opportunity to design equipment, and decision making processes, which allow for these very human characteristics.

If you are actively involved in Root Cause Analysis facilitation, or are a manager of a complex system, I recommend this book.


Copyright 1996-2009, The Plant Maintenance Resource Center . All Rights Reserved.
Revised: Thursday, 08-Oct-2015 12:08:03 AEDT
Privacy Policy