« Leadership and Systems Thinking | Main | Ackoff's Best: His Classic Writings on Management »

November 12, 2007

Application of Systems Thinking

Deep Ocean Search Planning: A Case Study Of Problem Solving
Johan Strümpfer

Introduction
The Accident
In late 1987 a Boeing 747 of South African Airways crashed into the Indian Ocean after an on board fire. The crash location was 250 km north west of Mauritius, an island east of Madagascar. Despite modern technology the location of the wreckage was as uncertain as the location of the Titanic. A massive search and recovery operation was immediately launched. This effort covered various search phases, namely a surface search and recovery, an underwater sound beacon search, an underwater sonar search, and an underwater photographic survey and recovery. All searches faced unique technical challenges, of providing accurate navigation aids and the sheer depth of the ocean. Ocean depth was in places over 5 km deep and very mountainous. This depth was significantly deeper than that of the Titanic search, the deepest deep ocean search up to that time.
This situation resulted in the in the mobilization of resources and personnel from more than a dozen nationalities. There eventually were six basic sources of information on where to conduct the search. Some of the nationality groups favored one or more of these information sources over others, resulting in conflicting and widely dispersed opinions on where the search should be conducted. The stakes were raised by the perception that those groups whose information sources were seen to prevail, would be more likely to obtain the lucrative search and recovery contract.
To read this web post, click on: Deep Ocean Search Planning: A Case Study Of Problem Solving

Posted by ACASA on November 12, 2007 at 09:51 AM in blog post | Permalink

Comments

Thanks for the response. I have read through the paper, though did fail to understand some things about it. Your comments do help, thanks again.

I suppose I would love to here more about the story (and yes, I agree it is more a story than a method) of how you got from the "hard" OR data to the prediction that was made. Story telling is very powerful, and I think more could be made of your story. Even if just a "diary".

I am a strong supporter of Systems Engineering, but also of Operations Research, and I really don't think the two have to be that far apart. I probably should clarify that I come from military OR, which is a little more practical based it appears than civilian OR programs. Yes, I agree with Dr. Ackoff that there have been some too academic, not practical OR approaches, but I don't think that dooms OR as a profession or a tool.

By the way, in at least the first space shuttle disaster, there is an OR piece. I was working on my master's thesis in OR during the Challenger investigation, and was using logistic regression as a part of my thesis. My thesis advisor did show me the work going on with the Challenger, that did show, using only information available prior to the launch (and yes, logistic regression), you could predict a greater than 10% chance of complete burnthrough of the two O-rings, which is what did actually happen. That information was overlooked, went unanalyzed, and although subjective and emotional information was supplied by Thiokol, it was not acted upon by NASA. If the subjective and emotional could have been backed up with the "hard" OR data, perhaps the launch would have been scrubbed.

I suppose my hope is for a story that says - we started with this hard data, did use some data analysis tools, and integrated in experience and theories using systems thinking and came up with a valid prediction. A very powerful story.

Thanks again,

Steve

Posted by: Steve Prevette at Nov 15, 2007 12:42:04 PM

Hi Steve
Your comment above refers: Quotong form the section outlining the structure of the paper
"This paper relates a story of how the above quantitative and technical approach to the search problem lead to almost intractable problem. This is covered in the following sections, giving the background to the search situation and the results of the search. The planning problem was dealt with in the end by switching to a different paradigm of problem solving, which is dealt with in the last sections."
The paper is an interplay between the different philosophical frameworks underlying operations research and a systems approach. The context is technical, and the moral of the story, for me, was that in a context where apparently all the factors favor the application of an OR approach, a systems approach was the one that helped me in the end. The exact "method" of systems approach followed was not a method at all, but a series of systemic insights, as listed in the last part of the paper, that struck me. These insights gave rise to a different behavior (way of interacting with the system)- as listed in the paper - that resulted in a remarkable unfolding of what was a very technical "problem".

As to where the aircraft was found, refer to:

" The aircraft wreckage was found at the dot in area 1 within two days after the sonar search was started. (If you have trouble seeing it image our problem of finding it in the ocean." (from the section on Planning Results)

Therefore looking at your questions and criticism, it is not clear to me that you have read the full article in all its detail. This make me wonder why not, and if there are constructive suggestions as to what presentation structure would retain the reader's interest better, I would welcome this.

Some background subsequent to the work described in the article: The associated accident investigation gave me an interest in the cause of accidents, as seen form a systemic perspective. I did quite a lot of additional work on that, and worked with some major companies to improve safety, e.g. several of the oil refineries in South Africa. One of the principles that emerge out of a systemic view of the cause of accidents, is that most catastrophic failures are the result of various factors working together over time and space to produce (in the Singer/Churchman/Ackoff sense of produce)the accident. In virtually all case of failures in complex systems, one of these co-producers of a failure, is a progressive breakdown in discipline of adhering to basic safety, engineering and management principles. As this degradation in standards progress, the underlying system starts to send signals. These signals are in the form of near misses, close shaves, small failures, events that are not catastrophic but little problems that do not have deep impact per se. They are, what I call, footsteps on the ground leading ultimately to the catastrophe. These footsteps are often very clear when one looks back from the perspective of the catastrophe itself. The different space shuttle accidents are very clear examples of this.

Currently we are in the midst of such a series of footsteps in the airline industry in South Africa, in my view. There has been a spate of "little" problems: Near misses (near hits is the more accurate description), aircraft handling problems (go off run way), delays, baggage delays, losses, theft of luggage, engine lost - as in fall off - on take-off, accidents involving aircraft on the ground. As a frequent member of the flying public this systemic diagnosis does not sit well with me. Which leads to an even deeper systemic mystery: How does one use this insight to change a system?

My current interest is changing systems, and trying to find systemic methods for achieving development while avoiding the fate of Giordano Bruno (burned at the stake in 1600 for insisting that the earth turn around the sun).

Johan Strumpfer

Posted by: Johan P Strumpfer at Nov 15, 2007 2:38:55 AM

I started reading the paper with great expectations, but found myself rather left out in the cold at the end. 3/4 of the paper covers in great detail the "operations research" data for the problem. Then the theory is put forth that standard OR methods couldn't work for the problem, and a description of the systems thinking model is provided. But - I would be interested in how the different inputs were considered, and what was done. 3/4 of the paper is about what not to do - what should we do - what worked? What was the path followed?

And most important - we have a fine mystery here, but the final result was never shown (the butler did it!)- just where was the wreckage?

Posted by: Steve Prevette at Nov 12, 2007 5:23:54 PM

Post a comment