Curated by: Luigi Canali De Rossi
 


Thursday, June 21, 2007

Complex Systems Management: When Linear Approaches Don't Work

Sponsored Links

Wikipedia says that:

"There are many definitions of complexity, therefore many natural, artificial and abstract objects or networks can be considered to be complex systems, and their study (complexity science) is highly interdisciplinary. Examples of complex systems include ant-hills, ants themselves, human economies, climate, nervous systems, cells and living things, including human beings, as well as modern energy or telecommunication infrastructures."

complex_network_management.jpg
Photo credit: tasosk

A complex system is any system that is based on strict collaboration and interaction: a network, basically. But which is the right way to manage a complex system and make it work?

In health care, researchers and consultants who take up complexity, often fail to follow through the implications in their work and are unable to fully estimate the degree of change necessary to make it work in practice. The theory of complexity often loses sense when applied to real contexts.

The article we are republishing today has been written by Dave Snowden, an expert of cognitive science who, starting from the discussion of complexity in health care systems, puts down a list of all the linear approaches that don't work in non-linear complex situations, thus introducing a different outlook on the subject.




Understandable "hypocrisy"

by Dave Snowden

understandable_hypocrisy_second_image.jpg

My presentation earlier today was on the use of complexity in health care. I went through some of the basics, in particular the links with cognitive science and the role of narrative. At the end of the session I put together a summary of the sorts of thing that go wrong.

By wrong, I meant structured linear methods being applied to non-linear complex situations, but I was also targeting researchers and consultants who take up complexity and/or narrative, but then fail to follow through the implications in their work.

The title of this post is not meant as an attack per se. Hypocrisy is a serious charge so I have placed it in quotations and prefixed it with understandable. It is very difficult, if you have been trained for years in a particular approach to change.

Even if you understand complexity in theory, the practice of research and consultancy has been developed before that knowledge. There is thus a danger of failing to fully appreciate the degree of change necessary. This is particularly the case with demanding clients who want to know what they are going to get in advance.

So, here is the list, with some explanation ...

  1. Believing that their expertise is more valuable than the knowledge of their subjects, and that they can avoid bias.

    Too many experts (researchers and consultants are equally guilty) continue to insist that they have to interview subjects in the study and draw conclusion. They will say things like We draw our their stories or We allow them to speak in different voices.

    They talk about the need to interpret the material, to deconstruct it and allow the authentic voice to be heard. Sometimes they argue that senior people need to be interviewed by someone senior to take a project seriously. For reasons that I do not fully understand they actively resist any idea that stories can be told in context without their presence, or that people tagging and interpreting their own stories provides richer and more authentic research data than can be provided by the expert.



  2. Seeking to mandate ideal behaviour or organisational strucuture.

    I have heard many people talk about emergence, the need to create an ecology of agent interaction and the fact that there is no such things as best practice in a complex system. They then go on to talk about ideal models of behaviour (thus implying such a thing as best practice) for employees and leaders alike.

    They describe the ideal form of an organisation in the usual platitudes or organisational consultancy: open to new ideas, creating a no blame culture etc. All of their description implies that these things can be mandated, when in practice they evolve, they cannot be defined or designed they have to emerge through multiple interactions over time.



  3. Assuming an experiment will scale, or replicate in a different context.

    The fact that something works in one context (for example a particular hospital) does not mean that the outcome can be replicated in another place even if it similar. Each specific context is not fully knowable, and the interaction of agents in each context will be different in each case.

    We can replicate starting conditions and monitor for emergent patterns, damping and amplifying according to their efficacy but replication of outcome is not possible. I can understand this. A common senior management needs seems to be for a recipe with a known outcome.

    The researcher or consultant working with complexity does their client no favours by pandering to this need. better to be honest up front and set the expectations. Complexity interventions create unique contextually appropriate solutions. They do not replicate, neither do they necessarily scale.



  4. Focusing on efficiency, not effectiveness; thinking of stability, rather than resilience.

    This is basic. Efficiency is all about stripping away superfluous functionality, stabilizing the system to an equilibrium state so that its performance is optimized. That is fine as long as the context does not change. If it does, then the reaction time may be too long. Resilient systems on the other hand have a degree of redundancy so that they can adapt to changing contexts and as a result are effective under dynamic conditions.



  5. Using outcome based targets for other than ordered systems in equilibrium states.

    Outcome assumes causality and repeatability. In a complex system this is not possible. Any attempt to create an outcome will be subject to the law of unintended consequences. You may get what you targeted, but the system adjusts in ways that may not be beneficial.

    For example setting a target to reduce time in an A&E unity, achieved the target but at the cost of occupying ward beds (people were shipped out just before the target deadline) as a result of which operations had to be cancelled. The classic response is to set a target for operations, but then there is another unintended consequence. The cycle of more and more targets produces a perverted system with so many measures that contextualisation and innovation are stifled in a blame culture.



  6. Using the wrong model of science for evidence.

    Evidence based policy is all well and good, but if your model of evidence requires provable outcomes in advance then your science model is locked into the last century. The critical switch in a complex system is from Safe-Fail design, to fail-safe experimentation. Evidence needs to be linked to experimentation not to outcome in a complex system.



  7. Using hindsight rather than foresight.

    Complex systems are retrospectively coherent. Anyone who works in complexity knows this, but many suggest future interventions and policy on the basis that you can repeat past success. The future has potential, we can sense some of that, we can influence its evolution. We can learn from what happened in the past. However we cannot use the past to predict the future.



Originally written by Dave Snowden as "Understandable "hypocrisy"" and published on June 1, 2007 on Cognitive Edge.



About the author

dave_snowden.jpg

Dave Snowden is a major figure in the movement towards integration of humanistic approaches to knowledge management and sensemaking. His original degree is in Philosophy from the University of Lancaster and he also has an MBA from Middlesex University. He is the Founder and Chief Scientific Officer of Cognitive Edge, which focuses on the development of the theory and practice of sensemaking.



Photo credits
Network Boxes: Peter Galbraith

Dave Snowden -
Reference: Cognitive Edge [ Read more ]
 
 
 
Readers' Comments    
blog comments powered by Disqus
 
posted by on Thursday, June 21 2007, updated on Tuesday, May 5 2015


Search this site for more with 

  •  

     

     

     

     

    7524




     
     




    Curated by


    Publisher

    MasterNewMedia.org
    New media explorer
    Communication designer

     

    POP Newsletter

    Robin Good's Newsletter for Professional Online Publishers  

    Name:
    Email:

     

     
    Real Time Web Analytics