Difference between revisions of "Complexity"

From SEBoK
Jump to navigation Jump to search
Line 1: Line 1:
Complexity is one of the most important and difficult to define system concepts. Is a system's complexity in the eye of the beholder or is there inherent complexity?  How should complexity be rigorously defined? How should it be measured? What are the consequences on systems engineering of dealing with higher complexity systems? Many questions abound. The knowledge about complexity is summarized in this article.
+
Complexity is one of the most important and difficult to define system concepts. Is a system's complexity in the eye of the beholder or is there inherent complexity in how systems are organised? Is there a single definitive definition of complexity and if so how can it be assessed and measured?  
==Defining System Complexity==
 
Weaver (Weaver 1948) gives one of the earliest definitions as the degree of difficulty in predicting the properties of a system, if the properties of the system's parts are given.  Does this simple definition describe a static property of a system artifact, or a dynamic property of systems in use to solve a problem?  If complexity is related to an ability to understand systems, does it vary depending who is considering the system and why?  How do these questions relate to the distinctions between [[Natural System (glossary)|Natural Systems (glossary)]], [[Social System (glossary)|Social Systems (glossary)]] and [[Engineered System (glossary)|Engineered Systems (glossary)]] or to the idea of a [[System Context (glossary)]]?
 
  
According to (Sheard and Mostashari 2008) complexity sits on a spectrum somewhere between order and chaos. In common usage [[chaos (glossary)]] is a state of disorder or unpredictability. A chaotic system has elements which are not interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain types of dynamic system (e.g. the weather) which, although they have structure and relationships, exhibit unpredictable [[behavior (glossary)]]. These systems are deterministic; their future behavior is fully determined by their initial conditions with no random elements involved. However, their structure is such that (un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in behavior. This behavior is known as deterministic chaos, or simply chaos. Models of chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are possible at least some of the time. One might need to consider truly random or chaotic natural or social systems as part of the context of an engineered system, but such system cannot themselves be engineered.
+
This article will give the background to the ways in which complexity has been describe, how this relates to all types of system be they [[Natural Systems (glossary)]], [[Social Systems (glossary)]] or [[Engineered Systems (glossary)]] and an indication of current thinking on what it is and how it influences Systems Engineering Practice.  
  
Ordered systems have fixed relations between  elements and are not adaptable. (Page 2009) cites a watch as an example of an ordered system. The components of a watch are based on similar technologies with a clear mapping between form and function. If the operating environment changes outside prescribed limits or one key component is removed, the watch will cease to perform its function. Although the watch may have many components, it can be regarded as complicated but not complex. Ordered systems occur as system components, and are the subject of traditional engineering.  It is important to understand the limitations of such systems when using them in a complex system.
+
==Origins and Characteristics of Complexity==
 
 
Complex systems sit between order and chaos with combinations of elements of different types arranged in relationships which provide more than one function. This can lead to multiple ways of achieving a given outcome. Complex systems [[adaptability (glossary)|adapt (glossary)]] to environmental changes or the loss of some elements. For example, if one element such as a doctor, piece of equipment or building infrastructure is removed from a hospital surgical unit, the remaining elements will continue to function as a unit, albeit with reduced effectiveness. This would be considered a complex system. 
 
  
The inclusion of people in a system is often a factor in their complexity, due to the variability of human behavior as part of a system and the perceptions of people out-side the system(Sheard and Mostashari 2011) sort the attributes of complexity into causes and effects. Attributes that cause complexity include: many pieces, nonlinear, emergent, chaotic, adaptive, tightly coupled, self-organized, decentralized, open, political (vs. scientific), and multi-scale.  The effects of those attributes which make a system seem complex, often as perceived by people, include: uncertain, difficult to understand, unclear cause and effect, unpredictable, uncontrollable, unstable, unrepairable and unmaintainable, costly, and takes too long to build. (Sillitto 2009) refers to these as Objective and Subjective complexity and associates both with problem situations and system solutions.
+
In this section some of the prevailing ideas on complexity as describedDifferent authors have used different language to express these ideas and, while a number of common threads can be seen, some of the ideas take different viewpoints or are even contradictory.
 
Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the consequences of changing it. It occurs when there is no simple relationship between what an individual element does and what the system as a whole will do, and when the system includes some element of adaptation or problem solving to achieve its goals in different situations.  It can be based on objective attributes of the system or on subjective perceptions of system observers.  This view of complex systems is very much the kind of system for which a [[Systems Approach (glossary)]] is essential.
 
  
==Origins and Characteristics of Complexity==
+
Weaver (Weaver 1948) gives one of the earliest definitions as the degree of difficulty in predicting the properties of a system, if the properties of the system's parts are given.  This is clearly related to the number of elements and connections between them.  However, complexity should not be confused with complicated.  Complexity is a system property related to the kinds of elements and relationships not simply to their number.
 +
Ordered systems have fixed relations between elements and are not adaptable. (Page 2009) cites a watch as an example of something which can be considered as an ordered system.  Such a system is complicated, with many elements working together.  Its components are based on similar technologies with a clear mapping between form and function. If the operating environment changes outside prescribed limits or one key component is removed, the watch will cease to perform its function.
  
Many systems science authors have attempted to make sense of complexity; how does it differ from what is merely complicated or intricate, and how is it related to human perception or societal context? Weaver provided an early viewpoint categorizing organized and disorganized complexity (Weaver 1948). These categories and later reflections amongst others, such as (Flood and Carson 1993) and (Lawson 2010), provide the following complexity categorization:
+
In common usage chaos (glossary) is a state of disorder or unpredictability characterised by elements which are not interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain dynamic system (e.g. the weather) which, although they have structure and relationships, exhibit unpredictable behavior (glossary). These systems may include aspects of randomness but can be described by deterministic models from which their behavior can be described given a set of initial conditions. However, their structure is such that (un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in behaviour.  Such systems are referred to as Deterministically Chaotic, or simply Chaotic Systems. Simulations of chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are possible at least some of the time.
  
*Organized simplicity occurs when there are a small number of essential factors and large number of less significant or insignificant factors. Initially, a situation may seem to be complex, but on investigation the less significant and insignificant factors are taken out of the picture and hidden simplicity is found. This is also the basis for the process of [[abstraction (glossary)]]; creating systems of greater general applicability, but with lower level of detail.  
+
On a spectrum of order to complete disorder, complexity is somewhere in the middle, with more flexibility and change that complete order, and more stability than complete disorder (Sheard and Mostashari 2009).  
  
*Organized complexity is prevalent in physical and abstract systems where the structure of the system is organized in order to be understood and thus amenable to scientists in describing complex behaviors as well as for structuring the engineering and [[Life Cycle Management (glossary)|life cycle management (glossary)]] of complex systems (Braha et al. 2006). There is a richness that must not be oversimplified.  
+
Complex systems may evolve “to the edge of chaos” resulting in systems which can appear deterministic but which exhibit counter intuitive behavior compared to that or more ordered systems. The statistics of chance events in a complex system are often characterised by a Power Law distribution, the “signature of complexity” [Sheard 2005]. The Power Law distribution is found in a very wide variety of natural and man-made phenomena and it means that the probability of a low probability large impact event is much higher than Gaussian statistics would suggest.  Such a systems may react in a non-linear way to exhibit abrupt “phase changes”, which may be reversible or irreversible.  This has a major impact on engineered systems in terms of the occurrence, impact and public acceptance of risk and failure.
  
*Disorganized complexity occurs when there are many variables that exhibit a high level of random behavior. It can also represent the product of not having adequate control over the structure of heterogeneous complex systems that have evolved due to inadequate control over the system during its life (complexity creep).  
+
It is useful to think in terms of two kinds of complexity, “Subjective complexity exists in peoples’ minds and can be mitigated by consistent clear communication and good stakeholder engagement. Objective complexity is a real attribute of complex systems and is a measure of the extent to which future states of the system cannot be predicted with certainty and precision however good our knowledge of current state and history. (Sillitto, 2009).
  
*People-related complexity, where perception fosters a feeling of complexity. In this context, humans become “observing systems”. People can be viewed as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or irrational behavior of individuals in particular situations is of course a vital factor in respect to complexity (Kline 1995).  
+
A number of authors, (Weaver 1948), (Flood and Carson 1993) and (Lawson 2010), describe similar complexity categorization which related to these two types of complexity:
 +
#Organized simplicity occurs when there are a small number of essential factors and large number of less significant or insignificant factors. Initially, a situation may seem to be complex, but on investigation the less significant and insignificant factors are taken out of the picture and hidden simplicity is found. This is also the basis for the process of abstraction (glossary); creating systems of greater general applicability, but with lower level of detail.
 +
#Organized complexity is prevalent in physical and abstract systems where the structure of the system is organized in order to be understood and thus amenable to scientists in describing complex behaviors.  This also form the basis for structuring the engineering and life cycle management (glossary) of complex systems (Braha et al. 2006).
 +
#Disorganized complexity occurs when there are many variables that exhibit a high level of unpredictable behavior. It can also result from a heterogeneous complex system evolving without explicit architectural control during its life (complexity creep).
 +
#People-related complexity, where perception fosters a feeling of complexity or human behavior adds to complexity. People may be viewed as “observing systems” or as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or irrational behavior of individuals in particular situations is of course a vital factor in respect to complexity (Kline 1995).  
  
(Senge 1990) identifies two fundamental forms of engineered systems complexity; namely, detail complexity and dynamic complexity. Detail complexity arises from the number of systems elements and relationships. This complexity is related to the systems as they are; their static existence. Dynamic complexity, on the other hand, is related to the expected and even unexpected behavior of systems during their use in different problem scenarios.
+
The literature has evolved to a fairly consistent definition of the characteristics of system elements and relationships for objective systems complexity; the following summary is given by (Page 2009):
 +
#Independence: Autonomous system elements which are able to make their own decisions; influenced by information from other elements and the adaptability algorithms it carries with it (Sheard and Mostashari 2009)..  
 +
#Interconnectedness: System elements connect via a physical connection, shared data or simply a visual awareness of where the other elements are and what they are doing as in the case of the flock of geese or the squadron of aircraft.
 +
#Diversity: System elements which different in some way either technologically or functionally. Elements may be carrying different adaptability algorithms, for example.
 +
#Adaptability: Self-organizing system element which can do what it wants to do to support itself or the entire system in response to their environment (Sheard and Mostashari 2009). Adaptability is often achieved by human elements but can be achieved with software. (Pollock and Hodgson 2004) describe how this can be done in a variety of complex system types including power grids and enterprise systems.  
  
(Sheard and Mostashari 2011) describe Structural, Dynamic and Socio-political complexity. Structural complexity looks at the system elements and relationships. In particular, structural complexity looks at how many different ways system elements can be combined, and thus the potential for the system to adapt to external needs. Dynamic Complexity considers the complexity which can be observed when systems are used to perform particular tasks in an environment. There is a time element to dynamic complexity.  The ways in which systems interact in the short term is directly related to system [[behavior (glossary)]]; the longer term affects of using systems in an environment is related to system evolution.  Finally, Socio-political complexity considers the affect of individuals or groups of people on complexity. This will include the cognitive behavior of people in the system, multiple stakeholder viewpoints within a system context and social or cultural biases which add to the wider influences on a system context.
+
The inclusion of people in a system is often a factor in their complexity, due to the variability of human behavior as part of a system and the perceptions of people out-side the system. (Sheard and Mostashari 2011) sort the attributes of complexity into causes and effects. Attributes that cause complexity include: many pieces, nonlinear, emergent, chaotic, adaptive, tightly coupled, self-organized, decentralized, open, political (vs. scientific), and multi-scale. The effects of those attributes which make a system seem complex, often as perceived by people, include: uncertain, difficult to understand, unclear cause and effect, unpredictable, uncontrollable, unstable, unrepairable and unmaintainable, costly, and takes too long to build.  
  
==Characteristics of Complex Systems==
+
(Warfield 2006) developed a powerful methodology for addressing complex issues, particularly in the socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a set of interacting problems - what he called the “problematique”. The complexity is then characterized by several measures, such as the number of significant problems, their interactions, and the degree of consensus about the nature of the problems. Thus, what becomes clear is that how, why, where and by whom a system is used may all contribute to its perceived complexity.
 +
Some of this complexity can be reduced by education, training or familiarity with a system; some must be managed as part of a problem or solution. (Checkland 1999) argues that a group of stakeholders will have its own world views which lead them to form different, but equally valid, understandings of a system context. These differences cannot be explained away or analyzed out, but must be understood and considered in the formulation of problems or the creation of potential solutions.
  
According to (Page 2009), there are four characteristics of complex systems:
+
==Defining System Complexity==
*Independence of system elements. That is, making their own decisions; although these decisions may be influenced by information from other elements and the adaptability algorithms it carries with it. (Sheard and Mostashari 2008) refer to this characteristic as “autonomous” components.
+
(Sheard and Mostashari 2011) synthesis much of the ideas described above to categorize complexity as follows:
*Interconnectedness: between system elements. This may be via a physical connection, shared data or simply a visual awareness of where the other elements are and what they are doing as in the case of the flock of geese or the squadron of aircraft.  
+
#Structural complexity looks at the system elements and relationships. In particular, structural complexity looks at how many different ways system elements can be combined, and thus the potential for the system to adapt to external needs.  
*Diversity simply means that system elements are different in some way, technologically or functionally. Element may be carrying different adaptability algorithms, for example.  
+
#Dynamic Complexity considers the complexity which can be observed when systems are used to perform particular tasks in an environment. There is a time element to dynamic complexity. The ways in which systems interact in the short term is directly related to system behavior (glossary); the longer term effects of using systems in an environment is related to system evolution.  
*Adaptability is generally considered to be the most important characteristic of the elements of a complex system. Adaptability means that each element can do what it wants to do to support itself or the entire system. In the case of the human pilots, each pilot can make his or her own decisions to adjust to the mission of the whole squadron. (Sheard and Mostashari 2008) refer to this characteristic as self-organizing. Sheard and Mostashari also say that complex systems adapt to their environment. Adaptability can also be achieved with software. (Pollock and Hodgson 2004) describe how this can be done in a variety of complex system types including power grids and enterprise systems.
+
#Finally, Socio-political complexity considers the effect of individuals or groups of people on complexity. People related complexity has two aspects, one related to the perception of a situation as complex or not, due to multiple stakeholder viewpoints within a system context and social or cultural biases which add to the wider influences on a system contextThe other involves either irrational behavior of an individual or swarm-effects of many people behaving individually in ways that make sense but the emergent behavior is unpredicted and perhaps counterproductive.  This latter type is based on the interactions of the people according to their various interrelationships and is often graphed using systems dynamics formalisms.
 
 
Complexity is also in many ways a human conceptLooking at a hospital surgical unit from the perspective of an experienced nurse, a member of the cleaning staff, a software engineer designing code for a piece of medical equipment, a typically educated patient or a patient from an African village flown into the hospital after a natural disaster, it is clear that the education, experience and knowledge of each person may radically change their understanding of the same system. Such factors as human values and beliefs, interests, capabilities as well as notions and perceptions of systems are determinants of complexity.
 
  
(Warfield 2006) developed a powerful methodology for addressing complex issues, particularly in the socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a set of interacting problems - what he called the “problematique”. The complexity is then characterized by several measures, such as the number of significant problems, their interactions, and the degree of consensus about the nature of the problems. Thus, what becomes clear that how, why, where and by whom a system is used may all contribute to its complexity.  
+
Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the consequences of changing it. It occurs when there is no simple relationship between what an individual element does and what the system as a whole will do, and when the system includes some element of adaptation or problem solving to achieve its goals in different situations. It can be effected by objective attributes of a system such as by the number, types of and diversity of system elements and relationships; or by the subjective perceptions of system observers, both due to their experience, knowledge or training or due to other socio political considerations.  
  
Some of this complexity can be reduced by education, training or familiarity with a system; some must be managed as part of a problem or solution. (Checkland 1999) argues that a group of stakeholders will have its own world views which lead them to form different, but equally valid, understandings of a system context. These differences cannot be explained away or analyzed out, but must be understood and considered in the formulation of problems or the creation of potential solutions.
+
This view of complex systems is very much the kind of system for which a Systems Approach (glossary) is essential.  
  
==Complexity and Context==
+
==Complexity and Engineered Systems==
  
The views of complexity are not independent when considered across a system [[Hierarchy (glossary)]].  [[System Context (glossary)]] is a concept used to focus on an engineered system-of-interest, while still considering wider [[Holistic (glossary)]] system and environmental relationships.  Problem situations and potential solutions may contain both subjective and objective complexity, while structural complexity at one level will be related to dynamic complexity at higher levels. People are involved in most system contexts, as system elements and as part of the operating environment. People are also involved with systems throughout the lifetimes of those systems.
+
The views of complexity are not independent when considered across a System Context (glossary).  Problem situations and potential solutions may contain both subjective and objective complexity, and structural complexity of a system-of-interest be related to dynamic complexity when the system-of-interest is used as part of the wider system in different problem scenarios. People are involved in most system contexts, as system elements and as part of the operating environment. People are also involved with systems throughout the lifetimes of those systems.  
  
(Sillitto 2009) considers the link between the types of complexity and system architectures, but this can be generalized to consider how to deal with complexity in the applications of a Systems Approach (see [[Applying the Systems Approach]]).  Sheard and Mostashari 2011) also show how the different views of complexity map onto [[Product System (glossary)|Product Systems (glossary)]], [[Service System (glossary)|Service Systems (glossary)]] and [[Enterprise System (glossary)| Enterprise Systems (glossary)]]; and to associated Development and Sustainment systems and Project organizations.
+
(Sheard and Mostashari 2011) also show how the different views of complexity map onto Product Systems (glossary), Service Systems (glossary) and Enterprise Systems (glossary); and to associated Development and Sustainment systems and Project organizations.  Ordered systems occur as system components, and are the subject of traditional engineering. It is important to understand the behaviors of such systems when using them in a complex system.  One might also need to consider both truly random or chaotic natural or social systems as part of the context of an engineered system.  The main focus for systems approaches is Organised complexity, the ways we choose to structure system elements to help manage and mitigate both objective and subjective complexity.
  
The definition of system complexity used in the SEBoK covers two views of complexity within a system context: the structural complexity of the system-of-interest and wider system; and the dynamic complexity when the system-of-interest is used as part of the wider system in different problem scenarios. The differing perceptions of this complexity by both individuals and social groups of people involved in creating, using or interacting with a system is recognized.  In many ways the [[Systems Approach Applied to Engineered Systems | Systems Approach]] exists to deal with these complexity issues.
+
(Sillitto 2009) considers the link between the types of system complexity and system architecture. The ability to understand, manage and respond to both objective and subjective complexity be they in the problem situation, the systems we develop or the systems we use to develop and sustain them is a key component of the [[Systems Approach as Applied to Engineered Systems]] and hence to the practice of Systems Engineering.  
  
 
==References==
 
==References==

Revision as of 09:26, 27 July 2012

Complexity is one of the most important and difficult to define system concepts. Is a system's complexity in the eye of the beholder or is there inherent complexity in how systems are organised? Is there a single definitive definition of complexity and if so how can it be assessed and measured?

This article will give the background to the ways in which complexity has been describe, how this relates to all types of system be they Natural Systems (glossary), Social Systems (glossary) or Engineered Systems (glossary) and an indication of current thinking on what it is and how it influences Systems Engineering Practice.

Origins and Characteristics of Complexity

In this section some of the prevailing ideas on complexity as described. Different authors have used different language to express these ideas and, while a number of common threads can be seen, some of the ideas take different viewpoints or are even contradictory.

Weaver (Weaver 1948) gives one of the earliest definitions as the degree of difficulty in predicting the properties of a system, if the properties of the system's parts are given. This is clearly related to the number of elements and connections between them. However, complexity should not be confused with complicated. Complexity is a system property related to the kinds of elements and relationships not simply to their number. Ordered systems have fixed relations between elements and are not adaptable. (Page 2009) cites a watch as an example of something which can be considered as an ordered system. Such a system is complicated, with many elements working together. Its components are based on similar technologies with a clear mapping between form and function. If the operating environment changes outside prescribed limits or one key component is removed, the watch will cease to perform its function.

In common usage chaos (glossary) is a state of disorder or unpredictability characterised by elements which are not interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain dynamic system (e.g. the weather) which, although they have structure and relationships, exhibit unpredictable behavior (glossary). These systems may include aspects of randomness but can be described by deterministic models from which their behavior can be described given a set of initial conditions. However, their structure is such that (un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in behaviour. Such systems are referred to as Deterministically Chaotic, or simply Chaotic Systems. Simulations of chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are possible at least some of the time.

On a spectrum of order to complete disorder, complexity is somewhere in the middle, with more flexibility and change that complete order, and more stability than complete disorder (Sheard and Mostashari 2009).

Complex systems may evolve “to the edge of chaos” resulting in systems which can appear deterministic but which exhibit counter intuitive behavior compared to that or more ordered systems. The statistics of chance events in a complex system are often characterised by a Power Law distribution, the “signature of complexity” [Sheard 2005]. The Power Law distribution is found in a very wide variety of natural and man-made phenomena and it means that the probability of a low probability large impact event is much higher than Gaussian statistics would suggest. Such a systems may react in a non-linear way to exhibit abrupt “phase changes”, which may be reversible or irreversible. This has a major impact on engineered systems in terms of the occurrence, impact and public acceptance of risk and failure.

It is useful to think in terms of two kinds of complexity, “Subjective complexity exists in peoples’ minds and can be mitigated by consistent clear communication and good stakeholder engagement. Objective complexity is a real attribute of complex systems and is a measure of the extent to which future states of the system cannot be predicted with certainty and precision however good our knowledge of current state and history. (Sillitto, 2009).

A number of authors, (Weaver 1948), (Flood and Carson 1993) and (Lawson 2010), describe similar complexity categorization which related to these two types of complexity:

  1. Organized simplicity occurs when there are a small number of essential factors and large number of less significant or insignificant factors. Initially, a situation may seem to be complex, but on investigation the less significant and insignificant factors are taken out of the picture and hidden simplicity is found. This is also the basis for the process of abstraction (glossary); creating systems of greater general applicability, but with lower level of detail.
  2. Organized complexity is prevalent in physical and abstract systems where the structure of the system is organized in order to be understood and thus amenable to scientists in describing complex behaviors. This also form the basis for structuring the engineering and life cycle management (glossary) of complex systems (Braha et al. 2006).
  3. Disorganized complexity occurs when there are many variables that exhibit a high level of unpredictable behavior. It can also result from a heterogeneous complex system evolving without explicit architectural control during its life (complexity creep).
  4. People-related complexity, where perception fosters a feeling of complexity or human behavior adds to complexity. People may be viewed as “observing systems” or as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or irrational behavior of individuals in particular situations is of course a vital factor in respect to complexity (Kline 1995).

The literature has evolved to a fairly consistent definition of the characteristics of system elements and relationships for objective systems complexity; the following summary is given by (Page 2009):

  1. Independence: Autonomous system elements which are able to make their own decisions; influenced by information from other elements and the adaptability algorithms it carries with it (Sheard and Mostashari 2009)..
  2. Interconnectedness: System elements connect via a physical connection, shared data or simply a visual awareness of where the other elements are and what they are doing as in the case of the flock of geese or the squadron of aircraft.
  3. Diversity: System elements which different in some way either technologically or functionally. Elements may be carrying different adaptability algorithms, for example.
  4. Adaptability: Self-organizing system element which can do what it wants to do to support itself or the entire system in response to their environment (Sheard and Mostashari 2009). Adaptability is often achieved by human elements but can be achieved with software. (Pollock and Hodgson 2004) describe how this can be done in a variety of complex system types including power grids and enterprise systems. .

The inclusion of people in a system is often a factor in their complexity, due to the variability of human behavior as part of a system and the perceptions of people out-side the system. (Sheard and Mostashari 2011) sort the attributes of complexity into causes and effects. Attributes that cause complexity include: many pieces, nonlinear, emergent, chaotic, adaptive, tightly coupled, self-organized, decentralized, open, political (vs. scientific), and multi-scale. The effects of those attributes which make a system seem complex, often as perceived by people, include: uncertain, difficult to understand, unclear cause and effect, unpredictable, uncontrollable, unstable, unrepairable and unmaintainable, costly, and takes too long to build.

(Warfield 2006) developed a powerful methodology for addressing complex issues, particularly in the socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a set of interacting problems - what he called the “problematique”. The complexity is then characterized by several measures, such as the number of significant problems, their interactions, and the degree of consensus about the nature of the problems. Thus, what becomes clear is that how, why, where and by whom a system is used may all contribute to its perceived complexity. Some of this complexity can be reduced by education, training or familiarity with a system; some must be managed as part of a problem or solution. (Checkland 1999) argues that a group of stakeholders will have its own world views which lead them to form different, but equally valid, understandings of a system context. These differences cannot be explained away or analyzed out, but must be understood and considered in the formulation of problems or the creation of potential solutions.

Defining System Complexity

(Sheard and Mostashari 2011) synthesis much of the ideas described above to categorize complexity as follows:

  1. Structural complexity looks at the system elements and relationships. In particular, structural complexity looks at how many different ways system elements can be combined, and thus the potential for the system to adapt to external needs.
  2. Dynamic Complexity considers the complexity which can be observed when systems are used to perform particular tasks in an environment. There is a time element to dynamic complexity. The ways in which systems interact in the short term is directly related to system behavior (glossary); the longer term effects of using systems in an environment is related to system evolution.
  3. Finally, Socio-political complexity considers the effect of individuals or groups of people on complexity. People related complexity has two aspects, one related to the perception of a situation as complex or not, due to multiple stakeholder viewpoints within a system context and social or cultural biases which add to the wider influences on a system context. The other involves either irrational behavior of an individual or swarm-effects of many people behaving individually in ways that make sense but the emergent behavior is unpredicted and perhaps counterproductive. This latter type is based on the interactions of the people according to their various interrelationships and is often graphed using systems dynamics formalisms.

Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the consequences of changing it. It occurs when there is no simple relationship between what an individual element does and what the system as a whole will do, and when the system includes some element of adaptation or problem solving to achieve its goals in different situations. It can be effected by objective attributes of a system such as by the number, types of and diversity of system elements and relationships; or by the subjective perceptions of system observers, both due to their experience, knowledge or training or due to other socio political considerations.

This view of complex systems is very much the kind of system for which a Systems Approach (glossary) is essential.

Complexity and Engineered Systems

The views of complexity are not independent when considered across a System Context (glossary). Problem situations and potential solutions may contain both subjective and objective complexity, and structural complexity of a system-of-interest be related to dynamic complexity when the system-of-interest is used as part of the wider system in different problem scenarios. People are involved in most system contexts, as system elements and as part of the operating environment. People are also involved with systems throughout the lifetimes of those systems.

(Sheard and Mostashari 2011) also show how the different views of complexity map onto Product Systems (glossary), Service Systems (glossary) and Enterprise Systems (glossary); and to associated Development and Sustainment systems and Project organizations. Ordered systems occur as system components, and are the subject of traditional engineering. It is important to understand the behaviors of such systems when using them in a complex system. One might also need to consider both truly random or chaotic natural or social systems as part of the context of an engineered system. The main focus for systems approaches is Organised complexity, the ways we choose to structure system elements to help manage and mitigate both objective and subjective complexity.

(Sillitto 2009) considers the link between the types of system complexity and system architecture. The ability to understand, manage and respond to both objective and subjective complexity be they in the problem situation, the systems we develop or the systems we use to develop and sustain them is a key component of the Systems Approach as Applied to Engineered Systems and hence to the practice of Systems Engineering.

References

Works Cited

Axelrod, R. and M. Cohen. 1999. Harnessing Complexity: Organizational Implications of a Scientific Frontier. New York, NY, USA: Simon and Schuster.

Braha, D., A. Minai, and Y. Bar-Yam (eds.). 2006. Complex Engineered Systems: Science Meets Technology. New York, NY, USA: Springer.

Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: John Wiley & Sons.

Flood, R. L., and E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of Systems Science", 2nd ed.. New York, NY, USA: Plenum Press.

Lawson, H. W. 2010. A Journey Through the Systems Landscape. Kings College, UK: College Publications.

Kellert, S. 1993. In the Wake of Chaos: Unpredictable Order in Dynamical Systems, Chicago, IL, USA: University of Chicago Press. p. 32.

Kline, S. 1995. Foundations of Multidisciplinary Thinking. Stanford, CA, USA: Stanford University Press.

Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.

Pollock, J.T. and R. Hodgson. 2004. Adaptive Information. Hoboken, NJ, USA: John Wiley & Sons.

Senge, P.M. 1990. The Fifth Discipline: The Art & Practice of The Learning Organization. New York, NY, USA: Doubleday/Currency.

Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering." Systems Engineering, 12(4): 295-311.

Sheard, SA. and A. Mostashari. 2011. "Complexity Types: From Science to Systems Engineering." Proceedings of the 21st Annual of the International Council on Systems Engineering (INCOSE) International Symposium, 20-23 June 2011, Denver, Colorado, USA.

Sillitto H.G. 2009. "On Systems Architects and Systems Architecting: Some Thoughts on Explaining The Art and Science of System Architecting." Proceedings of the 19th Annual International Council on Systems Engineering (INCOSE) International Symposium, 20-23 July 2009, Singapore.

Weaver, W. 1948. "Science and Complexity." American Science. 36: 536-544.

Warfield, J.N. 2006. An Introduction to Systems Science. London, UK: World Scientific Publishing.

Primary References

Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.

Flood, R. L., & E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of Systems Science, 2nd ed. New York, NY, USA: Plenum Press.

Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering". Systems Engineering, 12(4): 295-311.

Additional References

Ashby, W.R. 1956. An Introduction to Cybernetics. London, UK: Chapman and Hall.

Aslaksen, E.W. 2004. "System Thermodynamics: A Model Illustrating Complexity Emerging from Simplicity". Systems Engineering, 7(3). Hoboken, NJ, USA: Wiley.

Aslaksen, E.W. 2009. Engineering Complex Systems: Foundations of Design in the Functional Domain. Boca Raton, FL, USA: CRC Press.

Aslaksen, E.W. 2011. "Elements of a Systems Engineering Ontology". Proceedings of SETE 2011, Canberra, Australia.

Eisner, H. 2005. Managing Complex Systems: Thinking Outside the Box. Hoboken, NJ, USA: John Wiley & Sons.

Jackson, S., D. Hitchins, and H. Eisner. 2010. What is the Systems Approach? INCOSE Insight 13(1) (April 2010): 41-43.

MITRE. 2011. "Systems Engineering Strategies for Uncertainty and Complexity." Systems Engineering Guide. Accessed 9 March 2011 at [[1]].

Ryan, A. 2007. "Emergence Is Coupled to Scope, Not Level, Complexity". A condensed version appeared in INCOSE Insight, 11(1) (January 2008): 23-24.


< Previous Article | Parent Article | Next Article >
SEBoK v. 1.9.1 released 30 September 2018

SEBoK Discussion

Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.

If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.

blog comments powered by Disqus