Difference between revisions of "Complexity"

From SEBoK
Jump to navigation Jump to search
m (Text replacement - "SEBoK v. 2.9, released 20 November 2023" to "SEBoK v. 2.10, released 06 May 2024")
 
(178 intermediate revisions by 15 users not shown)
Line 1: Line 1:
Complexity is one of the most important and difficult to define system concepts. Is a system's complexity in the eye of the beholder or is there inherent complexity? How should complexity be rigorously defined? How should it be measured? What are the consequences on systems engineering of dealing with higher complexity systems? Many questions abound. The knowledge about complexity is summarized in this article.
+
----
 +
'''''Lead Author:''''' ''Rick Adcock'', '''''Contributing Authors:''''' ''Hillary Sillitto, Sarah Sheard''
 +
----
 +
This article is part of the [[Systems Science]] knowledge area (KA). It gives the background of and an indication of current thinking on {{Term|Complexity (glossary)|complexity}} and how it influences {{Term|Systems Engineering (glossary)|systems engineering}} (SE) practice. 
 +
 
 +
Complexity is one of the most important and difficult to define {{Term|System (glossary)|system}} {{Term|Concept (glossary)|concepts}}. Is a system's complexity in the eye of the beholder, or is there inherent complexity in how systems are organized? Is there a single definitive definition of complexity and, if so, how can it be assessed and {{Term|Measure (glossary)|measured}}? This topic will discuss how these ideas relate to the general definitions of a system given in [[What is a System?]], and in particular to the different {{Term|Engineered System (glossary)|engineered system}} {{Term|Context (glossary)|contexts}}. This article is closely related to the [[Emergence|emergence]] topic that follows it.
 +
 
 
==Defining System Complexity==
 
==Defining System Complexity==
Weaver (Weaver 1948) gives one of the earliest definitions as the degree of difficulty in predicting the properties of a system, if the properties of the system's parts are given. Does this simple definition describe a static property of a system artifact, or a dynamic property of systems in use to solve a problem?  If complexity is related to an ability to understand systems, does it vary depending who is considering the system and why?  How do these questions relate to the distinctions between [[Natural System (glossary)]], [[Social System (glossary)]] and [[Engineered System (glossary)]] or to the idea of a [[System Context (glossary)]]? 
+
Complexity has been considered by a number of authors from various perspectives; some of the discussions of complexity relevant to systems are described in the final section of this article.  Sheard and Mostashari (Sheard and Mostashari 2011) synthesize many of these ideas to categorize complexity as follows:
 +
#'''Structural Complexity''' looks at the system elements and relationships. In particular, structural complexity looks at how many different ways system elements can be combined. Thus, it is related to the potential for the system to adapt to external needs.
 +
#'''Dynamic Complexity''' considers the complexity which can be observed when systems are used to perform particular tasks in an environment. There is a time element to dynamic complexity. The ways in which systems interact in the short term is directly related to system behavior; the longer-term effects of using systems in an environment is related to system evolution.
 +
#'''Socio-Political Complexity''' considers the effect of individuals or groups of people on complexity. People-related complexity has two aspects. One is related to the perception of a situation as complex or not complex, due to multiple stakeholder {{Term|Viewpoint (glossary)|viewpoints}} within a system context and social or cultural biases which add to the wider influences on a system context. The other involves either the “irrational” behavior of an individual or the swarm behavior of many people behaving individually in ways that make sense; however, the {{Term|Emergence (glossary)|emergent}} behavior is unpredicted and perhaps counterproductive. This latter type is based on the interactions of the people according to their various interrelationships and is often graphed using systems dynamics formalisms.
 +
 
 +
Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the consequences of changing it. It occurs when there is no simple relationship between what an individual element does and what the system as a whole will do, and when the system includes some element of adaptation or problem solving to achieve its goals in different situations. It can be affected by objective attributes of a system such as by the number, types of and diversity of system elements and relationships, or by the subjective perceptions of system observers due to their experience, knowledge, training, or other sociopolitical considerations.
  
According to (Sheard and Mostashari 2008) complexity sits on a spectrum somewhere between order and chaos. In common usage [[chaos (glossary)]] is a state of disorder or unpredictability. A chaotic system has elements which are not interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain types of dynamic system (e.g. the weather) which, although they have structure and relationships, exhibit unpredictable [[behavior (glossary)]]. These systems are deterministic; their future behavior is fully determined by their initial conditions with no random elements involved. However, their structure is such that (un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in behavior. This behavior is known as deterministic chaos, or simply chaos. Models of chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are possible at least some of the time. One might need to consider truly random or chaotic natural or social systems as part of the context of an engineered system, but such system cannot themselves be engineered.
+
This view of complex systems provides insight into the kind of system for which systems thinking and a {{Term|Systems Approach (glossary)|systems approach}} is essential.
  
Ordered systems have fixed relations between  elements and are not adaptable. (Page 2009) cites a watch as an example of an ordered system. The components of a watch are based on similar technologies with a clear mapping between form and function. If the operating environment changes outside prescribed limits or one key component is removed, the watch will cease to perform its function. Although the watch may have many components, it can be regarded as complicated but not complex. Ordered systems occur as system components, and are the subject of traditional engineering.  It is important to understand the limitations of such systems when using them in a complex system.
+
==Complexity and Engineered Systems==
  
Complex systems sit between order and chaos with combinations of elements of different types arranged in relationships which provide more than one function. This can lead to multiple ways of achieving a given outcome. Complex systems [[adaptability (glossary)|adapt (glossary]] to environmental changes or the loss of some elements. For example, if one element such as a doctor, piece of equipment or building infrastructure is removed from a hospital surgical unit, the remaining elements will continue to function as a unit, albeit with reduced effectiveness. This would be considered a complex system.
+
The different perspectives on complexity are not independent when considered across a {{Term|System Context (glossary)|systems context}}. The structural complexity of a {{Term|System-of-Interest (glossary)|system-of-interest}} (SoI) may be related to dynamic complexity when the SoI also functions as part of a wider system in different problem scenarios. People are involved in most system contexts, as part of the problem situation, as system elements and part of the operating environment. The human activity systems which we create to identify, design, build and support an {{Term|Engineered System (glossary)|engineered system}} and the wider social and business systems in which they sit are also likely to be complex and affect the complexity of the systems they produce and use.  
  
The inclusion of people in a system is often a factor in their complexity, due to the variability of human behavior as part of a system and the perceptions of people out-side the system.  (Sheard and Mostashari 2011) sort the attributes of complexity into two kinds
+
Sheard and Mostashari (2011) show the ways different views of complexity map onto {{Term|Product System (glossary)|product system}}, {{Term|Service System (glossary)|service system}} and {{Term|Enterprise System (glossary)|enterprise system}} contexts, as well as to associated development and sustainment systems and project {{Term|Organization (glossary)|organizations}}.  Ordered systems occur as system {{Term|Component (glossary)|components}} and are the subject of traditional {{Term|Engineering (glossary)|engineering}}. It is important to understand the behaviors of such systems when using them in a complex system.  One might also need to consider both truly random or chaotic natural or social systems as part of the context of an engineered system.  The main focus for systems approaches is '''organized complexity '''(see below).  This kind of complexity cannot be dealt with by traditional analysis techniques, nor can it be totally removed by the way we design or use solutions. A systems approach must be able to recognize and deal with such complexity across the life of the systems with which it interacts.
  
picture here
+
Sillitto (2014) considers the link between the types of system complexity and system {{Term|Architecture (glossary)|architecture}}. The ability to understand, manage and respond to both objective and subjective complexity in the problem situation, the systems we develop or the systems we use to develop and sustain them is a key component of the [[Systems Approach Applied to Engineered Systems]] and hence to the practice of systems engineering.
 
The attributes on the left are attributes of a system which make it complex; those on the right are the resulting effect, often as perceived by people, which make a system appear complex.  (Sillitto 2009) refers to these as Objective and Subjective complexity and associates both with problem situations and system solutions.
 
 
Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the consequences of changing it. It occurs when there is no simple relationship between what an individual element does and what the system as a whole will do, and when the system includes some element of adaptation or problem solving to achieve its goals in different situations.  It can be based on objective attributes of the system or on subjective perceptions of system observers.  This view of complex systems is very much the kind of system for which a [[Systems Approach (glossary)]] is essential.
 
  
 
==Origins and Characteristics of Complexity==
 
==Origins and Characteristics of Complexity==
  
Many systems science authors have attempted to make sense of complexity; how does it differ from what is merely complex, complicated, or intricate and how is it related to human perception or societal context?
+
This section describes some of the prevailing ideas on complexity. Various authors have used different language to express these ideas. While a number of common threads can be seen, some of the ideas take different {{Term|Viewpoint (glossary)|viewpoints}} and may be contradictory in nature.
  
Weaver provided an early viewpoint categorizing organized and disorganized complexity (Weaver, 1948). These categories and later reflections by amongst others Flood and Carson (1993) and Lawson (2010) provide the following complexity categorization:
+
One of the most widely used definitions of complexity is the degree of difficulty in predicting the properties of a system if the properties of the system's parts are given (generally attributed to Weaver).  This, in turn, is related to the number of {{Term|Element (glossary)|elements}} and connections between them.  Weaver (Weaver 1948) relates complexity to types of elements and how they interact. He describes simplicity as {{Term|Problem (glossary)|problems}} with a finite number of variables and interaction, and identifies two kinds of complexity:
 +
#'''Disorganized Complexity''' is found in a system with many loosely coupled, disorganized and equal elements, which possesses certain average properties such as temperature or pressure.  Such a system can be described by “19th Century” statistical analysis techniques.
 +
#'''Organized Complexity''' can be found in a system with many strongly coupled, organized and different elements which possess certain {{Term|Emergence (glossary)|emergent}} properties and phenomena such as those exhibited by economic, political or social systems.  Such a system cannot be described well by traditional analysis techniques.
  
*Organized simplicity occurs when there are a small number of essential factors and large number of less significant or insignificant factors. Initially a situation may seem to be complex, but on investigation the less significant and insignificant factors are taken out of the picture and the hidden simplicity is found. This is also the basis for the process of [[abstraction (glossary)]]; creating systems of greater general applicability, but with lower level of detail.  
+
Weaver's ideas about this new kind of {{Term|Complex (glossary)|complex}} problem are some of the foundational ideas of {{Term|Systems Thinking (glossary)|systems thinking}}. (See also [[Systems Thinking]].)
  
*Organized complexity is prevalent in physical and abstract systems where the structure of the system is organized in order to be understood and thus be amenable to scientists in describing complex behaviors as well as for structuring the engineering and [[Life Cycle Management (glossary)|life cycle management (glossary)]] of complex systems (Braha et al. 2006). There is a richness that must not be over-simplified.
+
Later authors, such as Flood and Carson (1993) and Lawson (2010), expand organized complexity to systems which have been organized into a {{Term|Structure (glossary)|structure}} intended to be understood and thus amenable to {{Term|Engineering (glossary)|engineering}} and {{Term|Life Cycle Management (glossary)|life cycle management}} (Braha et al. 2006). They also suggest that disorganized complexity could result from a heterogeneous {{Term|Complex (glossary)|complex}} system evolving without explicit architectural {{Term|Control (glossary)|control}} during its life (complexity creep).  This is a different use of the terms “organized” and “disorganized” to that used by Weaver.  Care should be taken in mixing these ideas
  
*Disorganized complexity occurs when there are many variables that exhibit a high level of random behavior. It can also represent the product of not having adequate control over the structure of heterogeneous complex systems that have evolved due to inadequate control over the system during its life (complexity creep).  
+
Complexity should not be confused with "complicated".  Many authors make a distinction between ordered and disordered collections of elements.
  
*People-related complexity, where perception fosters a feeling of complexity. In this context, humans become “observing systems”. We can also consider people as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or irrational behavior of individuals in particular situations is of course a vital factor in respect to complexity (Kline 1995).  
+
Ordered systems have fixed relationships between elements and are not adaptable. Page (2009) cites a watch as an example of something which can be considered an ordered system.  Such a system is complicated, with many elements working together. Its {{Term|Component (glossary)|components}} are based on similar technologies, with clear mapping between form and function. If the operating {{Term|Environment (glossary)|environment}} changes beyond prescribed limits, or one key component is removed, the watch will cease to perform its {{Term|Function (glossary)|function}}.  
  
(Senge 1990) identifies two fundamental forms of engineered systems complexity, namely detail complexity and dynamic complexity. Detail complexity arises from the number of systems elements and relationships. This complexity is related to the systems as they are; their static existence. Dynamic complexity, on the other hand, is related to the expected and even unexpected behavior of systems during their use in different problem scenarios.
+
In common usage, {{Term|Chaos (glossary)|chaos}} is a state of disorder or unpredictability characterized by elements which are not interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain dynamic systems (e.g., the weather) which, although they have structure and relationships, exhibit unpredictable {{Term|Behavior (glossary)|behavior}}. These systems may include aspects of randomness but can be described using deterministic models from which their behavior can be described given a set of initial conditions. However, their structure is such that (un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in behavior.  Such systems are referred to as deterministically chaotic or, simply, chaotic systems. {{Term|Simulation (glossary)|Simulations}} of chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are possible at least some of the time.
  
(Sheard and Mostashari, 2011) describe Structural, Dynamic and Socio-political complexity. Structural complexity looks at the system elements and relationship. In par-ticular, structural complexity looks at how many different ways system elements can be combined, and thus the potential for the system to adapt to external needs. Dynamic Complexity considers the complexity which can be observed when systems are used to perform particular tasks in an environment. There is a time element to dynamic complexity.  The ways in which systems interact in the short term is directly related to system [[behavior (glossary)]]; the longer term affects of using systems in an environment is related to system evolution.  Finally, Socio-political complexity considers the affect of individuals or groups of people on complexity. This will include the cognitive behavior of people in the system, multiple stakeholder viewpoints within a system context and social or cultural biases which add to the wider influences on a system context.
+
On a spectrum of order to complete disorder, complexity is somewhere in the middle, with more flexibility and change than complete order and more stability than complete disorder (Sheard and Mostashari 2009).  
  
==Characteristics of Complex Systems==
+
Complex systems may evolve “to the edge of chaos,” resulting in systems which can appear deterministic but which exhibit counter intuitive behavior compared to that of more ordered systems. The statistics of chance events in a complex system are often characterized by a power-law distribution, the “signature of complexity” (Sheard 2005). The power-law distribution is found in a very wide variety of natural and man-made phenomena, and it means that the probability of a low probability—large impact event is much higher than a Gaussian distribution would suggest. Such a system may react in a non-linear way to exhibit abrupt phase changes. These phase changes can be either reversible or irreversible. This has a major impact on engineered systems in terms of the occurrence, impact and public acceptance of {{Term|Risk (glossary)|risk}} and failure.
  
According to (Page, 2009) there are four characteristics of complex systems:
+
'''Objective''' complexity is an attribute of complex systems and is a measure of where a system sits on this spectrum. It is defined as the extent to which future {{Term|State (glossary)|states}} of the system cannot be predicted with certainty and precision, regardless of our knowledge of current state and history. Subjective complexity is a measure of how easy it is for an observer to understand a system or predict what it will do next. As such, it is a function of the perspective and comprehension of each individual. It is important to be prepared to mitigate subjective complexity with consistent, clear communication and strong {{Term|Stakeholder (glossary)|stakeholder}} engagement (Sillitto 2009).
*Independence of system elements.. That is, making their own decisions; although these decisions may be influenced by information from other elements and the adaptability algorithms it carries with it. (Sheard and Mostashari, 2008) refer to this characteristic as “autonomous” components.
 
*Interconnectedness: between system elements.  This may be via a physical connection, shared data or simply a visual awareness of where the other elements are and what they are doing as in the case of the flock of geese or the squadron of air-craft.
 
*Diversity simply means that system elements are different in some way, techno-logically or functionally. Element may be carrying different adaptability algo-rithms, for example.
 
*Adaptability is generally considered to be the most important characteristic of the elements of a complex system. Adaptability means that each element can do what it wants to do to support itself or the entire system. In the case of the human pilots, each pilot can make his or her own decisions to adjust to the mission of the whole squadron. (Sheard and Mostashari, 2008) refer to this characteristic as self-organizing. Sheard and Mostashari also say that complex systems adapt to their environment. Adaptability can also be achieved with software. (Pollock and Hodgson, 2004) de-scribe how this can be done in a variety of complex system types including power grids and enterprise systems.  
 
  
Complexity is also in many ways a human concept. If we looked at a hospital surgi-cal unit from the perspective of an experienced nurse, a member of the cleaning staff, a software engineer designing code for a piece of medical equipment, a typically edu-cated patient or a patient from an African village flown into the hospital after a natural disaster we can see that the education, experience and knowledge of each person may radically change their understanding of the same system. Such factors as human val-ues and beliefs, interests, capabilities as well as notions and perceptions of systems are determinants of complexity.
+
The literature has evolved to a fairly consistent definition of the characteristics of system elements and relationships for objective systems complexity. The following summary is given by Page (2009):
 +
#'''Independence''': Autonomous system elements which are able to make their own decisions, influenced by information from other elements and the adaptability algorithms the autonomous elements carry with themselves (Sheard and Mostashari 2009).
 +
#'''Interconnectedness''': System elements connect via a physical connection, shared data or simply a visual awareness of where the other elements are and what they are doing, as in the case of the flock of geese or the squadron of aircraft.
 +
#'''Diversity''': System elements which are either technologically or functionally different in some way. For example, elements may be carrying different {{Term|Adaptability (glossary)|adaptability}} algorithms.
 +
#'''Adaptability''': Self-organizing system elements which can do what they want to do to support themselves or the entire system in response to their environment (Sheard and Mostashari 2009). Adaptability is often achieved by human elements but can be achieved with software. Pollock and Hodgson (2004) describe how this can be done in a variety of complex system types, including power grids and {{Term|Enterprise (glossary)|enterprise}} systems.  
  
Warfield (Warfield 2006) developed a powerful methodology for addressing complex issues, particularly in the socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a set of interacting problems, what he called the “problematique”. The complexity is then characterized by several measures, such as the number of significant problems, their interactions, and the de-gree of consensus about the nature of the problems. Thus, we can see that how, why, where and by whom a system is used may all contribute to its complexity.  
+
Due to the variability of human behavior as part of a system and the perceptions of people outside the system, the inclusion of people in a system is often a factor in their complexity. People may be viewed as observing systems or as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or irrational behavior of individuals in particular situations is a vital factor in respect to complexity (Kline 1995).  Some of this complexity can be reduced through education, training and familiarity with a system. Some is irreducible and must be managed as part of a problem or solution. Checkland (1999) argues that a group of stakeholders will have its own world views which lead them to form different, but equally valid, understandings of a system context. These differences cannot be explained away or analyzed out, and must be understood and considered in the formulation of problems and the creation of potential solutions.
  
Some of this complexity can be reduced by education, training or simply familiarity with a system; some must be managed as part of a problem or solution. (Checkland, 1999) argues that a group of stakeholders will have their own world views which lead them to form different, but equally valid, understandings of a system context. These differences cannot be explained away or analyzed out, but must be understood and considered in the formulation of problems or the creation of potential solutions.
+
Warfield (2006) developed a powerful methodology for addressing complex issues, particularly in the socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a set of interacting problems - what he called the “problematique”. The complexity is then characterized via several {{Term|Measure (glossary)|measures}}, such as the number of significant problems, their interactions and the degree of consensus about the nature of the problems. What becomes clear is that how, why, where and by whom a system is used may all contribute to its perceived complexity.  
  
==Complexity and Context==
+
Sheard and Mostashari (2011) sort the attributes of complexity into causes and effects. Attributes that cause complexity include being non-linear; emergent; chaotic; adaptive; tightly coupled; self-organized; decentralized; open; political (as opposed to scientific); and multi-scale; as well as having many pieces. The effects of those attributes which make a system be perceived as complex include being uncertain; difficult to understand; unpredictable; uncontrollable; unstable; unrepairable; unmaintainable and costly; having unclear cause and effect; and taking too long to build.
  
The views of complexity are not independent when considered across a system Hier-archy (Glossary).  System Context (Glossary) is a concept used to focus on an Engi-neered system-of-interest, while still considering wider Holistic (Glossary) system and environmental relationships.  Problem situations and potential solutions may con-tain both subjective and objective complexity; while structural complexity at one level will be related to dynamic complexity at higher levels.  People are involved in most system contexts, as system elements and as part of the operating environment.  People are also involved with systems through-life (Glossary). 
+
==Complexity and its relationship with difficultly==
  
(Sillitto, 2009) considers the link between the types of complexity and system archi-tectures, but this can be generalized to consider how we deal with complexity in the applications of a Systems Approach (link).  Sheard and Mostashari, 2011) also show how the different views of complexity map onto Product, Service and Enterprise sys-tems; and to associated Development and Sustainment systems and Project organiza-tions.
+
The Collins Dictionary (2024) defines difficulty as “a task, problem, etc., that is hard to deal with”, while the Oxford English dictionary (2024) defines difficult as “Needing much effort or skill to accomplish, deal with, or understand”.
  
The definition of System Complexity (glossary) used in the SEBoK covers two views of complexity within a system context: the structural complexity of the system-of-interest and wider system; and the dynamic complexity when the system-of-interest is used as part of the wider system in different problem scenarios. The differing percep-tions of this complexity by both individuals and social groups of people involved in creating, using or interacting with a system is recognized.  In many ways the Systems Approach (Glossary) exists to deal with these complexity issues.
+
There are many types of systems that fall into the difficult category: Complex, complicated and constrained systems are examples of difficult systems from the perspective of systems engineering.  Complicated or intricate systems can be hard to understand, and hence difficult. Complex systems are hard to predict because they are not understood, or otherwise leading to a breakdown in causality, and hence are difficult. Achieving an outcome when heavily constrained by cost, resource or time can also be very difficult.  
 +
 
 +
A richer understanding of Complex systems and proposed definitions for complex, complicated and simple systems are available in the “A Complexity Primer for Systems Engineers, Revision 1, 2021.  
  
 
==References==
 
==References==
 
   
 
   
===Citations===
+
===Works Cited===
  
Sillitto H G, 2009. On Systems Architects and Systems Architecting: some thoughts on explaining the art and science of system architecting.
+
Axelrod, R. and M. Cohen. 1999. ''Harnessing Complexity: Organizational Implications of a Scientific Frontier''. New York, NY, USA: Simon and Schuster.
  
SHEARD, S. A. & MOSTASHARI, A. 2008. Principles of Complex Systems for Systems Engineering. Systems Engineering, 12, 295-311.
+
Braha, D., A. Minai, and Y. Bar-Yam (eds.). 2006. ''Complex Engineered Systems: Science Meets Technology''. New York, NY, USA: Springer.
  
Sheard, Sarah S. and Ali Mostashari - , 2011. Complexity Types: From Science to Systems Engineering. Proc. Twenty-first Annual International Symposium of the International Council on Systems Engineering, Denver, Colorado.
+
Checkland, P. 1999''Systems Thinking, Systems Practice''. New York, NY, USA: John Wiley & Sons.  
  
Kellert, S. 1993. In the Wake of Chaos: Unpredictable Order in Dynamical Systems, University of Chicago Press, , p 32, ISBN 0-226-42976-8.
+
Flood, R. L., and E.R. Carson. 1993. ''Dealing with Complexity: An Introduction to The Theory and Application of Systems Science'', 2nd ed. New York, NY, USA: Plenum Press.
  
Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.
+
Kellert, S. 1993. ''In the Wake of Chaos: Unpredictable Order in Dynamical Systems'', Chicago, IL, USA: University of Chicago Press.
  
CHECKLAND, P. 1999. Systems Thinking, Systems Practice, New York, John Wiley & Sons.  
+
Kline, S. 1995. ''Foundations of Multidisciplinary Thinking''. Stanford, CA, USA: Stanford University Press.
  
Weaver, W. 1948, Science and Complexity, American Science, 36 pp 536-544.
+
Lawson, H. W. 2010. ''A Journey Through the Systems Landscape''.  Kings College, UK: College Publications.
  
Flood, R. L., & Carson, E. R. 1993. Dealing with complexity: An introduction to the theory and application of systems science (2nd ed.). New York: Plenum Press.
+
Page, Scott E. 2009. ''Understanding Complexity''. Chantilly, VA, USA: The Teaching Company.
  
Lawson, H. W. 2010. A Journey Through the Systems Landscape, College Publications, Kings College, UK.  
+
Pollock, J.T. and R. Hodgson. 2004. ''Adaptive Information''. Hoboken, NJ, USA: John Wiley & Sons.
  
Braha, D., A. Minai, and Y. Bar-Yam, eds. Complex engineered systems: Science meets tech-nology, Springer, 2006.
+
Senge, P.M. 1990. ''The Fifth Discipline: The Art & Practice of The Learning Organization''. New York, NY, USA: Doubleday/Currency.
  
Axelrod, R. and M. Cohen, Harnessing Complexity: Organizational Implications of a Scien-tific Frontier, Simon and Schuster, 1999.
+
Sheard, S.A. 2005. "Practical applications of complexity theory for systems engineers". ''Proceedings of the Fifteenth Annual International Council on Systems Engineering,'' vol. 15, no. 1.
  
Kline, S. 1995, Foundations of Multidisciplinary Thinking, Stanford University Press.
+
Sheard, S.A. and A. Mostashari. 2009. "Principles of complex systems for systems engineering." ''Systems Engineering'', vol. 12, no. 4, pp. 295-311.
  
Senge, P.M. 1990, The Fifth Discipline: The Art & Practice of The Learning Organization, Currency Doubleday, New York.
+
Sheard, SA. and A. Mostashari. 2011. "Complexity types: From science to systems engineering."  Proceedings of the 21st Annual of the International Council on Systems Engineering (INCOSE) International Symposium, Denver, Colorado, USA, 20-23 June 2011.
  
Warfield, John N. 2006. An Introduction to Systems Science, World Scientific Publishing
+
Sillitto, H. 2014. "Architecting Systems - Concepts, Principles and Practice", London, UK: College Publications.
  
POLLOCK, J. T. & HODGSON, R. 2004. Adaptive Information, Hoboken, NJ, John Wiley & Sons.
+
Warfield, J.N. 2006. ''An Introduction to Systems Science''.  London, UK: World Scientific Publishing.
 +
 
 +
Weaver, W. 1948. "Science and complexity." ''American Science,'' vol. 36, pp. 536-544.
  
 
===Primary References===
 
===Primary References===
Page, Scott E. 2009. [[Understanding Complexity]]. Chantilly, VA, USA: The Teaching Company.
+
Flood, R. L., & E.R. Carson. 1993. ''[[Dealing with Complexity]]: An Introduction to The Theory and Application of Systems Science'', 2nd ed. New York, NY, USA: Plenum Press
  
Flood, R. L., & Carson, E. R. 1993. [[Dealing with Complexity]]: An introduction to the theory and application of systems science. (2nd ed.). New York: Plenum Press.
+
INCOSE, 2021. A Complexity Primer for Systems Engineers.
  
SHEARD, S. A. & MOSTASHARI, A. 2008. [[Principles of Complex Systems for Systems Engineering]]. Systems Engineering, 12, 295-311.
+
Page, Scott E. 2009. ''[[Understanding Complexity]]''. Chantilly, VA, USA: The Teaching Company.
 +
 
 +
Sheard, S.A. and A. Mostashari. 2009. "[[Principles of Complex Systems for Systems Engineering|Principles of complex systems for systems engineering]]". ''Systems Engineering'', vol. 12, no. 4, pp. 295-311.
  
 
===Additional References===
 
===Additional References===
  
EISNER, H. 2005. Managing Complex Systems: Thinking Outside the Box, Hoboken, NJ, John Wiley & Sons.
+
Ashby, W.R. 1956. ''An Introduction to Cybernetics''. London, UK: Chapman and Hall.
  
JACKSON, S., HITCHINS, D. & EISNER, H. 2010. What is the Systems Approach? INCOSE Insight. International Council on Systems Engineering.
+
Aslaksen, E.W. 2004. "System thermodynamics: A model illustrating complexity emerging from simplicity". ''Systems Engineering'', vol. 7, no. 3. Hoboken, NJ, USA: Wiley.
  
Aslaksen, Erik W. 2004. System Thermodynamics: A Model Illustrating Complexity Emerging from Simplicity, Systems Engineering, Vol. 7, No.3.
+
Aslaksen, E.W. 2009. ''Engineering Complex Systems: Foundations of Design in the Functional Domain''.  Boca Raton, FL, USA: CRC Press.
  
- , 2009. Engineering Complex Systems: Foundations of Design in the Functional Domain, CRC Press, Boca Raton.
+
Aslaksen, E.W. 2011. "Elements of a systems engineering ontology". Proceedings of SETE 2011, Canberra, Australia.
  
- , 2011. Elements of a Systems Engineering Ontology, Proc. SETE 2011, Engineers Austra-lia, Canberra.
+
Eisner, H. 2005. ''Managing Complex Systems: Thinking Outside the Box''. Hoboken, NJ, USA: John Wiley & Sons.
  
Ashby, W.R. 1956. An Introduction to Cybernetics, Chapman and Hall, London.
+
Jackson, S., D. Hitchins, and H. Eisner. 2010. “What is the Systems Approach?” INCOSE ''Insight,'' vol. 13, no. 1, April, pp.  41-43, 2010.
  
Ryan, A. 2007, Emergence is coupled to scope, not level, Complexity. A condensed version appeared in Insight, the newsletter of INCOSE, vol.11, no.1, January 2008, pp. 23,24.
+
MITRE. 2011. "Systems engineering strategies for uncertainty and complexity."  ''Systems Engineering Guide.''  Accessed 9 March 2011. Available at: http://www.mitre.org/work/systems_engineering/guide/enterprise_engineering/comprehensive_viewpoint/sys_engineering_strategies_uncertainty_complexity.html.
  
----
+
Ryan, A. 2007. "Emergence is coupled to scope, not Level, complexity".  A condensed version appeared in INCOSE ''Insight'', vol. 11, no. 1, January, pp.  23-24, 2008.
====Article Discussion====
 
  
[[{{TALKPAGENAME}}|[Go to discussion page]]]
+
Sillitto H.G. 2009. "On systems architects and systems architecting: Some thoughts on explaining the art and science of system architecting."  Proceedings of the 19th Annual International Council on Systems Engineering (INCOSE) International Symposium, Singapore, 20-23 July 2009.
<center>[[Overview of System Concepts|<- Previous Article]] | [[System Concepts|Parent Article]] | [[Emergence|Next Article ->]]</center>
 
Description : The section is descriptive
 
  
Balance: ok
+
----
 
+
<center>[[Systems Approaches|< Previous Article]] | [[Systems Science|Parent Article]] | [[Emergence|Next Article >]]</center>
Scope: ok
 
 
 
Writing: ok
 
 
 
Citations: ok
 
 
 
references:ok
 
 
 
Glossary:ok note one glossary term in red
 
 
 
Cross-Linkages: None noted
 
 
 
Maturity: good
 
  
I believe this topic is descriptive and balanced. The article ideas are from valid authors known for there work on the subject of complexity.  
+
<center>'''SEBoK v. 2.10, released 06 May 2024'''</center>
It is my opinion that the topic is ready for version .50 release. -John Snoderly Aug 21, 2011
 
  
==Signatures==
 
--[[User:Radcock|Radcock]] 19:21, 15 August 2011 (UTC)
 
 
[[Category:Part 2]][[Category:Topic]]
 
[[Category:Part 2]][[Category:Topic]]
 +
[[Category:Systems Science]]

Latest revision as of 22:03, 2 May 2024


Lead Author: Rick Adcock, Contributing Authors: Hillary Sillitto, Sarah Sheard


This article is part of the Systems Science knowledge area (KA). It gives the background of and an indication of current thinking on complexitycomplexity and how it influences systems engineeringsystems engineering (SE) practice.

Complexity is one of the most important and difficult to define systemsystem conceptsconcepts. Is a system's complexity in the eye of the beholder, or is there inherent complexity in how systems are organized? Is there a single definitive definition of complexity and, if so, how can it be assessed and measuredmeasured? This topic will discuss how these ideas relate to the general definitions of a system given in What is a System?, and in particular to the different engineered systemengineered system contextscontexts. This article is closely related to the emergence topic that follows it.

Defining System Complexity

Complexity has been considered by a number of authors from various perspectives; some of the discussions of complexity relevant to systems are described in the final section of this article. Sheard and Mostashari (Sheard and Mostashari 2011) synthesize many of these ideas to categorize complexity as follows:

  1. Structural Complexity looks at the system elements and relationships. In particular, structural complexity looks at how many different ways system elements can be combined. Thus, it is related to the potential for the system to adapt to external needs.
  2. Dynamic Complexity considers the complexity which can be observed when systems are used to perform particular tasks in an environment. There is a time element to dynamic complexity. The ways in which systems interact in the short term is directly related to system behavior; the longer-term effects of using systems in an environment is related to system evolution.
  3. Socio-Political Complexity considers the effect of individuals or groups of people on complexity. People-related complexity has two aspects. One is related to the perception of a situation as complex or not complex, due to multiple stakeholder viewpointsviewpoints within a system context and social or cultural biases which add to the wider influences on a system context. The other involves either the “irrational” behavior of an individual or the swarm behavior of many people behaving individually in ways that make sense; however, the emergentemergent behavior is unpredicted and perhaps counterproductive. This latter type is based on the interactions of the people according to their various interrelationships and is often graphed using systems dynamics formalisms.

Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the consequences of changing it. It occurs when there is no simple relationship between what an individual element does and what the system as a whole will do, and when the system includes some element of adaptation or problem solving to achieve its goals in different situations. It can be affected by objective attributes of a system such as by the number, types of and diversity of system elements and relationships, or by the subjective perceptions of system observers due to their experience, knowledge, training, or other sociopolitical considerations.

This view of complex systems provides insight into the kind of system for which systems thinking and a systems approachsystems approach is essential.

Complexity and Engineered Systems

The different perspectives on complexity are not independent when considered across a systems contextsystems context. The structural complexity of a system-of-interestsystem-of-interest (SoI) may be related to dynamic complexity when the SoI also functions as part of a wider system in different problem scenarios. People are involved in most system contexts, as part of the problem situation, as system elements and part of the operating environment. The human activity systems which we create to identify, design, build and support an engineered systemengineered system and the wider social and business systems in which they sit are also likely to be complex and affect the complexity of the systems they produce and use.

Sheard and Mostashari (2011) show the ways different views of complexity map onto product systemproduct system, service systemservice system and enterprise systementerprise system contexts, as well as to associated development and sustainment systems and project organizationsorganizations. Ordered systems occur as system componentscomponents and are the subject of traditional engineeringengineering. It is important to understand the behaviors of such systems when using them in a complex system. One might also need to consider both truly random or chaotic natural or social systems as part of the context of an engineered system. The main focus for systems approaches is organized complexity (see below). This kind of complexity cannot be dealt with by traditional analysis techniques, nor can it be totally removed by the way we design or use solutions. A systems approach must be able to recognize and deal with such complexity across the life of the systems with which it interacts.

Sillitto (2014) considers the link between the types of system complexity and system architecturearchitecture. The ability to understand, manage and respond to both objective and subjective complexity in the problem situation, the systems we develop or the systems we use to develop and sustain them is a key component of the Systems Approach Applied to Engineered Systems and hence to the practice of systems engineering.

Origins and Characteristics of Complexity

This section describes some of the prevailing ideas on complexity. Various authors have used different language to express these ideas. While a number of common threads can be seen, some of the ideas take different viewpointsviewpoints and may be contradictory in nature.

One of the most widely used definitions of complexity is the degree of difficulty in predicting the properties of a system if the properties of the system's parts are given (generally attributed to Weaver). This, in turn, is related to the number of elementselements and connections between them. Weaver (Weaver 1948) relates complexity to types of elements and how they interact. He describes simplicity as problemsproblems with a finite number of variables and interaction, and identifies two kinds of complexity:

  1. Disorganized Complexity is found in a system with many loosely coupled, disorganized and equal elements, which possesses certain average properties such as temperature or pressure. Such a system can be described by “19th Century” statistical analysis techniques.
  2. Organized Complexity can be found in a system with many strongly coupled, organized and different elements which possess certain emergentemergent properties and phenomena such as those exhibited by economic, political or social systems. Such a system cannot be described well by traditional analysis techniques.

Weaver's ideas about this new kind of complexcomplex problem are some of the foundational ideas of systems thinkingsystems thinking. (See also Systems Thinking.)

Later authors, such as Flood and Carson (1993) and Lawson (2010), expand organized complexity to systems which have been organized into a structurestructure intended to be understood and thus amenable to engineeringengineering and life cycle managementlife cycle management (Braha et al. 2006). They also suggest that disorganized complexity could result from a heterogeneous complexcomplex system evolving without explicit architectural controlcontrol during its life (complexity creep). This is a different use of the terms “organized” and “disorganized” to that used by Weaver. Care should be taken in mixing these ideas

Complexity should not be confused with "complicated". Many authors make a distinction between ordered and disordered collections of elements.

Ordered systems have fixed relationships between elements and are not adaptable. Page (2009) cites a watch as an example of something which can be considered an ordered system. Such a system is complicated, with many elements working together. Its componentscomponents are based on similar technologies, with clear mapping between form and function. If the operating environmentenvironment changes beyond prescribed limits, or one key component is removed, the watch will cease to perform its functionfunction.

In common usage, chaoschaos is a state of disorder or unpredictability characterized by elements which are not interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain dynamic systems (e.g., the weather) which, although they have structure and relationships, exhibit unpredictable behaviorbehavior. These systems may include aspects of randomness but can be described using deterministic models from which their behavior can be described given a set of initial conditions. However, their structure is such that (un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in behavior. Such systems are referred to as deterministically chaotic or, simply, chaotic systems. SimulationsSimulations of chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are possible at least some of the time.

On a spectrum of order to complete disorder, complexity is somewhere in the middle, with more flexibility and change than complete order and more stability than complete disorder (Sheard and Mostashari 2009).

Complex systems may evolve “to the edge of chaos,” resulting in systems which can appear deterministic but which exhibit counter intuitive behavior compared to that of more ordered systems. The statistics of chance events in a complex system are often characterized by a power-law distribution, the “signature of complexity” (Sheard 2005). The power-law distribution is found in a very wide variety of natural and man-made phenomena, and it means that the probability of a low probability—large impact event is much higher than a Gaussian distribution would suggest. Such a system may react in a non-linear way to exhibit abrupt phase changes. These phase changes can be either reversible or irreversible. This has a major impact on engineered systems in terms of the occurrence, impact and public acceptance of riskrisk and failure.

Objective complexity is an attribute of complex systems and is a measure of where a system sits on this spectrum. It is defined as the extent to which future statesstates of the system cannot be predicted with certainty and precision, regardless of our knowledge of current state and history. Subjective complexity is a measure of how easy it is for an observer to understand a system or predict what it will do next. As such, it is a function of the perspective and comprehension of each individual. It is important to be prepared to mitigate subjective complexity with consistent, clear communication and strong stakeholderstakeholder engagement (Sillitto 2009).

The literature has evolved to a fairly consistent definition of the characteristics of system elements and relationships for objective systems complexity. The following summary is given by Page (2009):

  1. Independence: Autonomous system elements which are able to make their own decisions, influenced by information from other elements and the adaptability algorithms the autonomous elements carry with themselves (Sheard and Mostashari 2009).
  2. Interconnectedness: System elements connect via a physical connection, shared data or simply a visual awareness of where the other elements are and what they are doing, as in the case of the flock of geese or the squadron of aircraft.
  3. Diversity: System elements which are either technologically or functionally different in some way. For example, elements may be carrying different adaptabilityadaptability algorithms.
  4. Adaptability: Self-organizing system elements which can do what they want to do to support themselves or the entire system in response to their environment (Sheard and Mostashari 2009). Adaptability is often achieved by human elements but can be achieved with software. Pollock and Hodgson (2004) describe how this can be done in a variety of complex system types, including power grids and enterpriseenterprise systems.

Due to the variability of human behavior as part of a system and the perceptions of people outside the system, the inclusion of people in a system is often a factor in their complexity. People may be viewed as observing systems or as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or irrational behavior of individuals in particular situations is a vital factor in respect to complexity (Kline 1995). Some of this complexity can be reduced through education, training and familiarity with a system. Some is irreducible and must be managed as part of a problem or solution. Checkland (1999) argues that a group of stakeholders will have its own world views which lead them to form different, but equally valid, understandings of a system context. These differences cannot be explained away or analyzed out, and must be understood and considered in the formulation of problems and the creation of potential solutions.

Warfield (2006) developed a powerful methodology for addressing complex issues, particularly in the socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a set of interacting problems - what he called the “problematique”. The complexity is then characterized via several measuresmeasures, such as the number of significant problems, their interactions and the degree of consensus about the nature of the problems. What becomes clear is that how, why, where and by whom a system is used may all contribute to its perceived complexity.

Sheard and Mostashari (2011) sort the attributes of complexity into causes and effects. Attributes that cause complexity include being non-linear; emergent; chaotic; adaptive; tightly coupled; self-organized; decentralized; open; political (as opposed to scientific); and multi-scale; as well as having many pieces. The effects of those attributes which make a system be perceived as complex include being uncertain; difficult to understand; unpredictable; uncontrollable; unstable; unrepairable; unmaintainable and costly; having unclear cause and effect; and taking too long to build.

Complexity and its relationship with difficultly

The Collins Dictionary (2024) defines difficulty as “a task, problem, etc., that is hard to deal with”, while the Oxford English dictionary (2024) defines difficult as “Needing much effort or skill to accomplish, deal with, or understand”.

There are many types of systems that fall into the difficult category: Complex, complicated and constrained systems are examples of difficult systems from the perspective of systems engineering.  Complicated or intricate systems can be hard to understand, and hence difficult. Complex systems are hard to predict because they are not understood, or otherwise leading to a breakdown in causality, and hence are difficult. Achieving an outcome when heavily constrained by cost, resource or time can also be very difficult.

A richer understanding of Complex systems and proposed definitions for complex, complicated and simple systems are available in the “A Complexity Primer for Systems Engineers, Revision 1, 2021.

References

Works Cited

Axelrod, R. and M. Cohen. 1999. Harnessing Complexity: Organizational Implications of a Scientific Frontier. New York, NY, USA: Simon and Schuster.

Braha, D., A. Minai, and Y. Bar-Yam (eds.). 2006. Complex Engineered Systems: Science Meets Technology. New York, NY, USA: Springer.

Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: John Wiley & Sons.

Flood, R. L., and E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of Systems Science, 2nd ed. New York, NY, USA: Plenum Press.

Kellert, S. 1993. In the Wake of Chaos: Unpredictable Order in Dynamical Systems, Chicago, IL, USA: University of Chicago Press.

Kline, S. 1995. Foundations of Multidisciplinary Thinking. Stanford, CA, USA: Stanford University Press.

Lawson, H. W. 2010. A Journey Through the Systems Landscape. Kings College, UK: College Publications.

Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.

Pollock, J.T. and R. Hodgson. 2004. Adaptive Information. Hoboken, NJ, USA: John Wiley & Sons.

Senge, P.M. 1990. The Fifth Discipline: The Art & Practice of The Learning Organization. New York, NY, USA: Doubleday/Currency.

Sheard, S.A. 2005. "Practical applications of complexity theory for systems engineers". Proceedings of the Fifteenth Annual International Council on Systems Engineering, vol. 15, no. 1.

Sheard, S.A. and A. Mostashari. 2009. "Principles of complex systems for systems engineering." Systems Engineering, vol. 12, no. 4, pp. 295-311.

Sheard, SA. and A. Mostashari. 2011. "Complexity types: From science to systems engineering." Proceedings of the 21st Annual of the International Council on Systems Engineering (INCOSE) International Symposium, Denver, Colorado, USA, 20-23 June 2011.

Sillitto, H. 2014. "Architecting Systems - Concepts, Principles and Practice", London, UK: College Publications.

Warfield, J.N. 2006. An Introduction to Systems Science. London, UK: World Scientific Publishing.

Weaver, W. 1948. "Science and complexity." American Science, vol. 36, pp. 536-544.

Primary References

Flood, R. L., & E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of Systems Science, 2nd ed. New York, NY, USA: Plenum Press

INCOSE, 2021. A Complexity Primer for Systems Engineers.

Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.

Sheard, S.A. and A. Mostashari. 2009. "Principles of complex systems for systems engineering". Systems Engineering, vol. 12, no. 4, pp. 295-311.

Additional References

Ashby, W.R. 1956. An Introduction to Cybernetics. London, UK: Chapman and Hall.

Aslaksen, E.W. 2004. "System thermodynamics: A model illustrating complexity emerging from simplicity". Systems Engineering, vol. 7, no. 3. Hoboken, NJ, USA: Wiley.

Aslaksen, E.W. 2009. Engineering Complex Systems: Foundations of Design in the Functional Domain. Boca Raton, FL, USA: CRC Press.

Aslaksen, E.W. 2011. "Elements of a systems engineering ontology". Proceedings of SETE 2011, Canberra, Australia.

Eisner, H. 2005. Managing Complex Systems: Thinking Outside the Box. Hoboken, NJ, USA: John Wiley & Sons.

Jackson, S., D. Hitchins, and H. Eisner. 2010. “What is the Systems Approach?” INCOSE Insight, vol. 13, no. 1, April, pp. 41-43, 2010.

MITRE. 2011. "Systems engineering strategies for uncertainty and complexity." Systems Engineering Guide. Accessed 9 March 2011. Available at: http://www.mitre.org/work/systems_engineering/guide/enterprise_engineering/comprehensive_viewpoint/sys_engineering_strategies_uncertainty_complexity.html.

Ryan, A. 2007. "Emergence is coupled to scope, not Level, complexity". A condensed version appeared in INCOSE Insight, vol. 11, no. 1, January, pp. 23-24, 2008.

Sillitto H.G. 2009. "On systems architects and systems architecting: Some thoughts on explaining the art and science of system architecting." Proceedings of the 19th Annual International Council on Systems Engineering (INCOSE) International Symposium, Singapore, 20-23 July 2009.


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.10, released 06 May 2024