Complexity
This article forms part of the Systems Fundamentals Knowledge Area. It gives the background and an indication of current thinking on complexity and how it influences Systems Engineering Practice.
complexity is one of the most important and difficult to define system concepts. Is a system's complexity in the eye of the beholder or is there inherent complexity in how systems are organized? Is there a single definitive definition of complexity and, if so, how can it be assessed and measured? This article will discuss how these ideas relate to the general definitions of system given in What is a System?, and in particular to the different engineered system contexts. This article is closely related to the Emergence article which follows it.
Origins and Characteristics of Complexity
In this section some of the prevailing ideas on complexity are described. Various authors have used different language to express these ideas and, while a number of common threads can be seen, some of the ideas take different viewpoints or are even contradictory.
One of the most widely used definitions of complexity is the degree of difficulty in predicting the properties of a system, if the properties of the system's parts are given (generally attributed to Weaver). This in turn is related to the number of elements and connections between them.
Warren Weaver (Weaver 48) relates complexity to types of elements and how they interact. He describes Simplicity as problems with a finite number of variables and interaction, and identifies two kinds of complexity:
- Disorganized Complexity occurs in a system with many loosely coupled, disorganized and equal elements, which possesses certain average properties such as temperature or pressure. Such a system can be described by “19th Century” statistical analysis techniques.
- Organized Complexity can be found in a system with many strongly coupled, organized and different elements, which possess certain emergent properties and phenonema such as those exhibited by economic, political or social systems.. Such a system can not be described well by traditional analysis techniques.
Weavers ideas about this new kind of complex problem are one of the foundational ideas of systems thinking.
Later authors, such as (Flood and Carson 1993) and (Lawson 2010), expand Organized Complexity to systems which have been organized into a structure intended to be understood and thus amenable to engineering and Life Cycle Management (Braha et al. 2006). Disorganized Complexity can also result from a heterogeneous complex system evolving without explicit architectural control during its life (complexity creep).
However, complexity should not be confused with complicated. Complexity is a system property related to the kinds of elements and relationships not simply to their number.
Ordered systems have fixed relations between elements and are not adaptable. (Page 2009) cites a watch as an example of something which can be considered an ordered system. Such a system is complicated, with many elements working together. Its components are based on similar technologies with a clear mapping between form and function. If the operating environment changes outside prescribed limits or one key component is removed, the watch will cease to perform its function.
In common usage chaos is a state of disorder or unpredictability characterized by elements which are not interconnected and behave randomly with no adaptation or control. Chaos Theory (Kellert 1993) is applied to certain dynamic system (e.g. the weather) which, although they have structure and relationships, exhibit unpredictable behavior . These systems may include aspects of randomness but can be described by deterministic models from which their behavior can be described given a set of initial conditions. However, their structure is such that (un-measurably) small perturbations in inputs or environmental conditions may result in unpredictable changes in behaviour. Such systems are referred to as Deterministically Chaotic, or simply Chaotic Systems. Simulations of chaotic systems can be created and, with increases in computing power, reasonable predictions of behavior are possible at least some of the time.
On a spectrum of order to complete disorder, complexity is somewhere in the middle, with more flexibility and change than complete order, and more stability than complete disorder (Sheard and Mostashari 2009).
Complex systems may evolve “to the edge of chaos”, resulting in systems which can appear deterministic but which exhibit counterintuitive behavior compared to that of more ordered systems. The statistics of chance events in a complex system are often characterised by a Power Law distribution, the “signature of complexity” [Sheard 2005]. The Power Law distribution is found in a very wide variety of natural and man-made phenomena and it means that the probability of a low probability—large impact event is much higher than a Gaussian distribution would suggest. Such a system may react in a non-linear way to exhibit abrupt “phase changes”, which may be reversible or irreversible. This has a major impact on engineered systems in terms of the occurrence, impact and public acceptance of risk and failure.
Objective complexity is an attribute of complex systems and is a measure of where a system sits on this spectrum. It is defined as the extent to which future states of the system cannot be predicted with certainty and precision however good our knowledge of current state and history. Subjective complexity is a measure of how easy it is for an observer to understand a system or predict what it will do next. As such it is a function of the perspective and comprehension of each individual. It is important to be prepared to mitigate subjective complexity by consistent, clear communication and good stakeholder engagement (Sillitto, 2009).
The literature has evolved to a fairly consistent definition of the characteristics of system elements and relationships for objective systems complexity; the following summary is given by (Page 2009):
- Independence: Autonomous system elements which are able to make their own decisions; influenced by information from other elements and the adaptability algorithms it carries with it (Sheard and Mostashari 2009)..
- Interconnectedness: System elements connect via a physical connection, shared data or simply a visual awareness of where the other elements are and what they are doing as in the case of the flock of geese or the squadron of aircraft.
- Diversity: System elements which are different in some way either technologically or functionally. Elements may be carrying different adaptability algorithms, for example.
- Adaptability: Self-organizing system element which can do what it wants to do to support itself or the entire system in response to their environment (Sheard and Mostashari 2009). Adaptability is often achieved by human elements but can be achieved with software. (Pollock and Hodgson 2004) describe how this can be done in a variety of complex system types including power grids and enterprise systems. .
The inclusion of people in a system is often a factor in their complexity, due to the variability of human behavior as part of a system and the perceptions of people out-side the system. People may be viewed as “observing systems” or as system elements which contribute to the other types of complexity (Axelrod and Cohen 1999). The rational or irrational behavior of individuals in particular situations is of course a vital factor in respect to complexity (Kline 1995). Some of this complexity can be reduced by education, training or familiarity with a system; some is irreducible, and must be managed as part of a problem or solution. (Checkland 1999) argues that a group of stakeholders will have its own world views which lead them to form different, but equally valid, understandings of a system context. These differences cannot be explained away or analyzed out, but must be understood and considered in the formulation of problems or the creation of potential solutions.
(Warfield 2006) developed a powerful methodology for addressing complex issues, particularly in the socio-economic field, based on a relevant group of people developing an understanding of the issue in the form of a set of interacting problems - what he called the “problematique”. The complexity is then characterized by several measures, such as the number of significant problems, their interactions, and the degree of consensus about the nature of the problems. Thus, what becomes clear is that how, why, where and by whom a system is used may all contribute to its perceived complexity.
(Sheard and Mostashari 2011) sort the attributes of complexity into causes and effects. Attributes that cause complexity include: many pieces, nonlinear, emergent, chaotic, adaptive, tightly coupled, self-organized, decentralized, open, political (vs. scientific), and multi-scale. The effects of those attributes which make a system seem complex, often as perceived by people, include: uncertain, difficult to understand, unclear cause and effect, unpredictable, uncontrollable, unstable, unrepairable and unmaintainable, costly, and takes too long to build.
Defining System Complexity
(Sheard and Mostashari 2011) synthesize many of the ideas described above to categorize complexity as follows:
- Structural Complexity looks at the system elements and relationships. In particular, structural complexity looks at how many different ways system elements can be combined, and thus is related to the potential for the system to adapt to external needs.
- Dynamic Complexity considers the complexity which can be observed when systems are used to perform particular tasks in an environment. There is a time element to dynamic complexity. The ways in which systems interact in the short term is directly related to system behavior (glossary); the longer term effects of using systems in an environment is related to system evolution.
- Finally, Socio-political Complexity considers the effect of individuals or groups of people on complexity.People-related complexity has two aspects. One is related to the perception of a situation as complex or not, due to multiple stakeholder viewpoints within a system context and social or cultural biases which add to the wider influences on a system context. The other involves either “irrational” behavior of an individual or swarm-effects of many people behaving individually in ways that make sense but the emergent behavior is unpredicted and perhaps counterproductive. This latter type is based on the interactions of the people according to their various interrelationships and is often graphed using systems dynamics formalisms.
Thus, complexity is a measure of how difficult it is to understand how a system will behave or to predict the consequences of changing it. It occurs when there is no simple relationship between what an individual element does and what the system as a whole will do, and when the system includes some element of adaptation or problem solving to achieve its goals in different situations. It can be affected by objective attributes of a system such as by the number, types of and diversity of system elements and relationships; or by the subjective perceptions of system observers, both due to their experience, knowledge or training or due to other socio political considerations.
This view of complex systems provides insight into the kind of system for which Systems Thinking and a systems approach is essential.
Complexity and Engineered Systems
The views of complexity are not independent when considered across a system context . Problem situations and potential solutions may contain both subjective and objective complexity, and structural complexity of a system-of-interest be related to dynamic complexity when the system-of-interest is used as part of the wider system in different problem scenarios. People are involved in most system contexts, as system elements and as part of the operating environment. People are also involved with systems throughout the lifetimes of those systems.
(Sheard and Mostashari 2011) also show how the different views of complexity map onto product system , service system and enterprise system contexts; and to associated Development and Sustainment systems and Project organizations. Ordered systems occur as system components, and are the subject of traditional engineering. It is important to understand the behaviors of such systems when using them in a complex system. One might also need to consider both truly random or chaotic natural or social systems as part of the context of an engineered system. The main focus for systems approaches is Organised Complexity, the ways we choose to structure system elements to help manage and mitigate both objective and subjective complexity.
(Sillitto 2009) considers the link between the types of system complexity and system architecture . The ability to understand, manage and respond to both objective and subjective complexity be they in the problem situation, the systems we develop or the systems we use to develop and sustain them is a key component of the Systems Approach Applied to Engineered Systems and hence to the practice of Systems Engineering.
References
Works Cited
Axelrod, R. and M. Cohen. 1999. Harnessing Complexity: Organizational Implications of a Scientific Frontier. New York, NY, USA: Simon and Schuster.
Braha, D., A. Minai, and Y. Bar-Yam (eds.). 2006. Complex Engineered Systems: Science Meets Technology. New York, NY, USA: Springer.
Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: John Wiley & Sons.
Flood, R. L., and E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of Systems Science", 2nd ed.. New York, NY, USA: Plenum Press.
Lawson, H. W. 2010. A Journey Through the Systems Landscape. Kings College, UK: College Publications.
Kellert, S. 1993. In the Wake of Chaos: Unpredictable Order in Dynamical Systems, Chicago, IL, USA: University of Chicago Press. p. 32.
Kline, S. 1995. Foundations of Multidisciplinary Thinking. Stanford, CA, USA: Stanford University Press.
Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.
Pollock, J.T. and R. Hodgson. 2004. Adaptive Information. Hoboken, NJ, USA: John Wiley & Sons.
Senge, P.M. 1990. The Fifth Discipline: The Art & Practice of The Learning Organization. New York, NY, USA: Doubleday/Currency.
Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering." Systems Engineering, 12(4): 295-311.
Sheard, SA. and A. Mostashari. 2011. "Complexity Types: From Science to Systems Engineering." Proceedings of the 21st Annual of the International Council on Systems Engineering (INCOSE) International Symposium, 20-23 June 2011, Denver, Colorado, USA.
Sillitto H.G. 2009. "On Systems Architects and Systems Architecting: Some Thoughts on Explaining The Art and Science of System Architecting." Proceedings of the 19th Annual International Council on Systems Engineering (INCOSE) International Symposium, 20-23 July 2009, Singapore.
Weaver, W. 1948. "Science and Complexity." American Science. 36: 536-544.
Warfield, J.N. 2006. An Introduction to Systems Science. London, UK: World Scientific Publishing.
Primary References
Page, Scott E. 2009. Understanding Complexity. Chantilly, VA, USA: The Teaching Company.
Flood, R. L., & E.R. Carson. 1993. Dealing with Complexity: An Introduction to The Theory and Application of Systems Science, 2nd ed. New York, NY, USA: Plenum Press.
Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering". Systems Engineering, 12(4): 295-311.
Additional References
Ashby, W.R. 1956. An Introduction to Cybernetics. London, UK: Chapman and Hall.
Aslaksen, E.W. 2004. "System Thermodynamics: A Model Illustrating Complexity Emerging from Simplicity". Systems Engineering, 7(3). Hoboken, NJ, USA: Wiley.
Aslaksen, E.W. 2009. Engineering Complex Systems: Foundations of Design in the Functional Domain. Boca Raton, FL, USA: CRC Press.
Aslaksen, E.W. 2011. "Elements of a Systems Engineering Ontology". Proceedings of SETE 2011, Canberra, Australia.
Eisner, H. 2005. Managing Complex Systems: Thinking Outside the Box. Hoboken, NJ, USA: John Wiley & Sons.
Jackson, S., D. Hitchins, and H. Eisner. 2010. What is the Systems Approach? INCOSE Insight 13(1) (April 2010): 41-43.
MITRE. 2011. "Systems Engineering Strategies for Uncertainty and Complexity." Systems Engineering Guide. Accessed 9 March 2011 at [[1]].
Ryan, A. 2007. "Emergence Is Coupled to Scope, Not Level, Complexity". A condensed version appeared in INCOSE Insight, 11(1) (January 2008): 23-24.
SEBoK Discussion
Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.
If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.
blog comments powered by Disqus