Concepts of Systems Thinking
This article forms part of the Systems Thinking Knowledge Area.
This article defines a system of systems concepts , which describe specific knowledge that can be used to better understand any system, whatever its domain or technology, as an extension of the basic ideas of Systems Thinking.
These concepts have been synthesized from the work of a number of System Science researchers. Ackoff (1971) proposes a System of "System Concepts" to bring together the wide variety of concepts which have been proposed. Ackoff’s concepts are written from a systems research perspective and can be a little abstract and hard to relate to practice. (Skyttner 2001) describes the main General Systems Thinking (GST) concepts proposed by a number of authors; (Flood and Carlson 1993) give a description of concepts as an overview of systems thinking; (Hitchins 2007) relates the concepts to systems engineering practice.
Wholeness
The definition of system includes the fundamental concepts of a set of elements which exhibit sufficient cohesion (Hitchins 2007) or "togetherness" (Boardman and Sauser 2008) to form a "bounded" whole.
A system exists in an environment which contains related systems and conditions:
- A closed system has no relationships with the environment.
- An open system shares inputs and outputs with its environment across the boundary.
System elements may be either conceptual organizations of ideals in symbolic form or real objects, e.g. people, data, physical artifacts, etc.
- Abstract system elements are conceptual.
- Concrete systems contain at least two elements which are objects.
Unless otherwise stated, the remaining concepts below apply to open, concrete systems.
Behavior
State
Any quality or property of a system element is called an attribute . The state of a system is a set of system attributes at a given time. A "system Event" describes any change to attributes of a system (or environment) and hence its state:
- Static - a single state exists with no events.
- Dynamic - multiple possible stable states exist. A stable state is one in which a system will remain until another event occurs.
- Homeostatic - system is static but its elements are dynamic. The system maintains its state by internal adjustments.
State can be monitored using "State Variables"; i.e., values of attributes which indicate the system state. The set of possible values of state variables over time is called the "state space". State variables are generally continuous, but can be modeled using a "Finite State Model" (or "State Machine").
- Deterministic systems have a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
- Non-Deterministic systems have a many-to-many mapping of state variables; future state cannot be reliably predicted. This may be due to random changes in state, or because their structure is sufficiently complex that while they may be deterministic, it may take up different states due to very small (below our ability to measure) differences in starting state.
The latter case is one definition of a chaotic system; e.g., the stock market or weather are examples whose past states can be explained using deterministic reasoning, but whose future states cannot be predicted with certainty.
System Events
(Ackoff 1971) considers "change" to be how a system is affected by events, and "behavior" as the effect a system has upon its environment. Three kinds of "system change" are described; e.g., a system can "react" to a request by turning on a light, or "respond" to darkness by deciding to turn on the light or "act" to turn on the lights at a fixed time, randomly or with discernible reasoning.
System behavior is a change which leads to events in itself or other systems. Thus, action, reaction, or response may constitute behavior in some cases. Systems have varying levels of behavior.
Survival Behavior
Systems often act to continue to exist, behaving to sustain themselves in one or more alternative viable states. Many natural or social systems have this goal, either consciously or as a "self organizing" system, arising from the interaction between elements.
entropy is the tendency of systems to move towards disorder or disorganization. In physics, entropy is used to describe how “organized” heat energy is “lost” into the “random” background energy of the surrounding environment; e.g. 2nd Law of Thermodynamics.
A similar effect can be seen in engineered systems. What happens to a building or garden, which is left unused for any time? Entropy can be used as a metaphor for aging, skill fade, obsolescence, misuse, boredom, etc.
"Negentropy" describes the forces working in a system to hold off entropy. homeostasis is the biological equivalent of this, describing behavior which maintains a "steady state" or "dynamic equilibrium". Examples of this process in nature include human cells, which maintain the same function while replacing their physical content at regular intervals. Again this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc.
(Hitchins 2007), describes the relationship between the viability of a system and the number of connections between its elements. In cybernetics, variety is used to describe the number of different ways elements can be controlled, dependent on the different ways in which then can be combined. Hitchins' concept of "connected variety" states that stability of a system increases with its connectivity (both internally and with its environment).
Goal Seeking Behavior
Goals and Objectives
Engineered systems generally have reasons for existence beyond simple survival.
- A "goal" is a specific outcome which a system can achieve in a specified time
- An "objective" is a longer term outcome which can be achieved through a series of goals.
- An "ideal" is an objective which cannot be achieved with any certainty, but for which progress towards the objective has value.
Systems may be single goal seeking (perform set tasks), multi-goal seeking (perform related tasks) or reflective (set goals to tackle objectives or ideas). There are two types of goal seeking systems:
- purposive systems have multiple goals, with some shared outcome. Such a system can be asked/used to provide pre-determined outcomes, within an agreed time period. Such a system may have some freedom to choose how to achieve the goal. If it has memory it may develop processes describing the behaviors needed for defined goals. Most machines or software systems are purposive.
- purposeful systems are free to determine the goals needed to achieve an outcome. Such a system can be tasked to pursue objectives or Ideals over a longer time through a series of goals. Humans and sufficiently complex machines are purposeful.
Function
Ackoff defines function as outcomes which contribute to goals or objectives. To have a function, a system must be able to provide the outcome in two or more different ways (this is called Equifinity).
This view of function and behavior is common in systems science. In this paradigm all system elements have behavior of some kind, but to be capable of functioning in certain ways requires a certain richness of behaviors.
In most hard systems approaches (Flood and Carson 1993) a set of functions are described from the problem statement, and then associated with one or more alternative element structures. This process may be repeated until a system component (implementable combinations of function and structure) has been defined (Martin 1997). Here "function" is defined as a task or activity that must be performed to achieve a desired outcome; or as a "transformation" of inputs to outputs. This transformation may be:
- Synchronous, a regular interaction with a closely related system.
- Asynchronous, an irregular response to a demand from another system, often triggering a set response.
The behavior of the resulting system is then assessed. In this case behavior is seen as an external property of the system as a whole, and often described as analogous to human or organic behavior (Hitchins 2009).
Control
cybernetics , the science of control, defines two basic control mechanisms:
- "Negative feedback", maintaining system state against a set objectives or levels.
- "Positive feedback", forced growth or contraction to new levels.
One of the main concerns of cybernetics is the balance between stability and speed of response. Cybernetics considers systems in three ways. A black-box system view looks at the whole system. Control can only be achieved by carefully balancing inputs with outputs which reduces speed of response. A white-box system view considers the system elements and their relationships; here control mechanisms can be imbedded into this structure giving more responsive control and associated risks to stability. A "grey-box" view sits between these two, with control exerted at the major sub-system level. Another useful control concept is that of a "meta-system", which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case, behavior arises from the combination of system and meta-system.
Systems behavior is influenced by variety , in particular in its control functions (Hitchins 2009). The law of requisite variety (Ashby 1956) states that a control system must have at least as much variety as the system it is controlling. The effect of variety on system behavior can often be seen in the relationship between:
- Specialization, the focus of system behavior to exploit particular features of its environment.
- flexibility , the ability of a system to adapt quickly to environmental change.
Effectiveness, Adaptation and Learning
Systems effectiveness is a measure of the system's ability to perform the functions necessary to achieve goals or objectives. Ackoff (1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.
Hitchins (2007) describes effectiveness as a combination of performance (how well a function is done in ideal conditions), availability (how often the function is there when needed) and survivability (how likely is it that the system will be able to use the function fully).
An adaptive System is one that is able to change itself or its environment if its effectiveness is insufficient to achieve its current or future goals or objectives. Ackoff defines four types of adaptation, changing the environment or the system, in response to internal or external factors.
A system may also "learn", improving its effectiveness over time, without any change in state or goal.
Hierarchy, Emergence and Complexity
System behavior is related to combinations of element behaviors. Most systems exhibit "increasing variety"; i.e., they have behavior resulting from the combination of element behaviors. The term "synergy", or weak emergence, is used to describe the idea that “the whole is greater than the sum of the parts”. While this is generally true, it is also possible to get reducing variety in which the whole function is less than the sum of the parts.
Open systems tend to form hierarchies of coherent system elements, or sub-systems. A natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together forming structures which reduce complexity and increase robustness (Simons 1962). Socio-technical systems form "control hierarchies", with systems at a higher level having some ownership of control over those at lower levels. Hitchins (2009) describes how systems form "preferred patterns" which can be used to the enhanced stability of interacting systems hierarchies.
Systems can be viewed using "systemic resolution". A system is characterized by its behavior in a wider system or environment and considered in detail as a set of sub-system structures and functions. This system description is focused at a particular level of resolution. The level of resolution is changed by focusing on the wider system or on one of the sub-systems. While this allows a focus on a given system-of-interest, the holistic view of the wider system and environment must not be lost.
Looking across a hierarchy of systems generally reveals increasing complexity at the higher level, relating to both the structure of the system and how it is used. The terms emergence and emergent properties are generally used to describe behaviors emerging across a complex system hierarchy. These last two ideas are fundamental to engineered systems and the Systems Approach.
Practical Consideration
How do people know how to apply system concepts to an Engineered System?
All SE texts, for example (INCOSE 2011), describe processes and activities based upon the application of systems thinking to an engineered system context. Often the link between SE and systems thinking is embedded in the details and not clear to those applying the processes. Systems Approach and its links to the rest of the SEBoK provide a guide to this linkage.
Hitchins (2007) proposes a set of necessary and sufficient questions to help ensure all systemic issues have been considered when assessing an existing or proposed system description. These questions attempt to relate system concepts to high level concerns more relevant to Systems Engineers.
Hitchins "Generic Reference Model" asks questions under six heading based on these concepts, related to its function (what it does) and form (what the system is). Each question expanded into a number of more details questions related to system concepts:
- Function: Mission Management. How does the system deal with setting of objectives and plans; control and behavior; relationships with other systems?
- Function: Viability Management. How does the system deal with state, survival, maintenance and repair?
- Function: Resource Management. How does the system deal with exchange of information, energy, people, material, finance, etc. with its environment?
- Form: Structure. Are system boundaries, sub-elements, connections and relationships understood?
- Form: Influence. Are the systems' wider relationships and influences with its environment understood?
- Form: Potential. Has the structure to achieve all objectives been considered, how often and well those objectives must be achieved, and how faults or failures are addressed?
References
Works Cited
Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications, Revised ed. New York, NY, USA: Braziller.
Ackoff, R.L. 1971. "Towards a System of Systems Concepts". Management Science. 17(11).
Ashby, W R. 1956. "Chapter 11". Introduction to Cybernetics. London, UK: Wiley.
Flood, R.L. and E.R. Carson. 1993. Dealing With Complexity: An Introduction to the Theory and Application of Systems Science. New York, NY, USA: Plenum Press.
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley and Sons.
Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE Insight. 12(4): 59-63.
Martin J. N. 1997. Systems Engineering Guidebook. Boca Raton, FL, USA: CRC Press.
Skyttner, L. 2001. General Systems Theory: Ideas and Applications. Singapore: World Scientific Publishing Co. p. 53-69.
Simon, H. A. 1962. "The Architecture of Complexity." Proceedings of the American Philosophical Society. 106(6) (Dec. 12, 1962): 467-482.
Primary References
Ackoff, R.L. 1971. "Towards a System of Systems Concepts." Management Science. 17(11).
Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE Insight. 12(4): 59-63.
Additional References
Edson, Robert. 2008. Systems Thinking. Applied. A Primer. Arlington, VA, USA: Applied Systems Thinking Institute (ASysT), Analytic Services Inc.
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley & Sons.
Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight. 13(1) (April 2010): 41-43.
Lawson, H. 2010. A Journey Through the Systems Landscape. London, UK: College Publications, Kings College.
Waring, A. 1996. "Chapter 1." Practical Systems Thinking. London, UK: International Thomson Business Press.
Comments from SEBoK 0.5 Wiki
Please note that in version 0.5, this article was titled "Overview of System Concepts”.
<html> <iframe src="http://www.sebokwiki.org/05/index.php?title=Talk:Overview_of_System_Concepts&printable=yes" width=825 height=200 frameborder=1 scrolling=auto> </iframe> </html>
SEBoK Discussion
Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.
If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.
blog comments powered by Disqus