Difference between revisions of "Concepts of Systems Thinking"
m (Text replacement - "SEBoK v. 2.9, released 20 November 2023" to "SEBoK v. 2.10, released 06 May 2024") |
|||
(81 intermediate revisions by 9 users not shown) | |||
Line 1: | Line 1: | ||
− | This article forms part of the [[Systems Thinking]] | + | ---- |
+ | '''''Lead Author:''''' ''Rick Adcock'', '''''Contributing Authors:''''' ''Scott Jackson, Janet Singer, Duane Hybertson'' | ||
+ | ---- | ||
+ | This article forms part of the [[Systems Thinking]] knowledge area (KA). It describes {{Term|Systems Concept (glossary)|systems concepts}}, knowledge that can be used to understand {{Term|Problem (glossary)|problems}} and {{Term|Solution (glossary)|solutions}} to support [[Systems Thinking|systems thinking]]. | ||
− | The concepts below have been | + | The {{Term|Concept (glossary)|concepts}} below have been synthesized from a number of sources, which are themselves summaries of concepts from other authors. Ackoff (1971) proposed a system of system concepts as part of {{Term|General System Theory (glossary)|general system theory}} (GST); Skyttner (2001) describes the main GST concepts from a number of {{Term|Systems Science (glossary)|systems science}} authors; Flood and Carlson (1993) give a description of concepts as an overview of systems thinking; Hitchins (2007) relates the concepts to {{Term|Systems Engineering (glossary)|systems engineering}} practice; and Lawson (2010) describes a system of system concepts where systems are categorized according to fundamental concepts, types, topologies, focus, {{Term|Complexity (glossary)|complexity}}, and roles. |
==Wholeness and Interaction== | ==Wholeness and Interaction== | ||
− | A | + | A {{Term|System (glossary)|system}} is defined by a set of {{Term|Element (glossary)|elements}} which exhibit sufficient {{Term|Cohesion (glossary)|cohesion}}, or "togetherness," to form a bounded whole (Hitchins 2007; Boardman and Sauser 2008). |
− | According to | + | According to Hitchins, interaction between elements is the "key" system concept (Hitchins 2009, 60). The focus on interactions and {{Term|Holism (glossary)|holism}} is a push-back against the perceived {{Term|Reductionism (glossary)|reductionist}} focus on parts and provides recognition that in {{Term|Complex (glossary)|complex}} systems, the interactions among parts is at least as important as the parts themselves. |
− | An | + | An {{Term|Open System (glossary)|open system}} is defined by the interactions between {{Term|System Element (glossary)|system elements}} within a {{Term|System Boundary (glossary)|system boundary}} and by the interaction between system elements and other systems within an {{Term|Environment (glossary)|environment}} (see [[What is a System?]]). The remaining concepts below apply to open systems. |
− | |||
− | The remaining concepts below apply to open systems. | ||
==Regularity== | ==Regularity== | ||
− | + | {{Term|Regularity (glossary)|Regularity}} is a uniformity or similarity that exists in multiple entities or at multiple times (Bertalanffy 1968). Regularities make science possible and {{Term|Engineering (glossary)|engineering}} efficient and effective. Without regularities, we would be forced to consider every natural and artificial system problem and solution as unique. We would have no scientific laws, no categories or taxonomies, and each engineering effort would start from a clean slate. | |
− | + | Similarities and differences exist in any set or population. Every system problem or solution can be regarded as unique, but no problem/solution is in fact entirely unique. The nomothetic approach assumes regularities among entities and investigates what the regularities are. The idiographic approach assumes each entity is unique and investigates the unique qualities of entities, (Bertalanffy 1975). | |
− | A very large amount of regularity exists in both natural systems and | + | A very large amount of regularity exists in both natural systems and {{Term|Engineered System (glossary)|engineered systems}}. [[Patterns of Systems Thinking|Patterns of systems thinking]] capture and exploit that regularity. |
==State and Behavior== | ==State and Behavior== | ||
− | Any quality or property of a | + | Any quality or property of a {{Term|System Element (glossary)|system element}} is called an {{Term|Attribute (glossary)|attribute}}. The {{Term|State (glossary)|state}} of a system is a set of system attributes at a given time. A '''system event''' describes any change to the{{Term|Environment (glossary)|environment}} of a system, and hence its state: |
− | *'''Static''' - | + | *'''Static''' - A single state exists with no events. |
− | *'''Dynamic''' - | + | *'''Dynamic''' - Multiple possible stable states exist. |
− | *'''Homeostatic''' - | + | *'''Homeostatic''' - System is static but its elements are dynamic. The system maintains its state by internal adjustments. |
A stable state is one in which a system will remain until another event occurs. | A stable state is one in which a system will remain until another event occurs. | ||
− | State can be monitored using | + | State can be monitored using state variables, {{Term|Value (glossary)|values}} of attributes which indicate the system state. The set of possible values of state variables over time is called the "'state space'". State variables are generally continuous but can be modeled using a finite state model (or "state machine"). |
− | ( | + | Ackoff (1971) considers "change" to be how a system is affected by events, and system {{Term|Behavior (glossary)|behavior}} as the effect a system has upon its environment. A system can |
*'''react''' to a request by turning on a light, | *'''react''' to a request by turning on a light, | ||
− | *'''respond''' to darkness by deciding to turn on the light | + | *'''respond''' to darkness by deciding to turn on the light, or |
*'''act''' to turn on the lights at a fixed time, randomly or with discernible reasoning. | *'''act''' to turn on the lights at a fixed time, randomly or with discernible reasoning. | ||
− | A | + | A stable system is one which has one or more stable states within an environment for a range of possible events: |
* '''Deterministic''' systems have a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states. | * '''Deterministic''' systems have a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states. | ||
− | * '''Non-Deterministic''' systems have a many-to-many mapping of state variables; future | + | * '''Non-Deterministic''' systems have a many-to-many mapping of state variables; future states cannot be reliably predicted. |
− | The relationship between determinism and system complexity, including the idea of | + | The relationship between determinism and system complexity, including the idea of {{Term|Chaos (glossary)|chaotic}} systems, is further discussed in the [[Complexity]] article. |
==Survival Behavior== | ==Survival Behavior== | ||
− | Systems often | + | Systems often behave in a manner that allows them to sustain themselves in one or more alternative viable states. Many natural or social systems have this goal, either consciously or as a "self organizing" system, arising from the interaction between {{Term|Element (glossary)|elements}}. |
− | + | {{Term|Entropy (glossary)|Entropy}} is the tendency of systems to move towards disorder or disorganization. In physics, entropy is used to describe how organized heat energy is “lost” into the random background energy of the surrounding environment (the 2nd Law of Thermodynamics). A similar effect can be seen in {{Term|Engineered System (glossary)|engineered systems}}. What happens to a building or garden left unused for any time? Entropy can be used as a metaphor for aging, skill fade, obsolescence, misuse, boredom, etc. | |
− | "Negentropy" describes the forces working in a system to hold off entropy. | + | "Negentropy" describes the forces working in a system to hold off entropy. {{Term|Homeostasis (glossary)|Homeostasis}} is the biological equivalent of this, describing behavior which maintains a "steady state" or "dynamic equilibrium." Examples in nature include human cells, which maintain the same function while replacing their physical content at regular intervals. Again, this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc. |
− | ( | + | Hitchins (2007) describes the relationship between the viability of a system and the number of connections between its elements. Hitchins's concept of connected variety states that stability of a system increases with its connectivity (both internally and with its environment). (See {{Term|Variety (glossary)|variety}}.) |
==Goal Seeking Behavior== | ==Goal Seeking Behavior== | ||
− | Some systems have reasons for existence beyond simple survival. Goal seeking is one of the defining characteristics of | + | Some systems have reasons for existence beyond simple survival. Goal seeking is one of the defining characteristics of {{Term|Engineered System (glossary)|engineered systems}}: |
*A '''goal''' is a specific outcome which a system can achieve in a specified time | *A '''goal''' is a specific outcome which a system can achieve in a specified time | ||
− | *An '''objective''' is a longer term outcome which can be achieved through a series of goals. | + | *An '''objective''' is a longer-term outcome which can be achieved through a series of goals. |
− | *An '''ideal''' is an objective which cannot be achieved with any certainty, but for which progress towards the objective has value. | + | *An '''ideal''' is an objective which cannot be achieved with any certainty, but for which progress towards the objective has {{Term|Value (glossary)|value}}. |
− | Systems may be single goal seeking (perform set tasks), multi-goal seeking (perform related tasks) or reflective (set goals to tackle objectives or ideas). There are two types of goal seeking systems: | + | Systems may be single-goal seeking (perform set tasks), multi-goal seeking (perform related tasks), or reflective (set goals to tackle objectives or ideas). There are two types of goal seeking systems: |
− | * | + | *{{Term|Purposive (glossary)|Purposive}} systems have multiple goals with some shared outcome. Such a system can be used to provide pre-determined outcomes within an agreed time period. This system may have some freedom to choose how to achieve the goal. If it has memory, it may develop {{Term|Process (glossary)|processes}} describing the behaviors needed for defined goals. Most machines or {{Term|Software (glossary)|software}} systems are purposive. |
− | * | + | *{{Term|Purposeful (glossary)|Purposeful}} systems are free to determine the goals needed to achieve an outcome. Such a system can be tasked to pursue objectives or ideals over a longer time through a series of goals. Humans and sufficiently complex machines are purposeful. |
==Control Behavior== | ==Control Behavior== | ||
− | + | {{Term|Cybernetics (glossary)|Cybernetics}}, the science of {{Term|Control (glossary)|control}}, defines two basic control mechanisms: | |
− | *'''Negative feedback''', maintaining system state against | + | *'''Negative feedback''', maintaining system state against set objectives or levels. |
*'''Positive feedback''', forced growth or contraction to new levels. | *'''Positive feedback''', forced growth or contraction to new levels. | ||
− | One of the main concerns of cybernetics is the balance between stability and speed of response. A | + | One of the main concerns of cybernetics is the balance between stability and speed of response. A {{Term|Black-Box System (glossary)|black-box system}} view looks at the whole system. Control can only be achieved by carefully balancing inputs with outputs, which reduces speed of response. A {{Term|White-Box System (glossary)|white-box system}} view considers the {{Term|System Element (glossary)|system elements}} and their relationships; control mechanisms can be embedded into this structure to provide more responsive control and associated risks to stability. |
Another useful control concept is that of a "meta-system", which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case, behavior arises from the combination of system and meta-system. | Another useful control concept is that of a "meta-system", which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case, behavior arises from the combination of system and meta-system. | ||
Line 78: | Line 79: | ||
Control behavior is a trade between: | Control behavior is a trade between: | ||
− | *''Specialization'', the focus of system behavior to exploit particular features of its environment | + | *'''Specialization''', the focus of system behavior to exploit particular features of its environment, and |
− | * | + | *{{Term|Flexibility (glossary)|Flexibility}}, the ability of a system to adapt quickly to environmental change. |
− | While some system elements may be | + | While some system elements may be optimized for specialization, a temperature sensitive switch, flexibility, or an autonomous human controller, complex systems must strike a balance between the two for best results. This is an example of the concept of {{Term|Dualism (glossary)|dualism}}, discussed in more detail in [[Principles of Systems Thinking]]. |
− | + | {{Term|Variety (glossary)|Variety}} describes the number of different ways elements can be controlled and is dependent on the different ways in which they can then be combined. The Law of Requisite Variety states that a control system must have at least as much variety as the system it is controlling (Ashby 1956). | |
==Function== | ==Function== | ||
− | Ackoff defines | + | Ackoff defines {{Term|Function (glossary)|functions}} as outcomes which contribute to goals or objectives. To have a function, a system must be able to provide the outcome in two or more different ways. (This is called '''equifinality'''.) |
− | This view of function and behavior is common in systems science. In this paradigm all system elements have behavior of some kind, | + | This view of function and {{Term|Behavior (glossary)|behavior}} is common in systems science. In this {{Term|Paradigm (glossary)|paradigm}}, all system elements have behavior of some kind; however, to be capable of functioning in certain ways requires a certain richness of behaviors. |
− | In most hard systems approaches | + | In most {{Term|Hard System (glossary)|hard systems}} approaches, a set of functions are described from the problem statement and then associated with one or more alternative element {{Term|Structure (glossary)|structures}} (Flood and Carson 1993). This process may be repeated until a system {{Term|Component (glossary)|component}} (implementable combination of function and structure) has been defined (Martin 1997). Here, function is defined as either a task or activity that must be performed to achieve a desired outcome or as a transformation of inputs to outputs. This transformation may be: |
− | *'''Synchronous''', a regular interaction with a closely related system. | + | *'''Synchronous''', a regular interaction with a closely related system, or |
+ | *'''Asynchronous''', an irregular response to a demand from another system that often triggers a set response. | ||
− | + | The behavior of the resulting system is then assessed as a combination of function and {{Term|Effectiveness (glossary)|effectiveness}}. In this case, behavior is seen as an external property of the system as a whole and is often described as analogous to human or organic behavior (Hitchins 2009). | |
− | |||
− | The behavior of the resulting system is then assessed as a combination of function and | ||
==Hierarchy, Emergence and Complexity== | ==Hierarchy, Emergence and Complexity== | ||
− | System behavior is related to combinations of element behaviors. Most systems exhibit '''increasing variety'''; i.e., they have behavior resulting from the combination of element behaviors. The term "synergy" | + | System behavior is related to combinations of element behaviors. Most systems exhibit '''increasing variety'''; i.e., they have behavior resulting from the combination of element behaviors. The term "synergy," or weak {{Term|Emergence (glossary)|emergence}}, is used to describe the idea that the whole is greater than the sum of the parts. This is generally true; however, it is also possible to get '''reducing variety''', in which the whole function is less than the sum of the parts (Hitchins 2007). |
− | Complexity frequently takes the form of | + | Complexity frequently takes the form of {{Term|Hierarchy (glossary)|hierarchies}}. Hierarchic systems have some common properties independent of their specific content, and they will evolve far more quickly than non-hierarchic systems of comparable size (Simon 1996). A natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together forming structures which reduce complexity and increase {{Term|Robustness (glossary)|robustness}} (Simon 1962). |
− | + | {{Term|Encapsulation (glossary)|Encapsulation}} is the enclosing of one thing within another. It may also be described as the degree to which it is enclosed. System encapsulation encloses system elements and their interactions from the external environment, and usually involves a system boundary that hides the internal from the external; for example, the internal organs of the human body can be optimized to work effectively within tightly defined conditions because they are protected from extremes of environmental change. | |
− | Socio-technical systems form | + | Socio-technical systems form what are known as control hierarchies, with systems at a higher level having some ownership of control over those at lower levels. Hitchins (2009) describes how systems form "preferred patterns" which can be used to enhance the stability of interacting systems hierarchies. |
− | Looking across a hierarchy of systems generally reveals increasing | + | Looking across a hierarchy of systems generally reveals increasing complexity at the higher level, relating to both the structure of the system and how it is used. The term {{Term|Emergence (glossary)|emergence}} describes behaviors emerging across a complex system hierarchy. |
==Effectiveness, Adaptation and Learning== | ==Effectiveness, Adaptation and Learning== | ||
− | Systems | + | Systems {{Term|Effectiveness (glossary)|effectiveness}} is a measure of the system's ability to perform the functions necessary to achieve goals or objectives. Ackoff (1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination. |
− | Hitchins (2007) describes effectiveness as a combination of '''performance''' (how well a function is done in ideal conditions), '''availability''' (how often the function is there when needed) and '''survivability''' (how likely is it that the system will be able to use the function fully). | + | Hitchins (2007) describes effectiveness as a combination of '''performance''' (how well a function is done in ideal conditions), '''availability''' (how often the function is there when needed), and '''survivability''' (how likely is it that the system will be able to use the function fully). |
− | System elements and their | + | System elements and their environments change in a positive, neutral or negative way in individual situations. An {{Term|Adaptability (glossary)|adaptive}} system is one that is able to change itself or its environment if its effectiveness is insufficient to achieve its current or future objectives. Ackoff (1971) defines four types of adaptation, changing the environment or the system in response to internal or external factors. |
− | A system may also '''learn''', improving its effectiveness over time | + | A system may also '''learn''', improving its effectiveness over time without any change in state or goal. |
==References== | ==References== | ||
Line 125: | Line 125: | ||
===Works Cited=== | ===Works Cited=== | ||
− | Ackoff, R.L. 1971. "Towards a | + | Ackoff, R.L. 1971. "Towards a system of systems concepts," ''Management Science,'' vol. 17, no. 11. |
− | Ashby, W R. 1956. "Chapter 11" | + | Ackoff, R. 1979. "The future of operational research is past," ''Journal of the Operational Research Society'', vol. 30, no. 2, pp. 93–104, Pergamon Press. |
+ | |||
+ | Ashby, W R. 1956. "Chapter 11," in ''Introduction to Cybernetics''. London, UK: Wiley. | ||
Bertalanffy, L. von. 1968. ''General System Theory: Foundations, Development, Applications,'' Revised ed. New York, NY, USA: Braziller. | Bertalanffy, L. von. 1968. ''General System Theory: Foundations, Development, Applications,'' Revised ed. New York, NY, USA: Braziller. | ||
− | Bertalanffy, L. von. 1975. Perspectives on General System Theory. E. Taschdjian, ed. New York: George Braziller. | + | Bertalanffy, L. von. 1975. ''Perspectives on General System Theory.'' E. Taschdjian, ed. New York, NY, USA: George Braziller. |
− | Boardman, J. and B. Sauser. 2008. Systems Thinking: Coping with 21st Century Problems Boca Raton, FL, USA: Taylor & Francis. | + | Boardman, J. and B. Sauser. 2008. ''Systems Thinking: Coping with 21st Century Problems.'' Boca Raton, FL, USA: Taylor & Francis. |
− | Flood, R.L. and E.R. Carson. 1993. ''Dealing | + | Flood, R.L. and E.R. Carson. 1993. ''Dealing with Complexity: An Introduction to the Theory and Application of Systems Science''. New York, NY, USA: Plenum Press. |
Hitchins, D. 2007. ''Systems Engineering: A 21st Century Systems Methodology''. Hoboken, NJ, USA: John Wiley and Sons. | Hitchins, D. 2007. ''Systems Engineering: A 21st Century Systems Methodology''. Hoboken, NJ, USA: John Wiley and Sons. | ||
− | Hitchins, D. 2009. "[[What are the General Principles Applicable to Systems?]]" INCOSE ''Insight | + | Hitchins, D. 2009. "[[What are the General Principles Applicable to Systems?|What are the general principles applicable to systems?]]" INCOSE ''Insight,'' vol. 12, no. 4, pp. 59-63. |
− | Lawson, H. 2010. ''A Journey Through the Systems Landscape''. London, UK: College | + | Lawson, H. 2010. ''A Journey Through the Systems Landscape''. London, UK: College Publications, Kings College. |
− | Martin J. N. 1997. ''Systems Engineering Guidebook.'' Boca Raton, FL, USA: CRC Press. | + | Martin, J. N. 1997. ''Systems Engineering Guidebook.'' Boca Raton, FL, USA: CRC Press. |
− | Skyttner, L. 2001. ''General Systems Theory: Ideas and Applications.'' | + | Skyttner, L. 2001. ''General Systems Theory: Ideas and Applications.'' Singapore: World Scientific Publishing Co., pp. 53-69. |
− | Simon, H. A. 1962. "The | + | Simon, H.A. 1962. "The architecture of complexity," ''Proceedings of the American Philosophical Society'', vol. 106, no. 6, December 12, pp. 467-482. |
− | Simon, H. 1996. The Sciences of the Artificial, 3rd ed. Cambridge, MA: MIT Press. | + | Simon, H. 1996. ''The Sciences of the Artificial'', 3rd ed. Cambridge, MA, USA: MIT Press. |
===Primary References=== | ===Primary References=== | ||
− | Ackoff, R.L. 1971. "[[Towards a System of Systems Concept]] | + | Ackoff, R.L. 1971. "[[Towards a System of Systems Concept|Towards a system of systems concept]]," ''Management Science'', vol. 17, no. 11. |
− | Hitchins, D. 2009. "[[What are the General Principles Applicable to Systems?]]" INCOSE ''Insight | + | Hitchins, D. 2009. "[[What are the General Principles Applicable to Systems?|What are the general principles applicable to systems?]]" INCOSE ''Insight'', vol. 12, no. 4,pp 59-63. |
===Additional References=== | ===Additional References=== | ||
− | Edson, | + | Edson, R. 2008. ''Systems Thinking. Applied. A Primer''. Arlington, VA, USA: Applied Systems Thinking Institute (ASysT), Analytic Services Inc. |
− | Hitchins, | + | Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the systems approach?" INCOSE ''Insight'', vol. 13, no. 1, April, , pp. 41-43. |
− | + | Waring, A. 1996. "Chapter 1," in ''Practical Systems Thinking''. London, UK: International Thomson Business Press. | |
− | |||
− | Waring, A. 1996. "Chapter 1 | ||
---- | ---- | ||
Line 169: | Line 169: | ||
<center>[[What is Systems Thinking?|< Previous Article]] | [[Systems Thinking|Parent Article]] | [[Principles of Systems Thinking|Next Article >]]</center> | <center>[[What is Systems Thinking?|< Previous Article]] | [[Systems Thinking|Parent Article]] | [[Principles of Systems Thinking|Next Article >]]</center> | ||
− | + | <center>'''SEBoK v. 2.10, released 06 May 2024'''</center> | |
− | |||
− | |||
− | |||
[[Category:Part 2]][[Category:Topic]] | [[Category:Part 2]][[Category:Topic]] | ||
[[Category:Systems Thinking]] | [[Category:Systems Thinking]] |
Latest revision as of 21:45, 2 May 2024
Lead Author: Rick Adcock, Contributing Authors: Scott Jackson, Janet Singer, Duane Hybertson
This article forms part of the Systems Thinking knowledge area (KA). It describes systems concepts, knowledge that can be used to understand problems and solutions to support systems thinking.
The concepts below have been synthesized from a number of sources, which are themselves summaries of concepts from other authors. Ackoff (1971) proposed a system of system concepts as part of general system theory (GST); Skyttner (2001) describes the main GST concepts from a number of systems science authors; Flood and Carlson (1993) give a description of concepts as an overview of systems thinking; Hitchins (2007) relates the concepts to systems engineering practice; and Lawson (2010) describes a system of system concepts where systems are categorized according to fundamental concepts, types, topologies, focus, complexity, and roles.
Wholeness and Interaction
A system is defined by a set of elements which exhibit sufficient cohesion, or "togetherness," to form a bounded whole (Hitchins 2007; Boardman and Sauser 2008).
According to Hitchins, interaction between elements is the "key" system concept (Hitchins 2009, 60). The focus on interactions and holism is a push-back against the perceived reductionist focus on parts and provides recognition that in complex systems, the interactions among parts is at least as important as the parts themselves.
An open system is defined by the interactions between system elements within a system boundary and by the interaction between system elements and other systems within an environment (see What is a System?). The remaining concepts below apply to open systems.
Regularity
Regularity is a uniformity or similarity that exists in multiple entities or at multiple times (Bertalanffy 1968). Regularities make science possible and engineering efficient and effective. Without regularities, we would be forced to consider every natural and artificial system problem and solution as unique. We would have no scientific laws, no categories or taxonomies, and each engineering effort would start from a clean slate.
Similarities and differences exist in any set or population. Every system problem or solution can be regarded as unique, but no problem/solution is in fact entirely unique. The nomothetic approach assumes regularities among entities and investigates what the regularities are. The idiographic approach assumes each entity is unique and investigates the unique qualities of entities, (Bertalanffy 1975).
A very large amount of regularity exists in both natural systems and engineered systems. Patterns of systems thinking capture and exploit that regularity.
State and Behavior
Any quality or property of a system element is called an attribute. The state of a system is a set of system attributes at a given time. A system event describes any change to theenvironment of a system, and hence its state:
- Static - A single state exists with no events.
- Dynamic - Multiple possible stable states exist.
- Homeostatic - System is static but its elements are dynamic. The system maintains its state by internal adjustments.
A stable state is one in which a system will remain until another event occurs.
State can be monitored using state variables, values of attributes which indicate the system state. The set of possible values of state variables over time is called the "'state space'". State variables are generally continuous but can be modeled using a finite state model (or "state machine").
Ackoff (1971) considers "change" to be how a system is affected by events, and system behavior as the effect a system has upon its environment. A system can
- react to a request by turning on a light,
- respond to darkness by deciding to turn on the light, or
- act to turn on the lights at a fixed time, randomly or with discernible reasoning.
A stable system is one which has one or more stable states within an environment for a range of possible events:
- Deterministic systems have a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
- Non-Deterministic systems have a many-to-many mapping of state variables; future states cannot be reliably predicted.
The relationship between determinism and system complexity, including the idea of chaotic systems, is further discussed in the Complexity article.
Survival Behavior
Systems often behave in a manner that allows them to sustain themselves in one or more alternative viable states. Many natural or social systems have this goal, either consciously or as a "self organizing" system, arising from the interaction between elements.
Entropy is the tendency of systems to move towards disorder or disorganization. In physics, entropy is used to describe how organized heat energy is “lost” into the random background energy of the surrounding environment (the 2nd Law of Thermodynamics). A similar effect can be seen in engineered systems. What happens to a building or garden left unused for any time? Entropy can be used as a metaphor for aging, skill fade, obsolescence, misuse, boredom, etc.
"Negentropy" describes the forces working in a system to hold off entropy. Homeostasis is the biological equivalent of this, describing behavior which maintains a "steady state" or "dynamic equilibrium." Examples in nature include human cells, which maintain the same function while replacing their physical content at regular intervals. Again, this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc.
Hitchins (2007) describes the relationship between the viability of a system and the number of connections between its elements. Hitchins's concept of connected variety states that stability of a system increases with its connectivity (both internally and with its environment). (See variety.)
Goal Seeking Behavior
Some systems have reasons for existence beyond simple survival. Goal seeking is one of the defining characteristics of engineered systems:
- A goal is a specific outcome which a system can achieve in a specified time
- An objective is a longer-term outcome which can be achieved through a series of goals.
- An ideal is an objective which cannot be achieved with any certainty, but for which progress towards the objective has value.
Systems may be single-goal seeking (perform set tasks), multi-goal seeking (perform related tasks), or reflective (set goals to tackle objectives or ideas). There are two types of goal seeking systems:
- Purposive systems have multiple goals with some shared outcome. Such a system can be used to provide pre-determined outcomes within an agreed time period. This system may have some freedom to choose how to achieve the goal. If it has memory, it may develop processes describing the behaviors needed for defined goals. Most machines or software systems are purposive.
- Purposeful systems are free to determine the goals needed to achieve an outcome. Such a system can be tasked to pursue objectives or ideals over a longer time through a series of goals. Humans and sufficiently complex machines are purposeful.
Control Behavior
Cybernetics, the science of control, defines two basic control mechanisms:
- Negative feedback, maintaining system state against set objectives or levels.
- Positive feedback, forced growth or contraction to new levels.
One of the main concerns of cybernetics is the balance between stability and speed of response. A black-box system view looks at the whole system. Control can only be achieved by carefully balancing inputs with outputs, which reduces speed of response. A white-box system view considers the system elements and their relationships; control mechanisms can be embedded into this structure to provide more responsive control and associated risks to stability.
Another useful control concept is that of a "meta-system", which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case, behavior arises from the combination of system and meta-system.
Control behavior is a trade between:
- Specialization, the focus of system behavior to exploit particular features of its environment, and
- Flexibility, the ability of a system to adapt quickly to environmental change.
While some system elements may be optimized for specialization, a temperature sensitive switch, flexibility, or an autonomous human controller, complex systems must strike a balance between the two for best results. This is an example of the concept of dualism, discussed in more detail in Principles of Systems Thinking.
Variety describes the number of different ways elements can be controlled and is dependent on the different ways in which they can then be combined. The Law of Requisite Variety states that a control system must have at least as much variety as the system it is controlling (Ashby 1956).
Function
Ackoff defines functions as outcomes which contribute to goals or objectives. To have a function, a system must be able to provide the outcome in two or more different ways. (This is called equifinality.)
This view of function and behavior is common in systems science. In this paradigm, all system elements have behavior of some kind; however, to be capable of functioning in certain ways requires a certain richness of behaviors.
In most hard systems approaches, a set of functions are described from the problem statement and then associated with one or more alternative element structures (Flood and Carson 1993). This process may be repeated until a system component (implementable combination of function and structure) has been defined (Martin 1997). Here, function is defined as either a task or activity that must be performed to achieve a desired outcome or as a transformation of inputs to outputs. This transformation may be:
- Synchronous, a regular interaction with a closely related system, or
- Asynchronous, an irregular response to a demand from another system that often triggers a set response.
The behavior of the resulting system is then assessed as a combination of function and effectiveness. In this case, behavior is seen as an external property of the system as a whole and is often described as analogous to human or organic behavior (Hitchins 2009).
Hierarchy, Emergence and Complexity
System behavior is related to combinations of element behaviors. Most systems exhibit increasing variety; i.e., they have behavior resulting from the combination of element behaviors. The term "synergy," or weak emergence, is used to describe the idea that the whole is greater than the sum of the parts. This is generally true; however, it is also possible to get reducing variety, in which the whole function is less than the sum of the parts (Hitchins 2007).
Complexity frequently takes the form of hierarchies. Hierarchic systems have some common properties independent of their specific content, and they will evolve far more quickly than non-hierarchic systems of comparable size (Simon 1996). A natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together forming structures which reduce complexity and increase robustness (Simon 1962).
Encapsulation is the enclosing of one thing within another. It may also be described as the degree to which it is enclosed. System encapsulation encloses system elements and their interactions from the external environment, and usually involves a system boundary that hides the internal from the external; for example, the internal organs of the human body can be optimized to work effectively within tightly defined conditions because they are protected from extremes of environmental change.
Socio-technical systems form what are known as control hierarchies, with systems at a higher level having some ownership of control over those at lower levels. Hitchins (2009) describes how systems form "preferred patterns" which can be used to enhance the stability of interacting systems hierarchies.
Looking across a hierarchy of systems generally reveals increasing complexity at the higher level, relating to both the structure of the system and how it is used. The term emergence describes behaviors emerging across a complex system hierarchy.
Effectiveness, Adaptation and Learning
Systems effectiveness is a measure of the system's ability to perform the functions necessary to achieve goals or objectives. Ackoff (1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.
Hitchins (2007) describes effectiveness as a combination of performance (how well a function is done in ideal conditions), availability (how often the function is there when needed), and survivability (how likely is it that the system will be able to use the function fully).
System elements and their environments change in a positive, neutral or negative way in individual situations. An adaptive system is one that is able to change itself or its environment if its effectiveness is insufficient to achieve its current or future objectives. Ackoff (1971) defines four types of adaptation, changing the environment or the system in response to internal or external factors.
A system may also learn, improving its effectiveness over time without any change in state or goal.
References
Works Cited
Ackoff, R.L. 1971. "Towards a system of systems concepts," Management Science, vol. 17, no. 11.
Ackoff, R. 1979. "The future of operational research is past," Journal of the Operational Research Society, vol. 30, no. 2, pp. 93–104, Pergamon Press.
Ashby, W R. 1956. "Chapter 11," in Introduction to Cybernetics. London, UK: Wiley.
Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications, Revised ed. New York, NY, USA: Braziller.
Bertalanffy, L. von. 1975. Perspectives on General System Theory. E. Taschdjian, ed. New York, NY, USA: George Braziller.
Boardman, J. and B. Sauser. 2008. Systems Thinking: Coping with 21st Century Problems. Boca Raton, FL, USA: Taylor & Francis.
Flood, R.L. and E.R. Carson. 1993. Dealing with Complexity: An Introduction to the Theory and Application of Systems Science. New York, NY, USA: Plenum Press.
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley and Sons.
Hitchins, D. 2009. "What are the general principles applicable to systems?" INCOSE Insight, vol. 12, no. 4, pp. 59-63.
Lawson, H. 2010. A Journey Through the Systems Landscape. London, UK: College Publications, Kings College.
Martin, J. N. 1997. Systems Engineering Guidebook. Boca Raton, FL, USA: CRC Press.
Skyttner, L. 2001. General Systems Theory: Ideas and Applications. Singapore: World Scientific Publishing Co., pp. 53-69.
Simon, H.A. 1962. "The architecture of complexity," Proceedings of the American Philosophical Society, vol. 106, no. 6, December 12, pp. 467-482.
Simon, H. 1996. The Sciences of the Artificial, 3rd ed. Cambridge, MA, USA: MIT Press.
Primary References
Ackoff, R.L. 1971. "Towards a system of systems concept," Management Science, vol. 17, no. 11.
Hitchins, D. 2009. "What are the general principles applicable to systems?" INCOSE Insight, vol. 12, no. 4,pp 59-63.
Additional References
Edson, R. 2008. Systems Thinking. Applied. A Primer. Arlington, VA, USA: Applied Systems Thinking Institute (ASysT), Analytic Services Inc.
Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the systems approach?" INCOSE Insight, vol. 13, no. 1, April, , pp. 41-43.
Waring, A. 1996. "Chapter 1," in Practical Systems Thinking. London, UK: International Thomson Business Press.