Concepts of Systems Thinking

From SEBoK
Revision as of 16:51, 16 August 2011 by Bkcase (talk | contribs) (→‎System Events)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Jump to navigation Jump to search

Introduction

In this Topic we have organized the key System Concepts around the 8 principles defined in the System Concepts Knowledge Area Introduction. A number of key sources have been used.

Ackoff (Ackoff, 1971) proposes a System of System Concepts to bring together the wide variety of concepts which have been proposed. Ackoff’s concepts are written from a systems research perspective and can be a little abstract and hard to relate to practice. (Skyttner, 2001) describes the main GST concepts proposed by a number of authors; (Flood and Carlson, 1993) give a description of concepts as an overview of systems thinking; (Hitchins, 2007) relates the concepts to Systems Engineering Practice.

Wholeness

The definition of system includes the fundamental concepts of a set of Elements which exhibit sufficient Cohesion (Hitchins, 2007) or Togetherness (Boardman and Sauser 2008) to form a Bounded whole.

A system exists in an environment which contains related systems and conditions:

  • closed system , has no relationships with the environment.
  • open system , shares Inputs and Outputs with its environment across the boundary.

System elements may be conceptual organizations of ideals in symbolic form or real objects, e.g. people, data, physical artifacts, etc.

  • Abstract system all elements are conceptual.
  • Concrete system contains at least two elements which are objects.

Unless other wise stated, the remaining concepts below apply to open, concrete systems.

Behavior

State

Any quality or property of a system element is called an Attribute. state is a set of system attributes at a given time. A System Event describes any change to attributes of a system (or environment) and hence its state:

  • Static, a single state exists with no events.
  • Dynamic, multiple possible Stable states exist. A stable state is one in which a system will remain until another event occurs.
  • Homeostatic, system is static but its elements are dynamic. The system maintains its state by internal adjustments.

State can be monitored using State Variables attributes which indicate system state. The set of possible combinations of State over time is called State Space. State is generally Continuous, but can be modeled using a Finite State Model (or State Machine).

  • Deterministic systems has a one-to-one mapping of state variables to state space, allowing future states to be predicted from past states.
  • Non-Deterministic systems have a many-to-many mapping of state variables, future state cannot be predicted. This may be due to random changes in state, or because its structure is sufficiently complex that while it is deterministic it may take up different states due to very small (below our ability to measure) differences it starting state.

The later is one definition of a chaotic system, e.g. stock market or weather, are examples whose past states can be explained using deterministic reasoning, but whose future states cannot be predicted with any certainty.

System Events

(Ackoff, 1971) considers Change to be how a system is affected by events, and Behavior as the effect a system has upon its environment.

Three kind of System Change are described: we React to a request by turning on a light, or we Respond to darkness by deciding to turn on the light or we Act to turn on the lights at a fixed time, randomly or with discernable reasoning.

System behavior is a change which leads to events in itself or other systems. Thus, action, reaction or response may constitute behavior in some cases. Systems have varying levels of behavior.

Survival Behavior

All systems seek to continue to exist, behaving to sustain themselves in one or more alternative viable states. Many natural or social systems have this goal, either consciously or as a Self Organizing, arising from the interaction between elements..

Entropy (glossary is the tendency of systems to move towards disorder or disorganization. In physics entropy is used to describe how “organized” heat energy is “lost” into the “random” background energy of the surrounding environment, e.g. 2nd law of Thermodynamics.

A similar effect can be seen in engineered systems. What happens to a building or garden, which is left unused for any time? Entropy can be used as a metaphor for aging, skill fade, obsolescence, miss-use, boredom, etc.

Negentropy describes the forces working in a system to hold off entropy. Homeostasis (glossary is the biological equivalent of this, describing behavior which maintains a Steady State or Dynamic Equilibrium. Examples of this process in nature include human cells, which maintain the same function while replacing their physical content at regular intervals.

Again this can be used as a metaphor for the fight against entropy, e.g. training, discipline, maintenance, etc.

Goal Seeking Behavior

Goals and Objectives

Some systems have other reasons for existence, beyond simple survival.

  • A Goal is a specific outcome which a system can achieve in a specified time
  • An Objective is a longer term outcome which can be achieved through a series of goals.
  • An Ideal is an objective which cannot be achieved with any certainty, but for which progress towards the objective has value.

Systems may be single goal seeking (perform set tasks), multi-goal seeking (perform related tasks) or reflective (set goals to tackle objectives or ideas). We define two types of goal seeking system:

  • Purposive (glossary systems have multiple goals, with some shared outcome. Such a system can be asked/used to provide pre-determined outcomes, within an agreed time period. Such a system may have some freedom to choose how to achieve the goal. If it has memory it may develop Processes describing the behaviors needed for defined goals. Most machines or software systems are purposive.
  • Purposeful(glossary systems are free to determine the goals needed to achieve an outcome. Such a system can be tasked to pursue Objective or Ideals over a longer time through a series of goals. Humans, and sufficiently complex machines, are purposeful.

Function

Ackoff defines Function (glossary as outcomes which contribute to goals or objectives. To have a function, a system must be able to provide the outcome in two or more different ways (this is called Equifinity).

This view of function and behavior is common in system science. In this paradigm all system elements have behavior of some kind, but to be capable of functioning in certain ways requires a certain richness of behaviors.

In most hard systems approaches (Flood and Carson, 1993) a set of functions are described from the problem statement, and then associated with one or more alternative element structures. This process may be repeated until system Components (implementable combinations of function and structure) have been defined (Martin, 1997). Here Function is defined as a task or activity that must be performed to achieve a desired outcome; or as a Transformation of Inputs to Outputs.

  • Synchronous, a regular interaction with a closely related system.
  • Asynchronous, an irregular response to a demand from another system, often triggering a set response.

The behavior of the resulting system is then assessed. In this case behavior is seen as an external property of the system as a whole, and often describe as analogous to human or organic behavior (Hitchins, 2009).

Control

cybernetics , the science of control, defines two basic control mechanisms:

  • Negative feedback, maintaining system state against set objectives or levels.
  • Positive feedback, forced growth or contraction to new levels.

One of the main concerns of cybernetics is the balance between stability and speed of response. Cybernetic considers systems in three ways. A Black-Box view looks at the whole system, control can only be achieved by carefully balancing inputs with outputs which reduces speed of response. A White-Box view considers the system elements and their relationships; here control mechanisms can be imbedded into this structure giving more responsive control and associated risks to stability. A Grey-Box view sits between these two, with control exerted at the major sub-system level. Another useful control concept is that of a Meta-System, which sits over the system and is responsible for controlling its functions, either as a black-box or white-box. In this case behavior arises from the combination of system and meta-system.

Systems behavior is influenced by variety , in particular in its control functions (Hitchins, 2009). The law of requisite variety (Ashby, 1956) states that a control system must have great than or equal to the variety of the system it is controlling. The effect of variety on system behavior can often be seen in the relationship between:

  • Specialisation, the focus of system behaviour to exploit particular features of its environment.
  • Flexibility, the ability of a system to adapt quickly to environmental change.

Effectiveness, Adaption and Learning

A systems effectiveness is a measure of its ability to perform the functions necessary to achieve goals or objectives. (Ackoff, 1971) defines this as the product of the number of combinations of behavior to reach a function and the efficiency of each combination.

(Hitchins, 2007) describes effectiveness as a combination of performance (how well a function is done in ideal conditions), availability (how often the function is there when needed) and survivability (how likely is it that the system will be able to use the function fully).

An Adaptive System is one able change itself or its environment if its effectiveness is insufficient to achieve its current or future goals or objectives. Ackoff defines four types of adaption, changing the environment or the system, in response to internal or external factors.

A system may also Learn, improving its effectiveness over time, without any change in state or goal.

Hierarchy, Emergence and Complexity

System behavior is related to the combinations of element behaviors. Most systems exhibit increasing variety, they have behavior resulting from the combination of element behavior. The term Synergy, or weak emergence, is used to describe the idea that “the whole being greater than the sum of the parts”. While this is generally true it is also possible to get reducing variety in which the whole function at less than the sum of the part.

Open Systems tend to form Hierarchies of coherent System Elements, or Sub-Systems. In natural system hierarchy is a consequence of wholeness, with strongly cohesive elements grouping together, forming structures which reduce complexity and increase robustness (Simons, 1962). Soci-technical systems form Control Hierarchies, with systems at a higher level having some ownership of control over those at lower levels. (Hitchins, 2009) describes how systems form Preferred Patterns which can be used to the enhanced stability of interacting systems hierarchies.


We take advantage of this by viewing systems using Systemic Resolution. A system is characterized by its behavior in a wider system or environment and considered in detail as a set of sub-system Structures and Functions. This system description is focused at a particular level of resolution. We can change this level of resolution by focusing upon the wider system, or upon one of the sub-systems. While this allows us to focus on a given system-of-interest we must take care to continue to take a holistic view of the wider system and environment.

When we look across a hierarchy of system we generally see increasing complexity at the higher level, relating to both the structure of the system and how it is used. The terms emergence and Emergent Properties (glossary) are generally used to describe behaviors emerging across a complex system hierarchy. These last two ideas are fundamental to Engineered System and the Systems Approach, and are discussed in more detail in the related topics.

Completeness

How do we apply system concepts to an Engineered System? (Hitchins, 2007) proposes a set of necessary and sufficient questions to help ensure all systemic issues have been considered when assessing an existing or proposed system description.

Hitchins Generic Reference Model asks questions under six heading based on these concepts, related to its Function (what it does) and Form (what the system is).


References

Citations

von Bertalanffy, L. 1968. General system theory: Foundations, development, applications. Revised ed. New York, NY: Braziller.

Ackoff, R.L. 1971. Towards a System of Systems Concepts, Management Science, Vol.17 No. 11, USA.

Flood, R.L. and Carson, E.R. (1993) Dealing With Complexity: An Introduction to the Theory and Application of Systems Science. New York: Plenum Press

Hitchins, D. (2007) Systems Engineering: A 21st Century Systems Methodology: Wiley

Hitchins, Derek. 2009. What are the General Principles Applicable to Systems? Insight, 59-63.

Skyttner, L. (2001) General Systems Theory: Ideas and Applications. Singapore: World Scientific Publishing Co. (Pages 53 - 69)

Ashby, W R. 1956. Introduction to Cybernetics London; Wiley (chapter 11)

Simon, H. A. 1962. The Architecture of Complexity. Proceedings of the American Philosophical Society, Vol. 106, No. 6. (Dec. 12, 1962), pp. 467-482.

Martin J. N. 1997. Systems Engineering Guidebook. CRC Press, 1997. ISBN 0849378370

Primary References

Checkland, Peter. 1999. Systems Thinking, Systems Practice. New York: John Wiley & Sons.

Hitchins, Derek. 2009. What are the General Principles Applicable to Systems? Insight, 59-63.

Additional References

Waring, A. (1996) Practical Systems Thinking. London: International Thomson Business Press (Chapter 1)

Edson, Robert. 2008. Systems Thinking. Applied. A Primer. edited by AsysT Institute. Arlington, VA: Analytic Services.

Hitchins, Derek K. 2007. Systems Engineering: A 21st Century Systems Methodology Edited by A. P. Sage, Wiley Series in Systems Engineering and Management. Hoboken, NJ: John Wiley & Sons.

Jackson, Scott, Derek Hitchins, and Howard Eisner. 2010. What is the Systems Approach? INCOSE Insight, April, 41-43.

Lawson, Harold. 2010. A Journey Through the Systems Landscape. London: College Publications, Kings College.

Article Discussion

Peter Checkland is an INCOSE Pioneer and one of the most respected authorities on systems theory.

Derek Hitchins is an INCOSE Fellow and an author of several books on systems theory and systems engineering. He is a respected authority on both subjects.]]]

<- Previous Article | Parent Article | Next Article ->

Signatures

--Radcock 19:20, 15 August 2011 (UTC)