Difference between revisions of "Principles of Systems Thinking"

From SEBoK
Jump to navigation Jump to search
m (Text replacement - "SEBoK v. 2.9, released 20 November 2023" to "SEBoK v. 2.10, released 06 May 2024")
 
(63 intermediate revisions by 12 users not shown)
Line 1: Line 1:
This article forms part of the [[Systems Thinking]] Knowledge Area. It identifies systems [[Principle (glossary)|principles]] as part of the basic ideas of Systems Thinking.  
+
----
 +
'''''Lead Author:''''' ''Rick Adcock'', '''''Contributing Authors:''''' ''Scott Jackson, Janet Singer, Duane Hybertson''
 +
----
 +
This topic forms part of the [[Systems Thinking]] knowledge area (KA). It identifies systems {{Term|Principle (glossary)|principles}} as part of the basic ideas of {{Term|Systems Thinking (glossary)|systems thinking}}.  
  
Some additional concepts more directly associated with Engineered Systems are described and a summary of system principles associated with the concepts aready defiend, is provided.  A number of additional “laws” and heuristics are also discussed.  
+
Some additional {{Term|Concept (glossary)|concepts}} more directly associated with {{Term|Engineered System (glossary)|engineered systems}} are described, and a summary of systems {{Term|Principle (glossary)|principles}} associated with the concepts already defined is provided.  A number of additional “laws” and {{Term|Heuristic (glossary)|heuristics}} are also discussed.  
  
 
==Systems Principles, Laws, and Heuristics==
 
==Systems Principles, Laws, and Heuristics==
A principle is a general rule of conduct or behavior (Lawson and Martin 2008), or a basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct (WordWeb 2012c). Thus, systems principles can be used as a basis for reasoning about systems (systems thinking) or associated conduct (systems approaches).
+
A principle is a general rule of conduct or behavior (Lawson and Martin 2008). It can also be defined as a basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct (WordWeb 2012c). Thus, systems principles can be used as a basis for reasoning about {{Term|System (glossary)|systems}} thinking or associated conduct ({{Term|Systems Approach (glossary)|systems approaches}}).
 
 
  
 
==Separation of Concerns==
 
==Separation of Concerns==
  
A Systems Approach is focused on a [[System of Interest (SoI) (glossary)]] defined as an open system that interacts with and adapts to other systems in an environment; contains open, interacting subsystems, and forms part of some wider or greater whole. The systems approach then considers an SOI to be open and dynamic, and to be comprised of open, dynamic, interacting subsystems. It also understands the SOI to exist in an environment; to interact with, and adapt to, other systems in that environment; and to form part of a larger, wider, or containing system, (Hitchins 2009).  
+
A systems approach is focused on a {{Term|System-of-Interest (glossary)|systems-of-interest}} (SoI) of an {{Term|Open System (glossary)|open system}}. This SoI consists of open, interacting subsystems that as a whole interact with and adapt to other systems in an {{Term|Environment (glossary)|environment}}. The systems approach also considers the SoI in its environment to be part of a larger, wider, or containing system (Hitchins 2009).  
  
In the [[What is Systems Thinking?]] article a “systems thinking paradox” is discussed.  How can we take a holistic system view while still being able to focus on changing or creating systems?   
+
In the [[What is Systems Thinking?]] topic, a “systems thinking paradox” is discussed.  How is it possible to take a holistic system view while still being able to focus on changing or creating systems?   
  
Separation of Concerns is a term used to describe a balance between considering parts of a system problem or solution while not loosing site of the whole (Greer 2008). [[Abstraction (glossary)]] is the process of taking away characteristics from something in order to reduce it to a set of essential characteristics (SearchCIO 2012). In attempting to understand complex situations it is easier to focus on bounded problems, whose solutions “still remaining agnostic to the greater problem.” (Erl 2012).   This sounds reductionist, but is applied effectively in natural systems and engineered systems. The key is that one of the selected problems needs to be the concerns of the system as a whole. This idea of a balance between using abstraction to focus on specific concerns while ensuring we continue to consider the whole is at the centre of [[Systems Approach (glossary)| Systems Approaches]].  
+
Separation of concerns describes a balance between considering parts of a system {{Term|Problem (glossary)|problem}} or {{Term|Solution (glossary)|solution}} while not losing sight of the whole (Greer 2008). {{Term|Abstraction (glossary)|Abstraction}} is the process of taking away characteristics from something in order to reduce it to a set of base characteristics (SearchCIO 2012). In attempting to understand {{Term|Complex (glossary)|complex}} situations, it is easier to focus on {{Term|Boundary (glossary)|bounded}} problems, whose {{Term|Solution (glossary)|solutions}} still remain agnostic to the greater problem (Erl 2012). This process sounds {{Term|Reductionism (glossary)|reductionist}}, but it can be applied effectively to systems. The key to the success of this approach is ensuring that one of the selected problems is the concern of the system as a whole. Finding balance between using abstraction to focus on specific concerns while ensuring the whole is continually considered is at the center of {{Term|Systems Approach (glossary)|systems approaches}}.
 
 
A [[View (glossary)]] is a subset of information observed of one or more entities, such as systems. The physical or conceptual point from which a view is observed is the [[Viewpoint (glossary)]], which can be motivated by one or more observer concerns. Different views of the same target must be both separated, to reflects separation of concerns, and integrated such that all views of a given target are consistent and form a coherent whole (Hybertson 2009). Sample views of a system: internal (what does it consist of?); external (what are its properties and behavior as a whole?); static (parts, structures); dynamic (interactions).
+
A {{Term|View (glossary)|view}} is a subset of information observed of one or more entities, such as systems. The physical or conceptual point from which a view is observed is the {{Term|Viewpoint (glossary)|viewpoint}}, which can be motivated by one or more observer concerns. Different views of the same target must be both separate, to reflect separation of concerns, and integrated such that all views of a given target are consistent and form a coherent whole (Hybertson 2009). Some sample views of a system are internal (Of what does it consist?), external (What are its properties and {{Term|Behavior (glossary)|behavior}} as a whole?), static (What are its parts or structures?); and dynamic (interactions).
  
[[Encapsulation (glossary)]], encloses system elements and their interactions from the external environment is discussed in [[Concepts of Systems Thinking]]. Encapsulation is associated with [[Modularity (glossary)]] the degree to which a system's components may be separated and recombined (Griswold 1995). Modularity applies to systems in many domains, natural, social and engineered. In engineering, encapsulation is the isolation of a system function within a module and providing precise specifications for the module (IEEE Std. 610.12-1990).
+
{{Term|Encapsulation (glossary)}}, which encloses {{Term|System Element (glossary)|system elements}} and their interactions from the external environment, is discussed in [[Concepts of Systems Thinking]]. Encapsulation is associated with {{Term|Modularity (glossary)|modularity}}, the degree to which a system's {{Term|Component (glossary)|components}} may be separated and recombined (Griswold 1995). Modularity applies to systems in natural, social, and engineered domains. In {{Term|Engineering (glossary)|engineering}}, encapsulation is the isolation of a system {{Term|Function (glossary)|function}} within a module; it provides precise specifications for the module (IEEE Std. 610.12-1990).
  
[[Dualism (glossary)]] is a characteristic of systems in which they exhibit seemingly contradictory characteristics that are important for the system (Hybertson 2009).  The yin yang concept in Chinese philosophy emphasizes the interaction between dual elements and their harmonization, ensuring a constant dynamic balance often through a cyclic dominance of one element and then the other, such as day and night (IEP 2006).  
+
{{Term|Dualism (glossary)|Dualism}} is a characteristic of systems in which they exhibit seemingly contradictory characteristics that are important for the system (Hybertson 2009).  The yin yang concept in Chinese philosophy emphasizes the interaction between dual {{Term|Element (glossary)|elements}} and their harmonization, ensuring a constant dynamic balance through a cyclic dominance of one element and then the other, such as day and night (IEP 2006).  
  
From a systems perspective the interaction, harmonization, and balance between system properties is important. (Hybertson 2009) defines '''Leverage''' as the duality between:  
+
From a systems perspective, the interaction, harmonization, and balance between system properties is important. Hybertson (2009) defines '''leverage''' as the duality between:  
*Power the extent to which a system solves a specific problem  
+
*'''Power''', the extent to which a system solves a specific problem, and
*Generality the extent to which a system solves a whole class of problems.  
+
*'''Generality''', the extent to which a system solves a whole class of problems.  
  
While some systems or elements may be optimised for one extreme of such dualities a dynamic balance is needed to be effective in solving complex problems.
+
While some systems or elements may be optimized for one extreme of such dualities, a dynamic balance is needed to be effective in solving complex problems.
  
 
===Summary of Systems Principles===
 
===Summary of Systems Principles===
A set of systems principles is given in Table 1 below.  
+
A set of systems principles is given in Table 1 below. The "Names" segment points to concepts underlying the principle. (See [[Concepts of Systems Thinking]]). Following the table, two additional sets of items related to systems principles are noted and briefly discussed: prerequisite laws for {{Term|Design (glossary)|design science}}, and {{Term|Heuristic (glossary)|heuristics}} and pragmatic principles.
 
 
The names points to concepts underlying the principle (see article on [[Concepts of Systems Thinking]]). Following the table, two additional sets of items related to systems principles are noted and briefly discussed: Prerequisite laws for design science, and heuristics and pragmatic principles.
 
  
<center>'''Table 1. A Set of Systems Principles.'''  SEBoK Original.</center>
+
<center>'''Table 1. A Set of Systems Principles.'''  (SEBoK Original)</center>
 
{| align="center"
 
{| align="center"
 
! Name
 
! Name
 
! Statement of Principle
 
! Statement of Principle
 
|-
 
|-
|[[Regularity (glossary)]]
+
|{{Term|Abstraction (glossary)|Abstraction}}
|Systems science should find and capture regularities in systems, because those regularities promote systems understanding and facilitate systems practice. (Bertalanffy 1968)
+
|A focus on essential characteristics is important in problem solving because it allows problem solvers to ignore the nonessential, thus simplifying the problem (Sci-Tech Encyclopedia 2009; SearchCIO 2012; Pearce 2012).
 
|-
 
|-
|[[Holism (glossary)]]
+
|{{Term|Boundary (glossary)|Boundary}}
|A system should be considered as a single entity, a whole, not just as a set of parts. (Ackoff 1979; Klir 2001)
+
|A boundary or membrane separates the system from the external world. It serves to concentrate interactions inside the system while allowing exchange with external systems (Hoagland, Dodson, and Mauck 2001).
 
|-
 
|-
|'''Interaction'''
+
|'''Change'''
|The properties, capabilities, and behavior of a system derive from its parts, from interactions between those parts, and from interactions with other systems. (Hitchins 2009 p. 60)
+
|Change is necessary for growth and adaptation, and should be accepted and planned for as part of the natural order of things rather than something to be ignored, avoided, or prohibited (Bertalanffy 1968; Hybertson 2009).
 
|-
 
|-
|'''Relations'''
+
|{{Term|Dualism (glossary)|Dualism}}
|A system is characterized by its relations: the interconnections between the elements. Feedback is a type of relation. The set of relations defines the network of the system. (Odum 1994)
+
| Recognize dualities and consider how they are, or can be, harmonized in the {{Term|Context (glossary)|context}} of a larger whole (Hybertson 2009).
 
|-
 
|-
|[[Boundary (glossary)]]
+
|'''Encapsulation'''
|A boundary or membrane separates the system from the external world. It serves to concentrate interactions inside the system while allowing exchange with external systems. (Hoagland, Dodson, and Mauck 2001)
+
|Hide internal parts and their interactions from the external environment (Klerer 1993; IEEE 1990).
 
|-
 
|-
|[[Synthesis (glossary)]]
+
|'''Equifinality'''
|Systems can be created by choosing (conceiving, designing, selecting) the right parts, bringing them together to interact in the right way, and in orchestrating those interactions to create requisite properties of the whole, such that it performs with optimum efectiveness in its operational environment, so solving the problem that prompted its creation” (Hitchins 2008: 120).
+
|In open systems, the same final state may be reached from different initial conditions and in different ways (Bertalanffy 1968). This principle can be exploited, especially in systems of purposeful agents.
 
|-
 
|-
|[[Abstraction (glossary)]]
+
|{{Term|Holism (glossary)|Holism}}
|A focus on essential characteristics is important in problem solving because it allows problem solvers to ignore the nonessential, thus simplifying the problem. (Sci-Tech Encyclopedia 2009; SearchCIO 2012; Pearce 2012)
+
|A system should be considered as a single entity, a whole, not just as a set of parts (Ackoff 1979; Klir 2001).
 
|-
 
|-
|'''Separation of Concerns'''
+
|'''Interaction'''
|A larger problem is more effectively solved when decomposed into a set of smaller problems or concerns. (Erl 2012; Greer 2008)
+
|The properties, {{Term|Capability (glossary)|capabilities}}, and behavior of a system are derived from its parts, from interactions between those parts, and from interactions with other systems (Hitchins 2009 p. 60).
 
|-
 
|-
|[[View (glossary)]]
+
|'''Layer Hierarchy'''
|Multiple views, each based on a system aspect or concern, are essential to understand a complex system or problem situation. One critical view being how concern relates to properties of the whole. (Edson 2008; Hybertson 2009)
+
|The evolution of complex systems is facilitated by their hierarchical structure (including stable intermediate forms) and the understanding of complex systems is facilitated by their hierarchical description (Pattee 1973; Bertalanffy 1968; Simon 1996).
 
|-
 
|-
|[[Modularity (glossary)]]
+
|{{Term|Leverage (glossary)|Leverage}}
|Unrelated parts of the system should be separated, and related parts of the system should be grouped together. (Griswold 1995; Wikipedia 2012a)
+
|Achieve maximum leverage (Hybertson 2009). Because of the power versus generality tradeoff, leverage can be achieved by a complete solution (power) for a narrow class of problems, or by a partial solution for a broad class of problems (generality).
 
|-
 
|-
|'''Encapsulation'''
+
|{{Term|Modularity (glossary)|Modularity}}
|Hide internal parts and their interactions from the external environment. (Klerer 1993; IEEE 1990)
+
|Unrelated parts of the system should be separated, and related parts of the system should be grouped together (Griswold 1995; Wikipedia 2012a).
 
|-
 
|-
|'''Similarity/ Difference'''
+
|{{Term|Network (glossary)|Network}}
|Both the similarities and differences in systems should be recognized and accepted for what they are. (Bertalanffy 1975 p. 75; Hybertson 2009). Avoid forcing one size fits all, and avoid treating everything as entirely unique.
+
|The network is a fundamental topology for systems that forms the basis of togetherness, connection, and dynamic interaction of parts that yield the behavior of complex systems (Lawson 2010; Martin et al. 2004; Sillitto 2010).
 
|-
 
|-
|[[Dualism (glossary)]]
+
|'''Parsimony'''
| Recognize dualities and consider how they are, or can be, harmonized in the context of a larger whole (Hybertson 2009)
+
|One should choose the simplest explanation of a phenomenon, the one that requires the fewest assumptions (Cybernetics 2012). This applies not only to choosing a design, but also to operations and {{Term|Requirement (glossary)|requirements}}.
 
|-
 
|-
|[[Leverage (glossary)]]
+
|{{Term|Regularity (glossary)|Regularity}}
|Achieve maximum leverage (Hybertson 2009). Because of the power versus generality tradeoff, leverage can be achieved by a complete solution (power) for a narrow class of problems, or by a partial solution for a broad class of problems (generality.
+
|{{Term|Systems Science (glossary)|Systems science}} should find and capture regularities in systems, because those regularities promote systems understanding and facilitate systems practice (Bertalanffy 1968).  
 
|-
 
|-
|'''Change'''
+
|'''Relations'''
|Change is necessary for growth and adaptation, and should be accepted and planned for as part of the natural order of things, rather than something to be ignored, avoided, or prohibited. (Bertalanffy 1968; Hybertson 2009)
+
|A system is characterized by its relations: the interconnections between the elements. Feedback is a type of relation. The set of relations defines the {{Term|Network (glossary)|network}} of the system (Odum 1994).
 
|-
 
|-
|'''Stability/ Change'''
+
|'''Separation of Concerns'''
|Things change at different rates, and entities or concepts at the stable end of the spectrum can and should be used to provide a guiding context for rapidly changing entities at the volatile end of the spectrum (Hybertson 2009). The study of complex adaptive systems can give guidance to system behavior and design in changing environments (Holland 1992).
+
|A larger problem is more effectively solved when decomposed into a set of smaller problems or concerns (Erl 2012; Greer 2008).  
 
|-
 
|-
|'''Equifinality'''
+
|'''Similarity/Difference'''
|In open systems, the same final state may be reached from different initial conditions and in different ways. (Bertalanffy 1968). This principle can be exploited especially in systems of purposeful agents.
+
|Both the similarities and differences in systems should be recognized and accepted for what they are (Bertalanffy 1975 p. 75; Hybertson 2009). Avoid forcing one size fits all, and avoid treating everything as entirely unique.
 
|-
 
|-
|'''Parsimony'''
+
|'''Stability/Change'''
|One should choose the simplest explanation of a phenomenon, the one that requires the fewest assumptions. (Cybernetics 2012). This applies not only to choosing a design, but also operations and requirements.
+
|Things change at different rates, and entities or concepts at the stable end of the spectrum can and should be used to provide a guiding context for rapidly changing entities at the volatile end of the spectrum (Hybertson 2009). The study of complex adaptive systems can give guidance to system behavior and design in changing environments (Holland 1992).
 
|-
 
|-
|'''Layer, ''' [[Hierarchy (glossary)]]
+
|{{Term|Synthesis (glossary)|Synthesis}}
|The evolution of complex systems is facilitated by their hierarchical structure (including stable intermediate forms), and the understanding of complex systems is facilitated by their hierarchical description. (Pattee 1973; Bertalanffy 1968; Simon 1996)
+
|Systems can be created by “choosing (conceiving, designing, selecting) the right parts, bringing them together to interact in the right way, and in orchestrating those interactions to create requisite properties of the whole, such that it performs with optimum effectiveness in its operational {{Term|Environment (glossary)|environment}}, so solving the problem that prompted its creation” (Hitchins 2009: 120).
 
|-
 
|-
|[[Network (glossary)]]
+
|{{Term|View (glossary)|View}}
|The network is a fundamental topology for systems that forms the basis of togetherness, connection, and dynamic interaction of parts that yield the behavior of complex systems (Lawson 2010; Martin et al. 2004; Sillitto 2010)
+
|Multiple views, each based on a system aspect or concern, are essential to understand a complex system or problem situation. One critical view is how concern relates to properties of the whole (Edson 2008; Hybertson 2009).  
 
|}
 
|}
  
The principles are not independent. They have synergies and tradeoffs. Lipson (2007), for example, argued that “Scalability of open-ended evolutionary processes depends on their ability to exploit functional modularity, structural regularity and hierarchy.” He proposed a formal model for examining the properties, dependencies, and tradeoffs among these principles. Edson (2008) related many of the above principles in a structure called the conceptagon, which he modified from (Boardman and Sauser 2008), and also provided guidance on how to apply the principles. Not all principles apply to every system or engineering decision. Judgment, experience, and heuristics (see below) help understand which principles apply in a given situation.
+
The principles are not independent. They have synergies and tradeoffs. Lipson (2007), for example, argued that “{{Term|Scalability (glossary)|scalability}} of open-ended evolutionary processes depends on their ability to exploit functional modularity, structural regularity and hierarchy.” He proposed a formal {{Term|Model (glossary)|model}} for examining the properties, dependencies, and tradeoffs among these principles. Edson (2008) related many of the above principles in a structure called the conceptagon, which he modified from the work of Boardman and Sauser (2008). Edson also provided guidance on how to apply these principles. Not all principles apply to every system or engineering decision. Judgment, experience, and heuristics (see below) provide understanding into which principles apply in a given situation.
  
Several principles illustrate the relation of view with the dualism and yin yang principle. An important example is the Holism and Separation of Concerns pair of principles. These look contradictory, but they are dual ways of dealing with complexity. Holism deals with complexity by focusing on the whole system, and Separation of Concerns deals with complexity by dividing a problem or system into smaller more manageable elements that focus on particular concerns. They are reconciled by the fact that both views are needed to understand systems and to engineer systems; focusing on only one or the other does not give sufficient understanding or a good overall solution. This dualism is closely related to the Systems Thinking Paradox described in [[What is Systems Thinking?]]. Rosen (1979) discussed “false dualisms” of systems paradigms that are considered incompatible but are in fact different aspects or views of reality. In the present context, they are thus reconcilable through yin yang harmonization. Edson (2008) emphasized viewpoints as an essential principle of systems thinking and specifically as a way to understand opposing concepts.
+
Several principles illustrate the relation of view with the dualism and yin yang principle, for example, holism and separation of concerns. These principles appear to be contradictory but are in fact dual ways of dealing with {{Term|Complexity (glossary)|complexity}}. Holism deals with complexity by focusing on the whole system, while separation of concerns divides a problem or system into smaller, more manageable elements that focus on particular concerns. They are reconciled by the fact that both views are needed to understand systems and to engineer systems; focusing on only one or the other does not give sufficient understanding or a good overall solution. This dualism is closely related to the systems thinking paradox described in [[What is Systems Thinking?]].  
  
Guidance on how to apply many of these principles to engineered systems is given in the article [[Synthesizing Possible Solutions]] as well as in [[System Definition]] and other knowledge areas in Part 3 of this SEBoK.
+
Rosen (1979) discussed “false dualisms” of systems paradigms that are considered incompatible but are in fact different aspects or views of reality. In the present context, they are thus reconcilable through yin yang harmonization. Edson (2008) emphasized viewpoints as an essential principle of systems thinking; specifically, as a way to understand opposing concepts.
  
===Design Principles===
+
Derick Hitchins (2003) produced a systems life cycle theory described by a set of seven principles forming an integrated set. This theory describes the creation, manipulation and demise of engineered systems. These principles consider the factors which contribute to the stability and survival of man made systems in an environment.  Stability is associated with the principle of '''connected variety''', in which stability is increased by variety, plus the '''cohesion''' and '''adaptability''' of that variety. Stability is limited by allowable relations, resistance to change, and patterns of interaction.  Hitchins describes how interconnected systems tend toward a '''cyclic progression''', in which variety is generated, dominance emerges to suppress variety, dominant modes decay and collapse and survivors emerge to generate new variety.
John Warfield (Warfield 1994) identified a set of laws of generic design science that are related to systems principles. Three of these laws are stated here.
 
  
#‘’Law of Requisite Variety’’: A design situation embodies a variety that must be matched by the specifications. The variety includes the diversity of stakeholders. This law is an application to design science of the Ashby (1956) Law of Requisite Variety, which was defined in the context of cybernetics and states that to successfully regulate a system, the variety of the regulator must be at least as large as the variety of the regulated system.
+
Guidance on how to apply many of these principles to engineered systems is given in the topic [[Synthesizing Possible Solutions]], as well as in System Definition and other knowledge areas in Part 3 of the SEBoK.
#‘’Law of Requisite Parsimony’’: Information must be organized and presented in a way that prevents human information overload. This law derives from Miller’s (1956) findings on the limits of human information processing capacity. Warfield’s structured dialog method is one possible way to help achieve the requisite parsimony.
 
#‘’Law of Gradation’’: Any conceptual body of knowledge can be graded in stages or varying degrees of complexity and scale, ranging from simplest to most comprehensive, and the degree of knowledge applied to any design situation should match the complexity and scale of the situation. A corollary, called the Law of Diminishing Returns, is that a body of knowledge should be applied to a design situation to the stage at which the point of diminishing returns is reached.
 
  
Derick Hitchins (Hitchins 2003), produced a “systems-lifecycle-theory” described by a set of 7 principles forming an integrated set which describe the creation, manipulation and demise of Engineered Systems.
+
===Prerequisite Laws of Design Science===
 +
John Warfield (1994) identified a set of laws of generic design science that are related to systems principles. Three of these laws are stated here:
  
These principles consider the factors which contribute to the stability and survival of man made systems in an environment. Stability is associated with the principle of '''Connected Variety''' in which stability is increased by variety plus the '''cohesion''' and '''adaptability''' of that variety; and stability is limited by allowable relations, resistance to change and patterns of interaction. Hitchins describes how interconnected systems tend to a '''cyclic progression''' in which variety is generated, dominance emerges to suppress variety, dominant modes decays and collapse and survivors emerge to generate new variety.
+
#‘’Law of Requisite Variety’’: A design situation embodies a variety that must be matched by the specifications. The variety includes the diversity of {{Term|Stakeholder (glossary)|stakeholders}}. This law is an application of the design science of the Ashby (1956) Law of Requisite Variety, which was defined in the context of {{Term|Cybernetics (glossary)|cybernetics}} and states that to successfully regulate a system, the variety of the regulator must be at least as large as the variety of the regulated system.
 +
#‘’Law of Requisite Parsimony’’: Information must be organized and presented in a way that prevents human information overload. This law derives from Miller’s findings on the limits of human information processing capacity (Miller 1956). Warfield’s structured dialog method is one possible way to help achieve the requisite parsimony.
 +
#‘’Law of Gradation’’: Any conceptual body of knowledge can be graded in stages or varying degrees of complexity and scale, ranging from simplest to most comprehensive, and the degree of knowledge applied to any design situation should match the complexity and scale of the situation. A corollary, called the Law of Diminishing Returns, states that a body of knowledge should be applied to a design situation at the stage at which the point of diminishing returns is reached.
  
 
===Heuristics and Pragmatic Principles===
 
===Heuristics and Pragmatic Principles===
A heuristic is a common sense rule intended to increase the probability of solving some problem (WordWeb 2012b). In the present context it may be regarded as an informal or pragmatic principle. Maier and Rechtin (2000) identified an extensive set of heuristics that are related to systems principles. A few of these heuristics are stated here, and each is related to principles described above.
+
A heuristic is a common sense rule intended to increase the probability of solving some problem (WordWeb 2012b). In the present context, it may be regarded as an informal or pragmatic principle. Maier and Rechtin (2000) identified an extensive set of heuristics that are related to systems principles. A few of these heuristics are stated here:
  
 
*Relationships among the elements are what give systems their added value. This is related to the ‘’Interaction’’ principle.
 
*Relationships among the elements are what give systems their added value. This is related to the ‘’Interaction’’ principle.
Line 122: Line 122:
 
*The first line of defense against complexity is simplicity of design. This is related to the ‘’Parsimony’’ principle.
 
*The first line of defense against complexity is simplicity of design. This is related to the ‘’Parsimony’’ principle.
 
*In order to understand anything, you must not try to understand everything (attributed to Aristotle). This is related to the ‘’Abstraction’’ principle.
 
*In order to understand anything, you must not try to understand everything (attributed to Aristotle). This is related to the ‘’Abstraction’’ principle.
An INCOSE working group (INCOSE 1993) defined a set of “pragmatic principles” for Systems Engineering. They are essentially best practice heuristics for engineering a system. A large number of heuristics are given. Three examples:
+
An International Council on Systems Engineering (INCOSE) working group (INCOSE 1993) defined a set of “pragmatic principles” for systems engineering (SE). They are essentially best practice heuristics for engineering a system. For example:
  
 
*Know the problem, the customer, and the consumer  
 
*Know the problem, the customer, and the consumer  
*Identify and assess alternatives so as to converge on a solution  
+
*Identify and assess alternatives to converge on a solution  
 
*Maintain the integrity of the system
 
*Maintain the integrity of the system
  
Hitchins defines a similar set of principles which also consider some of the issues of hierarchy and complexity of particular relevance to a system approach (Hitchins 2009).
+
Hitchins defines a set of SE principles which include principles of holism and synthesis as discussed above, as well as principles describing how systems problems that are of particular relevance to a [[Systems Approach Applied to Engineered Systems]] should be resolved (Hitchins 2009).
  
 
==References==
 
==References==
 
===Works Cited===
 
===Works Cited===
Ackoff, R. 1979. The Future of Operational Research is Past, ''J. Opl. Res. Soc.'', 30(2): 93–104, Pergamon Press.
+
Ackoff, R. 1979. "The future of operational research is past," ''Journal of the  Operational Research Society,'' vol. 30, no. 2, pp. 93–104, Pergamon Press.
  
Ashby, W.R. 1956. Requisite variety and its implications for the control of complex systems, ''Cybernetica'', 1(2):1–17.
+
Ashby, W.R. 1956. "Requisite variety and its implications for the control of complex systems," ''Cybernetica,'' vol. 1, no. 2, pp. 1–17.
  
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY: Braziller.
+
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY, USA: Braziller.
  
Bertalanffy, L. von. 1975. ''Perspectives on General System Theory''. E. Taschdjian, ed. New York: George Braziller.
+
Bertalanffy, L. von. 1975. ''Perspectives on General System Theory''. E. Taschdjian, ed. New York, NY, USA: George Braziller.
  
Boardman, J. and B. Sauser. 2008. ''Systems Thinking: Coping with 21st Century Problems''. Boca Raton, FL: Taylor & Francis.
+
Boardman, J. and B. Sauser. 2008. ''Systems Thinking: Coping with 21st Century Problems''. Boca Raton, FL, USA: Taylor & Francis.
  
Cybernetics (Web Dictionary of Cybernetics and Systems). 2012. Principle of Parsimony or Principle of Simplicity. http://pespmc1.vub.ac.be/ASC/PRINCI_SIMPL.html  
+
Cybernetics (Web Dictionary of Cybernetics and Systems). 2012. "Principle of Parsimony or Principle of Simplicity." Available at: Web Dictionary of Cybernetics and Systems http://pespmc1.vub.ac.be/ASC/PRINCI_SIMPL.html. Accessed December 3, 2014.
  
 
Edson, R. 2008. ''Systems Thinking. Applied. A Primer''. Arlington, VA, USA: Applied Systems Thinking (ASysT) Institute, Analytic Services Inc.
 
Edson, R. 2008. ''Systems Thinking. Applied. A Primer''. Arlington, VA, USA: Applied Systems Thinking (ASysT) Institute, Analytic Services Inc.
  
Erl, T. 2012. SOA Principles: An Introduction to the Service Orientation Paradigm. http://www.soaprinciples.com/p3.php  
+
Erl, T. 2012. "SOA Principles: An Introduction to the Service Orientation Paradigm." Available at: Arcitura http://www.soaprinciples.com/p3.php. Accessed December 3 2014.
  
Greer, D. 2008. The Art of Separation of Concerns. http://aspiringcraftsman.com/tag/separation-of-concerns/  
+
Greer, D. 2008. "The Art of Separation of Concerns." Available at: Aspiring Craftsman http://aspiringcraftsman.com/tag/separation-of-concerns/. Accessed December 3 2014 
  
Griswold, W. 1995. Modularity Principle. http://cseweb.ucsd.edu/users/wgg/CSE131B/Design/node1.html
+
Griswold, W. 1995. "Modularity Principle."  Available at: William Griswold  http://cseweb.ucsd.edu/users/wgg/CSE131B/Design/node1.html. Accessed December 3 2014.
  
Hitchins D. K. 2003. Advanced systems thinking engineering and management. Boston MA, Artech House  
+
Hitchins D. K. 2003. ''Advanced Systems Thinking Engineering and Management.'' Boston, MA, USA: Artech House.
  
Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE ''Insight''. 12(4): 59-63.  
+
Hitchins, D. 2009. "What are the general principles applicable to systems?" INCOSE ''Insight'',  vol. 12, no. 4, pp. 59-63.  
  
Hoagland, M., B. Dodson, and J. Mauck. 2001. ''Exploring the Way Life Works''. Jones and Bartlett Publishers, Inc.
+
Hoagland, M., B. Dodson, and J. Mauck. 2001. ''Exploring the Way Life Works''. Burlington, MA, USA: Jones and Bartlett Publishers, Inc.
  
Holland, J. 1992. ''Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence''. Cambridge, MA: MIT Press.
+
Holland, J. 1992. ''Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence''. Cambridge, MA, USA: MIT Press.
  
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science]]: A Unifying Framework for Traditional and Complex Systems''. Auerbach/CRC Press, Boca Raton, FL.
+
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science]]: A Unifying Framework for Traditional and Complex Systems''. Boca Raton, FL, USA: Auerbach/CRC Press.
  
IEEE. 1990. ''IEEE Standard Glossary of Software Engineering Terminology''. IEEE Std 610.12-1990, IEEE, September 1990.
+
IEEE. 1990. ''IEEE Standard Glossary of Software Engineering Terminology''. Geneva, Switzerland: Institute of Electrical and Electronics Engineers. IEEE Std 610.12-1990.
  
IEP (Internet Encyclopedia of Philosophy). 2006. Yinyang (Yin-yang). http://www.iep.utm.edu/yinyang/
+
IEP (Internet Encyclopedia of Philosophy). 2006. "Yinyang (Yin-yang)." Available at: Internet Encyclopedia of Philosophy http://www.iep.utm.edu/yinyang/. Accessed December 3, 2014.
  
INCOSE 1993. ''An Identification of Pragmatic Principles -Final Report''. SE Principles Working Group, January 21, 1993. http://www.incose.org/productspubs/pdf/techdata/pitc/principlespragmaticdefoe_1993-0123_prinwg.pdf
+
INCOSE. 1993. ''An Identification of Pragmatic Principles - Final Report''. SE Principles Working Group, January 21, 1993.  
  
Klerer, S. “System Management Information Modeling,” ''IEEE Comm'', Vol 31:No 5, May 1993, pp 38-44.
+
Klerer, S. “System management information modeling,” ''IEEE Communications'', vol. 31, no. 5 May 1993, pp. 38-44.
  
Klir, G. 2001. ''[[Facets of Systems Science]], 2nd ed.'' New York: Kluwer Academic/Plenum Publishers.  
+
Klir, G. 2001. ''[[Facets of Systems Science]]'', 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.  
  
 
Lawson, H. 2010. ''A Journey Through the Systems Landscape''. London, UK: College Publications, Kings College, UK.
 
Lawson, H. 2010. ''A Journey Through the Systems Landscape''. London, UK: College Publications, Kings College, UK.
  
Lawson, H. and J. Martin. 2008. On the Use of Concepts and Principles for Improving Systems Engineering Practice. INCOSE International Symposium 2008, The Netherlands.
+
Lawson, H. and J. Martin. 2008. "On the use of concepts and principles for improving systems engineering practice." INCOSE International Symposium 2008, The Netherlands, 15-19 June 2008.
 
 
Lipson, H. 2007. Principles of modularity, regularity, and hierarchy for scalable systems. ''Journal of Biological Physics and Chemistry'' 7, 125–128.
 
 
 
Maier, M. and E. Rechtin. 2000. ''The Art of Systems Architecting, 2nd ed''. Boca Raton, FL: CRC Press.
 
  
Martin, R., E. Robertson, and J. Springer. 2004. ''Architectural Principles for Enterprise Frameworks''. Technical Report No. 594, Indiana University, April 2004. http://www.cs.indiana.edu/cgi-bin/techreports/TRNNN.cgi?trnum=TR594.
+
Lipson, H. 2007. "Principles of modularity, regularity, and hierarchy for scalable systems," ''Journal of Biological Physics and Chemistry,'' vol. 7 pp. 125–128.
  
Miller, G. 1956. The magical number seven, plus or minus two: some limits on our capacity for processing information. ''The Psychological Review'', 63, 81–97.
+
Maier, M. and E. Rechtin. 2000. ''The Art of Systems Architecting,'' 2nd ed. Boca Raton, FL, USA: CRC Press.
  
Odum, H.1994. Ecological and General Systems: An Introduction to Systems Ecology (Revised Edition). University Press of Colorado.
+
Miller, G. 1956. "The magical number seven, plus or minus two: Some limits on our capacity for processing information," ''The Psychological Review,''  vol. 63, pp. 81–97.
  
Pattee, H. (ed.) 1973. ''Hierarchy Theory: The Challenge of Complex Systems''. New York: George Braziller.
+
Odum, H. 1994. ''Ecological and General Systems: An Introduction to Systems Ecology (Revised Edition).'' Boulder, CO, USA: University Press of Colorado.
  
Pearce, J. 2012. The Abstraction Principle. http://www.cs.sjsu.edu/~pearce/modules/lectures/ood/principles/Abstraction.htm [Posting date unknown; accessed June 2012.]
+
Pattee, H., Ed.1973. ''Hierarchy Theory: The Challenge of Complex Systems''. New York, NY, USA: George Braziller.
  
Rosen, R. 1979. Old trends and new trends in general systems research. ''Int. J. of General Systems'' 5(3): 173-184. [Reprinted in Klir 2001]
+
Pearce, J. 2012. "The Abstraction Principle."  Available at: Jon Pearce, San Jose State University http://www.cs.sjsu.edu/~pearce/modules/lectures/ood/principles/Abstraction.htm. Accessed December 3 2014.
  
Sci-Tech Encyclopedia. 2009. Abstract Data Type. ''McGraw-Hill Concise Encyclopedia of Science and Technology, Sixth Edition'', The McGraw-Hill Companies, Inc. http://www.answers.com/topic/abstract-data-type.  
+
Rosen, R. 1979. "Old trends and new trends in general systems research," ''International Journal of General Systems,'' vol. 5, no. 3, pp. 173-184.
  
SearchCIO. 2012. Abstraction. http://searchcio-midmarket.techtarget.com/definition/abstraction
+
Sci-Tech Encyclopedia. 2009. "Abstract data type," in ''McGraw-Hill Concise Encyclopedia of Science and Technology, Sixth Edition'', New York, NY, USA: The McGraw-Hill Companies, Inc.  
  
Sillitto, H. 2010. Design principles for Ultra-Large-Scale (ULS) Systems. ''Proceedings of INCOSE International Symposium 2010'', Chicago, Ill.
+
SearchCIO. 2012. "Abstraction." Available at: SearchCIO http://searchcio-midmarket.techtarget.com/definition/abstraction. Accessed December 3 2014.
  
Simon, H. 1996. ''The Sciences of the Artificial, 3rd ed''. Cambridge, MA: MIT Press.
+
Sillitto, H. 2010. "Design principles for ultra-large-scale (ULS) systems," ''Proceedings of INCOSE International Symposium 2010,'', Chicago, IL,12-15 July 2010.
  
Volk, T., & Bloom, J. W. (2007). The use of metapatterns for research into complex systems of teaching, learning, and schooling. Part I: Metapatterns in nature and culture. ''Complicity: An International Journal of Complexity and Education'', 4(1), 25—43 (http://www.complexityandeducation.ualberta.ca/COMPLICITY4/documents/Complicity_41d_Volk_Bloom.pdf).
+
Simon, H. 1996. ''The Sciences of the Artificial,'' 3rd ed. Cambridge, MA, USA: MIT Press.
  
Warfield, J.N. 1994. ''A Science of Generic Design''. Ames, IA: Iowa State University Press.
+
Warfield, J.N. 1994. ''A Science of Generic Design''. Ames, IA, USA: Iowa State University Press.
  
Wikipedia. 2012a. Modularity. http://en.wikipedia.org/wiki/Modularity  
+
Wikipedia. 2012a. "Modularity." Available at: Wikipedia http://en.wikipedia.org/wiki/Modularity. Accessed December 3 2014.
  
WordWeb. 2012a. Dualism. http://www.wordwebonline.com/en/DUALISM.
+
WordWeb. 2012b. "Dualism." Available at: WordWeb http://www.wordwebonline.com/en/DUALISM. Accessed December 3 2014.
  
WordWeb. 2012b. Heuristic. http://www.wordwebonline.com/en/HEURISTIC.  
+
WordWeb. 2012c. "Heuristic." Available at: WordWeb http://www.wordwebonline.com/en/HEURISTIC. Accessed December 3 2014.
  
WordWeb. 2012c. Principle. http://www.wordwebonline.com/en/PRINCIPLE.
+
WordWeb. 2012d. "Principle." Available at: WordWeb http://www.wordwebonline.com/en/PRINCIPLE. Accessed December 3 2014.
  
 
===Primary References===
 
===Primary References===
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY: Braziller.
+
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY, USA: Braziller.
  
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science]]: A Unifying Framework for Traditional and Complex Systems''. Auerbach/CRC Press, Boca Raton, FL.
+
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science]]: A Unifying Framework for Traditional and Complex Systems''. Boca Raton, FL, USA: Auerbach/CRC Press.
  
Klir, G. 2001. ''[[Facets of Systems Science]], 2nd ed.'' New York: Kluwer Academic/Plenum Publishers.
+
Klir, G. 2001. ''[[Facets of Systems Science]],'' 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.
  
 
===Additional References===
 
===Additional References===
Francois, F. (ed.). 2004. ''International Encyclopedia of Systems and Cybernetics, 2nd ed''. K. G. Saur.
+
Francois, F. Ed. 2004. ''International Encyclopedia of Systems and Cybernetics,'' 2nd ed. Munich, Germany: K. G. Saur Verlag.
  
Meyers, R. (ed.). 2009. ''Encyclopedia of Complexity and Systems Science'' (10 vol. set). Springer.
+
Meyers, R. Ed. 2009. ''Encyclopedia of Complexity and Systems Science.'' New York, NY, USA: Springer.
  
Midgley, G. (ed.). 2003. ''Systems Thinking'' (4 Vol. Set). Sage Publications Ltd.
+
Midgley, G. Ed. 2003. ''Systems Thinking.'' Thousand Oaks, CA, USA: Sage Publications Ltd.
 +
 
 +
Volk, T., and J.W. Bloom. 2007. "The use of metapatterns for research into complex systems of teaching, learning, and schooling. Part I: Metapatterns in nature and culture," ''Complicity: An International Journal of Complexity and Education,'' vol. 4, no. 1, pp. 25—43.
  
 
----
 
----
 
<center> [[Concepts of Systems Thinking|< Previous Article]]  |  [[Systems Thinking|Parent Article]]  |  [[Patterns of Systems Thinking|Next Article >]] </center>
 
<center> [[Concepts of Systems Thinking|< Previous Article]]  |  [[Systems Thinking|Parent Article]]  |  [[Patterns of Systems Thinking|Next Article >]] </center>
  
 
+
<center>'''SEBoK v. 2.10, released 06 May 2024'''</center>
  
 
[[Category:Part 2]][[Category:Topic]][[Category:Systems Thinking]]
 
[[Category:Part 2]][[Category:Topic]][[Category:Systems Thinking]]
{{DISQUS}}
 

Latest revision as of 22:24, 2 May 2024


Lead Author: Rick Adcock, Contributing Authors: Scott Jackson, Janet Singer, Duane Hybertson


This topic forms part of the Systems Thinking knowledge area (KA). It identifies systems principlesprinciples as part of the basic ideas of systems thinkingsystems thinking.

Some additional conceptsconcepts more directly associated with engineered systemsengineered systems are described, and a summary of systems principlesprinciples associated with the concepts already defined is provided. A number of additional “laws” and heuristicsheuristics are also discussed.

Systems Principles, Laws, and Heuristics

A principle is a general rule of conduct or behavior (Lawson and Martin 2008). It can also be defined as a basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct (WordWeb 2012c). Thus, systems principles can be used as a basis for reasoning about systemssystems thinking or associated conduct (systems approachessystems approaches).

Separation of Concerns

A systems approach is focused on a systems-of-interestsystems-of-interest (SoI) of an open systemopen system. This SoI consists of open, interacting subsystems that as a whole interact with and adapt to other systems in an environmentenvironment. The systems approach also considers the SoI in its environment to be part of a larger, wider, or containing system (Hitchins 2009).

In the What is Systems Thinking? topic, a “systems thinking paradox” is discussed. How is it possible to take a holistic system view while still being able to focus on changing or creating systems?

Separation of concerns describes a balance between considering parts of a system problemproblem or solutionsolution while not losing sight of the whole (Greer 2008). AbstractionAbstraction is the process of taking away characteristics from something in order to reduce it to a set of base characteristics (SearchCIO 2012). In attempting to understand complexcomplex situations, it is easier to focus on boundedbounded problems, whose solutionssolutions still remain agnostic to the greater problem (Erl 2012). This process sounds reductionistreductionist, but it can be applied effectively to systems. The key to the success of this approach is ensuring that one of the selected problems is the concern of the system as a whole. Finding balance between using abstraction to focus on specific concerns while ensuring the whole is continually considered is at the center of systems approachessystems approaches.

A viewview is a subset of information observed of one or more entities, such as systems. The physical or conceptual point from which a view is observed is the viewpointviewpoint, which can be motivated by one or more observer concerns. Different views of the same target must be both separate, to reflect separation of concerns, and integrated such that all views of a given target are consistent and form a coherent whole (Hybertson 2009). Some sample views of a system are internal (Of what does it consist?), external (What are its properties and behaviorbehavior as a whole?), static (What are its parts or structures?); and dynamic (interactions).

encapsulationencapsulation, which encloses system elementssystem elements and their interactions from the external environment, is discussed in Concepts of Systems Thinking. Encapsulation is associated with modularitymodularity, the degree to which a system's componentscomponents may be separated and recombined (Griswold 1995). Modularity applies to systems in natural, social, and engineered domains. In engineeringengineering, encapsulation is the isolation of a system functionfunction within a module; it provides precise specifications for the module (IEEE Std. 610.12-1990).

DualismDualism is a characteristic of systems in which they exhibit seemingly contradictory characteristics that are important for the system (Hybertson 2009). The yin yang concept in Chinese philosophy emphasizes the interaction between dual elementselements and their harmonization, ensuring a constant dynamic balance through a cyclic dominance of one element and then the other, such as day and night (IEP 2006).

From a systems perspective, the interaction, harmonization, and balance between system properties is important. Hybertson (2009) defines leverage as the duality between:

  • Power, the extent to which a system solves a specific problem, and
  • Generality, the extent to which a system solves a whole class of problems.

While some systems or elements may be optimized for one extreme of such dualities, a dynamic balance is needed to be effective in solving complex problems.

Summary of Systems Principles

A set of systems principles is given in Table 1 below. The "Names" segment points to concepts underlying the principle. (See Concepts of Systems Thinking). Following the table, two additional sets of items related to systems principles are noted and briefly discussed: prerequisite laws for design sciencedesign science, and heuristicsheuristics and pragmatic principles.

Table 1. A Set of Systems Principles. (SEBoK Original)
Name Statement of Principle
AbstractionAbstraction A focus on essential characteristics is important in problem solving because it allows problem solvers to ignore the nonessential, thus simplifying the problem (Sci-Tech Encyclopedia 2009; SearchCIO 2012; Pearce 2012).
BoundaryBoundary A boundary or membrane separates the system from the external world. It serves to concentrate interactions inside the system while allowing exchange with external systems (Hoagland, Dodson, and Mauck 2001).
Change Change is necessary for growth and adaptation, and should be accepted and planned for as part of the natural order of things rather than something to be ignored, avoided, or prohibited (Bertalanffy 1968; Hybertson 2009).
DualismDualism Recognize dualities and consider how they are, or can be, harmonized in the contextcontext of a larger whole (Hybertson 2009).
Encapsulation Hide internal parts and their interactions from the external environment (Klerer 1993; IEEE 1990).
Equifinality In open systems, the same final state may be reached from different initial conditions and in different ways (Bertalanffy 1968). This principle can be exploited, especially in systems of purposeful agents.
HolismHolism A system should be considered as a single entity, a whole, not just as a set of parts (Ackoff 1979; Klir 2001).
Interaction The properties, capabilitiescapabilities, and behavior of a system are derived from its parts, from interactions between those parts, and from interactions with other systems (Hitchins 2009 p. 60).
Layer Hierarchy The evolution of complex systems is facilitated by their hierarchical structure (including stable intermediate forms) and the understanding of complex systems is facilitated by their hierarchical description (Pattee 1973; Bertalanffy 1968; Simon 1996).
LeverageLeverage Achieve maximum leverage (Hybertson 2009). Because of the power versus generality tradeoff, leverage can be achieved by a complete solution (power) for a narrow class of problems, or by a partial solution for a broad class of problems (generality).
ModularityModularity Unrelated parts of the system should be separated, and related parts of the system should be grouped together (Griswold 1995; Wikipedia 2012a).
NetworkNetwork The network is a fundamental topology for systems that forms the basis of togetherness, connection, and dynamic interaction of parts that yield the behavior of complex systems (Lawson 2010; Martin et al. 2004; Sillitto 2010).
Parsimony One should choose the simplest explanation of a phenomenon, the one that requires the fewest assumptions (Cybernetics 2012). This applies not only to choosing a design, but also to operations and requirementsrequirements.
RegularityRegularity Systems scienceSystems science should find and capture regularities in systems, because those regularities promote systems understanding and facilitate systems practice (Bertalanffy 1968).
Relations A system is characterized by its relations: the interconnections between the elements. Feedback is a type of relation. The set of relations defines the networknetwork of the system (Odum 1994).
Separation of Concerns A larger problem is more effectively solved when decomposed into a set of smaller problems or concerns (Erl 2012; Greer 2008).
Similarity/Difference Both the similarities and differences in systems should be recognized and accepted for what they are (Bertalanffy 1975 p. 75; Hybertson 2009). Avoid forcing one size fits all, and avoid treating everything as entirely unique.
Stability/Change Things change at different rates, and entities or concepts at the stable end of the spectrum can and should be used to provide a guiding context for rapidly changing entities at the volatile end of the spectrum (Hybertson 2009). The study of complex adaptive systems can give guidance to system behavior and design in changing environments (Holland 1992).
SynthesisSynthesis Systems can be created by “choosing (conceiving, designing, selecting) the right parts, bringing them together to interact in the right way, and in orchestrating those interactions to create requisite properties of the whole, such that it performs with optimum effectiveness in its operational environmentenvironment, so solving the problem that prompted its creation” (Hitchins 2009: 120).
ViewView Multiple views, each based on a system aspect or concern, are essential to understand a complex system or problem situation. One critical view is how concern relates to properties of the whole (Edson 2008; Hybertson 2009).

The principles are not independent. They have synergies and tradeoffs. Lipson (2007), for example, argued that “scalabilityscalability of open-ended evolutionary processes depends on their ability to exploit functional modularity, structural regularity and hierarchy.” He proposed a formal modelmodel for examining the properties, dependencies, and tradeoffs among these principles. Edson (2008) related many of the above principles in a structure called the conceptagon, which he modified from the work of Boardman and Sauser (2008). Edson also provided guidance on how to apply these principles. Not all principles apply to every system or engineering decision. Judgment, experience, and heuristics (see below) provide understanding into which principles apply in a given situation.

Several principles illustrate the relation of view with the dualism and yin yang principle, for example, holism and separation of concerns. These principles appear to be contradictory but are in fact dual ways of dealing with complexitycomplexity. Holism deals with complexity by focusing on the whole system, while separation of concerns divides a problem or system into smaller, more manageable elements that focus on particular concerns. They are reconciled by the fact that both views are needed to understand systems and to engineer systems; focusing on only one or the other does not give sufficient understanding or a good overall solution. This dualism is closely related to the systems thinking paradox described in What is Systems Thinking?.

Rosen (1979) discussed “false dualisms” of systems paradigms that are considered incompatible but are in fact different aspects or views of reality. In the present context, they are thus reconcilable through yin yang harmonization. Edson (2008) emphasized viewpoints as an essential principle of systems thinking; specifically, as a way to understand opposing concepts.

Derick Hitchins (2003) produced a systems life cycle theory described by a set of seven principles forming an integrated set. This theory describes the creation, manipulation and demise of engineered systems. These principles consider the factors which contribute to the stability and survival of man made systems in an environment. Stability is associated with the principle of connected variety, in which stability is increased by variety, plus the cohesion and adaptability of that variety. Stability is limited by allowable relations, resistance to change, and patterns of interaction. Hitchins describes how interconnected systems tend toward a cyclic progression, in which variety is generated, dominance emerges to suppress variety, dominant modes decay and collapse and survivors emerge to generate new variety.

Guidance on how to apply many of these principles to engineered systems is given in the topic Synthesizing Possible Solutions, as well as in System Definition and other knowledge areas in Part 3 of the SEBoK.

Prerequisite Laws of Design Science

John Warfield (1994) identified a set of laws of generic design science that are related to systems principles. Three of these laws are stated here:

  1. ‘’Law of Requisite Variety’’: A design situation embodies a variety that must be matched by the specifications. The variety includes the diversity of stakeholdersstakeholders. This law is an application of the design science of the Ashby (1956) Law of Requisite Variety, which was defined in the context of cyberneticscybernetics and states that to successfully regulate a system, the variety of the regulator must be at least as large as the variety of the regulated system.
  2. ‘’Law of Requisite Parsimony’’: Information must be organized and presented in a way that prevents human information overload. This law derives from Miller’s findings on the limits of human information processing capacity (Miller 1956). Warfield’s structured dialog method is one possible way to help achieve the requisite parsimony.
  3. ‘’Law of Gradation’’: Any conceptual body of knowledge can be graded in stages or varying degrees of complexity and scale, ranging from simplest to most comprehensive, and the degree of knowledge applied to any design situation should match the complexity and scale of the situation. A corollary, called the Law of Diminishing Returns, states that a body of knowledge should be applied to a design situation at the stage at which the point of diminishing returns is reached.

Heuristics and Pragmatic Principles

A heuristic is a common sense rule intended to increase the probability of solving some problem (WordWeb 2012b). In the present context, it may be regarded as an informal or pragmatic principle. Maier and Rechtin (2000) identified an extensive set of heuristics that are related to systems principles. A few of these heuristics are stated here:

  • Relationships among the elements are what give systems their added value. This is related to the ‘’Interaction’’ principle.
  • Efficiency is inversely proportional to universality. This is related to the ‘’Leverage’’ principle.
  • The first line of defense against complexity is simplicity of design. This is related to the ‘’Parsimony’’ principle.
  • In order to understand anything, you must not try to understand everything (attributed to Aristotle). This is related to the ‘’Abstraction’’ principle.

An International Council on Systems Engineering (INCOSE) working group (INCOSE 1993) defined a set of “pragmatic principles” for systems engineering (SE). They are essentially best practice heuristics for engineering a system. For example:

  • Know the problem, the customer, and the consumer
  • Identify and assess alternatives to converge on a solution
  • Maintain the integrity of the system

Hitchins defines a set of SE principles which include principles of holism and synthesis as discussed above, as well as principles describing how systems problems that are of particular relevance to a Systems Approach Applied to Engineered Systems should be resolved (Hitchins 2009).

References

Works Cited

Ackoff, R. 1979. "The future of operational research is past," Journal of the Operational Research Society, vol. 30, no. 2, pp. 93–104, Pergamon Press.

Ashby, W.R. 1956. "Requisite variety and its implications for the control of complex systems," Cybernetica, vol. 1, no. 2, pp. 1–17.

Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications. Revised ed. New York, NY, USA: Braziller.

Bertalanffy, L. von. 1975. Perspectives on General System Theory. E. Taschdjian, ed. New York, NY, USA: George Braziller.

Boardman, J. and B. Sauser. 2008. Systems Thinking: Coping with 21st Century Problems. Boca Raton, FL, USA: Taylor & Francis.

Cybernetics (Web Dictionary of Cybernetics and Systems). 2012. "Principle of Parsimony or Principle of Simplicity." Available at: Web Dictionary of Cybernetics and Systems http://pespmc1.vub.ac.be/ASC/PRINCI_SIMPL.html. Accessed December 3, 2014.

Edson, R. 2008. Systems Thinking. Applied. A Primer. Arlington, VA, USA: Applied Systems Thinking (ASysT) Institute, Analytic Services Inc.

Erl, T. 2012. "SOA Principles: An Introduction to the Service Orientation Paradigm." Available at: Arcitura http://www.soaprinciples.com/p3.php. Accessed December 3 2014.

Greer, D. 2008. "The Art of Separation of Concerns." Available at: Aspiring Craftsman http://aspiringcraftsman.com/tag/separation-of-concerns/. Accessed December 3 2014

Griswold, W. 1995. "Modularity Principle." Available at: William Griswold http://cseweb.ucsd.edu/users/wgg/CSE131B/Design/node1.html. Accessed December 3 2014.

Hitchins D. K. 2003. Advanced Systems Thinking Engineering and Management. Boston, MA, USA: Artech House.

Hitchins, D. 2009. "What are the general principles applicable to systems?" INCOSE Insight, vol. 12, no. 4, pp. 59-63.

Hoagland, M., B. Dodson, and J. Mauck. 2001. Exploring the Way Life Works. Burlington, MA, USA: Jones and Bartlett Publishers, Inc.

Holland, J. 1992. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. Cambridge, MA, USA: MIT Press.

Hybertson, D. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems. Boca Raton, FL, USA: Auerbach/CRC Press.

IEEE. 1990. IEEE Standard Glossary of Software Engineering Terminology. Geneva, Switzerland: Institute of Electrical and Electronics Engineers. IEEE Std 610.12-1990.

IEP (Internet Encyclopedia of Philosophy). 2006. "Yinyang (Yin-yang)." Available at: Internet Encyclopedia of Philosophy http://www.iep.utm.edu/yinyang/. Accessed December 3, 2014.

INCOSE. 1993. An Identification of Pragmatic Principles - Final Report. SE Principles Working Group, January 21, 1993.

Klerer, S. “System management information modeling,” IEEE Communications, vol. 31, no. 5 May 1993, pp. 38-44.

Klir, G. 2001. Facets of Systems Science, 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.

Lawson, H. 2010. A Journey Through the Systems Landscape. London, UK: College Publications, Kings College, UK.

Lawson, H. and J. Martin. 2008. "On the use of concepts and principles for improving systems engineering practice." INCOSE International Symposium 2008, The Netherlands, 15-19 June 2008.

Lipson, H. 2007. "Principles of modularity, regularity, and hierarchy for scalable systems," Journal of Biological Physics and Chemistry, vol. 7 pp. 125–128.

Maier, M. and E. Rechtin. 2000. The Art of Systems Architecting, 2nd ed. Boca Raton, FL, USA: CRC Press.

Miller, G. 1956. "The magical number seven, plus or minus two: Some limits on our capacity for processing information," The Psychological Review, vol. 63, pp. 81–97.

Odum, H. 1994. Ecological and General Systems: An Introduction to Systems Ecology (Revised Edition). Boulder, CO, USA: University Press of Colorado.

Pattee, H., Ed.1973. Hierarchy Theory: The Challenge of Complex Systems. New York, NY, USA: George Braziller.

Pearce, J. 2012. "The Abstraction Principle." Available at: Jon Pearce, San Jose State University http://www.cs.sjsu.edu/~pearce/modules/lectures/ood/principles/Abstraction.htm. Accessed December 3 2014.

Rosen, R. 1979. "Old trends and new trends in general systems research," International Journal of General Systems, vol. 5, no. 3, pp. 173-184.

Sci-Tech Encyclopedia. 2009. "Abstract data type," in McGraw-Hill Concise Encyclopedia of Science and Technology, Sixth Edition, New York, NY, USA: The McGraw-Hill Companies, Inc.

SearchCIO. 2012. "Abstraction." Available at: SearchCIO http://searchcio-midmarket.techtarget.com/definition/abstraction. Accessed December 3 2014.

Sillitto, H. 2010. "Design principles for ultra-large-scale (ULS) systems," Proceedings of INCOSE International Symposium 2010,, Chicago, IL,12-15 July 2010.

Simon, H. 1996. The Sciences of the Artificial, 3rd ed. Cambridge, MA, USA: MIT Press.

Warfield, J.N. 1994. A Science of Generic Design. Ames, IA, USA: Iowa State University Press.

Wikipedia. 2012a. "Modularity." Available at: Wikipedia http://en.wikipedia.org/wiki/Modularity. Accessed December 3 2014.

WordWeb. 2012b. "Dualism." Available at: WordWeb http://www.wordwebonline.com/en/DUALISM. Accessed December 3 2014.

WordWeb. 2012c. "Heuristic." Available at: WordWeb http://www.wordwebonline.com/en/HEURISTIC. Accessed December 3 2014.

WordWeb. 2012d. "Principle." Available at: WordWeb http://www.wordwebonline.com/en/PRINCIPLE. Accessed December 3 2014.

Primary References

Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications. Revised ed. New York, NY, USA: Braziller.

Hybertson, D. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems. Boca Raton, FL, USA: Auerbach/CRC Press.

Klir, G. 2001. Facets of Systems Science, 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.

Additional References

Francois, F. Ed. 2004. International Encyclopedia of Systems and Cybernetics, 2nd ed. Munich, Germany: K. G. Saur Verlag.

Meyers, R. Ed. 2009. Encyclopedia of Complexity and Systems Science. New York, NY, USA: Springer.

Midgley, G. Ed. 2003. Systems Thinking. Thousand Oaks, CA, USA: Sage Publications Ltd.

Volk, T., and J.W. Bloom. 2007. "The use of metapatterns for research into complex systems of teaching, learning, and schooling. Part I: Metapatterns in nature and culture," Complicity: An International Journal of Complexity and Education, vol. 4, no. 1, pp. 25—43.


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.10, released 06 May 2024