Difference between revisions of "Principles of Systems Thinking"

From SEBoK
Jump to navigation Jump to search
m (Text replacement - "<center>'''SEBoK v. 2.3, released 30 October 2020'''</center>" to "<center>'''SEBoK v. 2.4, released 19 May 2021'''</center>")
(76 intermediate revisions by 10 users not shown)
Line 1: Line 1:
This article forms part of the [[Systems Thinking]] Knowledge Area. It identifies systems [[Principle (glossary)|principles]] and systems [[Pattern (glossary)|patterns]] as part of the basic ideas of Systems Thinking. The first part presents a set of principles, along with a few “laws” and heuristics. The second part presents the general idea of patterns and a number of examples. A brief conclusion discusses the maturity of systems science from the perspective of principles and patterns.
+
----
 +
'''''Lead Author:''''' ''Rick Adcock'', '''''Contributing Authors:''''' ''Scott Jackson, Janet Singer, Duane Hybertson''
 +
----
 +
This topic forms part of the [[Systems Thinking]] knowledge area (KA). It identifies systems {{Term|Principle (glossary)|principles}} as part of the basic ideas of {{Term|Systems Thinking (glossary)|systems thinking}}.  
 +
 
 +
Some additional {{Term|Concept (glossary)|concepts}} more directly associated with {{Term|Engineered System (glossary)|engineered systems}} are described, and a summary of systems {{Term|Principle (glossary)|principles}} associated with the concepts already defined is provided. A number of additional “laws” and {{Term|Heuristic (glossary)|heuristics}} are also discussed.  
  
 
==Systems Principles, Laws, and Heuristics==
 
==Systems Principles, Laws, and Heuristics==
===Systems Principles===
+
A principle is a general rule of conduct or behavior (Lawson and Martin 2008). It can also be defined as a basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct (WordWeb 2012c). Thus, systems principles can be used as a basis for reasoning about {{Term|System (glossary)|systems}} thinking or associated conduct ({{Term|Systems Approach (glossary)|systems approaches}}).
A principle is a general rule of conduct or behavior (Lawson and Martin 2008), or a basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct (WordWeb 2012c). Thus, systems principles can be used as a basis for reasoning about systems (systems thinking) or associated conduct (systems engineering). A set of systems principles is given in Table 1 below. The names points to the concept underlying the principle (see article on [[Concepts of Systems Thinking]]). Following the table, two additional sets of items related to systems principles are noted and briefly discussed: Prerequisite laws for design science, and heuristics and pragmatic principles.
 
 
 
<center>'''Table 1. A Set of Systems Principles.'''  (SEBoK Original)</center>
 
{| align="center"
 
! Name
 
! Statement of Principle
 
|-
 
|'''Regularity'''
 
|Systems science should find and capture regularities in systems, because those regularities promote systems understanding and facilitate systems practice. (Bertalanffy 1968)
 
|-
 
|'''Similarity/ Difference'''
 
|Both the similarities and differences in systems should be recognized and accepted for what they are. (Bertalanffy 1975 p. 75; Hybertson 2009). Avoid forcing one size fits all, and avoid treating everything as entirely unique.
 
|-
 
|[[Dualism (glossary)]] '''and Yin Yang'''
 
|''Dualism'': Reality consists of two basic opposing elements (WordWeb 2012a). Examples: The Universe is divided into system and not-system (Fuller 1975); top-down/bottom-up; static/dynamic; continuous/ discrete
 
''Yin yang'': Interaction and harmonization of dual elements ensures a constant, dynamic balance of all things (IEP 2006)
 
''Combined principle'': Recognize dualities and consider how they are, or can be, harmonized in the context of a larger whole (Hybertson 2009)
 
|-
 
|[[Leverage (glossary)]]
 
|Achieve maximum leverage (Hybertson 2009). Because of the power versus generality tradeoff, leverage can be achieved by a complete solution (power) for a narrow class of problems, or by a partial solution for a broad class of problems (generality).
 
|-
 
|'''Interaction'''
 
|The properties, capabilities, and behavior of a system derive from its parts, from interactions between those parts, and from interactions with other systems. (Hitchins 2009 p. 60)
 
|-
 
|'''Relations'''
 
|A system is characterized by its relations: the interconnections between the elements. Feedback is a type of relation. The set of relations defines the network of the system. (Odum 1994)
 
|-
 
|'''Change'''
 
|Change is necessary for growth and adaptation, and should be accepted and planned for as part of the natural order of things, rather than something to be ignored, avoided, or prohibited. (Bertalanffy 1968; Hybertson 2009)
 
|-
 
|'''Stability/ Change'''
 
|Things change at different rates, and entities or concepts at the stable end of the spectrum can and should be used to provide a guiding context for rapidly changing entities at the volatile end of the spectrum (Hybertson 2009). The study of complex adaptive systems can give guidance to system behavior and design in changing environments (Holland 1992).
 
|-
 
|'''Equifinality'''
 
|In open systems, the same final state may be reached from different initial conditions and in different ways. (Bertalanffy 1968). This principle can be exploited especially in systems of purposeful agents.
 
|-
 
|[[Holism (glossary)]]
 
|A system should be considered as a single entity, a whole, not just as a set of parts. (Ackoff 1979; Klir 2001)
 
|-
 
|'''Parsimony'''
 
|One should choose the simplest explanation of a phenomenon, the one that requires the fewest assumptions. (Cybernetics 2012). This applies not only to choosing a design, but also operations and requirements.
 
|-
 
|'''Separation of Concerns'''
 
|A larger problem is more effectively solved when decomposed into a set of smaller problems or concerns. (Erl 2012; Greer 2008)
 
|-
 
|[[Abstraction (glossary)]]
 
|A focus on essential characteristics is important in problem solving because it allows problem solvers to ignore the nonessential, thus simplifying the problem. (Sci-Tech Encyclopedia 2009; SearchCIO 2012; Pearce 2012)
 
|-
 
|[[Modularity (glossary)]]
 
|Unrelated parts of the system should be separated, and related parts of the system should be grouped together. (Griswold 1995; Wikipedia 2012a)
 
|-
 
|'''Encapsulation'''
 
|Hide internal parts and their interactions from the external environment. (Klerer 1993; IEEE 1990)
 
|-
 
|[[Boundary (glossary)]]
 
|A boundary or membrane separates the system from the external world. It serves to concentrate interactions inside the system while allowing exchange with external systems. (Hoagland, Dodson, and Mauck 2001)
 
|-
 
|[[View (glossary)]]
 
|Multiple views, each based on a system aspect or concern, are essential to understand a complex system or problem situation. (Edson 2008; Hybertson 2009)
 
|-
 
|'''Layer, ''' [[Hierarchy (glossary)]]
 
|The evolution of complex systems is facilitated by their hierarchical structure (including stable intermediate forms), and the understanding of complex systems is facilitated by their hierarchical description. (Pattee 1973; Bertalanffy 1968; Simon 1996)
 
|-
 
|[[Network (glossary)]]
 
|The network is a fundamental topology for systems that forms the basis of togetherness, connection, and dynamic interaction of parts that yield the behavior of complex systems (Lawson 2010; Martin et al. 2004; Sillitto 2010)
 
|}
 
 
 
The principles are not independent. They have synergies and tradeoffs. Lipson (2007), for example, argued that “Scalability of open-ended evolutionary processes depends on their ability to exploit functional modularity, structural regularity and hierarchy.” He proposed a formal model for examining the properties, dependencies, and tradeoffs among these principles. Edson (2008) related many of the above principles in a structure called the conceptagon, which he modified from (Boardman and Sauser 2008), and also provided guidance on how to apply the principles. Not all principles apply to every system or engineering decision. Judgment, experience, and heuristics (see below) help understand which principles apply in a given situation.
 
 
 
Several principles illustrate the relation of view with the dualism and yin yang principle. An important example is the Holism and Separation of Concerns pair of principles. These look contradictory, but they are dual ways of dealing with complexity. Holism deals with complexity by focusing on the whole system, and Separation of Concerns deals with complexity by dividing a problem or system into smaller more manageable elements that focus on particular concerns. They are reconciled by the fact that both views are needed to understand systems and to engineer systems; focusing on only one or the other does not give sufficient understanding or a good overall solution. This dualism is closely related to the Systems Thinking Paradox described in [[What is Systems Thinking?]]. Rosen (1979) discussed “false dualisms” of systems paradigms that are considered incompatible but are in fact different aspects or views of reality. In the present context, they are thus reconcilable through yin yang harmonization. Edson (2008) emphasized viewpoints as an essential principle of systems thinking and specifically as a way to understand opposing concepts.
 
  
Guidance on how to apply many of these principles to engineered systems is given in the article [[Synthesizing Possible Solutions]] as well as in [[System Definition]] and other knowledge areas in Part 3 of this SEBoK.
+
==Separation of Concerns==
  
===Prerequisite Laws of Design Science===
+
A systems approach is focused on a {{Term|System-of-Interest (glossary)|systems-of-interest}} (SoI) of an {{Term|Open System (glossary)|open system}}. This SoI consists of open, interacting subsystems that as a whole interact with and adapt to other systems in an {{Term|Environment (glossary)|environment}}. The systems approach also considers the SoI in its environment to be part of a larger, wider, or containing system (Hitchins 2009).  
John Warfield (1994) identified a set of laws of generic design science that are related to systems principles. Three of these laws are stated here.
 
  
#‘’Law of Requisite Variety’’: A design situation embodies a variety that must be matched by the specifications. The variety includes the diversity of stakeholders. This law is an application to design science of the Ashby (1956) Law of Requisite Variety, which was defined in the context of cybernetics and states that to successfully regulate a system, the variety of the regulator must be at least as large as the variety of the regulated system.
+
In the [[What is Systems Thinking?]] topic, a “systems thinking paradox” is discussed. How is it possible to take a holistic system view while still being able to focus on changing or creating systems? 
#‘’Law of Requisite Parsimony’’: Information must be organized and presented in a way that prevents human information overload. This law derives from Miller’s (1956) findings on the limits of human information processing capacity. Warfield’s structured dialog method is one possible way to help achieve the requisite parsimony.
 
#‘’Law of Gradation’’: Any conceptual body of knowledge can be graded in stages or varying degrees of complexity and scale, ranging from simplest to most comprehensive, and the degree of knowledge applied to any design situation should match the complexity and scale of the situation. A corollary, called the Law of Diminishing Returns, is that a body of knowledge should be applied to a design situation to the stage at which the point of diminishing returns is reached.
 
  
===Heuristics and Pragmatic Principles===
+
Separation of concerns describes a balance between considering parts of a system {{Term|Problem (glossary)|problem}} or {{Term|Solution (glossary)|solution}} while not losing sight of the whole (Greer 2008). {{Term|Abstraction (glossary)|Abstraction}} is the process of taking away characteristics from something in order to reduce it to a set of base characteristics (SearchCIO 2012). In attempting to understand {{Term|Complex (glossary)|complex}} situations, it is easier to focus on {{Term|Boundary (glossary)|bounded}} problems, whose {{Term|Solution (glossary)|solutions}} still remain agnostic to the greater problem (Erl 2012). This process sounds {{Term|Reductionism (glossary)|reductionist}}, but it can be applied effectively to systems. The key to the success of this approach is ensuring that one of the selected problems is the concern of the system as a whole. Finding balance between using abstraction to focus on specific concerns while ensuring the whole is continually considered is at the center of {{Term|Systems Approach (glossary)|systems approaches}}.
A heuristic is a common sense rule intended to increase the probability of solving some problem (WordWeb 2012b). In the present context it may be regarded as an informal or pragmatic principle. Maier and Rechtin (2000) identified an extensive set of heuristics that are related to systems principles. A few of these heuristics are stated here, and each is related to principles described above.
+
 
+
A {{Term|View (glossary)|view}} is a subset of information observed of one or more entities, such as systems. The physical or conceptual point from which a view is observed is the {{Term|Viewpoint (glossary)|viewpoint}}, which can be motivated by one or more observer concerns. Different views of the same target must be both separate, to reflect separation of concerns, and integrated such that all views of a given target are consistent and form a coherent whole (Hybertson 2009). Some sample views of a system are internal (Of what does it consist?), external (What are its properties and {{Term|Behavior (glossary)|behavior}} as a whole?), static (What are its parts or structures?); and dynamic (interactions).
*Relationships among the elements are what give systems their added value. This is related to the ‘’Interaction’’ principle.
 
*Efficiency is inversely proportional to universality. This is related to the ‘’Leverage’’ principle.
 
*The first line of defense against complexity is simplicity of design. This is related to the ‘’Parsimony’’ principle.
 
*In order to understand anything, you must not try to understand everything (attributed to Aristotle). This is related to the ‘’Abstraction’’ principle.
 
An INCOSE working group (INCOSE 1993) defined a set of “pragmatic principles” for Systems Engineering. They are essentially best practice heuristics for engineering a system. A large number of heuristics are given. Three examples:
 
  
*Know the problem, the customer, and the consumer
+
{{Term|Encapsulation (glossary)}}, which encloses {{Term|System Element (glossary)|system elements}} and their interactions from the external environment, is discussed in [[Concepts of Systems Thinking]]. Encapsulation is associated with {{Term|Modularity (glossary)|modularity}}, the degree to which a system's {{Term|Component (glossary)|components}} may be separated and recombined (Griswold 1995). Modularity applies to systems in natural, social, and engineered domains. In {{Term|Engineering (glossary)|engineering}}, encapsulation is the isolation of a system {{Term|Function (glossary)|function}} within a module; it provides precise specifications for the module (IEEE Std. 610.12-1990).
*Identify and assess alternatives so as to converge on a solution
 
*Maintain the integrity of the system
 
  
==Systems Patterns==
+
{{Term|Dualism (glossary)|Dualism}} is a characteristic of systems in which they exhibit seemingly contradictory characteristics that are important for the system (Hybertson 2009). The yin yang concept in Chinese philosophy emphasizes the interaction between dual {{Term|Element (glossary)|elements}} and their harmonization, ensuring a constant dynamic balance through a cyclic dominance of one element and then the other, such as day and night (IEP 2006).  
This section first discusses definitions, types, and pervasiveness of patterns. This is followed by samples of basic patterns in the form of hierarchy and network patterns, metapatterns, and systems engineering patterns. Then samples of patterns of failure (or “[[Antipattern (glossary)|antipatterns]]”) are presented in the form of system archetypes, along with antipatterns in software engineering and other fields. Finally, a brief discussion of patterns as maturity indicators is given.
 
  
===Pattern Definitions and Types===
+
From a systems perspective, the interaction, harmonization, and balance between system properties is important. Hybertson (2009) defines '''leverage''' as the duality between:
The most general definition of pattern is that it is an expression of an observed regularity. Patterns exist in both natural and artificial systems, and are used in both systems science and systems engineering. Theories in science are patterns. Building architecture styles are patterns. Engineering uses patterns extensively.  
+
*'''Power''', the extent to which a system solves a specific problem, and  
 +
*'''Generality''', the extent to which a system solves a whole class of problems.  
  
Patterns are a representation of similarities in a set or class of problems, solutions, or systems. In addition, some patterns can also represent uniqueness or differences, e.g., uniqueness pattern: unique identifier, such as automobile vehicle identification number (VIN), serial number on a consumer product, human fingerprints, DNA. The pattern is that a unique identifier, common to all instances in a class (such as fingerprint), distinguishes between all instances in that class.
+
While some systems or elements may be optimized for one extreme of such dualities, a dynamic balance is needed to be effective in solving complex problems.
  
The term pattern has been used primarily in building architecture and urban planning by Alexander (Alexander et al. 1977, Alexander 1979) and in software engineering (e.g., Gamma et al. 1995; Buschmann et al. 1996). Their definitions portray a pattern as capturing design ideas as an archetypal and reusable description. A design pattern provides a generalized solution in the form of templates to a commonly occurring real-world problem within a given context. A design pattern is not a finished design that can be transformed directly into a specific solution. It is a description or template for how to solve a problem that can be used in many different specific situations (Gamma et al. 1995; Wikipedia 2012b). Alexander placed significant emphasis on the pattern role of reconciling and resolving competing forces, which is an important application of the yin yang principle.
+
===Summary of Systems Principles===
 +
A set of systems principles is given in Table 1 below. The "Names" segment points to concepts underlying the principle. (See [[Concepts of Systems Thinking]]). Following the table, two additional sets of items related to systems principles are noted and briefly discussed: prerequisite laws for {{Term|Design (glossary)|design science}}, and {{Term|Heuristic (glossary)|heuristics}} and pragmatic principles.
  
Other examples of general patterns in both natural and engineered systems include conventional designs in engineering handbooks, complex system models such as evolution and predator-prey models that apply to multiple application domains, domain taxonomies, architecture frameworks, standards, templates, architecture styles, reference architectures, product lines, abstract data types, and classes in class hierarchies (Hybertson 2009). Shaw and Garlan (1996) used the terms pattern and style interchangeably in discussing software architecture. Lehmann and Belady (1985) examined a set of engineered software systems and tracked their change over time, and observed regularities that they captured as evolution laws or patterns.
+
<center>'''Table 1. A Set of Systems Principles.'''  (SEBoK Original)</center>
 
 
Patterns have been combined with model-based systems engineering (MBSE) to lead to pattern-based systems engineering (PBSE) (Schindel and Smith 2002, Schindel 2005).
 
 
 
Patterns also exist in systems practice, both science and engineering. At the highest level, Gregory (1966) defined science and design as behavior patterns: “The scientific method is a pattern of problem-solving behaviour employed in finding out the nature of what exists, whereas the design method is a pattern of behaviour employed in inventing things of value which do not yet exist.”
 
 
 
Regularities exist not only as positive solutions to recurring problems, but also as patterns of failure, i.e., as commonly attempted solutions that consistently fail to solve recurring problems. In software engineering these are called antipatterns, originally coined and defined by Koenig (1995): An antipattern is just like a pattern, except that instead of a solution it gives something that looks superficially like a solution but isn’t one. Koenig’s rationale was that if one does not know how to solve a problem, it may nevertheless be useful to know about likely blind alleys. Antipatterns may include patterns of pathologies (i.e., common diseases), common impairment of normal functioning, and basic recurring problematic situations. These antipatterns can be used to help identify the root cause of a problem and eventually lead to solution patterns. The concept was expanded beyond software to include project management, organization, and other antipatterns (Brown et al. 1998; AntiPatterns Catalog 2012).
 
 
 
Patterns are grouped in the remainder of this section into basic foundational patterns and antipatterns (or patterns of failure).
 
 
 
===Basic Foundational Patterns===
 
The basic patterns in this section consist of a set of hierarchy and network patterns, followed by a set of metapatterns and systems engineering patterns.
 
 
 
====Hierarchy and Network Patterns====
 
The first group of patterns are representative types of hierarchy patterns distinguished by the one-to-many relation type (extended from Hybertson 2009, p. 90), as shown in the table below. These are presented first because hierarchy patterns infuse many of the other patterns discussed in this section.
 
 
 
<center>'''Table 2. Hierarchy Patterns.'''  (SEBoK Original)</center>
 
 
{| align="center"
 
{| align="center"
! Relation
+
! Name
! Hierarchy Type or Pattern
+
! Statement of Principle
 
|-
 
|-
|'''Basic: repeating one-to-many relation'''
+
|{{Term|Abstraction (glossary)|Abstraction}}
|General: Tree structure
+
|A focus on essential characteristics is important in problem solving because it allows problem solvers to ignore the nonessential, thus simplifying the problem (Sci-Tech Encyclopedia 2009; SearchCIO 2012; Pearce 2012).
 
|-
 
|-
|'''Part of a whole'''
+
|{{Term|Boundary (glossary)|Boundary}}
|Composition (or Aggregation) hierarchy
+
|A boundary or membrane separates the system from the external world. It serves to concentrate interactions inside the system while allowing exchange with external systems (Hoagland, Dodson, and Mauck 2001).
 
|-
 
|-
|'''Part of + dualism: Each element in the hierarchy is a holon, i.e., is both a whole that has parts and a part of a larger whole'''
+
|'''Change'''
|Holarchy (composition hierarchy of holons) (Koestler 1967). Helps recognize similarities across levels in multi-level systems
+
|Change is necessary for growth and adaptation, and should be accepted and planned for as part of the natural order of things rather than something to be ignored, avoided, or prohibited (Bertalanffy 1968; Hybertson 2009).
 
|-
 
|-
|'''Part of + interchangeability: The parts are clonons, i.e., interchangeable'''
+
|{{Term|Dualism (glossary)|Dualism}}
|Composition hierarchy of clonons (Bloom 2005).
+
| Recognize dualities and consider how they are, or can be, harmonized in the {{Term|Context (glossary)|context}} of a larger whole (Hybertson 2009).
Note: This pattern reflects horizontal similarity.
 
 
|-
 
|-
|'''Part of + self-similarity: At each level, the shape or structure of the whole is repeated in the parts, i.e., the hierarchy is self-similar at all scales.'''
+
|'''Encapsulation'''
|Fractal.
+
|Hide internal parts and their interactions from the external environment (Klerer 1993; IEEE 1990).  
Note: This pattern reflects vertical similarity.
 
 
|-
 
|-
|'''Part of + connections or interactions among parts'''
+
|'''Equifinality'''
|System composition hierarchy
+
|In open systems, the same final state may be reached from different initial conditions and in different ways (Bertalanffy 1968). This principle can be exploited, especially in systems of purposeful agents.
 
|-
 
|-
|'''Control of many by one'''
+
|{{Term|Holism (glossary)|Holism}}
|Control hierarchy—e.g., a command structure
+
|A system should be considered as a single entity, a whole, not just as a set of parts (Ackoff 1979; Klir 2001).
 
|-
 
|-
|'''Subtype or sub-class'''
+
|'''Interaction'''
|Type or specialization hierarchy; a type of commonization
+
|The properties, {{Term|Capability (glossary)|capabilities}}, and behavior of a system are derived from its parts, from interactions between those parts, and from interactions with other systems (Hitchins 2009 p. 60).
 
|-
 
|-
|'''Instance of category'''
+
|'''Layer Hierarchy'''
| Categorization (object-class; model-metamodel…) hierarchy; a type of commonization
+
|The evolution of complex systems is facilitated by their hierarchical structure (including stable intermediate forms) and the understanding of complex systems is facilitated by their hierarchical description (Pattee 1973; Bertalanffy 1968; Simon 1996).  
|}
 
 
 
 
 
Network patterns are of two flavors. First, traditional patterns are network topology types, such as bus (common backbone), ring, star (central hub), tree, and mesh (multiple routes) (ATIS 2008). Second, the relatively young science of networks has been investigating social and other complex patterns, such as percolation, cascades, power law, scale-free, small worlds, semantic networks, and neural networks (Boccara 2004; Neumann et al. 2006).
 
 
 
====Metapatterns====
 
The metapatterns identified and defined in the table below are from (Bloom 2005), (Volk and Bloom 2007), and (Kappraff 1991). They describe a metapattern as convergences exhibited in the similar structures of evolved systems across widely separated scales (Volk and Bloom 2007).
 
 
 
<center>'''Table 3. Metapatterns.'''  (SEBoK Original)</center>
 
{| align="center"
 
! Name
 
! Brief Definition
 
! Examples
 
 
|-
 
|-
|'''Spheres'''
+
|{{Term|Leverage (glossary)|Leverage}}
|Shape of maximum volume, minimum surface, containment
+
|Achieve maximum leverage (Hybertson 2009). Because of the power versus generality tradeoff, leverage can be achieved by a complete solution (power) for a narrow class of problems, or by a partial solution for a broad class of problems (generality).
|Cell, planet, dome, ecosystem, community
 
 
|-
 
|-
|'''Centers'''
+
|{{Term|Modularity (glossary)|Modularity}}
|Key components of system stability
+
|Unrelated parts of the system should be separated, and related parts of the system should be grouped together (Griswold 1995; Wikipedia 2012a).  
|Prototypes, purpose, causation; DNA, social insect centers, political constitutions and government, attractors.
 
 
|-
 
|-
|'''Tubes'''
+
|{{Term|Network (glossary)|Network}}
|Surface transfer, connection, support
+
|The network is a fundamental topology for systems that forms the basis of togetherness, connection, and dynamic interaction of parts that yield the behavior of complex systems (Lawson 2010; Martin et al. 2004; Sillitto 2010).
|Networks, lattices, conduits, relations; leaf veins, highways, chains of command.
 
 
|-
 
|-
|'''Binaries plus'''
+
|'''Parsimony'''
|Minimal and thus efficient system
+
|One should choose the simplest explanation of a phenomenon, the one that requires the fewest assumptions (Cybernetics 2012). This applies not only to choosing a design, but also to operations and {{Term|Requirement (glossary)|requirements}}.
|Contrast, duality, reflections, tensions, complementary/symmetrical/reciprocal relationships; two sexes, two-party politics, bifurcating decision process.
 
 
|-
 
|-
|'''Clusters, Clustering'''
+
|{{Term|Regularity (glossary)|Regularity}}
|Subset of webs, distributed systems of parts with mutual attractions
+
|{{Term|Systems Science (glossary)|Systems science}} should find and capture regularities in systems, because those regularities promote systems understanding and facilitate systems practice (Bertalanffy 1968).
|Bird flocks, ungulate herds, children playing, egalitarian social groups
 
 
|-
 
|-
|'''Webs or Networks'''
+
|'''Relations'''
|Parts in relationships within systems (can be centered or clustered, using clonons or holons)
+
|A system is characterized by its relations: the interconnections between the elements. Feedback is a type of relation. The set of relations defines the {{Term|Network (glossary)|network}} of the system (Odum 1994).  
|Subsystems of cells, organisms, ecosystems, machines, society.
 
 
|-
 
|-
|'''Sheets'''
+
|'''Separation of Concerns'''
|Transfer surface for matter, energy, or information
+
|A larger problem is more effectively solved when decomposed into a set of smaller problems or concerns (Erl 2012; Greer 2008).
|Films; fish gills, solar collectors
 
 
|-
 
|-
|'''Borders and Pores'''
+
|'''Similarity/Difference'''
|Protection, openings for controlled exchange
+
|Both the similarities and differences in systems should be recognized and accepted for what they are (Bertalanffy 1975 p. 75; Hybertson 2009). Avoid forcing one size fits all, and avoid treating everything as entirely unique.
|Boundaries, containers, partitions; cell membranes, national borders.
 
 
|-
 
|-
|'''Layers'''
+
|'''Stability/Change'''
|Combination of other patterns that builds up order, structure, and stabilization
+
|Things change at different rates, and entities or concepts at the stable end of the spectrum can and should be used to provide a guiding context for rapidly changing entities at the volatile end of the spectrum (Hybertson 2009). The study of complex adaptive systems can give guidance to system behavior and design in changing environments (Holland 1992).
|Levels of scale, parts and wholes, packing, proportions, tilings
 
|-
 
|'''Similarity'''
 
|Figures of the same shape but different sizes
 
|Similar triangles; infant-adult
 
|-
 
|'''Emergence'''
 
|General phenomenon when a new type of functionality derives from binaries or webs.
 
|Creation (birth); life from molecules, cognition from neurons
 
|-
 
|'''Holarchies'''
 
|Levels of webs, in which successive systems are parts of larger systems
 
|Biological nesting from biomolecules to ecosystems, human social nesting, engineering designs, computer software
 
|-
 
|'''Holons'''
 
|Parts of systems as functionally unique
 
|Heart-lungs-liver (holons) of body
 
|-
 
|'''Clonons'''
 
|Parts of systems as interchangeable.
 
|Skin cells (clonons) of the skin; bricks in constructing a house
 
|-
 
|'''Arrows'''
 
|Stability or gradient-like change over time
 
|Stages, sequence, orientation, stress, growth, meanders; biological homeostasis, growth, self-maintaining social structures.
 
 
|-
 
|-
|'''Cycles'''
+
|{{Term|Synthesis (glossary)|Synthesis}}
|Recurrent patterns in systems over time
+
|Systems can be created by “choosing (conceiving, designing, selecting) the right parts, bringing them together to interact in the right way, and in orchestrating those interactions to create requisite properties of the whole, such that it performs with optimum effectiveness in its operational {{Term|Environment (glossary)|environment}}, so solving the problem that prompted its creation” (Hitchins 2009: 120).
|Alternating repetition, vortex, spiral, turbulence, helices, rotations; protein degradation and synthesis, life cycles, power cycles of electricity generating plants, feedback cycles
 
 
|-
 
|-
|'''Breaks'''
+
|{{Term|View (glossary)|View}}
|Relatively sudden changes in system behavior
+
|Multiple views, each based on a system aspect or concern, are essential to understand a complex system or problem situation. One critical view is how concern relates to properties of the whole (Edson 2008; Hybertson 2009).  
|Transformation, change, branching, explosion, cracking, translations; cell division, insect metamorphosis, coming-of-age ceremonies, political elections, bifurcation points
 
|-
 
|'''Triggers'''
 
|Initiating agents of breaks, both internal and external
 
|Sperm entering egg, precipitating events of war.
 
|-
 
|'''Gradients'''
 
|Continuum of variation between binary poles
 
|Chemical waves in cell development, human quantitative and qualitative values
 
 
|}
 
|}
  
===Systems Engineering Patterns===
+
The principles are not independent. They have synergies and tradeoffs. Lipson (2007), for example, argued that “{{Term|Scalability (glossary)|scalability}} of open-ended evolutionary processes depends on their ability to exploit functional modularity, structural regularity and hierarchy.” He proposed a formal {{Term|Model (glossary)|model}} for examining the properties, dependencies, and tradeoffs among these principles. Edson (2008) related many of the above principles in a structure called the conceptagon, which he modified from the work of Boardman and Sauser (2008). Edson also provided guidance on how to apply these principles. Not all principles apply to every system or engineering decision. Judgment, experience, and heuristics (see below) provide understanding into which principles apply in a given situation.
Some work has been done on various aspects of explicitly applying patterns to systems engineering. A review article of much of this work is by Bagnulo and Addison (2010), covering patterns in general, capability engineering, pattern languages, pattern modelling, and other SE-related pattern topics. Cloutier (2005) discussed applying patterns to SE, based on architecture and software design patterns. Haskins (2005) and Simpson and Simpson (2006) discussed the use of SE pattern languages to enhance the adoption and use of SE patterns. Simpsons identified three high-level, global patterns that can be used as a means of organizing systems patterns:
 
#Anything can be described as a system.
 
#The problem system is always separate from the solution system.
 
#Three systems, at a minimum, are always involved in any system activity: the environmental system, the product system, and the process system.
 
Haskins (2008) also proposed the use of patterns as a way to facilitate the extension of SE from traditional technological systems to address social and socio-technical systems. Some patterns have been applied and identified in this extended arena, described as patterns of success by Rebovich and DeRosa (2012). Stevens (2010) also discussed patterns in the engineering of large-scale, complex “mega-systems.
 
  
A common systems engineering activity in which patterns are applied is in system design, especially in defining one or more solution options for a system of interest. See [[Synthesizing Possible Solutions]] for a discussion. The more specific topic of using patterns (and antipatterns, as described below) to understand and exploit emergence is discussed in the [[Emergence]] article.
+
Several principles illustrate the relation of view with the dualism and yin yang principle, for example, holism and separation of concerns. These principles appear to be contradictory but are in fact dual ways of dealing with {{Term|Complexity (glossary)|complexity}}. Holism deals with complexity by focusing on the whole system, while separation of concerns divides a problem or system into smaller, more manageable elements that focus on particular concerns. They are reconciled by the fact that both views are needed to understand systems and to engineer systems; focusing on only one or the other does not give sufficient understanding or a good overall solution. This dualism is closely related to the systems thinking paradox described in [[What is Systems Thinking?]].  
  
===Patterns of Failure: Antipatterns===
+
Rosen (1979) discussed “false dualisms” of systems paradigms that are considered incompatible but are in fact different aspects or views of reality. In the present context, they are thus reconcilable through yin yang harmonization. Edson (2008) emphasized viewpoints as an essential principle of systems thinking; specifically, as a way to understand opposing concepts.
====System Archetypes====
 
The system dynamics community has developed a collection of what are called system archetypes. The concept was originated by Forrester (1969), while Senge (1990) appears to have introduced the system archetype term. According to Braun (2002), the archetypes describe common patterns of behavior that help answer the question, “Why do we keep seeing the same problems recur over time?” They focus on behavior in organizations and other complex social systems that are repeatedly but unsuccessfully used to solve recurring problems. This is why they are grouped here under antipatterns, even though the system dynamics community does not refer to the archetypes as antipatterns. The table below summarizes the archetypes. There is not a fixed set, or even fixed names for a given archetype. The table shows alternative names for some archetypes. The references coded in Column 3 of the table are: B—Braun 2002; F1—Forrester 1969; F2—Forrester 1995; F3—Forrester 2009; H—Hardin 1968; M—Meadows 1982; S—Senge 1990.
 
  
<center>'''Table 4. System Archetypes.'''  (SEBoK Original)</center>
+
Derick Hitchins (2003) produced a systems life cycle theory described by a set of seven principles forming an integrated set. This theory describes the creation, manipulation and demise of engineered systems. These principles consider the factors which contribute to the stability and survival of man made systems in an environment.  Stability is associated with the principle of '''connected variety''', in which stability is increased by variety, plus the '''cohesion''' and '''adaptability''' of that variety. Stability is limited by allowable relations, resistance to change, and patterns of interaction. Hitchins describes how interconnected systems tend toward a '''cyclic progression''', in which variety is generated, dominance emerges to suppress variety, dominant modes decay and collapse and survivors emerge to generate new variety.
{| align="center"
 
! Name (Alternates)
 
! Description
 
! Reference
 
|-
 
|''' Counterintuitive behavior '''
 
|Forrester identified three “especially dangerous” counterintuitive behaviors of social systems, which correspond respectively to three of the archetypes discussed below: (1) Low-Leverage Policies: Ineffective Actions; (2) High Leverage Policies: Often Wrongly Applied; and (3) Long-Term vs. Short-Term Tradeoffs
 
|F1, F2
 
|-
 
|''' Low-Leverage Policies: Ineffective Actions (Policy Resistance)'''
 
|Most intuitive policy changes in a complex system have very little leverage to create change, because the change causes reactions in other parts of the system that counteract the new policy.
 
|F1, F3, M
 
|-
 
|''' High Leverage Policies: Often Wrongly Applied (High leverage, Wrong Direction) '''
 
|A system problem is often correctable with a small change, but this high-leverage solution is typically counterintuitive in two ways: First, the leverage point is difficult to find because it is usually far removed in time and place from where the problem appears; and second, if the leverage point is identified, the change is typically made in the wrong direction, thereby intensifying the problem.
 
|F1, F3, M
 
|-
 
|''' Long-Term vs. Short-Term Tradeoffs (Fixes that Fail; Shifting the Burden; Addiction)'''
 
|Short-term solutions are intuitive, but in complex systems there is nearly always a conflict or tradeoff between short-term and long-term goals. Thus, a quick fix produces immediate positive results, but its unforeseen and unintended long-term consequences worsen the problem. Furthermore, a repeated quick fix approach makes it harder to change to a more fundamental solution approach later.
 
|F1, F3, M, S, B
 
|-
 
|''' Drift to Low Performance (Eroding Goals; Collapse of Goals) '''
 
|There is a strong tendency for complex system goals to drift downward. A gap between current state and goal state creates pressure to lower the goal rather than taking difficult corrective action to reach the goal. Over time the continually lowered goals lead to crisis and possible collapse of the system.
 
|F1, F3, M, B
 
|-
 
|''' Official Addiction – Shifting the Burden to the Intervener '''
 
|The ability of a system to maintain itself deteriorates when an intervener provides help and the system then becomes dependent on the intervener
 
|M, S
 
|-
 
|''' Limits to Growth (aka Limits to Success) '''
 
|A reinforcing process of accelerating growth (or expansion) will encounter a balancing process as the limit of that system is approached; continuing efforts will produce diminishing returns as one approaches the limits.
 
|S, B
 
|-
 
|''' Balancing Process with Delay '''
 
|Delay in the response of a system to corrective action causes the correcting agent to either over-correct or to give up due to no visible progress.
 
|S
 
|-
 
|''' Escalation '''
 
|Two systems compete for superiority, with each escalating its competitive actions to get ahead, to the point that both systems are harmed.
 
|B
 
|-
 
|''' Success to the Successful '''
 
|Growth leads to decline elsewhere; When two equally capable systems compete for a limited resource, if one system receives more resources, it is more likely to be successful, which results in its receiving even more resources, in a reinforcing loop.
 
|S, B
 
|-
 
|''' Tragedy of the Commons '''
 
|A shared resource is depleted as each system abuses it for individual gain, ultimately hurting all who share it.
 
|H, S, B
 
|-
 
|''' Growth and Underinvestment '''
 
|In a situation where capacity investments can overcome limits, if such investments are not made, then growth stalls, which then rationalizes further underinvestment.
 
|S, B
 
|-
 
|''' Accidental Adversaries '''
 
|Two systems destroy their relationship through escalating retaliations for perceived injuries.
 
|B
 
|-
 
|''' Attractiveness Principle '''
 
|In situations where a system faces multiple limiting or impeding factors, the tendency is to consider each factor separately to select which one to address first, rather than a strategy based on the interdependencies among the factors.
 
|B
 
|}
 
  
Relations among system archetypes were defined by Goodman and Kleiner (1993/1994) and republished in Senge et al. (1994).
+
Guidance on how to apply many of these principles to engineered systems is given in the topic [[Synthesizing Possible Solutions]], as well as in [[System Definition]] and other knowledge areas in Part 3 of the SEBoK.
  
====Software and Other Antipatterns====
+
===Prerequisite Laws of Design Science===
Antipatterns have been identified and collected in the software community in areas that include: Architecture, development, project management, user interface, organization, analysis, software design, programming, methodology, and configuration management (AntiPatterns Catalog 2012, Wikibooks 2012). A brief statement of three of them follows; the first two are organization and the third is software design.
+
John Warfield (1994) identified a set of laws of generic design science that are related to systems principles. Three of these laws are stated here:
#Escalation of commitment: Failing to revoke a decision when it proves wrong
 
#Moral hazard: Insulating a decision-maker from the consequences of his or her decision
 
#Big ball of mud: A system with no recognizable structure
 
A link between the software community and the system archetypes is represented in a project at SEI (2012), which is exploring the system archetypes in the context of identifying recurring software acquisition problems as “acquisition archetypes.” They refer to both types of archetypes as patterns of failure.
 
  
Another set of antipatterns in the general systems arena has been compiled by Troncale (2010; 2011) in his systems pathologies project. Sample pathology types or patterns include:
+
#‘’Law of Requisite Variety’’: A design situation embodies a variety that must be matched by the specifications. The variety includes the diversity of {{Term|Stakeholder (glossary)|stakeholders}}. This law is an application of the design science of the Ashby (1956) Law of Requisite Variety, which was defined in the context of {{Term|Cybernetics (glossary)|cybernetics}} and states that to successfully regulate a system, the variety of the regulator must be at least as large as the variety of the regulated system.
*Cyberpathologies: systems-level malfunctions in feedback architectures.
+
#‘’Law of Requisite Parsimony’’: Information must be organized and presented in a way that prevents human information overload. This law derives from Miller’s findings on the limits of human information processing capacity (Miller 1956). Warfield’s structured dialog method is one possible way to help achieve the requisite parsimony.
*Nexopathologies: systems-level malfunctions in network architectures or dynamics.
+
#‘’Law of Gradation’’: Any conceptual body of knowledge can be graded in stages or varying degrees of complexity and scale, ranging from simplest to most comprehensive, and the degree of knowledge applied to any design situation should match the complexity and scale of the situation. A corollary, called the Law of Diminishing Returns, states that a body of knowledge should be applied to a design situation at the stage at which the point of diminishing returns is reached.
*Heteropathologies: systems-level malfunctions in hierarchical, modular structure & dynamics.
 
Some treatments of antipatterns, including Senge (1990) and SEI (2012), also provide some advice on dealing with or preventing the antipattern. Such guidance is a step toward patterns.
 
  
===Patterns and Maturity===
+
===Heuristics and Pragmatic Principles===
Patterns may be used as an indicator of the maturity of a domain of inquiry, such as systems science or systems engineering. In a mature and relatively stable domain, the problems and solutions are generally understood and their similarities are captured in a variety of what are here called patterns. A couple of observations can be made in this regard on the maturity of systems science in support of systems engineering.
+
A heuristic is a common sense rule intended to increase the probability of solving some problem (WordWeb 2012b). In the present context, it may be regarded as an informal or pragmatic principle. Maier and Rechtin (2000) identified an extensive set of heuristics that are related to systems principles. A few of these heuristics are stated here:
  
In the arenas of physical systems and technical systems, systems science is relatively mature; many system patterns of both natural physical systems and engineered technical systems are reasonably well defined and understood.
+
*Relationships among the elements are what give systems their added value. This is related to the ‘’Interaction’’ principle.
 +
*Efficiency is inversely proportional to universality. This is related to the ‘’Leverage’’ principle.
 +
*The first line of defense against complexity is simplicity of design. This is related to the ‘’Parsimony’’ principle.
 +
*In order to understand anything, you must not try to understand everything (attributed to Aristotle). This is related to the ‘’Abstraction’’ principle.
 +
An International Council on Systems Engineering (INCOSE) working group (INCOSE 1993) defined a set of “pragmatic principles” for systems engineering (SE). They are essentially best practice heuristics for engineering a system. For example:
  
In the arena of more complex systems, including social systems, systems science is somewhat less mature. Solution patterns in that arena are more challenging. A pessimistic view of the possibility of science developing solutions to social problems was expressed by Rittel and Webber (1973) in their classic paper on wicked problems: “The search for scientific bases for confronting problems of social policy is bound to fail, because … they are ‘wicked’ problems, whereas science has developed to deal with ‘tame’ problems.” A more optimistic stance toward social problems has characterized the system dynamics community. They have been pointing out for over 40 years the problems with conventional solutions to social problems, in the form of the system archetypes and associated feedback loop models. That was an important first step. Nevertheless, they have had difficulty achieving the second step: producing social patterns that can be applied to solve those problems. The antipatterns characterize problems, but the patterns for solving those problems are elusive.
+
*Know the problem, the customer, and the consumer
 +
*Identify and assess alternatives to converge on a solution
 +
*Maintain the integrity of the system
  
Despite the difficulties, however, social systems do exhibit regularities, and social problems are often solved to some degree. The social sciences and complex systems community have limited sets of patterns, such as common types of organization structures, common macro-economic models, and even patterns of insurgency and counter-insurgency. The challenge for systems science is to capture those regularities and the salient features of those solutions more broadly, and make them explicit and available in the form of mature patterns. Then perhaps social problems can be solved on a more regular basis. As systems engineering expands its scope from the traditional emphasis on technical aspects of systems to the interplay of the social and technical aspects of socio-technical systems, such progress in systems science is becoming even more important to the practice of systems engineering.
+
Hitchins defines a set of SE principles which include principles of holism and synthesis as discussed above, as well as principles describing how systems problems that are of particular relevance to a [[Systems Approach Applied to Engineered Systems]] should be resolved (Hitchins 2009).
  
 
==References==
 
==References==
 
===Works Cited===
 
===Works Cited===
Ackoff, R. 1979. The Future of Operational Research is Past, ''J. Opl. Res. Soc.'', 30(2): 93–104, Pergamon Press.
+
Ackoff, R. 1979. "The future of operational research is past," ''Journal of the  Operational Research Society,'' vol. 30, no. 2, pp. 93–104, Pergamon Press.
 
 
Alexander, C. 1979. ''[[The Timeless Way of Building]]''. New York: Oxford University Press.
 
 
 
Alexander, C., S. Ishikawa, M. Silverstein, M. Jacobson, I. Fiksdahl-King, and S. Angel. 1977. ''A Pattern Language: Towns – Buildings – Construction''. New York: Oxford University Press.
 
 
 
AntiPatterns Catalog. 2012. http://c2.com/cgi/wiki?AntiPatternsCatalog.
 
 
 
Ashby, W.R. 1956. Requisite variety and its implications for the control of complex systems, ''Cybernetica'', 1(2):1–17.
 
 
 
ATIS. 2008. ''ATIS Telecom Glossary 2007''. Washington, D.C.: Alliance for Telecommunications Industry Solutions. http://www.atis.org/glossary/definition.aspx?id=3516.
 
 
 
Bagnulo, A. and Addison, T. 2010. State of the Art Report on Patterns in Systems Engineering and Capability Engineering. Contract Report 2010-012 by CGI Group for Defence R&D Canada – Valcartier. March 2010.
 
 
 
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY: Braziller.
 
 
 
Bertalanffy, L. von. 1975. ''Perspectives on General System Theory''. E. Taschdjian, ed. New York: George Braziller.
 
 
 
Bloom, J. 2005. [[The application of chaos, complexity, and emergent (meta)patterns to research in teacher education]]. ''Proceedings of the 2004 Complexity Science and Educational Research Conference'' (pp. 155-191), Sep 30–Oct 3 • Chaffey’s Locks, Canada. http://www.complexityandeducation.ca.
 
 
 
Boardman, J. and B. Sauser. 2008. ''Systems Thinking: Coping with 21st Century Problems''. Boca Raton, FL: Taylor & Francis.
 
 
 
Boccara, N. 2004. ''Modeling Complex Systems''. New York: Springer-Verlag.
 
  
Braun, T. 2002. The System Archetypes. www.uni-klu.ac.at/~gossimit/pap/sd/wb_sysarch.pdf.
+
Ashby, W.R. 1956. "Requisite variety and its implications for the control of complex systems," ''Cybernetica,'' vol. 1, no. 2, pp. 1–17.
  
Brown, W., R. Malveau, H. " McCormick, and T. Mowbray. 1998. ''AntiPatterns: Refactoring Software, Architectures, and Projects in Crisis''. John Wiley & Sons.
+
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY, USA: Braziller.
  
Buschmann, F., R. Meunier, H. Rohnert, P. Sommerlad, and M. Stal. 1996. ''Pattern-Oriented Software Architecture: A System of Patterns''. Chichester, U.K.: John Wiley.
+
Bertalanffy, L. von. 1975. ''Perspectives on General System Theory''. E. Taschdjian, ed. New York, NY, USA: George Braziller.
  
Cloutier, R. 2005. Toward the Application of Patterns to Systems Engineering. ''Proceedings CSER 2005'', March 23-25, Hoboken, NJ, USA.
+
Boardman, J. and B. Sauser. 2008. ''Systems Thinking: Coping with 21st Century Problems''. Boca Raton, FL, USA: Taylor & Francis.
  
Cybernetics (Web Dictionary of Cybernetics and Systems). 2012. Principle of Parsimony or Principle of Simplicity. http://pespmc1.vub.ac.be/ASC/PRINCI_SIMPL.html  
+
Cybernetics (Web Dictionary of Cybernetics and Systems). 2012. "Principle of Parsimony or Principle of Simplicity." Available at: Web Dictionary of Cybernetics and Systems http://pespmc1.vub.ac.be/ASC/PRINCI_SIMPL.html. Accessed December 3, 2014.
  
 
Edson, R. 2008. ''Systems Thinking. Applied. A Primer''. Arlington, VA, USA: Applied Systems Thinking (ASysT) Institute, Analytic Services Inc.
 
Edson, R. 2008. ''Systems Thinking. Applied. A Primer''. Arlington, VA, USA: Applied Systems Thinking (ASysT) Institute, Analytic Services Inc.
  
Erl, T. 2012. SOA Principles: An Introduction to the Service Orientation Paradigm. http://www.soaprinciples.com/p3.php  
+
Erl, T. 2012. "SOA Principles: An Introduction to the Service Orientation Paradigm." Available at: Arcitura http://www.soaprinciples.com/p3.php. Accessed December 3 2014.
 
 
Flood, R. L., and E.R. Carson. 1993. ''Dealing with Complexity: An Introduction to the Theory and Application of Systems Science, 2nd ed''. New York, NY, USA: Plenum Press.
 
 
 
Forrester, J. 1969. ''Urban Dynamics''. Waltham, MA: Pegasus Communications.
 
 
 
Forrester, J. 1995. Counterintuitive Behavior of Social Systems. http://constitution.org/ps/cbss.pdf. Update of original paper in ''Technology Review'', Vol. 73, No. 3, Jan. 1971, pp. 52-68.
 
 
 
Forrester, J. 2009. Learning through System Dynamics as Preparation for the 21st Century. http://www.clexchange.com/ftp/documents/whyk12sd/Y_2009-02LearningThroughSD.pdf
 
 
 
Fuller, B. (1975) Synergetics, 876 pp. New York, USA: MacMillan. http://www.rwgrayprojects.com/synergetics/synergetics.html.
 
 
 
Gamma, E., R. Helm, R. Johnson, and J. Vlissides. 1995. ''Design Patterns: Elements of Reusable Object-Oriented Software''. Reading, MA: Addison-Wesley.
 
  
Goodman, G. and A. Kleiner. 1993/1994. “Using the Archetype Family Tree as a Diagnostic Tool”, ''The Systems Thinker'', December 1993/January 1994.
+
Greer, D. 2008. "The Art of Separation of Concerns." Available at: Aspiring Craftsman http://aspiringcraftsman.com/tag/separation-of-concerns/. Accessed December 3 2014 
  
Greer, D. 2008. The Art of Separation of Concerns. http://aspiringcraftsman.com/tag/separation-of-concerns/  
+
Griswold, W. 1995. "Modularity Principle."  Available at: William Griswold  http://cseweb.ucsd.edu/users/wgg/CSE131B/Design/node1.html. Accessed December 3 2014.
  
Gregory, S. 1966. Design and the design method, in S. Gregory (ed.). ''The Design Method''. London: Butterworth.
+
Hitchins D. K. 2003. ''Advanced Systems Thinking Engineering and Management.'' Boston, MA, USA: Artech House.
  
Griswold, W. 1995. Modularity Principle. http://cseweb.ucsd.edu/users/wgg/CSE131B/Design/node1.html
+
Hitchins, D. 2009. "What are the general principles applicable to systems?" INCOSE ''Insight'',  vol. 12, no. 4, pp. 59-63.  
  
Hardin, G. 1968. The Tragedy of the Commons. Science 162 (13 December 1968) 1243-1248. DOI: 10.1126/science.162.3859.1243.
+
Hoagland, M., B. Dodson, and J. Mauck. 2001. ''Exploring the Way Life Works''. Burlington, MA, USA: Jones and Bartlett Publishers, Inc.
  
Haskins, C. 2005. Application of Patterns and Pattern Languages to Systems Engineering. ''Proceedings of the INCOSE 15th Annual Int. Symp''. Rochester, NY, July 10-13, 2005.
+
Holland, J. 1992. ''Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence''. Cambridge, MA, USA: MIT Press.
  
Haskins, C. 2008. Using patterns to transition systems engineering from a technological to social context. ''Systems Engineering'', v. 11, no.2, May 2008, pp. 147-155.
+
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science]]: A Unifying Framework for Traditional and Complex Systems''. Boca Raton, FL, USA: Auerbach/CRC Press.
  
Hitchins, D. 2009. "What are the General Principles Applicable to Systems?" INCOSE ''Insight''. 12(4): 59-63.  
+
IEEE. 1990. ''IEEE Standard Glossary of Software Engineering Terminology''. Geneva, Switzerland: Institute of Electrical and Electronics Engineers. IEEE Std 610.12-1990.
  
Hoagland, M., B. Dodson, and J. Mauck. 2001. ''Exploring the Way Life Works''. Jones and Bartlett Publishers, Inc.
+
IEP (Internet Encyclopedia of Philosophy). 2006. "Yinyang (Yin-yang)." Available at: Internet Encyclopedia of Philosophy http://www.iep.utm.edu/yinyang/. Accessed December 3, 2014.
  
Holland, J. 1992. ''Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence''. Cambridge, MA: MIT Press.
+
INCOSE. 1993. ''An Identification of Pragmatic Principles - Final Report''. SE Principles Working Group, January 21, 1993.  
  
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems]]''. Auerbach/CRC Press, Boca Raton, FL.
+
Klerer, S. “System management information modeling,” ''IEEE Communications'', vol. 31, no. 5 May 1993, pp. 38-44.
  
IEEE. 1990. ''IEEE Standard Glossary of Software Engineering Terminology''. IEEE Std 610.12-1990, IEEE, September 1990.
+
Klir, G. 2001. ''[[Facets of Systems Science]]'', 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.  
 
 
IEP (Internet Encyclopedia of Philosophy). 2006. Yinyang (Yin-yang). http://www.iep.utm.edu/yinyang/
 
 
 
INCOSE 1993. ''An Identification of Pragmatic Principles -Final Report''. SE Principles Working Group, January 21, 1993. http://www.incose.org/productspubs/pdf/techdata/pitc/principlespragmaticdefoe_1993-0123_prinwg.pdf
 
 
 
Kappraff, J. (1991). ''Connections: The geometric bridge between art and science''. New York: McGraw-Hill.
 
 
 
Klerer, S. “System Management Information Modeling,” ''IEEE Comm'', Vol 31:No 5, May 1993, pp 38-44.
 
 
 
Klir, G. 2001. ''[[Facets of Systems Science, 2nd ed.]]'' New York: Kluwer Academic/Plenum Publishers.
 
 
 
Koenig, A. (March/April 1995). "Patterns and Antipatterns". ''Journal of Object-Oriented Programming'' 8, (1): 46–48.
 
 
 
Koestler, A. 1967. ''The Ghost in the Machine''. New York: Macmillan.
 
  
 
Lawson, H. 2010. ''A Journey Through the Systems Landscape''. London, UK: College Publications, Kings College, UK.
 
Lawson, H. 2010. ''A Journey Through the Systems Landscape''. London, UK: College Publications, Kings College, UK.
  
Lawson, H. and J. Martin. 2008. On the Use of Concepts and Principles for Improving Systems Engineering Practice. INCOSE International Symposium 2008, The Netherlands.
+
Lawson, H. and J. Martin. 2008. "On the use of concepts and principles for improving systems engineering practice." INCOSE International Symposium 2008, The Netherlands, 15-19 June 2008.
 
 
Lehmann, M. and L. Belady. 1985. ''Program Evolution''. London: Academic Press.
 
 
 
Lipson, H. 2007. Principles of modularity, regularity, and hierarchy for scalable systems. ''Journal of Biological Physics and Chemistry'' 7, 125–128.
 
 
 
Maier, M. and E. Rechtin. 2000. ''The Art of Systems Architecting, 2nd ed''. Boca Raton, FL: CRC Press.
 
 
 
Martin, R., E. Robertson, and J. Springer. 2004. ''Architectural Principles for Enterprise Frameworks''. Technical Report No. 594, Indiana University, April 2004. http://www.cs.indiana.edu/cgi-bin/techreports/TRNNN.cgi?trnum=TR594.
 
 
 
Meadows, D. 1982. Whole Earth Models and Systems. ''The Co-Evolution Quarterly'', Summer 1982, pp. 98-108. http://www.oss.net/dynamaster/file_archive/040324/48c97c243f534eee32d379e69b039289/WER-INFO-73.pdf.
 
 
 
Miller, G. 1956. The magical number seven, plus or minus two: some limits on our capacity for processing information. ''The Psychological Review'', 63, 81–97.
 
 
 
Newman, M., A.-L. Barabási, and D.J. Watts. 2006. ''The Structure and Dynamics of Networks''. Princeton, NJ: Princeton University Press.
 
 
 
Odum, H.1994. Ecological and General Systems: An Introduction to Systems Ecology (Revised Edition). University Press of Colorado.
 
 
 
Pattee, H. (ed.) 1973. ''Hierarchy Theory: The Challenge of Complex Systems''. New York: George Braziller.
 
 
 
Pearce, J. 2012. The Abstraction Principle. http://www.cs.sjsu.edu/~pearce/modules/lectures/ood/principles/Abstraction.htm [Posting date unknown; accessed June 2012.]
 
 
 
Rebovich, G. and J. DeRosa 2012. Patterns of Success in Systems Engineering of IT-Intensive Government Systems. ''Procedia Computer Science'' 8 (2012) 303 – 308.
 
 
 
Rittel, H. and M. Webber. 1973. Dilemmas in a general theory of planning. ''Policy Sciences'', 4:155–169. http://www.uctc.net/mwebber/Rittel+Webber+Dilemmas+General_Theory_of_Planning.pdf.  
 
  
Rosen, R. 1979. Old trends and new trends in general systems research. ''Int. J. of General Systems'' 5(3): 173-184. [Reprinted in Klir 2001]
+
Lipson, H. 2007. "Principles of modularity, regularity, and hierarchy for scalable systems," ''Journal of Biological Physics and Chemistry,'' vol. 7 pp. 125–128.
  
Schindel, W. 2005. Pattern-based systems engineering: An extension of model-based systems engineering. INCOSE TIES tutorial presented at 2005 INCOSE Symposium.
+
Maier, M. and E. Rechtin. 2000. ''The Art of Systems Architecting,'' 2nd ed. Boca Raton, FL, USA: CRC Press.
  
Schindel, W. and V. Smith. 2002. Results of applying a families-of-systems approach to systems engineering of product line families. Technical Report 2002-01-3086. SAE International.
+
Miller, G. 1956. "The magical number seven, plus or minus two: Some limits on our capacity for processing information," ''The Psychological Review,''  vol. 63, pp. 81–97.
  
Sci-Tech Encyclopedia. 2009. Abstract Data Type. ''McGraw-Hill Concise Encyclopedia of Science and Technology, Sixth Edition'', The McGraw-Hill Companies, Inc. http://www.answers.com/topic/abstract-data-type.  
+
Odum, H. 1994. ''Ecological and General Systems: An Introduction to Systems Ecology (Revised Edition).'' Boulder, CO, USA: University Press of Colorado.
  
SearchCIO. 2012. Abstraction. http://searchcio-midmarket.techtarget.com/definition/abstraction
+
Pattee, H., Ed.1973. ''Hierarchy Theory: The Challenge of Complex Systems''. New York, NY, USA: George Braziller.
  
SEI 2012. Patterns of Failure: System Archetypes. http://www.sei.cmu.edu/acquisition/research/pofsa.cfm
+
Pearce, J. 2012. "The Abstraction Principle."  Available at: Jon Pearce, San Jose State University http://www.cs.sjsu.edu/~pearce/modules/lectures/ood/principles/Abstraction.htm. Accessed December 3 2014.
  
Senge, P. 1990. ''The Fifth Discipline: Discipline: The Art and Practice of the Learning Organization''. New York: Currency Doubleday.
+
Rosen, R. 1979. "Old trends and new trends in general systems research," ''International Journal of General Systems,'' vol. 5, no. 3, pp. 173-184.
  
Senge, P., A. Kleiner, C. Roberts and R. Ross. 1994. ''The Fifth Discipline Fieldbook: Strategies and Tools for Building a Learning Organization''. New York: Currency Doubleday.
+
Sci-Tech Encyclopedia. 2009. "Abstract data type," in ''McGraw-Hill Concise Encyclopedia of Science and Technology, Sixth Edition'', New York, NY, USA: The McGraw-Hill Companies, Inc.  
  
Shaw, M. and D. Garlan. 1996. ''Software Architecture: Perspectives on an Emerging Discipline''. Prentice Hall.
+
SearchCIO. 2012. "Abstraction." Available at: SearchCIO http://searchcio-midmarket.techtarget.com/definition/abstraction. Accessed December 3 2014.
  
Sillitto, H. 2010. Design principles for Ultra-Large-Scale (ULS) Systems. ''Proceedings of INCOSE International Symposium 2010'', Chicago, Ill.
+
Sillitto, H. 2010. "Design principles for ultra-large-scale (ULS) systems," ''Proceedings of INCOSE International Symposium 2010,'', Chicago, IL,12-15 July 2010.
  
Simon, H. 1996. ''The Sciences of the Artificial, 3rd ed''. Cambridge, MA: MIT Press.
+
Simon, H. 1996. ''The Sciences of the Artificial,'' 3rd ed. Cambridge, MA, USA: MIT Press.
  
Simpson, J. and M. Simpson. 2006. Foundational Systems Engineering Patterns for a SE Pattern Language. ''Proc. 16th Annual INCOSE Symposium'', Orlando, FL July, 2006.
+
Warfield, J.N. 1994. ''A Science of Generic Design''. Ames, IA, USA: Iowa State University Press.
  
Stevens, R. 2011. ''Engineering Mega-Systems: The Challenge of Systems Engineering in the Information Age''. Boca Raton, FL: Auerbach/Taylor & Francis.
+
Wikipedia. 2012a. "Modularity." Available at: Wikipedia http://en.wikipedia.org/wiki/Modularity. Accessed December 3 2014.
  
Troncale, L. 2010. Would a Rigorous Knowledge Base in “Systems Pathology” Add to the S.E. Portfolio? Presented at 2010 LA Mini-Conference, 16 October 2010, Loyola Marymount University, Los Angeles, CA. http://www.incose-la.org/documents/events/conferences/mini/2010/presentations/Troncale.pdf.
+
WordWeb. 2012b. "Dualism." Available at: WordWeb http://www.wordwebonline.com/en/DUALISM. Accessed December 3 2014.
  
Troncale, L. 2011. “Would A Rigorous Knowledge Base in Systems Pathology Add Significantly to the SE Portfolio,” ''CSER’11 Proceedings'', Conference on Systems Engineering Research, April 14-16, Redondo Beach, Ca.
+
WordWeb. 2012c. "Heuristic." Available at: WordWeb http://www.wordwebonline.com/en/HEURISTIC. Accessed December 3 2014.
  
Volk, T., & Bloom, J. W. (2007). The use of metapatterns for research into complex systems of teaching, learning, and schooling. Part I: Metapatterns in nature and culture. ''Complicity: An International Journal of Complexity and Education'', 4(1), 25—43 (http://www.complexityandeducation.ualberta.ca/COMPLICITY4/documents/Complicity_41d_Volk_Bloom.pdf).
+
WordWeb. 2012d. "Principle." Available at: WordWeb http://www.wordwebonline.com/en/PRINCIPLE. Accessed December 3 2014.
 
 
Warfield, J.N. 1994. ''A Science of Generic Design''. Ames, IA: Iowa State University Press.
 
 
 
Wikibooks. 2012. AntiPatterns. http://en.wikibooks.org/wiki/Introduction_to_Software_Engineering/Architecture/Anti-Patterns.
 
 
 
Wikipedia. 2012a. Modularity. http://en.wikipedia.org/wiki/Modularity
 
 
 
Wikipedia. 2012b. Software design pattern. http://en.wikipedia.org/wiki/Software_design_pattern
 
 
 
WordWeb. 2012a. Dualism. http://www.wordwebonline.com/en/DUALISM.
 
 
 
WordWeb. 2012b. Heuristic. http://www.wordwebonline.com/en/HEURISTIC.
 
 
 
WordWeb. 2012c. Principle. http://www.wordwebonline.com/en/PRINCIPLE.
 
  
 
===Primary References===
 
===Primary References===
Alexander, C. 1979. ''[[The Timeless Way of Building]]''. New York: Oxford University Press.
+
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY, USA: Braziller.
  
Bertalanffy, L. von. 1968. ''[[General System Theory: Foundations, Development, Applications]]''. Revised ed. New York, NY: Braziller.
+
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science]]: A Unifying Framework for Traditional and Complex Systems''. Boca Raton, FL, USA: Auerbach/CRC Press.
  
Bloom, J. 2005. [[The application of chaos, complexity, and emergent (meta)patterns to research in teacher education]]. ''Proceedings of the 2004 Complexity Science and Educational Research Conference'' (pp. 155-191), Sep 30–Oct 3 • Chaffey’s Locks, Canada. http://www.complexityandeducation.ca.
+
Klir, G. 2001. ''[[Facets of Systems Science]],'' 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.
 
 
Hybertson, D. 2009. ''[[Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems]]''. Auerbach/CRC Press, Boca Raton, FL.
 
 
 
Klir, G. 2001. ''[[Facets of Systems Science, 2nd ed.]]'' New York: Kluwer Academic/Plenum Publishers.  
 
  
 
===Additional References===
 
===Additional References===
Cybernetics and Systems Theory. http://pespmc1.vub.ac.be/CYBSYSTH.html
+
Francois, F. Ed. 2004. ''International Encyclopedia of Systems and Cybernetics,'' 2nd ed. Munich, Germany: K. G. Saur Verlag.
  
Erl, T. 2009. ''SOA: Design Patterns''. Prentice Hall.
+
Meyers, R. Ed. 2009. ''Encyclopedia of Complexity and Systems Science.'' New York, NY, USA: Springer.
  
Erl, T. 2008. ''SOA: Principles of Service Design''. Prentice Hall.
+
Midgley, G. Ed. 2003. ''Systems Thinking.'' Thousand Oaks, CA, USA: Sage Publications Ltd.
  
Francois, F. (ed.). 2004. ''International Encyclopedia of Systems and Cybernetics, 2nd ed''. K. G. Saur.
+
Volk, T., and J.W. Bloom. 2007. "The use of metapatterns for research into complex systems of teaching, learning, and schooling. Part I: Metapatterns in nature and culture," ''Complicity: An International Journal of Complexity and Education,'' vol. 4, no. 1, pp. 25—43.
 
 
Meyers, R. (ed.). 2009. ''Encyclopedia of Complexity and Systems Science'' (10 vol. set). Springer.
 
 
 
Midgley, G. (ed.). 2003. ''Systems Thinking'' (4 Vol. Set). Sage Publications Ltd.
 
 
 
Web Dictionary of Cybernetics and Systems. http://pespmc1.vub.ac.be/ASC/indexASC.html
 
  
 
----
 
----
<center> [[Concepts of Systems Thinking|< Previous Article]]  |  [[Systems Thinking|Parent Article]]  |  [[Overview of Systems Approaches|Next Article >]] </center>
+
<center> [[Concepts of Systems Thinking|< Previous Article]]  |  [[Systems Thinking|Parent Article]]  |  [[Patterns of Systems Thinking|Next Article >]] </center>
 
 
  
 +
<center>'''SEBoK v. 2.4, released 19 May 2021'''</center>
  
 
[[Category:Part 2]][[Category:Topic]][[Category:Systems Thinking]]
 
[[Category:Part 2]][[Category:Topic]][[Category:Systems Thinking]]
{{DISQUS}}
 

Revision as of 21:20, 18 May 2021


Lead Author: Rick Adcock, Contributing Authors: Scott Jackson, Janet Singer, Duane Hybertson


This topic forms part of the Systems Thinking knowledge area (KA). It identifies systems principlesprinciples as part of the basic ideas of systems thinkingsystems thinking.

Some additional conceptsconcepts more directly associated with engineered systemsengineered systems are described, and a summary of systems principlesprinciples associated with the concepts already defined is provided. A number of additional “laws” and heuristicsheuristics are also discussed.

Systems Principles, Laws, and Heuristics

A principle is a general rule of conduct or behavior (Lawson and Martin 2008). It can also be defined as a basic generalization that is accepted as true and that can be used as a basis for reasoning or conduct (WordWeb 2012c). Thus, systems principles can be used as a basis for reasoning about systemssystems thinking or associated conduct (systems approachessystems approaches).

Separation of Concerns

A systems approach is focused on a systems-of-interestsystems-of-interest (SoI) of an open systemopen system. This SoI consists of open, interacting subsystems that as a whole interact with and adapt to other systems in an environmentenvironment. The systems approach also considers the SoI in its environment to be part of a larger, wider, or containing system (Hitchins 2009).

In the What is Systems Thinking? topic, a “systems thinking paradox” is discussed. How is it possible to take a holistic system view while still being able to focus on changing or creating systems?

Separation of concerns describes a balance between considering parts of a system problemproblem or solutionsolution while not losing sight of the whole (Greer 2008). AbstractionAbstraction is the process of taking away characteristics from something in order to reduce it to a set of base characteristics (SearchCIO 2012). In attempting to understand complexcomplex situations, it is easier to focus on boundedbounded problems, whose solutionssolutions still remain agnostic to the greater problem (Erl 2012). This process sounds reductionistreductionist, but it can be applied effectively to systems. The key to the success of this approach is ensuring that one of the selected problems is the concern of the system as a whole. Finding balance between using abstraction to focus on specific concerns while ensuring the whole is continually considered is at the center of systems approachessystems approaches.

A viewview is a subset of information observed of one or more entities, such as systems. The physical or conceptual point from which a view is observed is the viewpointviewpoint, which can be motivated by one or more observer concerns. Different views of the same target must be both separate, to reflect separation of concerns, and integrated such that all views of a given target are consistent and form a coherent whole (Hybertson 2009). Some sample views of a system are internal (Of what does it consist?), external (What are its properties and behaviorbehavior as a whole?), static (What are its parts or structures?); and dynamic (interactions).

encapsulationencapsulation, which encloses system elementssystem elements and their interactions from the external environment, is discussed in Concepts of Systems Thinking. Encapsulation is associated with modularitymodularity, the degree to which a system's componentscomponents may be separated and recombined (Griswold 1995). Modularity applies to systems in natural, social, and engineered domains. In engineeringengineering, encapsulation is the isolation of a system functionfunction within a module; it provides precise specifications for the module (IEEE Std. 610.12-1990).

DualismDualism is a characteristic of systems in which they exhibit seemingly contradictory characteristics that are important for the system (Hybertson 2009). The yin yang concept in Chinese philosophy emphasizes the interaction between dual elementselements and their harmonization, ensuring a constant dynamic balance through a cyclic dominance of one element and then the other, such as day and night (IEP 2006).

From a systems perspective, the interaction, harmonization, and balance between system properties is important. Hybertson (2009) defines leverage as the duality between:

  • Power, the extent to which a system solves a specific problem, and
  • Generality, the extent to which a system solves a whole class of problems.

While some systems or elements may be optimized for one extreme of such dualities, a dynamic balance is needed to be effective in solving complex problems.

Summary of Systems Principles

A set of systems principles is given in Table 1 below. The "Names" segment points to concepts underlying the principle. (See Concepts of Systems Thinking). Following the table, two additional sets of items related to systems principles are noted and briefly discussed: prerequisite laws for design sciencedesign science, and heuristicsheuristics and pragmatic principles.

Table 1. A Set of Systems Principles. (SEBoK Original)
Name Statement of Principle
AbstractionAbstraction A focus on essential characteristics is important in problem solving because it allows problem solvers to ignore the nonessential, thus simplifying the problem (Sci-Tech Encyclopedia 2009; SearchCIO 2012; Pearce 2012).
BoundaryBoundary A boundary or membrane separates the system from the external world. It serves to concentrate interactions inside the system while allowing exchange with external systems (Hoagland, Dodson, and Mauck 2001).
Change Change is necessary for growth and adaptation, and should be accepted and planned for as part of the natural order of things rather than something to be ignored, avoided, or prohibited (Bertalanffy 1968; Hybertson 2009).
DualismDualism Recognize dualities and consider how they are, or can be, harmonized in the contextcontext of a larger whole (Hybertson 2009).
Encapsulation Hide internal parts and their interactions from the external environment (Klerer 1993; IEEE 1990).
Equifinality In open systems, the same final state may be reached from different initial conditions and in different ways (Bertalanffy 1968). This principle can be exploited, especially in systems of purposeful agents.
HolismHolism A system should be considered as a single entity, a whole, not just as a set of parts (Ackoff 1979; Klir 2001).
Interaction The properties, capabilitiescapabilities, and behavior of a system are derived from its parts, from interactions between those parts, and from interactions with other systems (Hitchins 2009 p. 60).
Layer Hierarchy The evolution of complex systems is facilitated by their hierarchical structure (including stable intermediate forms) and the understanding of complex systems is facilitated by their hierarchical description (Pattee 1973; Bertalanffy 1968; Simon 1996).
LeverageLeverage Achieve maximum leverage (Hybertson 2009). Because of the power versus generality tradeoff, leverage can be achieved by a complete solution (power) for a narrow class of problems, or by a partial solution for a broad class of problems (generality).
ModularityModularity Unrelated parts of the system should be separated, and related parts of the system should be grouped together (Griswold 1995; Wikipedia 2012a).
NetworkNetwork The network is a fundamental topology for systems that forms the basis of togetherness, connection, and dynamic interaction of parts that yield the behavior of complex systems (Lawson 2010; Martin et al. 2004; Sillitto 2010).
Parsimony One should choose the simplest explanation of a phenomenon, the one that requires the fewest assumptions (Cybernetics 2012). This applies not only to choosing a design, but also to operations and requirementsrequirements.
RegularityRegularity Systems scienceSystems science should find and capture regularities in systems, because those regularities promote systems understanding and facilitate systems practice (Bertalanffy 1968).
Relations A system is characterized by its relations: the interconnections between the elements. Feedback is a type of relation. The set of relations defines the networknetwork of the system (Odum 1994).
Separation of Concerns A larger problem is more effectively solved when decomposed into a set of smaller problems or concerns (Erl 2012; Greer 2008).
Similarity/Difference Both the similarities and differences in systems should be recognized and accepted for what they are (Bertalanffy 1975 p. 75; Hybertson 2009). Avoid forcing one size fits all, and avoid treating everything as entirely unique.
Stability/Change Things change at different rates, and entities or concepts at the stable end of the spectrum can and should be used to provide a guiding context for rapidly changing entities at the volatile end of the spectrum (Hybertson 2009). The study of complex adaptive systems can give guidance to system behavior and design in changing environments (Holland 1992).
SynthesisSynthesis Systems can be created by “choosing (conceiving, designing, selecting) the right parts, bringing them together to interact in the right way, and in orchestrating those interactions to create requisite properties of the whole, such that it performs with optimum effectiveness in its operational environmentenvironment, so solving the problem that prompted its creation” (Hitchins 2009: 120).
ViewView Multiple views, each based on a system aspect or concern, are essential to understand a complex system or problem situation. One critical view is how concern relates to properties of the whole (Edson 2008; Hybertson 2009).

The principles are not independent. They have synergies and tradeoffs. Lipson (2007), for example, argued that “scalabilityscalability of open-ended evolutionary processes depends on their ability to exploit functional modularity, structural regularity and hierarchy.” He proposed a formal modelmodel for examining the properties, dependencies, and tradeoffs among these principles. Edson (2008) related many of the above principles in a structure called the conceptagon, which he modified from the work of Boardman and Sauser (2008). Edson also provided guidance on how to apply these principles. Not all principles apply to every system or engineering decision. Judgment, experience, and heuristics (see below) provide understanding into which principles apply in a given situation.

Several principles illustrate the relation of view with the dualism and yin yang principle, for example, holism and separation of concerns. These principles appear to be contradictory but are in fact dual ways of dealing with complexitycomplexity. Holism deals with complexity by focusing on the whole system, while separation of concerns divides a problem or system into smaller, more manageable elements that focus on particular concerns. They are reconciled by the fact that both views are needed to understand systems and to engineer systems; focusing on only one or the other does not give sufficient understanding or a good overall solution. This dualism is closely related to the systems thinking paradox described in What is Systems Thinking?.

Rosen (1979) discussed “false dualisms” of systems paradigms that are considered incompatible but are in fact different aspects or views of reality. In the present context, they are thus reconcilable through yin yang harmonization. Edson (2008) emphasized viewpoints as an essential principle of systems thinking; specifically, as a way to understand opposing concepts.

Derick Hitchins (2003) produced a systems life cycle theory described by a set of seven principles forming an integrated set. This theory describes the creation, manipulation and demise of engineered systems. These principles consider the factors which contribute to the stability and survival of man made systems in an environment. Stability is associated with the principle of connected variety, in which stability is increased by variety, plus the cohesion and adaptability of that variety. Stability is limited by allowable relations, resistance to change, and patterns of interaction. Hitchins describes how interconnected systems tend toward a cyclic progression, in which variety is generated, dominance emerges to suppress variety, dominant modes decay and collapse and survivors emerge to generate new variety.

Guidance on how to apply many of these principles to engineered systems is given in the topic Synthesizing Possible Solutions, as well as in System Definition and other knowledge areas in Part 3 of the SEBoK.

Prerequisite Laws of Design Science

John Warfield (1994) identified a set of laws of generic design science that are related to systems principles. Three of these laws are stated here:

  1. ‘’Law of Requisite Variety’’: A design situation embodies a variety that must be matched by the specifications. The variety includes the diversity of stakeholdersstakeholders. This law is an application of the design science of the Ashby (1956) Law of Requisite Variety, which was defined in the context of cyberneticscybernetics and states that to successfully regulate a system, the variety of the regulator must be at least as large as the variety of the regulated system.
  2. ‘’Law of Requisite Parsimony’’: Information must be organized and presented in a way that prevents human information overload. This law derives from Miller’s findings on the limits of human information processing capacity (Miller 1956). Warfield’s structured dialog method is one possible way to help achieve the requisite parsimony.
  3. ‘’Law of Gradation’’: Any conceptual body of knowledge can be graded in stages or varying degrees of complexity and scale, ranging from simplest to most comprehensive, and the degree of knowledge applied to any design situation should match the complexity and scale of the situation. A corollary, called the Law of Diminishing Returns, states that a body of knowledge should be applied to a design situation at the stage at which the point of diminishing returns is reached.

Heuristics and Pragmatic Principles

A heuristic is a common sense rule intended to increase the probability of solving some problem (WordWeb 2012b). In the present context, it may be regarded as an informal or pragmatic principle. Maier and Rechtin (2000) identified an extensive set of heuristics that are related to systems principles. A few of these heuristics are stated here:

  • Relationships among the elements are what give systems their added value. This is related to the ‘’Interaction’’ principle.
  • Efficiency is inversely proportional to universality. This is related to the ‘’Leverage’’ principle.
  • The first line of defense against complexity is simplicity of design. This is related to the ‘’Parsimony’’ principle.
  • In order to understand anything, you must not try to understand everything (attributed to Aristotle). This is related to the ‘’Abstraction’’ principle.

An International Council on Systems Engineering (INCOSE) working group (INCOSE 1993) defined a set of “pragmatic principles” for systems engineering (SE). They are essentially best practice heuristics for engineering a system. For example:

  • Know the problem, the customer, and the consumer
  • Identify and assess alternatives to converge on a solution
  • Maintain the integrity of the system

Hitchins defines a set of SE principles which include principles of holism and synthesis as discussed above, as well as principles describing how systems problems that are of particular relevance to a Systems Approach Applied to Engineered Systems should be resolved (Hitchins 2009).

References

Works Cited

Ackoff, R. 1979. "The future of operational research is past," Journal of the Operational Research Society, vol. 30, no. 2, pp. 93–104, Pergamon Press.

Ashby, W.R. 1956. "Requisite variety and its implications for the control of complex systems," Cybernetica, vol. 1, no. 2, pp. 1–17.

Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications. Revised ed. New York, NY, USA: Braziller.

Bertalanffy, L. von. 1975. Perspectives on General System Theory. E. Taschdjian, ed. New York, NY, USA: George Braziller.

Boardman, J. and B. Sauser. 2008. Systems Thinking: Coping with 21st Century Problems. Boca Raton, FL, USA: Taylor & Francis.

Cybernetics (Web Dictionary of Cybernetics and Systems). 2012. "Principle of Parsimony or Principle of Simplicity." Available at: Web Dictionary of Cybernetics and Systems http://pespmc1.vub.ac.be/ASC/PRINCI_SIMPL.html. Accessed December 3, 2014.

Edson, R. 2008. Systems Thinking. Applied. A Primer. Arlington, VA, USA: Applied Systems Thinking (ASysT) Institute, Analytic Services Inc.

Erl, T. 2012. "SOA Principles: An Introduction to the Service Orientation Paradigm." Available at: Arcitura http://www.soaprinciples.com/p3.php. Accessed December 3 2014.

Greer, D. 2008. "The Art of Separation of Concerns." Available at: Aspiring Craftsman http://aspiringcraftsman.com/tag/separation-of-concerns/. Accessed December 3 2014

Griswold, W. 1995. "Modularity Principle." Available at: William Griswold http://cseweb.ucsd.edu/users/wgg/CSE131B/Design/node1.html. Accessed December 3 2014.

Hitchins D. K. 2003. Advanced Systems Thinking Engineering and Management. Boston, MA, USA: Artech House.

Hitchins, D. 2009. "What are the general principles applicable to systems?" INCOSE Insight, vol. 12, no. 4, pp. 59-63.

Hoagland, M., B. Dodson, and J. Mauck. 2001. Exploring the Way Life Works. Burlington, MA, USA: Jones and Bartlett Publishers, Inc.

Holland, J. 1992. Adaptation in Natural and Artificial Systems: An Introductory Analysis with Applications to Biology, Control, and Artificial Intelligence. Cambridge, MA, USA: MIT Press.

Hybertson, D. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems. Boca Raton, FL, USA: Auerbach/CRC Press.

IEEE. 1990. IEEE Standard Glossary of Software Engineering Terminology. Geneva, Switzerland: Institute of Electrical and Electronics Engineers. IEEE Std 610.12-1990.

IEP (Internet Encyclopedia of Philosophy). 2006. "Yinyang (Yin-yang)." Available at: Internet Encyclopedia of Philosophy http://www.iep.utm.edu/yinyang/. Accessed December 3, 2014.

INCOSE. 1993. An Identification of Pragmatic Principles - Final Report. SE Principles Working Group, January 21, 1993.

Klerer, S. “System management information modeling,” IEEE Communications, vol. 31, no. 5 May 1993, pp. 38-44.

Klir, G. 2001. Facets of Systems Science, 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.

Lawson, H. 2010. A Journey Through the Systems Landscape. London, UK: College Publications, Kings College, UK.

Lawson, H. and J. Martin. 2008. "On the use of concepts and principles for improving systems engineering practice." INCOSE International Symposium 2008, The Netherlands, 15-19 June 2008.

Lipson, H. 2007. "Principles of modularity, regularity, and hierarchy for scalable systems," Journal of Biological Physics and Chemistry, vol. 7 pp. 125–128.

Maier, M. and E. Rechtin. 2000. The Art of Systems Architecting, 2nd ed. Boca Raton, FL, USA: CRC Press.

Miller, G. 1956. "The magical number seven, plus or minus two: Some limits on our capacity for processing information," The Psychological Review, vol. 63, pp. 81–97.

Odum, H. 1994. Ecological and General Systems: An Introduction to Systems Ecology (Revised Edition). Boulder, CO, USA: University Press of Colorado.

Pattee, H., Ed.1973. Hierarchy Theory: The Challenge of Complex Systems. New York, NY, USA: George Braziller.

Pearce, J. 2012. "The Abstraction Principle." Available at: Jon Pearce, San Jose State University http://www.cs.sjsu.edu/~pearce/modules/lectures/ood/principles/Abstraction.htm. Accessed December 3 2014.

Rosen, R. 1979. "Old trends and new trends in general systems research," International Journal of General Systems, vol. 5, no. 3, pp. 173-184.

Sci-Tech Encyclopedia. 2009. "Abstract data type," in McGraw-Hill Concise Encyclopedia of Science and Technology, Sixth Edition, New York, NY, USA: The McGraw-Hill Companies, Inc.

SearchCIO. 2012. "Abstraction." Available at: SearchCIO http://searchcio-midmarket.techtarget.com/definition/abstraction. Accessed December 3 2014.

Sillitto, H. 2010. "Design principles for ultra-large-scale (ULS) systems," Proceedings of INCOSE International Symposium 2010,, Chicago, IL,12-15 July 2010.

Simon, H. 1996. The Sciences of the Artificial, 3rd ed. Cambridge, MA, USA: MIT Press.

Warfield, J.N. 1994. A Science of Generic Design. Ames, IA, USA: Iowa State University Press.

Wikipedia. 2012a. "Modularity." Available at: Wikipedia http://en.wikipedia.org/wiki/Modularity. Accessed December 3 2014.

WordWeb. 2012b. "Dualism." Available at: WordWeb http://www.wordwebonline.com/en/DUALISM. Accessed December 3 2014.

WordWeb. 2012c. "Heuristic." Available at: WordWeb http://www.wordwebonline.com/en/HEURISTIC. Accessed December 3 2014.

WordWeb. 2012d. "Principle." Available at: WordWeb http://www.wordwebonline.com/en/PRINCIPLE. Accessed December 3 2014.

Primary References

Bertalanffy, L. von. 1968. General System Theory: Foundations, Development, Applications. Revised ed. New York, NY, USA: Braziller.

Hybertson, D. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems. Boca Raton, FL, USA: Auerbach/CRC Press.

Klir, G. 2001. Facets of Systems Science, 2nd ed. New York, NY, USA: Kluwer Academic/Plenum Publishers.

Additional References

Francois, F. Ed. 2004. International Encyclopedia of Systems and Cybernetics, 2nd ed. Munich, Germany: K. G. Saur Verlag.

Meyers, R. Ed. 2009. Encyclopedia of Complexity and Systems Science. New York, NY, USA: Springer.

Midgley, G. Ed. 2003. Systems Thinking. Thousand Oaks, CA, USA: Sage Publications Ltd.

Volk, T., and J.W. Bloom. 2007. "The use of metapatterns for research into complex systems of teaching, learning, and schooling. Part I: Metapatterns in nature and culture," Complicity: An International Journal of Complexity and Education, vol. 4, no. 1, pp. 25—43.


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.4, released 19 May 2021