Difference between revisions of "Emergence"

From SEBoK
Jump to navigation Jump to search
Line 26: Line 26:
 
According to (Page 2009), '''Simple Emergence''' is generated by the combination of element properties and relationships and occurs in non-complex or “ordered”  systems (see [[Complexity]] article). To achieve the emergent property of “controlled flight”, we cannot consider only the wings or just the control system or the propulsion system. All three (plus all other parts of an aircraft) and how they are interconnected must be considered. Page suggests that simple emergence is the only type of emergence that can be predicted. This view of emergence is also referred to as [[Synergy (glossary)]], (Hitchins 2009).
 
According to (Page 2009), '''Simple Emergence''' is generated by the combination of element properties and relationships and occurs in non-complex or “ordered”  systems (see [[Complexity]] article). To achieve the emergent property of “controlled flight”, we cannot consider only the wings or just the control system or the propulsion system. All three (plus all other parts of an aircraft) and how they are interconnected must be considered. Page suggests that simple emergence is the only type of emergence that can be predicted. This view of emergence is also referred to as [[Synergy (glossary)]], (Hitchins 2009).
  
(Page 2009), uses the term '''Weak Emergence''' to describe emergence which is expected and either desired or at least allowed for in the system [[Structure (glossary) |structure]]. However, since weak emergence is a product of a complex system, the actual level of emergence cannot be predicted just from knowledge of the characteristics of the individual system [[Components (glossary) | components]].   
+
(Page 2009), uses the term '''Weak Emergence''' to describe emergence which is expected and either desired or at least allowed for in the system [[Structure (glossary) |structure]]. However, since weak emergence is a product of a complex system, the actual level of emergence cannot be predicted just from knowledge of the characteristics of the individual system [[Component (glossary) | components]].   
  
 
(Page 2009) uses the term '''Strong Emergence''' to describe unexpected emergence. That is, emergence not observed until the system is simulated or tested or, more alarmingly, until the system encounters in operation a situation that was not anticipated during design and development. Strong emergence may be evident in failures or shutdowns. For example, the US-Canada Blackout of 2003 as described by (US-Canada Power System Outage Task Force, 2004) was a case of cascading shutdown that resulted from the [[Design (glossary) | design]] of the system, even though there were no equipment failures. The shutdown was systemic. As Hitchins (2007, p. 15) points out, this example shows that emergent properties are not always beneficial.  
 
(Page 2009) uses the term '''Strong Emergence''' to describe unexpected emergence. That is, emergence not observed until the system is simulated or tested or, more alarmingly, until the system encounters in operation a situation that was not anticipated during design and development. Strong emergence may be evident in failures or shutdowns. For example, the US-Canada Blackout of 2003 as described by (US-Canada Power System Outage Task Force, 2004) was a case of cascading shutdown that resulted from the [[Design (glossary) | design]] of the system, even though there were no equipment failures. The shutdown was systemic. As Hitchins (2007, p. 15) points out, this example shows that emergent properties are not always beneficial.  
Line 32: Line 32:
 
Other authors make a different distinction between the ideas of strong, or unexpected, emergence and unpredictable emergence (Chroust 2002):   
 
Other authors make a different distinction between the ideas of strong, or unexpected, emergence and unpredictable emergence (Chroust 2002):   
  
*Firstly there are the unexpected properties that could have been predicted but where not considered in a systems development: '''''Properties which are unexpected by the observer because of his incomplete data set, with regard to the phenomenon at hand''''' (Francois, C. 2004, p 737).  A typical example of this would be unwanted vibration in a vehicle.  According to (Jackson et al. 2010), a desired level of emergence is usually achieved by iteration. This may be by evolutionary [[Process (glossary) | processes]] in which element properties and combinations are "selected for" depending on how well they contribute to a systems effectiveness against [[Environmental (glossary) | environmental]] pressures, or by iteration of design parameters through [[Simulation (glossary) | simulation]] or build/test cycles.  Taking this view, the specific values of Weak Emergence can be refined and examples of Strong Emergence can be considered in subsequent iterations, so long as they are amenable to analysis.  
+
*Firstly there are the unexpected properties that could have been predicted but where not considered in a systems development: '''''Properties which are unexpected by the observer because of his incomplete data set, with regard to the phenomenon at hand''''' (Francois, C. 2004, p 737).  A typical example of this would be unwanted vibration in a vehicle.  According to (Jackson et al. 2010), a desired level of emergence is usually achieved by iteration. This may be by evolutionary [[Process (glossary) | processes]] in which element properties and combinations are "selected for" depending on how well they contribute to a systems effectiveness against [[Environment (glossary) | environmental]] pressures, or by iteration of design parameters through [[Simulation (glossary) | simulation]] or build/test cycles.  Taking this view, the specific values of Weak Emergence can be refined and examples of Strong Emergence can be considered in subsequent iterations, so long as they are amenable to analysis.  
  
 
*Secondly, there are unexpected properties which cannot be predicted from the properties of the system’s components: '''''Properties which are, in and of themselves, not derivable a priori from the behavior of the parts of the system''''' (Francois, C. 2004, p 737).  This view of emergence is a more familiar one in social or natural sciences, but more controversial in [[Engineering (glossary) | engineering]].  We should distinguish between a theoretical and a practical unpredictability, (Chroust 2002). The weather forecast is theoretically predictable but beyond certain limited accuracy practically impossible due to its [[Chaos (glossary)| Chaotic]] nature. The emergence of consciousness in human beings cannot be deduced from the physiological properties of the brain.  For many this genuinely unpredictable type of complexity has limited value for engineering.  See Practical Considerations below.
 
*Secondly, there are unexpected properties which cannot be predicted from the properties of the system’s components: '''''Properties which are, in and of themselves, not derivable a priori from the behavior of the parts of the system''''' (Francois, C. 2004, p 737).  This view of emergence is a more familiar one in social or natural sciences, but more controversial in [[Engineering (glossary) | engineering]].  We should distinguish between a theoretical and a practical unpredictability, (Chroust 2002). The weather forecast is theoretically predictable but beyond certain limited accuracy practically impossible due to its [[Chaos (glossary)| Chaotic]] nature. The emergence of consciousness in human beings cannot be deduced from the physiological properties of the brain.  For many this genuinely unpredictable type of complexity has limited value for engineering.  See Practical Considerations below.

Revision as of 18:22, 30 August 2012

This topic forms part of the Systems Fundamentals Knowledge Area. It gives the background to some of the ways in which emergence has been described and an indication of current thinking on what it is and how it influences systems engineering (SE) practice. It will discuss how these ideas relates to the general definitions of system given in What is a System?, and in particular to the different engineered system contexts. This topic is closely related to the Complexity topic that precedes it.

Emergence is a consequence of the fundamental system concepts of holism and Interaction, (Hitchins 2007, p. 27). System wholes have behavior and properties arising from the organization of their elements and their relationships which only become apparent when the system is placed in different environments.

Questions that arise from this definition include: What kinds of systems exhibit different kinds of emergence and under what conditions? Can emergence be predicted, is it beneficial or detrimental to a system? How do we deal with emergence in the development and use of Engineered Systems, can it be planned for and how?

There are many varied and even conflicting views on emergence. This topic presents the prevailing views. References for other views are also provided.

Overview of Emergence

Emergent system behavior can be viewed as a consequence of the interactions and relationships between system elements rather than by the behavior of individual elements. It emerges from a combination of the behavior and properties of the system elements and the systems structure or allowable interactions between the elements; and may be triggered or influenced by stimulus from the systems environment. Checkland, for example, defines emergence as “the principle that entities exhibit properties which are meaningful only when attributed to the whole, not to its parts.” (Checkland 1999, p. 314).

Emergence is common in nature. The pungent gas ammonia results from the chemical combination of two odorless gases, hydrogen and nitrogen. As parts, feathers, beaks, wings, and gullets do not have the ability to overcome gravity. Properly connected up in a bird, they together create the emergent behavior of flight. The emergent behavior “self-awareness” results from the combined effect of the interconnected and interacting neurons that make up the brain (Hitchins 2007, p. 7).

Hitchins also notes that technological systems exhibit emergence. We can observe a number of levels of outcome which arise from interaction between elements in an engineered system context. At a simple level some system outcomes or attributes have a fairly simple and well defined mapping to its elements, for example center of gravity or top speed of a vehicle result from a combination of element properties and how they are combined. Other behaviors can be associated with these simple outcomes but their value emerges in complex and less predictable ways across a system, for example the single lap performance of a vehicle around a track is related to center of gravity and speed but is also affected by driver skill, external conditions, component ware, etc. Getting the 'best' performance from a vehicle can only be achieved by a combination of good design and feedback from real laps under race conditions. There are also outcomes which are less tangible and which come as a surprise to system developers and users. How does lap time translate into a winning motor racing team or why is a sports car more desirable to many than other vehicles with as good or better performance?

Thus, emergence can always be observed at the highest level of system. However, (Hitchins 2007, p. 7) also points out that to the extent that the systems elements themselves can be considered as systems, they also exhibit emergence. Page (2009) also refers to emergence as a “macro-level property.” Ryan (2007) contends that emergence is coupled to scope rather than system hierarchical levels. In Ryan’s terms scope has to do with spatial dimensions, how system elements are related to each other, rather than hierarchical levels.

Abbott (2006) does not disagree with the general definition of emergence as discussed above. However, he takes issue with the notion that emergence operates outside the bounds of classical physics. He says that “such higher-level entities…can always be reduced to primitive physical forces.”

Bedau and Humphreys (2008) and Francois (2004) provide comprehensive descriptions of the philosophical and scientific background of emergence.

Types of Emergence

A variety of definitions of types of emergence exists. See (Emmeche et al. 1997), (Chroust 2003) and (O’Connor and Wong 2006) for discussion of some of the variants. Page (2009) describes three types of emergence, which he calls “simple”, “weak” and “strong”.

According to (Page 2009), Simple Emergence is generated by the combination of element properties and relationships and occurs in non-complex or “ordered” systems (see Complexity article). To achieve the emergent property of “controlled flight”, we cannot consider only the wings or just the control system or the propulsion system. All three (plus all other parts of an aircraft) and how they are interconnected must be considered. Page suggests that simple emergence is the only type of emergence that can be predicted. This view of emergence is also referred to as synergy , (Hitchins 2009).

(Page 2009), uses the term Weak Emergence to describe emergence which is expected and either desired or at least allowed for in the system structure. However, since weak emergence is a product of a complex system, the actual level of emergence cannot be predicted just from knowledge of the characteristics of the individual system components.

(Page 2009) uses the term Strong Emergence to describe unexpected emergence. That is, emergence not observed until the system is simulated or tested or, more alarmingly, until the system encounters in operation a situation that was not anticipated during design and development. Strong emergence may be evident in failures or shutdowns. For example, the US-Canada Blackout of 2003 as described by (US-Canada Power System Outage Task Force, 2004) was a case of cascading shutdown that resulted from the design of the system, even though there were no equipment failures. The shutdown was systemic. As Hitchins (2007, p. 15) points out, this example shows that emergent properties are not always beneficial.

Other authors make a different distinction between the ideas of strong, or unexpected, emergence and unpredictable emergence (Chroust 2002):

  • Firstly there are the unexpected properties that could have been predicted but where not considered in a systems development: Properties which are unexpected by the observer because of his incomplete data set, with regard to the phenomenon at hand (Francois, C. 2004, p 737). A typical example of this would be unwanted vibration in a vehicle. According to (Jackson et al. 2010), a desired level of emergence is usually achieved by iteration. This may be by evolutionary processes in which element properties and combinations are "selected for" depending on how well they contribute to a systems effectiveness against environmental pressures, or by iteration of design parameters through simulation or build/test cycles. Taking this view, the specific values of Weak Emergence can be refined and examples of Strong Emergence can be considered in subsequent iterations, so long as they are amenable to analysis.
  • Secondly, there are unexpected properties which cannot be predicted from the properties of the system’s components: Properties which are, in and of themselves, not derivable a priori from the behavior of the parts of the system (Francois, C. 2004, p 737). This view of emergence is a more familiar one in social or natural sciences, but more controversial in engineering. We should distinguish between a theoretical and a practical unpredictability, (Chroust 2002). The weather forecast is theoretically predictable but beyond certain limited accuracy practically impossible due to its Chaotic nature. The emergence of consciousness in human beings cannot be deduced from the physiological properties of the brain. For many this genuinely unpredictable type of complexity has limited value for engineering. See Practical Considerations below.

A type of system particularly subject to strong emergence is the system of systems (sos) . The reason for this is that the SoS, by definition, is composed of different systems that were designed to operate independently. When these systems are operated together, the interaction among the parts of the system is likely to result in unexpected emergence. Chaotic or truly unpredictable emergence is likely for this class of systems.

Emergent Properties

Emergent Properties can be defined as follows: “A property of a complex system is said to be ‘emergent’ [in the case when], although it arises out of the properties and relations characterizing its simpler constituents, it is neither predictable from, nor reducible to, these lower-level characteristics”, Honderich (1995, 224),

All systems can have emergent properties which may or may not be predictable or amenable to modeling, as discussed above. Much of the complexity literature includes emergence as a defining characteristic of complex systems. For example, (Boccara 2004) states that “The appearance of emergent properties is the single most distinguishing feature of complex systems”. In general, the more ordered a systems is the easier its emergent properties are to predict, and the more complex the more difficult.

Some practitioners use the term “emergence” only to refer to “strong emergence”, referring to the other two as synergy or “system level behavior”, (Chroust 2002). Taking this view we would reserve the term Emergent Property for unexpected properties, which can be modeled or refined through iterations of the systems development.

Unforeseen emergence causes nasty shocks. Many believe that the main job of the systems approach is to prevent undesired emergence to minimize the risk of unexpected and potentially undesirable outcomes. This review of Emergent Properties is often associated particularly with identifying and avoiding system failures (Hitchins 2007). .

But good systems engineering also involves maximizing opportunity: understanding and exploiting emergence in engineered system to create the required system level characteristics from synergistic interactions between the components, not just from the components themselves, (Sillitto, 2010)

One Important group of emergent properties include properties such as agility and resilience , because these are critical system properties that are not meaningful except at whole system level.

Practical Considerations

As discussed above one way to manage emergent properties is through iterate. The requirement to iterate the design of an engineered system to achieve desired emergence results in a design process that is itself more lengthy than one needed to design an “ordered” system. To create an engineered system capable of such iteration may also require a more configurable or modular solution. The result is that complex systems may be more costly and time-consuming to develop than “ordered” ones, and the cost and time to develop is inherently less predictable.

(Sillitto 2010) observes that “engineering design domains that exploit emergence have good mathematical models of the domain, and rigorously control variability of components and subsystems, and of process, in both design and operation”. The iterations discussed above can be accelerated by using simulation and modeling, so that not all the iterations need to involve building real systems and operating them in the real environment.

The idea of domain models is explored further in (Hybertson 2009) in the context of general models or patterns learned over time and captured in a model space. This orientation reflects the general constraint that knowing what emergence will appear from a given design, including side effects, requires hindsight. For a new type of problem that has not been solved, or a new type of system that has not been built, it is virtually impossible to predict emergent behavior of the solution or system. Some hindsight, or at least some insight, can be obtained by modeling and iterating a specific system design; but iterating the design within the development of one system yields only limited hindsight and often does not give a full sense of emergence and side effects.

The real hindsight and understanding comes from building multiple systems of the same type and deploying them, then observing their emergent behavior in operation and the side effects of placing them in their environments. If those observations are done systematically, and the emergence and side effects are distilled and captured in relation to the design of the systems—including the variations in those designs—and made available to the community, then we are in a position to predict and exploit the emergence. The learning that takes place is of two types: what works (that is, what emergent behavior and side effects are desirable); and what does not work (that is, what emergent behavior and side effects are undesirable). What works affirms the design; what does not work calls for corrections in the design. This is why multiple systems—especially if they are complex—must be built and deployed over time and in different environments, to learn and understand the relations among the design, emergent behavior, side effects, and environment.

The two types of captured learning correspond respectively to patterns and “antipatterns” or patterns of failure, both of which are discussed in a broader context in the Principles of Systems Thinking and Patterns of Systems Thinking articles.

The use of iterations to refine the values of emergent properties, either across the life of a single system or through the development of patterns encapsulating knowledge gained from multiple developments, applies most easily to the discussion of Strong Emergence above. In this sense those properties which can be observed but cannot be related to design choices are not relevant to a Systems Approach. However, they can have value when dealing with a combination of Engineering and Managed problems (Sillitto 2010) which occur for System of Systems contexts, see Systems Approach Applied to Engineered Systems.

References

Works Cited

Abbott, R. 2006. "Emergence Explained: Getting Epiphenomena to Do Real Work". Complexity, 12(1) (September-October): 13-26.

Bedau, M.A. and P. Humphreys, P. (eds.). 2008. "Emergence" In Contemporary Readings in Philosophy and Science. Cambridge, MA, USA: The MIT Press.

Boccara, N. 2004. Modeling Complex Systems. New York: Springer-Verlag.

Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: John Wiley & Sons.

Chroust. G. 2002. Emergent Properties in Software Systems, pages 277-289. 10th Interdisciplinary Information Management Talks; Hofer, C. and Chroust, G. (eds.). Verlag Trauner Linz, ISBN 3 85487 424-3.

Chroust, G. 2003. The Concept of Emergence in Systems Engineering. The Eleventh Fuschl Conversation Hofer. C. and Chroust. G. OSGK, Reports of the Austrian Society for Cybernetic Studies, Vienna, ISBN 3-85206-166-0.

Emmeche, C., S. Koppe, and F. Stjernfelt. 1997. Explaining Emergence: Towards an Ontology of Levels. Journal for General Philosophy of Science 28: 83-119 (1997). http://www.nbi.dk/~emmeche/coPubl/97e.EKS/emerg.html.

Francois, C. 2004. International Encyclopedia of Systems and Cybernetics, 2nd edition, 2 volumes. K.G.Saur, Munchen 3-598-11630-6.

Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley & Sons.

Honderich. T. 1995. The Oxford Companion to Philosophy. New York: Oxford University Press.

Hybertson, D. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems. Auerbach/CRC Press, Boca Raton, FL.

Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight 13(1) (April 2010): 41-43.

O’Connor, T. and H. Wong. 2006. Emergent Properties. Stanford Encyclopedia of Philosophy. http://plato.stanford.edu/entries/properties-emergent/.

Page, S.E. 2009. Understanding Complexity. The Great Courses. Chantilly, VA, USA: The Teaching Company.

Ryan, A. 2007. "Emergence is Coupled to Scope, Not Level." Complexity, 13(2) (November-December).

Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering." Systems Engineering. 12: 295-311.

Sillitto, H.G. 2010. "Design Principles for Ultra-Large-Scale Systems". Proceedings of the INCOSE International Symposium, Chicago, July 2010, reprinted in “The Singapore Engineer”, April 2011

US-Canada Power System Outage Task Force. 2004. Final Report on the August 14, 2003 Blackout in the United States and Canada: Causes and Recommendations. April, 2004. Washington-Ottawa. Available at https://reports.energy.gov/BlackoutFinal-Web.pdf.

Primary References

Emmeche, C., S. Koppe, and F. Stjernfelt. 1997. "Explaining Emergence: Towards an Ontology of Levels." Journal for General Philosophy of Science, 28: 83-119 (1997). http://www.nbi.dk/~emmeche/coPubl/97e.EKS/emerg.html.

Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley & Sons.

Page, S. E. 2009. Understanding Complexity. The Great Courses. Chantilly, VA, USA: The Teaching Company.

Additional References

Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight. 13(1) (April): 41-43.

Shalizi, C. 2007. Emergent Properties. http://www.cscs.umich.edu/~crshalizi/notebooks/emergentproperties.html.

Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering." Systems Engineering. 12: 295-311.

US-Canada Power System Outage Task Force. 2004. Final Report on the August 14, 2003 Blackout in the United States and Canada: Causes and Recommendations. April, 2004. Washington-Ottawa. Available at https://reports.energy.gov/BlackoutFinal-Web.pdf.


< Previous Article | Parent Article | Next Article >


SEBoK v. 1.9.1 released 30 September 2018

SEBoK Discussion

Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.

If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.

blog comments powered by Disqus