Difference between revisions of "Emergence"
Line 16: | Line 16: | ||
According to (Hitchins 2007, p. 7) emergence is common in nature. The pungent gas ammonia results from the chemical combination of two odorless gases, hydrogen and nitrogen. As parts, feathers, beaks, wings, and gullets do not have the ability to overcome gravity. Properly connected up in a bird, they together create the emergent behavior of flight. The emergent behavior “self-awareness” results from the combined effect of the interconnected and interacting neurons that make up the brain. | According to (Hitchins 2007, p. 7) emergence is common in nature. The pungent gas ammonia results from the chemical combination of two odorless gases, hydrogen and nitrogen. As parts, feathers, beaks, wings, and gullets do not have the ability to overcome gravity. Properly connected up in a bird, they together create the emergent behavior of flight. The emergent behavior “self-awareness” results from the combined effect of the interconnected and interacting neurons that make up the brain. | ||
− | Hitchins notes that technological systems also exhibit emergence. We can observe a number of levels of outcome which arise from interaction between elements in an engineered system context. At a simple level some system outcomes or attributes have a fairly simple and well defined mapping to its elements, for example centre of gravity or top speed of a vehicle are a simple combination of element properties and how they are combined. Other behaviors can be associated with these simple outcomes but | + | Hitchins notes that technological systems also exhibit emergence. We can observe a number of levels of outcome which arise from interaction between elements in an engineered system context. At a simple level some system outcomes or attributes have a fairly simple and well defined mapping to its elements, for example centre of gravity or top speed of a vehicle are a simple combination of element properties and how they are combined. Other behaviors can be associated with these simple outcomes but their value emerges in a complex and less predictable ways across a system, for example the single lap performance of a vehicle around a track is related to centre of gravity and speed but is also affected by driver skill, external conditions, component ware etc. Getting the 'best' perfromance from a vehicle can only be achieved by a combination of good design and practice. There are also outcomes which are less tangible and which come as a surpise to system developers and users. How does lap time translate into a winning motor racing team or why is a Ferrari sports car more desirable to many than other vehicles with as good or better performance? |
Thus, emergence can be observed at the highest level of the system hierarchy. However, (Hitchins 2007, p. 7) also points out that to the extent that the subsystems themselves can be considered systems, they also exhibit emergence. (Page 2009) also refers to emergence as a “macro-level property.” Ryan (2007) contends that emergence is coupled to scope rather than system hierarchical levels. In Ryan’s terms, scope has to do with spatial dimensions rather than hierarchical levels. | Thus, emergence can be observed at the highest level of the system hierarchy. However, (Hitchins 2007, p. 7) also points out that to the extent that the subsystems themselves can be considered systems, they also exhibit emergence. (Page 2009) also refers to emergence as a “macro-level property.” Ryan (2007) contends that emergence is coupled to scope rather than system hierarchical levels. In Ryan’s terms, scope has to do with spatial dimensions rather than hierarchical levels. |
Revision as of 08:48, 29 July 2012
According to (Checkland 1999, p. 314), emergence is “the principle that entities exhibit properties which are meaningful only when attributed to the whole, not to its parts.”
Emergence is a consequence of the fundamental system principles of holism and Interaction. All systems are based on the simple idea that it is the relationships between the System Elements (glossary) which allow them to be considered as a whole and that these relationships determine how the whole can be observed to interact with the things around it. All system wholes have behavior and properties arising from the organisation of their elements and their interactions which may become apparent when the system is placed in different environments.
Questions that arise from this definition include: What kinds of systems exhibit different kinds of emergence and under what conditions? Can emergence be predicted, is it beneficial or detrimental to a system? How do we deal with emergence in the development and use of Engineered Systems, can it be planned for and how?
There are many varied and even conflicting views on emergence. This article presents what is believed to be the prevailing view. Some references for other views are also provided.
Overview of Emergence
Emergent system behavior is a consequence of the interactions and relationships between system elements rather than by the behavior of individual elements. It emerges from a combination of the behavior and properties of the system elements and the systems structure or allowable interactions between the elements; and may be triggered or influenced by stimulus from the systems environment.
According to (Hitchins 2007, p. 27), emergence depends on the concept of holism that holds that “an open system is a whole” and that “the whole is different from, and may be greater than, the sum of its parts. (Page 2009) says that emergence “refers to the spontaneous creation of order and functionality from the bottom up.”
All systems can have emergent behavior. Unforeseen emergence causes nasty shocks. Many believe that the main job of the systems approach is to prevent undesired emergence to minimise the risk of unexpected and potentially undesirable outcomes. But good systems engineering also involves maximising opportunity: exploiting emergence to create the required system level characteristics from synergistic interactions between the components, not just from the components themselves. Understanding and exploiting emergence can make an engineered system simpler and cheaper, with fewer spurious modes due to unnecessary interactions between components, more predictable and potentially safer. (Sillitto, 2010)
According to (Hitchins 2007, p. 7) emergence is common in nature. The pungent gas ammonia results from the chemical combination of two odorless gases, hydrogen and nitrogen. As parts, feathers, beaks, wings, and gullets do not have the ability to overcome gravity. Properly connected up in a bird, they together create the emergent behavior of flight. The emergent behavior “self-awareness” results from the combined effect of the interconnected and interacting neurons that make up the brain.
Hitchins notes that technological systems also exhibit emergence. We can observe a number of levels of outcome which arise from interaction between elements in an engineered system context. At a simple level some system outcomes or attributes have a fairly simple and well defined mapping to its elements, for example centre of gravity or top speed of a vehicle are a simple combination of element properties and how they are combined. Other behaviors can be associated with these simple outcomes but their value emerges in a complex and less predictable ways across a system, for example the single lap performance of a vehicle around a track is related to centre of gravity and speed but is also affected by driver skill, external conditions, component ware etc. Getting the 'best' perfromance from a vehicle can only be achieved by a combination of good design and practice. There are also outcomes which are less tangible and which come as a surpise to system developers and users. How does lap time translate into a winning motor racing team or why is a Ferrari sports car more desirable to many than other vehicles with as good or better performance?
Thus, emergence can be observed at the highest level of the system hierarchy. However, (Hitchins 2007, p. 7) also points out that to the extent that the subsystems themselves can be considered systems, they also exhibit emergence. (Page 2009) also refers to emergence as a “macro-level property.” Ryan (2007) contends that emergence is coupled to scope rather than system hierarchical levels. In Ryan’s terms, scope has to do with spatial dimensions rather than hierarchical levels.
(Bedau and Humphreys 2008) provide a comprehensive description of the philosophical and scientific background of emergence. Abbott (2006) does not disagree with the general definition of emergence as discussed above. However, he takes issue with the notion that emergence operates outside the bounds of classical physics. He says that “such higher-level entities…can always be reduced to primitive physical forces.”
Emergent Properties are the resulting properties of a system that displays emergence. A more formal definition of emergent properties, from Honderich (1995, 224), is: “A property of a complex system is said to be ‘emergent’ just in case, although it arises out of the properties and relations characterizing its simpler constituents, it is neither predictable from, nor reducible to, these lower-level characteristics.”
One Important group of emergent properties include properties such as agility (glossary) and resilience (glossary), because these are critical whole system properties that are not meaningful except at whole system level. Emergent properties may or may not be predictable as discussed below. In general, the more ordered systems is, the easier its emergent properties are to predict, and the more complex, the more difficult. Much of the complex systems literature includes emergence as a defining characteristic of complex systems. For example, Boccara (2004, 3) states that “The appearance of emergent properties is the single most distinguishing feature of complex systems.” On the other hand, some researchers in complex systems call themselves reductionists — for example, Shalizi (2007).
Note, there is a strong correlation between the kinds of emergent properties and the dimensions of Complexity (glossary) discussed in the related Complexity (link) article.
Types of Emergence
A variety of definitions and types of emergence exists. See Emmeche et al. (1997) and O’Connor and Wong (2006) for discussion of some of the variants. (Page 2009) describes three types of emergence, which he calls “simple”, “weak” and “strong”. (Some practitioners use the term “emergence” only to refer to the “strong” kind, referring to the other two as “system level behaviour”. This usage is however not consistent with the generally accepted definition we have used.)
According to Page (2009), simple emergence is the only type of emergence that can be predicted. Simple emergence occurs in non-complex systems (see the Complexity article). Sheard and Mostashari (2009) refer to such systems as “ordered.” For example, the physics of aircraft flight is well-known. To achieve the emergent property of “controlled flight”, all parts of the aircraft need to be considered. It cannot be achieved alone by considering only the wings or just the control system or the propulsion system. All three (plus other elements) must be considered.
Page (2009), uses the term “weak emergence” to describe emergence which is expected and presumably desired. However, since weak emergence is a product of a complex system, the actual level of emergence cannot be predicted just from knowledge of the characteristics of the individual system components. So how is the desired level of emergence achieved? According to (Jackson et al. 2010), the desired level of emergence is usually achieved by iteration. That is, the different design parameters of the system must be adjusted until the desired level of emergence is achieved as determined through simulation or build/test cycles.
Page (2009) uses the term “strong emergence” to describe unexpected emergence. That is, emergence is not observed until the system is simulated or tested – or, more alarmingly, until the system encounters in operation a situation that was not anticipated during design and development. Strong emergence may be evident in failures or shutdowns. For example, the US-Canada Blackout of 2003 as described by (US-Canada Power System Outage Task Force, 2004) was a case of cascading shutdown that resulted from the design of the system, even though there were no equipment failures. The shutdown was completely systemic (glossary). As Hitchins (2007, p. 15) points out, this example shows that emergent properties are not always beneficial.
A type of system particularly subject to strong emergence is the system of systems (SoS) (glossary). The reason for this is that the SoS, by definition, is composed of different systems that were designed to operate independently. When these systems are operated together, the interaction among the parts of the system is likely to result in unexpected emergence.
Practical Considerations
The requirement to iterate the design to achieve desired emergence, as discussed above, results in a design process that is itself more lengthy than one needed to design an “ordered” system. The result is that complex systems may be more costly and time-consuming to develop than “ordered” ones, and the cost and time to develop is inherently less predictable.
Sillitto (2010) observes that “engineering design domains that exploit emergence have good mathematical models of the domain, and rigorously control variability of components and subsystems, and of process, in both design and operation”.
The idea of domain models is explored further in Hybertson (2009) in the context of general models or patterns learned over time and captured in a model space. This orientation reflects the general constraint that knowing what emergence will appear from a given design, including side effects, requires hindsight. For a new type of problem that has not been solved, or a new type of system that has not been built, it is virtually impossible to predict emergent behavior of the solution or system. Some hindsight, or at least some insight, can be obtained by modelling and iterating a specific system design; but iterating the design within the development of one system yields only limited hindsight and often does not give a full sense of emergence and side effects.
The real hindsight and understanding come from building multiple systems of the same type and deploying them, then observing their emergent behavior in operation and the side effects of placing them in their environments. If those observations are done systematically, and the emergence and side effects are distilled and captured in relation to the design of the systems—including the variations in those designs—and made available to the community, then we are in a position to predict and exploit the emergence. The learning that takes place is of two types: what works (that is, what emergent behavior and side effects are desirable); and what does not work (that is, what emergent behavior and side effects are undesirable). What works affirms the design; what does not work calls for corrections in the design. This is why multiple systems—especially if they are complex—must be built and deployed over time and in different environments, to learn and understand the relations among the design, emergent behavior, side effects, and environment.
The two types of captured learning correspond respectively to patterns and “antipatterns” or patterns of failure, both of which are discussed in a broader context in the Principles and Patterns article.
References
Works Cited
Abbott, R. 2006. "Emergence Explained: Getting Epiphenomena to Do Real Work". Complexity, 12(1) (September-October): 13-26.
Bedau, M.A. and P. Humphreys, P. (eds.). 2008. "Emergence" In Contemporary Readings in Philosophy and Science. Cambridge, MA, USA: The MIT Press.
Boccara, N. 2004. Modeling Complex Systems. New York: Springer-Verlag.
Checkland, P. 1999. Systems Thinking, Systems Practice. New York, NY, USA: John Wiley & Sons.
Emmeche, C., S. Koppe, and F. Stjernfelt. 1997. Explaining Emergence: Towards an Ontology of Levels. Journal for General Philosophy of
Science 28: 83-119 (1997). http://www.nbi.dk/~emmeche/co Publ/97e.EKS/emerg.html.
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley & Sons.
Honderich. T. 1995. The Oxford Companion to Philosophy. New York: Oxford University Press.
Hybertson, D. 2009. Model-Oriented Systems Engineering Science: A Unifying Framework for Traditional and Complex Systems. Auerbach/CRC Press, Boca Raton, FL.
Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight 13(1) (April 2010): 41-43.
O’Connor, T. and H. Wong. 2006. Emergent Properties. Stanford Encyclopedia of Philosophy. http://plato.stanford.edu/entries/properties-emergent/.
Page, S.E. 2009. Understanding Complexity. The Great Courses. Chantilly, VA, USA: The Teaching Company.
Ryan, A. 2007. "Emergence is Coupled to Scope, Not Level." Complexity, 13(2) (November-December).
Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering." Systems Engineering. 12: 295-311.
Sillitto, H.G. 2010. "Design Principles for Ultra-Large-Scale Systems". Proceedings of the INCOSE International Symposium, Chicago, July 2010, reprinted in “The Singapore Engineer”, April 2011
Shalizi, C. 2007. Emergent Properties. http://www.cscs.umich.edu/~crshalizi/notebooks/emergentproperties.html. US-Canada Power System Outage Task Force. 2004. Final Report on the August 14, 2003 Blackout in the United States and Canada: Causes and Recommendations. April, 2004. Washington-Ottawa. Available at https://reports.energy.gov/BlackoutFinal-Web.pdf.
Primary References
Hitchins, D. 2007. Systems Engineering: A 21st Century Systems Methodology. Hoboken, NJ, USA: John Wiley & Sons.
Page, S. E. 2009. Understanding Complexity. The Great Courses. Chantilly, VA, USA: The Teaching Company.
Additional References
Jackson, S., D. Hitchins, and H. Eisner. 2010. "What is the Systems Approach?" INCOSE Insight. 13(1) (April): 41-43.
Sheard, S.A. and A. Mostashari. 2008. "Principles of Complex Systems for Systems Engineering." Systems Engineering. 12: 295-311.
US-Canada Power System Outage Task Force. 2004. Final Report on the August 14, 2003 Blackout in the United States and Canada: Causes and Recommendations. April, 2004. Washington-Ottawa. Available at https://reports.energy.gov/BlackoutFinal-Web.pdf.
Comments from SEBok 0.5 Wiki
No comments were logged for this article in the SEBoK 0.5 wiki. Because of this, it is especially important for reviewers to provide feedback on this article. Please see the discussion prompts below.
SEBoK Discussion
Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.
If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.
blog comments powered by Disqus