Difference between revisions of "Socio-Technical Features of Systems of Systems"
m (Text replacement - "SEBoK v. 2.9, released 20 November 2023" to "SEBoK v. 2.10, released 06 May 2024") |
|||
(82 intermediate revisions by 16 users not shown) | |||
Line 1: | Line 1: | ||
− | + | ---- | |
− | + | '''''Lead Authors:''''' ''Judith Dahmann, Mike Henshaw, Bud Lawson'', '''''Contributing Authors:''''' ''Heidi Davidz, Alan Faisandier'' | |
− | + | ---- | |
− | + | In perhaps the earliest reference to Systems of Systems (SoS), Ackoff (1971) describes a concept that is mostly concerned with organizations, i.e. social. However, this section is concerned with the socio-technical aspects of technical SoS, which are composed of interdependent resources, such as, people, processes, information, and technology that interact with each other and with their environment in support of a common mission (glossary). | |
− | + | ||
− | + | ==The Socio-Technical Nature of Systems of Systems== | |
− | + | Rebovich (2009) [ has captured the essence of the SoS problem as: | |
− | + | ||
− | + | “From a single-system community’s perspective, its part of the SoS capability represents additional obligations, constraints and complexities. Rarely is participation in an (sic) SoS seen as a net gain from the viewpoint of single-system stakeholders.” | |
− | * | + | |
− | * | + | Three of the persistent SoS challenges, or pain points, identified by Dahmann (2015) are directly related to this problem of stakeholder perspective and the local optimization of constituent system performance at the expense, or to the detriment of, the overall SoS performance. These are: SoS Authority, Leadership, and Autonomy, Interdependencies & Emergence. Thus, the sociological aspects affecting decision making and human behaviors must be given similar weight to the technical aspects of SoS. |
− | + | ||
+ | Turning to views outside of Systems Engineering, Ergonomists regard socio-technical systems as having the following characteristics (Maguire, 2014): | ||
+ | * There are collective operational tasks, | ||
+ | * They contain social and technical sub-systems, | ||
+ | * They are open systems (i.e. strongly interacting with their environments), and | ||
+ | * The concept of the system being an unfinished system. | ||
− | + | These are also characteristics of Systems of Systems. Klein (2014) has noted that approaches to socio-technical systems can take the two perspectives of “system affects people” or “people affect system”, depending upon how the [[What is a System?|system boundary is drawn]]. It is generally true for systems that consideration of their context requires socio-technical aspects to be taken into account. | |
− | + | Although focused largely on IT systems, Baxter and Sommerville (2011) have noted that the introduction of new business SoS are generally carried out in conjunction with a change process. They argue that frequently the social and organizational aspects are disruptive and that inadequate attention is paid to the connection between change processes and systems development processes. They propose two types of Socio-Technical Systems Engineering activities: | |
− | + | * Sensitizing and awareness activities, designed to sensitize stakeholders to the concerns of other stakeholders. | |
+ | * Constructive engagement activities, which are largely concerned with deriving requirements accurately and meaningfully. | ||
+ | The extent to which these activities can be effective may be challenged by independent management or operation of constituent systems in a SoS. | ||
− | + | Although there are many matters concerning the socio-technical aspects of SoS, there are two important issues, that are dealt with her. The first is the need for appropriate governance structures, given that operational and/or managerial independence affects top-down direction of the SoS and may compromise achievement of the SoS goal(s). The second issue is a lack of situational awareness of managers, operators, or other stakeholders of the SoS, so that they may not understand the impact of their local decisions on the wider SoS. | |
− | + | ==SoS Governance== | |
− | + | Generally, design and operation of complex systems is concerned with control, but the classification of SoS (Dahmann, et. al., 2008) is based on the notion of diminishing central control, as the types go from directed to virtual. Sauser, et. al. (2009) has described the ‘control paradox of SoS’ and asserted that for SoS, ‘management’ is replaced by ‘governance’. ‘Control is a function of rules, time, and bandwidth; whereas command is a function of trusts, influence, fidelity, and agility’. | |
− | |||
− | + | Some practitioners have found the Cynefin framework, developed by David Snowden, helpful in understanding the nature of complexity that may arise in SoS. Developed from knowledge management considerations, Kurtz and Snowden (2003) propose three reasons why the behavior of systems involving people may be difficult to predict. Firstly, humans are not limited to one identity, and so modelling human behaviors using norms may not be reliable. Secondly, humans are not limited to acting in accordance with predetermined rules. Thirdly, humans are not limited to acting on local patterns. These reasons all undermine control, so that the sociological aspects of SoS make their behaviours hard to predict and, possibly indeterminate. The Cynefin framework considers systems to be classified in four domains: | |
+ | * Known – simple systems with predictable and repeatable cause and effect | ||
+ | * Knowable – amenable to systems thinking and analytical/reductionist methods | ||
+ | * Complex – adaptive systems where cause and effect are only discernable in retrospect and do not repeat | ||
+ | * Chaotic – no cause and effect relationships are perceivable | ||
+ | The different types of SoS (directed, acknowledged, collaborative, and virtual) could all be described in any of the above domains, depending on many factors internal to the SoS, but in all cases it is the sociological element of the socio-technical SoS that is most likely to give rise to ambiguity in predicting behavior. | ||
− | + | A major governance issue for SoS is understanding the ownership of, and making reliable estimates of risk (Fovino & Masera, 2007). High levels of connectivity, and the potential for emergent behavior due to the interactions of separately owned/operated constituent systems, means that significant risks may go unacknowledged and their mitigations unplanned. | |
− | + | In general, governance can be summed up by asking three connected questions (Siemieniuch and Sinclair, 2014): | |
+ | * Are we doing the right things (leadership)? | ||
+ | * Are we doing those things right (management)? | ||
+ | * How do we know this (metrics and measurements)? | ||
− | + | Currently, there is no accepted framework for addressing these questions in a SoS context, but Henshaw et. al. (2013) highlighted architectures as an important means through which governance may be clarified. They postulate that a SoS can be regarded as a set of trust and contract relationships between systems (i.e. including both informal and formal relationships). The systems architect of a constituent system must, therefore, address trust issues for each participating organization in the overall enterprise with which his/her system must interoperate. For SoS, technical engineering governance is concerned with defining and ensuring compliance with trust at the interface between constituent systems. An example of difficulty managing the interfaces in a SoS is provided in the [[How Lack of Information Sharing Jeopardized the NASA/ESA Cassini/Huygens Mission to Saturn|Cassini-Huygens mission case study]] . | |
− | == | + | |
− | SoS | + | ==Situational Awareness== |
− | + | Situational awareness is a decision maker’s understanding of the environment in which he/she takes a decision; it concerns information, awareness, perception, and cognition. Endsley (1995) emphasizes that situational awareness is a state of knowledge. There are numerous examples of SoS failure due to the operator of one constituent system making decisions based on inadequate knowledge of the overall SoS (big picture). | |
− | + | ||
− | + | On the other hand, SoS development is also viewed as the means through which improved situational awareness may be achieved (Van der Laar, et. al., 2013). In the defense environment, Network Enabled Capability (NEC) was a system of systems approach motivated by the objective of making better use of information sharing to achieve military objectives. NEC was predicated on the ability to share useful information effectively among the stakeholders that need it. It is concluded that improving situational awareness will improve SoS performance, or at least reduce the risk of failures at the SoS level. Thus, the principles which govern the organization of the SoS should support sharing information effectively across the network; in essence, ensuring that every level of the {{Term|Interoperability (glossary)|interoperability spectrum}} is adequately serviced. Operators need insight into the effect that their own local decisions may have on the changing SoS or environment; similarly they need to understand how external changes will affect the systems that they own. | |
− | + | ||
− | + | Increasingly, SoS include constituent systems with high levels of autonomous decision making ability, a class of system that can be described as cyber-physical systems (of systems). The relationship to SoS is described by Henshaw (2016). Issues arise because autonomy can degrade human situational awareness regarding the behavior of the SoS, and also the autonomous systems within the SoS have inadequate situational awareness due to a lack of competent models of humans (Sowe, 2016) | |
− | SoS | ||
==References== | ==References== | ||
− | === | + | ===Works Cited=== |
+ | |||
+ | Ackoff, R.L.(1971) “Towards a Systems of Systems Concepts,” ''Manage. Sci.'', vol. 17, no. 11, pp. 661–671. | ||
+ | |||
+ | Baxter, G. and I. Sommerville, (2011) “Socio-technical systems: From design methods to systems engineering,” Interact. Comput., vol. 23, no. 1, pp. 4–17. | ||
+ | |||
+ | Dahmann, J. S. & Baldwin, K. J. (2008) Understanding the Current State of US Defense Systems of Systems and the Implications for Systems Engineering, 2nd Annual IEEE Systems Conference, 1–7. http://doi.org/10.1109/SYSTEMS.2008.4518994 | ||
+ | |||
+ | Dahmann, J.S. (2015) “Systems of Systems Characterization and Types,” in Systems of Systems Engineering for NATO Defence Applications (STO-EN-SCI-276), pp. 1–14.] | ||
+ | |||
+ | Endsley, M. R. (1995) Toward a Theory of Situation Awareness in Dynamic Systems, J. Human Factors and Ergonomics Soc., 37(1), 32–64. http://doi.org/10.1518/001872095779049543 | ||
+ | |||
+ | Fovino, I. N., & Masera, M. (2007) Emergent disservices in interdependent systems and system-of-systems, in Proc. IEEE International Conference on Systems, Man and Cybernetics, Vol. 1, pp. 590–595. http://doi.org/10.1109/ICSMC.2006.384449 | ||
+ | |||
+ | Henshaw, M. J. de C., Siemieniuch, C. E., & Sinclair, M. A. (2013) Technical and Engineering Governance in the Context of Systems of Systems, in NATO SCI Symp. Architecture Assessment for NEC (pp. 1–10). Tallinn, Es. NATO STO. | ||
− | + | Henshaw, M. (2014) A Socio-Technical Perspective on SoSE, in Lecture Series in Systems of Systems Engineering for NATO Defence Applications (SCI-276). NATO CSO. | |
− | + | Henshaw, M. (2016). Systems of Systems, Cyber-Physical Systems, The Internet-of-Things…Whatever Next? INSIGHT, 19(3), pp.51–54. | |
− | |||
− | + | Klein, L. (2014) What do we actually mean by ‘sociotechnical’? On values, boundaries and the problems of language, Appl. Ergon., vol. 45, no. 2 PA, pp. 137–142. | |
− | + | Kurtz, C.F. and D. J. Snowden (2003) “The New Dynamics of Strategy: Sense-making in a Complex-Complicated World,” IBM Syst. J., vol. 42, no. 3, pp. 462–483. | |
− | + | Maguire, M. (2014) Socio-technical systems and interaction design - 21st century relevance, Appl. Ergon., vol. 45, no. 2 PA, pp. 162–170.[http://www.dtic.mil/dtic/tr/fulltext/u2/a468785.pdf :/mil/] | |
− | + | Rebovich, G. (2009) “Enterprise systems of Systems,” in Systems of Systems Engineering - Principles and Applications, M. Jamshidi, Ed. Boca Raton: CRC Press, pp. 165–191. | |
− | + | Sauser, B., Boardman, J., & Gorod, A. (2009) System of Systems Management, in System of Systems Engineering: Innovations for the 21st Century, M. Jamshidi (Ed.), (pp. 191–217) Wiley. | |
− | + | Siemieniuch, C.E. & Sinclair, M.A. (2014) Extending systems ergonomics thinking to accommodate the socio-technical issues of Systems of Systems, Appl. Ergon., V 45, Issue 1, Pages 85-98 | |
− | |||
− | + | Sowe, S.K. et al. (2016) Cyber-Physical-Human Systems - putting people in the loop. IT Professional, 18(February), pp.10–13.] | |
− | + | Van der Laar, P., Tretmans, J., & Borth, M. (2013) Situational Awareness with Systems of Systems. Springer. | |
+ | |||
+ | ===Primary References=== | ||
+ | Checkland, P.B. 1981. [[Systems Thinking, Systems Practice]]. Chichester, West Sussex, England, UK: John Wiley & Sons, Ltd. | ||
===Additional References=== | ===Additional References=== | ||
− | Bruesburg, A. and G. Fletcher. 2009. '' | + | Bruesburg, A., and G. Fletcher. 2009. ''The Human View Handbook for MODAF'', draft version 2, second issue. Bristol, England, UK: Systems Engineering & Assessment Ltd. Available: http://www.hfidtc.com/research/process/reports/phase-2/hv-handbook-issue2-draft.pdf. |
− | |||
− | |||
− | + | IFIP-IFAC Task Force. 1999. "The Generalised Enterprise Reference Architecture and Methodology," V1.6.3. Available: http://www.cit.gu.edu.au/~bernus/taskforce/geram/versions/geram1-6-3/v1.6.3.html. | |
− | ISO | + | ISO. 1998. ISO 14258:1998, ''Industrial automation systems — Concepts and rules for enterprise models.'' Geneva, Switzerland: International Organization for Standardization. |
− | ISO | + | ISO. 2006. ISO 19439:2006, ''Enterprise integration — Framework for enterprise modelling.'' Geneva, Switzerland: International Organization for Standardization. |
− | + | ISO. 2007. ISO 19440:2007, ''Enterprise integration — Constructs for enterprise modelling.'' Geneva, Switzerland: International Organization for Standardization. | |
+ | Miller, F.P., A.F. Vandome, and J. McBrewster. 2009. ''Enterprise Modelling.'' Mauritius: Alphascript Publishing, VDM Verlag Dr. Müller GmbH & Co. KG. | ||
---- | ---- | ||
− | + | <center>[[Architecting Approaches for Systems of Systems|< Previous Article]] | [[Systems of Systems (SoS)|Parent Article]] | [[Capability Engineering|Next Article >]]</center> | |
− | |||
− | |||
− | |||
− | |||
− | + | <center>'''SEBoK v. 2.10, released 06 May 2024'''</center> | |
− | <center> | ||
− | + | [[Category:Part 4]] | |
− | [[Category:Part 4]][[Category:Topic]] | + | [[Category:Topic]] |
− | + | [[Category:Systems of Systems (SoS)]] |
Latest revision as of 22:21, 2 May 2024
Lead Authors: Judith Dahmann, Mike Henshaw, Bud Lawson, Contributing Authors: Heidi Davidz, Alan Faisandier
In perhaps the earliest reference to Systems of Systems (SoS), Ackoff (1971) describes a concept that is mostly concerned with organizations, i.e. social. However, this section is concerned with the socio-technical aspects of technical SoS, which are composed of interdependent resources, such as, people, processes, information, and technology that interact with each other and with their environment in support of a common mission (glossary).
The Socio-Technical Nature of Systems of Systems
Rebovich (2009) [ has captured the essence of the SoS problem as:
“From a single-system community’s perspective, its part of the SoS capability represents additional obligations, constraints and complexities. Rarely is participation in an (sic) SoS seen as a net gain from the viewpoint of single-system stakeholders.”
Three of the persistent SoS challenges, or pain points, identified by Dahmann (2015) are directly related to this problem of stakeholder perspective and the local optimization of constituent system performance at the expense, or to the detriment of, the overall SoS performance. These are: SoS Authority, Leadership, and Autonomy, Interdependencies & Emergence. Thus, the sociological aspects affecting decision making and human behaviors must be given similar weight to the technical aspects of SoS.
Turning to views outside of Systems Engineering, Ergonomists regard socio-technical systems as having the following characteristics (Maguire, 2014):
- There are collective operational tasks,
- They contain social and technical sub-systems,
- They are open systems (i.e. strongly interacting with their environments), and
- The concept of the system being an unfinished system.
These are also characteristics of Systems of Systems. Klein (2014) has noted that approaches to socio-technical systems can take the two perspectives of “system affects people” or “people affect system”, depending upon how the system boundary is drawn. It is generally true for systems that consideration of their context requires socio-technical aspects to be taken into account.
Although focused largely on IT systems, Baxter and Sommerville (2011) have noted that the introduction of new business SoS are generally carried out in conjunction with a change process. They argue that frequently the social and organizational aspects are disruptive and that inadequate attention is paid to the connection between change processes and systems development processes. They propose two types of Socio-Technical Systems Engineering activities:
- Sensitizing and awareness activities, designed to sensitize stakeholders to the concerns of other stakeholders.
- Constructive engagement activities, which are largely concerned with deriving requirements accurately and meaningfully.
The extent to which these activities can be effective may be challenged by independent management or operation of constituent systems in a SoS.
Although there are many matters concerning the socio-technical aspects of SoS, there are two important issues, that are dealt with her. The first is the need for appropriate governance structures, given that operational and/or managerial independence affects top-down direction of the SoS and may compromise achievement of the SoS goal(s). The second issue is a lack of situational awareness of managers, operators, or other stakeholders of the SoS, so that they may not understand the impact of their local decisions on the wider SoS.
SoS Governance
Generally, design and operation of complex systems is concerned with control, but the classification of SoS (Dahmann, et. al., 2008) is based on the notion of diminishing central control, as the types go from directed to virtual. Sauser, et. al. (2009) has described the ‘control paradox of SoS’ and asserted that for SoS, ‘management’ is replaced by ‘governance’. ‘Control is a function of rules, time, and bandwidth; whereas command is a function of trusts, influence, fidelity, and agility’.
Some practitioners have found the Cynefin framework, developed by David Snowden, helpful in understanding the nature of complexity that may arise in SoS. Developed from knowledge management considerations, Kurtz and Snowden (2003) propose three reasons why the behavior of systems involving people may be difficult to predict. Firstly, humans are not limited to one identity, and so modelling human behaviors using norms may not be reliable. Secondly, humans are not limited to acting in accordance with predetermined rules. Thirdly, humans are not limited to acting on local patterns. These reasons all undermine control, so that the sociological aspects of SoS make their behaviours hard to predict and, possibly indeterminate. The Cynefin framework considers systems to be classified in four domains:
- Known – simple systems with predictable and repeatable cause and effect
- Knowable – amenable to systems thinking and analytical/reductionist methods
- Complex – adaptive systems where cause and effect are only discernable in retrospect and do not repeat
- Chaotic – no cause and effect relationships are perceivable
The different types of SoS (directed, acknowledged, collaborative, and virtual) could all be described in any of the above domains, depending on many factors internal to the SoS, but in all cases it is the sociological element of the socio-technical SoS that is most likely to give rise to ambiguity in predicting behavior.
A major governance issue for SoS is understanding the ownership of, and making reliable estimates of risk (Fovino & Masera, 2007). High levels of connectivity, and the potential for emergent behavior due to the interactions of separately owned/operated constituent systems, means that significant risks may go unacknowledged and their mitigations unplanned.
In general, governance can be summed up by asking three connected questions (Siemieniuch and Sinclair, 2014):
- Are we doing the right things (leadership)?
- Are we doing those things right (management)?
- How do we know this (metrics and measurements)?
Currently, there is no accepted framework for addressing these questions in a SoS context, but Henshaw et. al. (2013) highlighted architectures as an important means through which governance may be clarified. They postulate that a SoS can be regarded as a set of trust and contract relationships between systems (i.e. including both informal and formal relationships). The systems architect of a constituent system must, therefore, address trust issues for each participating organization in the overall enterprise with which his/her system must interoperate. For SoS, technical engineering governance is concerned with defining and ensuring compliance with trust at the interface between constituent systems. An example of difficulty managing the interfaces in a SoS is provided in the Cassini-Huygens mission case study .
Situational Awareness
Situational awareness is a decision maker’s understanding of the environment in which he/she takes a decision; it concerns information, awareness, perception, and cognition. Endsley (1995) emphasizes that situational awareness is a state of knowledge. There are numerous examples of SoS failure due to the operator of one constituent system making decisions based on inadequate knowledge of the overall SoS (big picture).
On the other hand, SoS development is also viewed as the means through which improved situational awareness may be achieved (Van der Laar, et. al., 2013). In the defense environment, Network Enabled Capability (NEC) was a system of systems approach motivated by the objective of making better use of information sharing to achieve military objectives. NEC was predicated on the ability to share useful information effectively among the stakeholders that need it. It is concluded that improving situational awareness will improve SoS performance, or at least reduce the risk of failures at the SoS level. Thus, the principles which govern the organization of the SoS should support sharing information effectively across the network; in essence, ensuring that every level of the interoperability spectrum is adequately serviced. Operators need insight into the effect that their own local decisions may have on the changing SoS or environment; similarly they need to understand how external changes will affect the systems that they own.
Increasingly, SoS include constituent systems with high levels of autonomous decision making ability, a class of system that can be described as cyber-physical systems (of systems). The relationship to SoS is described by Henshaw (2016). Issues arise because autonomy can degrade human situational awareness regarding the behavior of the SoS, and also the autonomous systems within the SoS have inadequate situational awareness due to a lack of competent models of humans (Sowe, 2016)
References
Works Cited
Ackoff, R.L.(1971) “Towards a Systems of Systems Concepts,” Manage. Sci., vol. 17, no. 11, pp. 661–671.
Baxter, G. and I. Sommerville, (2011) “Socio-technical systems: From design methods to systems engineering,” Interact. Comput., vol. 23, no. 1, pp. 4–17.
Dahmann, J. S. & Baldwin, K. J. (2008) Understanding the Current State of US Defense Systems of Systems and the Implications for Systems Engineering, 2nd Annual IEEE Systems Conference, 1–7. http://doi.org/10.1109/SYSTEMS.2008.4518994
Dahmann, J.S. (2015) “Systems of Systems Characterization and Types,” in Systems of Systems Engineering for NATO Defence Applications (STO-EN-SCI-276), pp. 1–14.]
Endsley, M. R. (1995) Toward a Theory of Situation Awareness in Dynamic Systems, J. Human Factors and Ergonomics Soc., 37(1), 32–64. http://doi.org/10.1518/001872095779049543
Fovino, I. N., & Masera, M. (2007) Emergent disservices in interdependent systems and system-of-systems, in Proc. IEEE International Conference on Systems, Man and Cybernetics, Vol. 1, pp. 590–595. http://doi.org/10.1109/ICSMC.2006.384449
Henshaw, M. J. de C., Siemieniuch, C. E., & Sinclair, M. A. (2013) Technical and Engineering Governance in the Context of Systems of Systems, in NATO SCI Symp. Architecture Assessment for NEC (pp. 1–10). Tallinn, Es. NATO STO.
Henshaw, M. (2014) A Socio-Technical Perspective on SoSE, in Lecture Series in Systems of Systems Engineering for NATO Defence Applications (SCI-276). NATO CSO.
Henshaw, M. (2016). Systems of Systems, Cyber-Physical Systems, The Internet-of-Things…Whatever Next? INSIGHT, 19(3), pp.51–54.
Klein, L. (2014) What do we actually mean by ‘sociotechnical’? On values, boundaries and the problems of language, Appl. Ergon., vol. 45, no. 2 PA, pp. 137–142.
Kurtz, C.F. and D. J. Snowden (2003) “The New Dynamics of Strategy: Sense-making in a Complex-Complicated World,” IBM Syst. J., vol. 42, no. 3, pp. 462–483.
Maguire, M. (2014) Socio-technical systems and interaction design - 21st century relevance, Appl. Ergon., vol. 45, no. 2 PA, pp. 162–170.:/mil/
Rebovich, G. (2009) “Enterprise systems of Systems,” in Systems of Systems Engineering - Principles and Applications, M. Jamshidi, Ed. Boca Raton: CRC Press, pp. 165–191.
Sauser, B., Boardman, J., & Gorod, A. (2009) System of Systems Management, in System of Systems Engineering: Innovations for the 21st Century, M. Jamshidi (Ed.), (pp. 191–217) Wiley.
Siemieniuch, C.E. & Sinclair, M.A. (2014) Extending systems ergonomics thinking to accommodate the socio-technical issues of Systems of Systems, Appl. Ergon., V 45, Issue 1, Pages 85-98
Sowe, S.K. et al. (2016) Cyber-Physical-Human Systems - putting people in the loop. IT Professional, 18(February), pp.10–13.]
Van der Laar, P., Tretmans, J., & Borth, M. (2013) Situational Awareness with Systems of Systems. Springer.
Primary References
Checkland, P.B. 1981. Systems Thinking, Systems Practice. Chichester, West Sussex, England, UK: John Wiley & Sons, Ltd.
Additional References
Bruesburg, A., and G. Fletcher. 2009. The Human View Handbook for MODAF, draft version 2, second issue. Bristol, England, UK: Systems Engineering & Assessment Ltd. Available: http://www.hfidtc.com/research/process/reports/phase-2/hv-handbook-issue2-draft.pdf.
IFIP-IFAC Task Force. 1999. "The Generalised Enterprise Reference Architecture and Methodology," V1.6.3. Available: http://www.cit.gu.edu.au/~bernus/taskforce/geram/versions/geram1-6-3/v1.6.3.html.
ISO. 1998. ISO 14258:1998, Industrial automation systems — Concepts and rules for enterprise models. Geneva, Switzerland: International Organization for Standardization.
ISO. 2006. ISO 19439:2006, Enterprise integration — Framework for enterprise modelling. Geneva, Switzerland: International Organization for Standardization.
ISO. 2007. ISO 19440:2007, Enterprise integration — Constructs for enterprise modelling. Geneva, Switzerland: International Organization for Standardization.
Miller, F.P., A.F. Vandome, and J. McBrewster. 2009. Enterprise Modelling. Mauritius: Alphascript Publishing, VDM Verlag Dr. Müller GmbH & Co. KG.