Difference between revisions of "Analysis and Selection between Alternative Solutions"
m (Text replacement - "SEBoK v. 2.9, released 20 November 2023" to "SEBoK v. 2.10, released 06 May 2024") |
|||
(79 intermediate revisions by 13 users not shown) | |||
Line 1: | Line 1: | ||
− | This | + | ---- |
+ | '''''Lead Author:''''' ''Rick Adcock'', '''''Contributing Authors:''''' ''Brian Wells, Scott Jackson, Janet Singer, Duane Hybertson'' | ||
+ | ---- | ||
+ | [[File:PPI.png|thumb|250px|right|<center>The "Systems Approach Applied to Engineered Systems" knowledge area is graciously sponsored by PPI.<center>]] | ||
+ | This topic is part of the [[Systems Approach Applied to Engineered Systems]] knowledge area (KA). It describes knowledge related to the analysis and selection of a preferred {{Term|Solution (glossary)|solution}} from the possible options, which may have been proposed by [[Synthesizing Possible Solutions]]. Selected solution options may form the starting point for [[Implementing and Proving a Solution]]. Any of the activities described below may also need to be considered {{Term|Concurrently (glossary)|concurrently}} with other activities in the {{Term|Systems Approach (glossary)|systems approach}} at a particular point in the life of a {{Term|System-of-Interest (glossary)|system-of-interest}} (SoI). | ||
+ | The activities described below should be considered in the {{Term|Context (glossary)|context}} of the [[Overview of the Systems Approach]] topic at the start of this KA. The final topic in this KA, [[Applying the Systems Approach]], considers the dynamic aspects of how these activities are used as part of the systems approach and how this relates in detail to {{Term|Element (glossary)|elements}} of {{Term|Systems Engineering (glossary)|systems engineering}} (SE). | ||
==System Analysis== | ==System Analysis== | ||
− | System Analysis is an activity | + | {{Term|System Analysis (glossary)|System analysis}} is an activity in the systems approach that evaluates one or more {{Term|System (glossary)|system}} artifacts created during the activities involved in [[Synthesizing Possible Solutions]], such as: |
− | * | + | * Defining {{Term|Assessment Criterion (glossary)|assessment criteria}} based on the required properties and behavior of an identified {{Term|Problem (glossary)|problem}} or {{Term|Opportunity (glossary)|opportunity}} system situation. |
− | * | + | * Accessing the properties and behavior of each candidate solution in comparison to the criteria. |
− | * | + | * Comparing the assessments of the candidate solutions and identification of any that could resolve the problem or exploit the opportunities, along with the selection of candidates that should be further explored. |
+ | |||
+ | As discussed in [[Synthesizing Possible Solutions]] topic, the problem context for an {{Term|Engineered System (glossary)|engineered system}} will include a logical or ideal system solution description. It is assumed that the solution that “best” matches the ideal one will be the most acceptable solution to the {{Term|Stakeholder (glossary)|stakeholders}}. Note, as discussed below, the “best” solution should include an understanding of {{Term|Cost (glossary)|cost}} and {{Term|Risk (glossary)|risk}}, as well as {{Term|Effectiveness (glossary)|effectiveness}}. The problem context may include a {{Term|Soft System (glossary)|soft system}} {{Term|Concept (glossary)|conceptual}} {{Term|Model (glossary)|model}} describing the logical elements of a system to resolve the problem situation and how these are perceived by different stakeholders (Checkland 1999). This soft context view will provide additional criteria for the analysis {{Term|Process (glossary)|process}}, which may become the critical issue in selecting between two equally effective solution alternatives. | ||
− | + | Hence, analysis is often not a one-time process of solution selection; rather, it is used in combination with problem understanding and solution {{Term|Synthesis (glossary)|synthesis}} to progress towards a more complete understanding of problems and solutions over time (see [[Applying the Systems Approach]] topic for a more complete discussion of the dynamics of this aspect of the approach). | |
− | + | ==Effectiveness Analysis== | |
− | Effectiveness studies use the problem or opportunity system as a starting point. | + | Effectiveness studies use the problem or opportunity system context as a starting point. |
− | The effectiveness of a synthesized system solution will | + | The effectiveness of a synthesized system solution will include performance criteria associated with both the system’s primary and enabling {{Term|Function (glossary)|functions}}. These are derived from the system’s {{Term|Purpose (glossary)|purpose}}, in order to enable the realization of stakeholder needs in one or more, wider system contexts. |
− | For a | + | For a {{Term|Product System (glossary)|product system}}, there are a set of generic non-functional qualities that are associated with different types of solution patterns or technology, e.g., {{Term|Safety (glossary)|safety}}, {{Term|Security (glossary)|security}}, {{Term|Reliability (glossary)|reliability}}, {{Term|Maintainability (glossary)|maintainability}}, usability, etc. These criteria are often explicitly stated as parts of the {{Term|Domain (glossary)|domain}} knowledge of related technical disciplines in technology domains. |
− | For a | + | For a {{Term|Service System (glossary)|service system}} or {{Term|Enterprise System (glossary)|enterprise system}}, the criteria will be more directly linked to the identified {{Term|User (glossary)|user}} needs or {{Term|Enterprise (glossary)|enterprise}} goals. Typical qualities for such systems include agility, {{Term|Resilience (glossary)|resilience}}, {{Term|Flexibility (glossary)|flexibility}}, upgradeability, etc. |
− | In addition to assessments of the absolute effectiveness of a given solution system | + | In addition to assessments of the absolute effectiveness of a given solution system, {{Term|Systems Engineer (glossary)|systems engineers}} must also be able to combine effectiveness with the limitations of cost and timescales included in the problem context. In general, the role of system analysis is to identify the proposed solutions which can provide some effectiveness within the cost and time allocated to any given {{Term|Iteration (glossary)|iteration}} of the systems approach (see [[Applying the Systems Approach]] for details). If none of the solutions can deliver an effectiveness level that justifies the proposed investment, then it is necessary to return to the original framing of the problem. If at least one solution is assessed as sufficiently effective, then a choice between solutions can be proposed. |
− | + | ==Trade-Off Studies== | |
− | In the context of the definition of a system, a trade-off study consists of comparing the characteristics of each candidate system element to determine the solution that | + | In the context of the definition of a system, a trade-off study consists of comparing the characteristics of each candidate system element to those of each candidate system {{Term|Architecture (glossary)|architecture}} in order to determine the solution that globally balances the assessment criteria in the best way. The various characteristics analyzed are gathered in cost analysis, technical risks analysis, and effectiveness analysis (NASA 2007). To accomplish a trade off study, there are a variety of methods, often supported by tooling. Each class of analysis is the subject of the following topics: |
− | + | * Assessment criteria are used to classify the various candidate solutions. They are either absolute or relative. For example, the maximum cost per unit produced is c$, cost reduction shall be x%, effectiveness improvement is y%, and risk mitigation is z%. | |
− | + | * '''{{Term|Boundary (glossary)|Boundaries}}''' identify and limit the characteristics or criteria to be taken into account at the time of analysis (e.g., the kind of costs to be taken into account, acceptable technical risks, and the type and level of effectiveness). | |
− | + | * '''Scales''' are used to quantify the characteristics, properties, and/or criteria and to make comparisons. Their definition requires knowledge of the highest and lowest limits, as well as the type of evolution of the characteristic (linear, logarithmic, etc.). | |
− | + | * An {{Term|Assessment Score (glossary)|assessment score}} is assigned to a characteristic or criterion for each candidate solution. The goal of the trade-off study is to succeed in quantifying the three variables (and their decomposition in sub-variables) of cost, risk, and effectiveness for each candidate solution. This operation is generally complex and requires the use of models. | |
− | + | * The '''optimization''' of the characteristics or properties improves the scoring of interesting solutions. | |
− | A decision-making process is not an accurate science | + | A decision-making process is not an accurate science; ergo, trade-off studies have limits. The following concerns should be taken into account: |
− | *Subjective | + | *Subjective Criteria – personal bias of the analyst; for example, if the component has to be beautiful, what constitutes a “beautiful” component? |
− | *Uncertain | + | *Uncertain Data – for example, inflation has to be taken into account to estimate the cost of maintenance during the complete {{Term|Life Cycle (glossary)|life cycle}} of a system; how can a systems engineer predict the evolution of inflation over the next five years? |
− | *Sensitivity | + | *Sensitivity Analysis – A global assessment score that is designated to every candidate solution is not absolute; thus, it is recommended that a robust selection is gathered by performing a sensitivity analysis that considers small variations of assessment criteria values (weights). The selection is robust if the variations do not change the order of scores. |
A thorough trade-off study specifies the assumptions, variables, and confidence intervals of the results. | A thorough trade-off study specifies the assumptions, variables, and confidence intervals of the results. | ||
− | == Systems Principles of System Analysis== | + | ==Systems Principles of System Analysis== |
− | From the discussions above, the following general | + | From the discussions above, the following general {{Term|Principle (glossary)|principles}} of systems analysis can be defined: |
− | + | *Systems analysis is an iterative activity consisting of trade studies made between various solution options from the systems synthesis activity. | |
− | + | *Systems analysis uses assessment criteria based upon a problem or opportunity system description. | |
− | + | ** These criteria will be based around an ideal system description that assumes a {{Term|Hard System (glossary)|hard system}} problem context can be defined. | |
− | + | ** The criteria must consider required system behavior and properties of the complete solution in all of the possible wider system contexts and environments. | |
− | + | ** Trade studies require equal consideration to the primary system and the enabling system working as a single system to address the user need. These studies need to consider system requirements for Key Performance Parameters (KPPs), systems safety, security, and affordability across the entire life cycle. | |
− | |||
− | |||
− | |||
− | |||
− | |||
− | + | ** This ideal system description may be supported by {{Term|Soft System (glossary)|soft system}} descriptions from which additional “soft” criteria may be defined (e.g., a stakeholder preference for or against certain kinds of solutions and relevant social, political, or cultural conventions to be considered in the likely solution environment, etc.). | |
+ | * At a minimum, the assessment criteria should include the constraints on cost and time scales acceptable to stakeholders. | ||
+ | * Trade studies provide a mechanism for conducting analysis of alternative solutions. | ||
+ | ** A trade study should consider a “system of assessment criteria,” designating special attention to the limitations and dependencies between individual criteria. | ||
+ | ** Trade studies need to deal with both objective and subjective criteria. Care must be taken to assess the sensitivity of the overall assessment to particular criteria. | ||
==References== | ==References== | ||
===Works Cited=== | ===Works Cited=== | ||
+ | Checkland, P.B. 1999. ''Systems Thinking, Systems Practice''. Chichester, UK: John Wiley & Sons Ltd. | ||
− | + | NASA. 2007. ''Systems Engineering Handbook'', Revision 1. Washington, D.C., USA: National Aeronautics and Space Administration (NASA). NASA/SP-2007-6105. | |
===Primary References=== | ===Primary References=== | ||
− | ISO/IEC | + | ISO/IEC/IEEE. 2015. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering -- System Life Cycle Processes]]''. Geneva, Switzerland: International Organisation for Standardisation/International Electrotechnical Commissions/Institute of Electrical and Electronics Engineer. [[ISO/IEC/IEEE 15288]]:2015. |
− | Switzerland: International Organisation for Standardisation / International Electrotechnical Commissions. [[ISO/IEC/IEEE 15288]]: | ||
− | Jackson, S., D. Hitchins and H. Eisner. 2010. "[[What is the Systems Approach?]]" INCOSE ''Insight | + | Jackson, S., D. Hitchins and H. Eisner. 2010. "[[What is the Systems Approach?|What is the systems approach?]]" INCOSE ''Insight,'' vol. 13, no. 1, April, pp. 41-43. |
===Additional References=== | ===Additional References=== | ||
+ | None. | ||
---- | ---- | ||
− | <center>[[Synthesizing Possible Solutions|< | + | <center>[[Synthesizing Possible Solutions|< Previous Article]] | [[Systems Approach Applied to Engineered Systems|Parent Article]] | [[Implementing and Proving a Solution|Next Article >]]</center> |
+ | |||
+ | <center>'''SEBoK v. 2.10, released 06 May 2024'''</center> | ||
[[Category:Part 2]][[Category:Topic]] | [[Category:Part 2]][[Category:Topic]] | ||
− | + | [[Category:Systems Approach Applied to Engineered Systems]] | |
− | |||
− | |||
− | [[Category:Systems Approach]] |
Latest revision as of 22:21, 2 May 2024
Lead Author: Rick Adcock, Contributing Authors: Brian Wells, Scott Jackson, Janet Singer, Duane Hybertson
This topic is part of the Systems Approach Applied to Engineered Systems knowledge area (KA). It describes knowledge related to the analysis and selection of a preferred solution from the possible options, which may have been proposed by Synthesizing Possible Solutions. Selected solution options may form the starting point for Implementing and Proving a Solution. Any of the activities described below may also need to be considered concurrently with other activities in the systems approach at a particular point in the life of a system-of-interest (SoI).
The activities described below should be considered in the context of the Overview of the Systems Approach topic at the start of this KA. The final topic in this KA, Applying the Systems Approach, considers the dynamic aspects of how these activities are used as part of the systems approach and how this relates in detail to elements of systems engineering (SE).
System Analysis
System analysis is an activity in the systems approach that evaluates one or more system artifacts created during the activities involved in Synthesizing Possible Solutions, such as:
- Defining assessment criteria based on the required properties and behavior of an identified problem or opportunity system situation.
- Accessing the properties and behavior of each candidate solution in comparison to the criteria.
- Comparing the assessments of the candidate solutions and identification of any that could resolve the problem or exploit the opportunities, along with the selection of candidates that should be further explored.
As discussed in Synthesizing Possible Solutions topic, the problem context for an engineered system will include a logical or ideal system solution description. It is assumed that the solution that “best” matches the ideal one will be the most acceptable solution to the stakeholders. Note, as discussed below, the “best” solution should include an understanding of cost and risk, as well as effectiveness. The problem context may include a soft system conceptual model describing the logical elements of a system to resolve the problem situation and how these are perceived by different stakeholders (Checkland 1999). This soft context view will provide additional criteria for the analysis process, which may become the critical issue in selecting between two equally effective solution alternatives.
Hence, analysis is often not a one-time process of solution selection; rather, it is used in combination with problem understanding and solution synthesis to progress towards a more complete understanding of problems and solutions over time (see Applying the Systems Approach topic for a more complete discussion of the dynamics of this aspect of the approach).
Effectiveness Analysis
Effectiveness studies use the problem or opportunity system context as a starting point.
The effectiveness of a synthesized system solution will include performance criteria associated with both the system’s primary and enabling functions. These are derived from the system’s purpose, in order to enable the realization of stakeholder needs in one or more, wider system contexts.
For a product system, there are a set of generic non-functional qualities that are associated with different types of solution patterns or technology, e.g., safety, security, reliability, maintainability, usability, etc. These criteria are often explicitly stated as parts of the domain knowledge of related technical disciplines in technology domains.
For a service system or enterprise system, the criteria will be more directly linked to the identified user needs or enterprise goals. Typical qualities for such systems include agility, resilience, flexibility, upgradeability, etc.
In addition to assessments of the absolute effectiveness of a given solution system, systems engineers must also be able to combine effectiveness with the limitations of cost and timescales included in the problem context. In general, the role of system analysis is to identify the proposed solutions which can provide some effectiveness within the cost and time allocated to any given iteration of the systems approach (see Applying the Systems Approach for details). If none of the solutions can deliver an effectiveness level that justifies the proposed investment, then it is necessary to return to the original framing of the problem. If at least one solution is assessed as sufficiently effective, then a choice between solutions can be proposed.
Trade-Off Studies
In the context of the definition of a system, a trade-off study consists of comparing the characteristics of each candidate system element to those of each candidate system architecture in order to determine the solution that globally balances the assessment criteria in the best way. The various characteristics analyzed are gathered in cost analysis, technical risks analysis, and effectiveness analysis (NASA 2007). To accomplish a trade off study, there are a variety of methods, often supported by tooling. Each class of analysis is the subject of the following topics:
- Assessment criteria are used to classify the various candidate solutions. They are either absolute or relative. For example, the maximum cost per unit produced is c$, cost reduction shall be x%, effectiveness improvement is y%, and risk mitigation is z%.
- Boundaries identify and limit the characteristics or criteria to be taken into account at the time of analysis (e.g., the kind of costs to be taken into account, acceptable technical risks, and the type and level of effectiveness).
- Scales are used to quantify the characteristics, properties, and/or criteria and to make comparisons. Their definition requires knowledge of the highest and lowest limits, as well as the type of evolution of the characteristic (linear, logarithmic, etc.).
- An assessment score is assigned to a characteristic or criterion for each candidate solution. The goal of the trade-off study is to succeed in quantifying the three variables (and their decomposition in sub-variables) of cost, risk, and effectiveness for each candidate solution. This operation is generally complex and requires the use of models.
- The optimization of the characteristics or properties improves the scoring of interesting solutions.
A decision-making process is not an accurate science; ergo, trade-off studies have limits. The following concerns should be taken into account:
- Subjective Criteria – personal bias of the analyst; for example, if the component has to be beautiful, what constitutes a “beautiful” component?
- Uncertain Data – for example, inflation has to be taken into account to estimate the cost of maintenance during the complete life cycle of a system; how can a systems engineer predict the evolution of inflation over the next five years?
- Sensitivity Analysis – A global assessment score that is designated to every candidate solution is not absolute; thus, it is recommended that a robust selection is gathered by performing a sensitivity analysis that considers small variations of assessment criteria values (weights). The selection is robust if the variations do not change the order of scores.
A thorough trade-off study specifies the assumptions, variables, and confidence intervals of the results.
Systems Principles of System Analysis
From the discussions above, the following general principles of systems analysis can be defined:
- Systems analysis is an iterative activity consisting of trade studies made between various solution options from the systems synthesis activity.
- Systems analysis uses assessment criteria based upon a problem or opportunity system description.
- These criteria will be based around an ideal system description that assumes a hard system problem context can be defined.
- The criteria must consider required system behavior and properties of the complete solution in all of the possible wider system contexts and environments.
- Trade studies require equal consideration to the primary system and the enabling system working as a single system to address the user need. These studies need to consider system requirements for Key Performance Parameters (KPPs), systems safety, security, and affordability across the entire life cycle.
- This ideal system description may be supported by soft system descriptions from which additional “soft” criteria may be defined (e.g., a stakeholder preference for or against certain kinds of solutions and relevant social, political, or cultural conventions to be considered in the likely solution environment, etc.).
- At a minimum, the assessment criteria should include the constraints on cost and time scales acceptable to stakeholders.
- Trade studies provide a mechanism for conducting analysis of alternative solutions.
- A trade study should consider a “system of assessment criteria,” designating special attention to the limitations and dependencies between individual criteria.
- Trade studies need to deal with both objective and subjective criteria. Care must be taken to assess the sensitivity of the overall assessment to particular criteria.
References
Works Cited
Checkland, P.B. 1999. Systems Thinking, Systems Practice. Chichester, UK: John Wiley & Sons Ltd.
NASA. 2007. Systems Engineering Handbook, Revision 1. Washington, D.C., USA: National Aeronautics and Space Administration (NASA). NASA/SP-2007-6105.
Primary References
ISO/IEC/IEEE. 2015. Systems and Software Engineering -- System Life Cycle Processes. Geneva, Switzerland: International Organisation for Standardisation/International Electrotechnical Commissions/Institute of Electrical and Electronics Engineer. ISO/IEC/IEEE 15288:2015.
Jackson, S., D. Hitchins and H. Eisner. 2010. "What is the systems approach?" INCOSE Insight, vol. 13, no. 1, April, pp. 41-43.
Additional References
None.