Difference between revisions of "Decision Management"

From SEBoK
Jump to navigation Jump to search
m (Text replacement - "SEBoK v. 2.9, released 13 November 2023" to "SEBoK v. 2.9, released 20 November 2023")
 
(94 intermediate revisions by 7 users not shown)
Line 1: Line 1:
The purpose of [[Decision Management (glossary)|decision management]] is to analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established [[Decision Criteria (glossary)|decision criteria]]. It involves establishing guidelines to determine which issues should be subjected to a formal evaluation process, and then applying formal evaluation processes to these issues.
+
----
The purpose of [[Decision Management (glossary)|decision management]] is to analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established [[Decision Criteria (glossary)|decision criteria]]. It involves establishing guidelines to determine which issues should be subjected to a formal evaluation process, and then applying formal evaluation processes to these issues.
+
'''''Lead Author:''''' ''Ray Madachy'', '''''Contributing Authors:''''' ''Garry Roedler, Greg Parnell, Scott Jackson''
 +
----
 +
Many systems engineering decisions are difficult because they include numerous stakeholders, multiple competing objectives, substantial uncertainty, and significant consequences.  In these cases, good decision making requires a formal {{Term|Decision Management (glossary)|decision management}} process.  The purpose of the decision management process is:
 +
<blockquote>“…to provide a structured, analytical framework for objectively identifying, characterizing and evaluating a set of alternatives for a decision at any point in the life cycle and select the most beneficial course of action.”([[ISO/IEC/IEEE 15288]])</blockquote>
 +
 
 +
Decision situations ({{Term|Opportunity (glossary)|opportunities}}) are commonly encountered throughout a {{Term|Life Cycle (glossary)|system’s lifecycle}}. The decision management method most commonly employed by systems engineers is the trade study.  Trade studies aim to define, measure, and assess shareholder and {{Term|stakeholder (glossary)|stakeholder}} {{Term|Value (glossary)|value}} to facilitate the decision maker’s search for an alternative that represents the best balance of competing objectives.  By providing techniques for decomposing a trade decision into logical segments and then synthesizing the parts into a coherent whole, a decision management process allows the decision maker to work within human cognitive limits without oversimplifying the problem. Furthermore, by decomposing the overall decision problem, experts can provide assessments of alternatives in their area of expertise. 
 +
 
 +
==Decision Management Process==
 +
The decision analysis process is depicted in Figure 1 below.  The decision management process is based on several best practices, including:
 +
*Utilizing sound mathematical technique of decision analysis for trade studies. Parnell (2009) provided a list of decision analysis concepts and techniques.
 +
*Developing one master decision model, followed by its refinement, update, and use, as required for trade studies throughout the system life cycle.
 +
*Using Value-Focused Thinking (Keeney 1992) to create better alternatives.
 +
*Identifying uncertainty and assessing risks for each decision.
 +
 
 +
[[File:Decision_Mgt_Process_DM.png|thumb|center|500px|center|'''Figure 1. Decision Management Process (INCOSE DAWG 2013).''' Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
 +
 
 +
The center of the diagram shows the five trade space objectives (listed clockwise): Performance, Growth Potential, Schedule, Development & Procurement Costs, and Sustainment Costs . The ten blue arrows represent the decision management process activities and the white text within the green ring represents SE process elements.  Interactions are represented by the small, dotted green or blue arrows.  The decision analysis process is an iterative process. 
 +
A hypothetical UAV decision problem is used to illustrate each of the activities in the following sections.
 +
 
 +
===Framing and Tailoring the Decision===
 +
To ensure the decision team fully understands the decision context, the analyst should describe the system baseline, boundaries and interfaces.  The decision context includes: the system definition, the life cycle stage, decision milestones, a list of decision makers and stakeholders, and available resources. The best practice is to identify a decision problem statement that defines the decision in terms of the system life cycle.
 +
 
 +
===Developing Objectives and Measures===
 +
Defining how an important decision will be made is difficult.  As Keeney (2002) puts it:
 +
<blockquote>''Most important decisions involve multiple objectives, and usually with multiple-objective decisions, you can't have it all.  You will have to accept less achievement in terms of some objectives in order to achieve more on other objectives.  But how much less would you accept to achieve how much more?''</blockquote>
 +
The first step is to develop objectives and measures using interviews and focus groups with subject matter experts (SMEs) and stakeholders.
 +
For systems engineering trade-off analyses, stakeholder value often includes competing objectives of performance, development schedule, unit cost, support costs, and growth potential.  For corporate decisions, shareholder value would also be added to this list. For performance, a functional decomposition can help generate a thorough set of potential objectives.  Test this initial list of fundamental objectives by checking that each fundamental objective is essential and controllable and that the set of objectives is complete, non-redundant, concise, specific, and understandable (Edwards et al. 2007).  Figure 2 provides an example of an objectives hierarchy.
 +
 
 +
[[File:Fund_Obj_Hierarchy_DM.png|thumb|center|650px|center|'''Figure 2. Fundamental Objectives Hierarchy (INCOSE DAWG 2013).''' Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
 +
 
 +
For each objective, a measure must be defined  to assess the value of each alternative for that objective. A measure (attribute, criterion, and metric) must be unambiguous, comprehensive, direct, operational, and understandable (Keeney & Gregory 2005).
 +
A defining feature of multi-objective decision analysis is the transformation from measure space to value space.  This transformation is performed by a value function which shows returns to scale on the measure range.  When creating a value function, the walk-away point on the measure scale (x-axis) must be ascertained and mapped to a 0 value on the value scale (y-axis).  A walk-away point is the measure score where regardless of how well an alternative performs in other measures, the decision maker will walk away from the alternative. He or she does this through working with the user, finding the measure score beyond, at which point an alternative provides no additional value, and labeling it "stretch goal" (ideal) and then mapping it to 100 (or 1 and 10) on the value scale (y-axis).  Figure 3 provides the most common value curve shapes.   The rationale for the shape of the value functions should be documented for traceability and defensibility (Parnell et al. 2011).
 +
 
 +
[[File:Value_Function_Example_DM.png|thumb|center|750px|center|'''Figure 3. Value Function Examples (INCOSE DAWG 2013).''' Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
 +
 
 +
The mathematics of multiple objective decision analysis (MODA) requires that the weights depend on importance of the measure and the range of the measure (walk away to stretch goal).  A useful tool for determining priority weighting is the swing weight matrix (Parnell et al. 2011).  For each measure, consider its importance through determining whether the measure corresponds to a defining, critical, or enabling function and consider the gap between the current capability and the desired capability; finally, put the name of the measure in the appropriate cell of the matrix (Figure 4).  The highest priority weighting is placed in the upper-left corner and assigned an unnormalized weight of 100.  The unnormalized weights are monotonically decreasing to the right and down the matrix.  Swing weights are then assessed by comparing them to the most important value measure or another assessed measure.  The swing weights are normalized to sum to one for the additive value model used to calculate value in a subsequent section.
 +
 
 +
[[File:Swing_Weight_Matrix_DM.png|thumb|center|750px|center|'''Figure 4. Swing Weight Matrix (INCOSE DAWG 2013).''' Permission granted by Gregory Parnell who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
 +
 
 +
===Generating Creative Alternatives===
 +
 
 +
To help generate a creative and comprehensive set of alternatives that span the decision space, consider developing an alternative generation table (also called a morphological box) (Buede, 2009; Parnell et al. 2011). It is a best practice to establish a meaningful product structure for the system and to be reported in all decision presentations (Figure 5). 
 +
 
 +
[[File:Descript_of_Alt_DM.png|thumb|center|750px|center|'''Figure 5. Descriptions of Alternatives (INCOSE DAWG 2013).''' Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
  
The technical discipline of decision analysis identifies the best option among a set of alternatives under [[Uncertainty (glossary)|uncertainty]]. Many analysis methods are used in the multidimensional tradeoffs of [[Systems Engineering (glossary)|systems engineering]], under varying degrees of uncertainty. These start from non-probabilistic decision rules that ignore the likelihood of chance outcomes, move to expected value rules, and end in more general utility approaches.
+
===Assessing Alternatives via Deterministic Analysis===
  
==Process Overview==
+
With objectives and measures established and alternatives having been defined, the decision team should engage SMEs, equipped with operational data, test data, simulations, models, and expert knowledge.  Scores are best captured on scoring sheets for each alternative/measure combination which document the source and rationale.  Figure 6 provides a summary of the scores.  
The best practices of decision management for evaluating alternatives are described in the following sections and grouped by specific practices (SEI 2007).  
 
  
* '''Establish Guidelines for Decision Analysis''' - Guidelines should be established to determine which issues are subject to a formal evaluation process, as not every decision is significant enough to warrant formality. Whether a decision is significant or not is dependent on the project and circumstances and is determined by the established guidelines.
+
[[File:ALT_Scores_DM.png|thumb|center|750px|center|'''Figure 6. Alternative Scores (INCOSE DAWG 2013).''' Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
* '''Establish Evaluation Criteria''' - Criteria for evaluating alternatives and the relative ranking of these criteria are established. The evaluation criteria provide the basis for evaluating alternative solutions. The criteria are ranked so that the highest ranked criteria exert the most influence. There are many contexts in which a formal evaluation process can be used by systems engineers, so the criteria may have already been defined as part of another process.
 
* '''Identify Alternative Solutions''' - Identify alternative solutions to address program issues. A wider range of alternatives can surface by soliciting more stakeholders with diverse skills and backgrounds to identify and address assumptions, [[Constraint (glossary)|constraints]], and biases.  Brainstorming sessions may stimulate innovative alternatives through rapid interaction and feedback. Sufficient candidate solutions may not be furnished for analysis. As the analysis proceeds, other alternatives should be added to the list of potential candidates. The generation and consideration of multiple alternatives early in a decision process increases the likelihood that an acceptable decision will be made and that the consequences of the decision will be understood.
 
* '''Select Evaluation Methods''' - Methods for evaluating alternative solutions against established criteria can range from simulations to the use of probabilistic models and decision theory. These methods need to be carefully selected. The level of detail of a method should be commensurate with cost, schedule, performance, and associated risk. Typical analysis evaluation methods include the following:
 
** modeling and simulation
 
** analysis studies on business opportunities, engineering, manufacturing, cost, etc
 
** surveys and user reviews
 
** extrapolations based on field experience and prototypes
 
** testing
 
** judgment provided by an expert or group of experts (e.g., Delphi Method)
 
* '''Evaluate Alternatives''' - Evaluate alternative solutions using the established criteria and methods evaluating alternative solutions includes such activities as analysis, discussion, and review. Iterative cycles of analysis are sometimes necessary. Supporting analyses, experimentation, prototyping, piloting, or simulations may be needed to substantiate scoring and conclusions.
 
* '''Select Solutions''' - Select solutions from the alternatives based on the evaluation criteria. Selecting solutions involves weighing the results from the evaluation of alternatives and the corresponding risks.
 
  
==Decision Judgment Methods==
+
Note that in addition to identified alternatives, the score matrix includes a row for the ideal alternative.   The ideal is a tool for value-focused thinking, which will be covered later.
Common alternative judgment methods that the practitioner should be aware of include:
 
* '''Emotion-Based Judgment''' - Once a decision is made public, the decision-makers will vigorously defend their choice, even in the face of contrary evidence, because it is easy to become emotionally tied to the decision. Another phenomenon is that people often need “permission” to support an action or idea, as explained by Cialdini (2006) inherent human trait also suggests why teams often resist new ideas.
 
* '''Intuition-Based Judgment''' - Intuition plays a key role in leading development teams to creative solutions. Gladwell (2005) makes the argument that humans intuitively see the powerful benefits or fatal flaws inherent in a proposed solution. Intuition can be an excellent guide when based on relevant past experience, but it may also be blinding to undiscovered concepts. Ideas generated based on intuition should be considered seriously, but should be treated as an output of a brainstorming session and evaluated using one of the next three approaches. Also, see ''Skunk Works'' (Rich and Janos 1996) for more on intuitive decisions.
 
* '''Expert-Based Judgment''' - For certain problems, especially ones involving technical expertise outside SE, utilizing experts is a cost effective approach. The decision-making challenge is to establish perceptive criteria for selecting the right experts.
 
* '''Fact-Based Judgment''' - This is the most common method and is discussed in more detail below.
 
* '''Probability-Based Judgment''' - This method is used to deal with uncertainty, and this topic is also elaborated below.
 
  
==Fact-Based Decision Making==
+
===Synthesizing Results===
Informed decision-making requires a clear statement of objectives, a clear understanding of the value of the outcome, a gathering of relevant information, an appropriate assessment of alternatives, and a logical process to make a selection. Regardless of the method used, the starting point for decision-making is to identify an appropriate (usually small) team to frame and challenge the decision statement. The decision statement should be concise, but the decision maker and the team should iterate until they have considered all positive and negative consequences of the way they have expressed their objective.
 
  
Once the decision-maker and team accept the decision statement, the next step is to define the decision criteria.  As shown in Figure 1, the criteria fall into two categories: ''musts'' and ''wants.'' Any candidate solution that does not satisfy a ''must'' should be rejected, no matter how attractive all other aspects of the solution are. If a candidate solution appears promising but fails the ''must'' requirements, there is nothing wrong in challenging the requirements, as long as this is done with open awareness to avoid bias. This is a judgment process and the resulting matrix is a decision support guide, not a mandatory theoretical constraint.  A sample flowchart to assist in fact based judgment from ''[[Visualizing Project Management]]'' is below in Figure 1 (Forsberg, Mooz, and Cotterman 2005, 154-155).
+
Next, one can transform the scores into a value table, by using the value functions developed previously.  A color heat map can be useful to visualize value tradeoffs between alternatives and identify where alternatives need improvement (Figure 7).
  
[[File:decision_selection_flowchart.png|thumb|600px|center|'''Figure 1. Decision Selection Flowchart (Forsberg, Mooz, and Cotterman 2005).''' Reprinted with permission of John Wiley & Sons. All other rights are reserved by the copyright owner.]]
+
[[File:Value_Scorecard_w_Heat_Map_DM.png|thumb|center|850px|center|'''Figure 7. Value Scorecard with Heat Map (INCOSE DAWG 2013).''' Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
  
The next step is to define the desirable characteristics of the solution and develop a relative weighting. If no weighting is used, it implies all criteria are of equal importance. One fatal flaw is if the team creates too many criteria (15 or 20 or more), as this tends to obscure important differences in the candidate solutions. When the relative comparison is completed, scores within five percent of each other are essentially equal. An opportunity and risk assessment should be performed on the best candidates, as well as a sensitivity analysis on the scores and weights to ensure that the robustness (or fragility) of the decision is known.
+
The additive value model uses the following equation to calculate each alternative’s value:
  
There are a number of approaches, starting with Pugh’s method in which each candidate solution is compared to a reference standard, and is rated as equal, better, or inferior for each of the criteria (Pugh 1981). An alternative approach is the Kepner Tragoe decision matrix, which is described by Forsberg, Mooz, and Cotterman (2005, 154-155).
+
[[File:Eq_1.jpg|200px]]
  
In the preceding example all criteria are compared to the highest-value criterion. Another approach is to create a full matrix of pair-wise comparisons of all criteria against each other, and from that, using the analytical hierarchy process (AHP), the relative importance is calculated. The process is still judgmental, and the results are no more accurate, but several computerized decision support tools have made effective use of this process (Saaty 2008).
+
where
  
==Probability-Based Decision Analysis==
+
[[File:Eq_2.jpg|400px]]
Probability-based decisions are made when there is uncertainty. Decision management techniques and tools for decisions based on uncertainty include probability theory, utility functions, decision tree analysis, modeling, and [[Computer Simulation (glossary)|simulation]]. A classic mathematically oriented reference for understanding decision trees and probability analysis is explained by Raiffa (1997). Another classic introduction is presented by Schlaiffer (1969) with more of an applied focus. The aspect of modeling and simulation is covered in the popular textbook ''[[Simulation Modeling and Analysis]]'' (Law 2007), which also has good coverage of Monte Carlo analysis. Some of these more commonly used and fundamental methods are overviewed below.
 
  
Decision trees and influence diagrams are visual analytical decision support tools where the expected values (or expected utility) of competing alternatives are calculated. A decision tree uses a tree-like graph or model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. Influence diagrams are used for decision models as alternate, more compact, graphical representations of decision trees.
+
The value component chart (Figure 8) shows the total value and the weighted value measure contribution of each alternative (Parnell et al. 2011).  
  
Figure 2 below demonstrates a simplified ''make vs. buy'' decision analysis tree and the associated calculations. Suppose making a product costs $200K more than buying an alternative off the shelf, reflected as a difference in the net payoffs in the figure. The custom development is also expected to be a better product with a corresponding larger probability of high sales at 80% vs. 50% for the bought alternative. With these assumptions, the monetary expected value of the make alternative is ''(.8*2.0M) + (.2*0.5M) = 1.7M'' and the buy alternative is ''(.5*2.2M) + (.5*0.7M) = 1.45M''.  
+
[[File:Value_Comp_Graph_DM.png|thumb|center|700px|center|'''Figure 8. Value Component Graph (INCOSE DAWG 2013).''' Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
  
[[File:decision_tree_example.png|thumb|400px|center|'''Figure 2. Decision Tree Example.''' (SEBoK Original)]]
+
The heart of a decision management process for system engineering trade off analysis is the ability to assess all dimensions of shareholder and stakeholder value. The stakeholder value scatter plot in Figure 9 shows five dimensions: unit cost, performance, development risk, growth potential, and operation and support costs for all alternatives.
  
Influence diagrams focus attention on the issues and relationships between events. They are generalizations of Bayesian networks; whereby, maximum expected utility criteria can be modeled. A good reference is presented by Detwarasiti and Shachter (2005, 207-228) for using influence diagrams in team decision analysis. Utility is a measure of relative satisfaction that takes into account the decision-maker’s preferred function, which may be nonlinear. Expected utility theory deals with the analysis of choices with multidimensional outcomes. When dealing with monetary value, the analyst can determine the decision-maker’s utility for money and select the alternative course of action that yields the highest expected utility rather than the highest expected monetary value.  ''Decision with Multiple Objectives: Preferences and Value- Trade-Offs'' is a classic reference on applying multiple objective methods, utility functions, and allied techniques (Kenney and Raiffa 1976). References with applied examples of decision tree analysis and utility functions include ''[[Managerial Decision Analysis]]'' (Samson 1988) and ''Introduction to Decision Analysis'' (Skinner 1999).
+
[[File:Ex_Stakeholder_Value_Scat_DM.png|thumb|center|700px|center|'''Figure 9. Example of a Stakeholder Value Scatterplot (INCOSE DAWG 2013).''' Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
  
Systems engineers often have to consider many different criteria when making choices about system tradeoffs. Taken individually, these criteria could lead to very different choices. A weighted objectives analysis is a rational basis for incorporating the multiple criteria into the decision. In this analysis, each criterion is weighted depending on its importance relative to the others. Table 1 below shows an example deciding between two alternatives using their criteria weights, ratings, weighted ratings, and weighted totals to decide between the alternatives.
+
Each system alternative is represented by a scatter plot marker (Figure 9). An alternative’s unit cost and performance value are indicated by x and y positions respectively. An alternative’s development risk is indicated by the color of the marker (green = low, yellow= medium, red = high), while the growth potential is shown as the number of hats above the circular marker (1 hat = low, 2 hats = moderate, 3 hats = high).
  
{| border="1" cellpadding="5" cellspacing="0" align="center"
+
===Identifying Uncertainty and Conducting Probabilistic Analysis===
|+ '''Table 1. Weighted Criteria Example.''' (SEBoK Original)
 
|-
 
! colspan="2" style="background: white;" | 
 
! colspan="2" |Alternative A
 
! colspan="2" |Alternative B
 
|-
 
|'''Criteria'''
 
|'''Weight'''
 
| '''Rating'''
 
| '''Weight * Rating'''
 
| '''Rating'''
 
| '''Weight * Rating'''
 
|-
 
| Better
 
| 0.5
 
| align="center" | 4
 
| align="center" | 2.0
 
| align="center" | 10
 
| align="center" | 5.0
 
|-
 
| Faster
 
| 0.3
 
| align="center" | 8
 
| align="center" | 2.4
 
| align="center" | 5
 
| align="center" | 21.5
 
|-
 
| Cheaper
 
| 0.2
 
| align="center" | 5
 
| align="center" | 1.0
 
| align="center" | 3
 
| align="center" | 0.6
 
|-
 
| colspan="2" | Total Weighted Score
 
|
 
| align="center" | 5.4
 
|
 
| align="center" | 7.1
 
|}
 
  
There are numerous other methods used in decision analysis. One of the simplest is sensitivity analysis, which looks at the relationships between the outcomes and their probabilities to find how sensitive a decision point is to changes in input values. Value of information methods concentrates its effort on data analysis and modeling to improve the optimum expected value. Multi-Attribute Utility Analysis (MAUA) is a method that develops equivalencies between dissimilar units of measure. ''[[Systems Engineering Management]]'' (Blanchard 2004b) shows a variety of these decision analysis methods in many technical decision scenarios. A comprehensive reference demonstrating decision analysis methods for software-intensive systems is "Software Risk Management: Principles and Practices" (Boehm 1981, 32-41). It is a major for information pertaining to multiple goal decision analysis, dealing with uncertainties, risks, and the value of information.
+
As part of the assessment, the SME should discuss the potential uncertainty of the independent variables. The independent variables are the variables that impact one or more scores; the scores that are independent scores. Many times the SME can assess an upper, nominal, and lower bound by assuming low, moderate, and high performance.   Using this data, a Monte Carlo Simulation summarizes the impact of the uncertainties and can identify the uncertainties that have the most impact on the decision.
  
Facets of a decision situation which cannot be explained by a quantitative model should be reserved for the intuition and judgment that are applied by the decision maker. Sometimes outside parties are also called upon. One method to canvas experts, known as the Delphi Technique, is a method of group decision-making and forecasting that involves successively collecting and analyzing the judgments of experts. A variant called the Wideband Delphi Technique is described by Boehm (1981, 32-41) for improving upon the standard Delphi with more rigorous iterations of statistical analysis and feedback forms.
+
===Accessing Impact of Uncertainty - Analyzing Risk and Sensitivity===
  
Making decisions is a key process practiced by systems engineers, project managers, and all team members. Sound decisions are based on good judgment and experience. There are concepts, methods, processes, and tools that can assist in the process of decision-making, especially in making comparisons of decision alternatives. These tools can also assist in building team consensus, in selecting and supporting the decision made, and in defending it to others. General tools, such as spreadsheets and simulation packages, can be used with these methods. There are also tools targeted specifically for aspects of decision analysis such as decision trees, evaluation of probabilities, Bayesian influence networks, and others. The International Council on Systems Engineering (INCOSE) tools database (INCOSE 2011, 1) has an extensive list of analysis tools.
+
Decision analysis uses many forms of sensitivity analysis including line diagrams, tornado diagrams, waterfall diagrams and several uncertainty analyses including Monte Carlo Simulation, decision trees, and influence diagrams (Parnell et al. 2013).  A line diagram is used to show the sensitivity to the swing weight judgment (Parnell et al. 2011).    Figure 10 shows the results of a Monte Carlo Simulation of performance value.  
  
==Linkages to Other Systems Engineering Management Topics==
+
[[File:Uncertainty_on_Perf_Value_from_Monte_DM.png|thumb|center|700px|center|'''Figure 10. Uncertainty on Performance Value from Monte Carlo Simulation (INCOSE DAWG 2013).''' Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.]]
  
Decision management is used in many other process areas due to numerous contexts for a formal evaluation process, both in technical and management, and is closely coupled with other management areas. [[Risk Management]] in particular uses decision analysis methods for risk evaluation, mitigation decisions, and a formal evaluation process to address medium or high risks. The [[Measurement|measurement]] process describes how to derive quantitative indicators as input to decisions. [[Assessment and Control|Project assessment and control]] uses decision results. Refer to the [[Planning|planning]] process area for more information about incorporating decision results into project plans.
+
===Improving Alternatives===
  
==Practical Considerations==
+
Mining the data generated for the alternatives will likely reveal opportunities to modify some design choices to claim untapped value and/or reduce risk.  Taking advantage of initial findings to generate new and creative alternatives starts the process of transforming the decision process from "alternative-focused thinking" to "value-focused thinking" (Keeney 1993).
Key pitfalls and good practices related to decision analysis are described below.  
 
  
===Pitfalls===
+
===Communicating Tradeoffs===
Some of the key pitfalls are below in Table 2.
 
  
{|
+
This is the point in the process where the decision analysis team identifies key observations about tradeoffs and the important uncertainties and risks.
|+'''Table 2. Decision Analysis Pitfalls.''' (SEBoK Original)
 
! Name
 
! Description
 
|-
 
|False Confidence
 
|
 
* False confidence in the accuracy of values used in decisions.
 
|-
 
|No External Validation
 
|
 
* Not engaging experts and holding peer reviews. The decision-maker should engage experts to validate decision values.
 
|-
 
|Errors and False Assumptions
 
|
 
* Prime sources of errors in risky decision-making include false assumptions, not having an accurate estimation of the probabilities, relying on expectations, difficulties in measuring the utility function, and forecast errors.
 
|-
 
|Impractical Application
 
|
 
* The analytical hierarchy process may not handle real-life situations taking into account the theoretical difficulties in using eigen vectors.
 
|}
 
  
===Good Practices===
+
===Presenting Recommendations and Implementing Action Plan===
Some good practices are below in Table 3.
 
  
{|
+
It is often helpful to describe the recommendation(s) in the form of a clearly-worded, actionable task-list in order to increase the likelihood of the decision implementation.  Reports are important for historical traceability and future decisions.  Take the time and effort to create a comprehensive, high-quality report detailing study findings and supporting rationale. Consider static paper reports augmented with dynamic hyper-linked e-reports.
|+'''Table 3. Decision Analysis Good Practices.''' (SEBoK Original)
 
! Name
 
! Description
 
|-
 
|Progressive Decision Modeling
 
|
 
* Use progressive model building. Detail and sophistication can be added as confidence in the model increases.
 
|-
 
|Necessary Measurements
 
|
 
*Measurements need to be tied to the information needs of the decision- makers.  
 
|-
 
|Define Selection Criteria
 
|
 
* Define selection criteria and process (and success criteria) before identifying trade alternatives.
 
|}
 
  
Additional good practices can be found in ISO/IEC/IEEE (2009, Clause 6.3) and INCOSE (2011, Section 5.3.1.5).  Parnell, Driscoll, and Henderson (2008) provide a thorough overview.
+
==The Cognitive Bias Effect on Decisions==
 +
Research by (Kahneman 2011) and (Thaler and Sunstein 2008) has concluded that [[Cognitive Bias (glossary)|cognitive bias]] can seriously distort decisions made by any decision maker. Both Kahneman and Thaler were awarded the Nobel prize for their work. The cause of this distortion is called the cognitive bias. These distorted decisions have contributed to major catastrophes, such as Challenger and Columbia. Other sources attributing major catastrophes are (Murata, Nakamura, and Karwowski 2015) and (Murata 2017).
 +
 
 +
(Kahneman 2011) and (Thaler and Sunstein 2008) have identified a large number of individual biases, the most well-known of which is the confirmation bias. This bias states that humans have a tendency to interpret new evidence as confirmation of one&#39;s existing beliefs or theories. Regarding mitigation of theses biases, there is general agreement that self-mitigation by the decision-maker is not feasible for most biases. (Thaler and Sunstein 2008) provide methods to influence the mitigation of most biases. They refer to these influences as “nudges”.
 +
 
 +
Considering cognitive biases in a systems engineering is discussed by (Jackson 2017, Jackson and Harel 2017), and (Jackson 2018). The primary theme of these references is that rational decisions are rarely possible and that cognitive bias must be taken into account.
 +
 
 +
===Decisions with Cognitive Bias===
 +
According to (INCOSE 2015) ideal decisions are made while “objectively identifying, characterizing, and evaluating a set of alternatives for a decision…” Research in the field of behavioral economics has shown that these decisions can be distorted by a phenomenon known as cognitive bias. Furthermore, most decision makers are unaware of these biases. The literature also provides methods for mitigating these biases.
 +
 
 +
According to (Haselton, Nettle, and Andrews 2005, p. 2) a cognitive bias represents a situation in which “human cognition reliably produces representations that are systematically distorted compared to some aspect of objective reality.” Cognitive biases are typically stimulated by emotion and prior belief. The literature reveals large numbers of cognitive biases of which the following three are typical:
 +
#The rankism bias. According to (Fuller 2011), rankism is simply the idea that persons of higher rank in an organization are better able to assert their authority over persons of lower rank regardless of the decision involved. Rankism frequently occurs in aircraft cockpits. According to (McCreary et al. 1998), rankism was a factor in the famous Tenerife disaster.
 +
#The complacency bias. According to (Leveson 1995, pp. 54-55), complacency is the disregard for safety and the belief that current safety measures are adequate. According to (Leveson 1995, pp. 54-55), complacency played a role in the Three Mile Island and Bhopal disasters.   
 +
#The optimism bias. According to (Leveson 1995, pp. 54-55), famous physicist Richard Feynman states that NASA “exaggerates the reliability of the system.” This is an example of the optimism bias.
 +
 
 +
===Mitigation of Cognitive Bias===
 +
Various sources have suggested methods to mitigate the effects of cognitive bias. Following are some of the major ones.
 +
#Independent Review. The idea of independent review is that advice on decisions should come from an outside body, called by the Columbia Accident Investigation Board (CAIB) (NASA 2003, 227) as the Independent Technical Authority (ITA). This authority must be both organizationally and financially independent of the program in question. That is, the ITA cannot be subordinate to the program manager.
 +
#Crew Resource Management. Following a period of high accident rate, several airlines have adopted the crew resource management (CRM) method. The primary purposes of this method are first to assure that all crew members do their job properly and secondly that they communicate with the pilot effectively when they have a concern. The impetus for this method was the judgment that many pilots were experiencing the rankism bias or were preoccupied with other tasks and simple did not understand the concerns of the other crew members. The result is that this strategy has been successful, and that the accident rate has fallen.
 +
#The Premortem. (Kahneman 2011) (pp. 264-265) suggests this method of nudging in an organizational context. This method, like others, requires a certain amount of willingness on the part of the decision-maker to participate in this process. It calls for decision-makers to surround themselves with trusted experts in advance of major decisions. According to Kahneman the primary job of the experts is to present the negative argument against any decision. For example, the decision-maker should not authorize the launch now, perhaps later.
  
 
==References==  
 
==References==  
  
 
===Works Cited===
 
===Works Cited===
Blanchard, B.S. 2004. ''[[Systems Engineering Management]],'' 3rd ed. New York, NY,USA: John Wiley & Sons.
+
Buede, D.M. 2009. ''The engineering design of systems: Models and methods''. 2nd ed. Hoboken, NJ: John Wiley & Sons Inc.
 +
 
 +
Edwards, W., R.F. Miles Jr., and D. Von Winterfeldt. 2007. ''Advances In Decision Analysis: From Foundations to Applications.'' New York, NY: Cambridge University Press.
 +
 
 +
Fuller, R.W. 2011. "What is Rankism and Why to We "Do" It?" ''Psychology Today''. 25 May 2011. https://www.psychologytoday.com/us/blog/somebodies-and-nobodies/201002/what-is-rankism-and-why-do-we-do-it
  
Boehm, B. 1981. "Software risk management: Principles and practices." IEEE ''Software.'' 8 (1) (January 1991): 32-41.
+
Haselton, M.G., D. Nettle, and P.W. Andrews. 2005. "The Evolution of Cognitive Bias." ''Handbook of Psychology''.
  
Cialdini, R.B. 2006. ''Influence: The Psychology of Persuasion''. New York, NY, USA: Collins Business Essentials.
+
INCOSE. 2015. ''Systems Engineering Handbook,'' 4th Ed. Edited by D.D. Walden, G.J. Roedler, K.J. Forsberg, R.D. Hamelin, and T.M. Shortell. San Diego, CA: International Council on Systems Engineering (INCOSE).
  
Detwarasiti, A. and R. D. Shachter. 2005. "Influence diagrams for team decision analysis." ''Decision Analysis.'' 2 (4): 207-28.
+
ISO/IEC/IEEE. 2015. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering -- System Life Cycle Processes]]''. Geneva, Switzerland: International Organisation for Standardisation / International Electrotechnical Commissions / Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.
  
Gladwell, M. 2005. ''Blink: the Power of Thinking without Thinking''. Boston, MA, USA: Little, Brown & Co.
+
Kahneman, D. 2011. "Thinking Fast and Slow." New York, NY: Farrar, Straus, and Giroux.
  
Kenney, R.L. and H. Raiffa. 1976. ''Decision with multiple objectives: Preferences and value- trade-offs.'' New York, NY: John Wiley & Sons.
+
Keeney, R.L. and H. Raiffa. 1976. ''Decisions with Multiple Objectives - Preferences and Value Tradeoffs.'' New York, NY: Wiley.
  
INCOSE. 2012. ''Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities,'' version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.
+
Keeney, R.L. 1992. ''Value-Focused Thinking: A Path to Creative Decision-Making.'' Cambridge, MA: Harvard University Press.
  
Law, A. 2007. ''[[Simulation Modeling and Analysis]],'' 4th ed. New York, NY, USA: McGraw Hill.
+
Keeney, R.L. 1993. "Creativity in MS/OR: Value-focused thinking—Creativity directed toward decision making." ''Interfaces'', 23(3), p.62–67.
  
Pugh, S. 1981. "Concept selection: a method that works". In Hubka, V. (ed.), ''Review of design methodology''. Proceedings of the International Conference on Engineering Design, Rome, Italy, March 1981.  
+
Leveson, N. 1995. ''Safeware: System Safety and Computers''. Reading, MA: Addison Wesley.
  
Rich, B. and L. Janos. 1996. ''Skunk Works''. Boston, MA, USA: Little, Brown & Company.
+
McCreary, J., M. Pollard, K. Stevenson, and M.B. Wilson. 1998. "Human Factors: Tenerife Revisited." ''Journal of Air Transportation World Wide''. 3(1).
  
Saaty, T.L. 2008. ''Decision Making for Leaders: The Analytic Hierarchy Process for Decisions in a Complex World''. Pittsburgh, PA, USA: RWS Publications. ISBN 0-9620317-8-X.
+
Murata, A. 2017. "Cultural Difference and Cognitive Biases as a Trigger of Critical Crashes or Disasters - Evidence from Case Studies of Human Factors Analysis." ''Journal of Behavioral and Brain Science.'' 7:299-415.
  
Raiffa, H. 1997. ''[[Decision Analysis: Introductory Lectures on Choices under Uncertainty]].'' New York, NY, USA: McGraw-Hill.
+
Murata, A., T. Nakamura, and W. Karwowski. 2015. "Influences of Cognitive Biases in Distorting Decision Making and Leading to Critical Unfavorable Incidents." ''Safety.'' 1:44-58.
  
Samson, D. 1988. ''[[Managerial Decision Analysis]].'' New York, NY, USA: Richard D. Irwin, Inc.
+
Parnell, G.S. 2009. "Decision Analysis in One Chart," ''Decision Line, Newsletter of the Decision Sciences Institute''. May 2009.
  
Schlaiffer, R. 1969. ''[[Analysis of Decisions under Uncertainty]].'' New York, NY, USA: McGraw-Hill Book Company.
+
Parnell, G.S., P.J. Driscoll, and D.L Henderson (eds). 2011. ''Decision Making for Systems Engineering and Management'', 2nd ed. Wiley Series in Systems Engineering. Hoboken, NJ: Wiley & Sons Inc.
  
Skinner, D. 1999. ''Introduction to Decision Analysis,'' 2nd ed. Sugar Land, TX, USA: Probabilistic Publishing.
+
Parnell, G.S., T. Bresnick, S. Tani, and E. Johnson. 2013. ''Handbook of Decision Analysis.'' Hoboken, NJ: Wiley & Sons.
  
Wikipedia contributors. "Decision making software." Wikipedia, The Free Encyclopedia. Accessed September 13, 2011. Available at: http://en.wikipedia.org/w/index.php?title=Decision_making_software&oldid=448914757.
+
Thaler, Richard H., and Cass R. Sunstein. 2008. Nudge: Improving Decisions About Health, Wealth, and Happiness. New York: Penguin Books.
  
 
===Primary References===
 
===Primary References===
Forsberg, K., H. Mooz, and H. Cotterman. 2005. ''[[Visualizing Project Management]],'' 3rd ed. Hoboken, NJ, USA: John Wiley and Sons. p. 154-155.
+
Buede, D.M. 2004. "[[On Trade Studies]]." Proceedings of the 14th Annual International Council on Systems Engineering International Symposium, 20-24 June, 2004, Toulouse, France.
 +
 
 +
Keeney, R.L. 2004. "[[Making Better Decision Makers]]." ''Decision Analysis'', 1(4), pp.193–204.
  
Law, A. 2007. ''[[Simulation Modeling and Analysis]],'' 4th ed. New York, NY, USA: McGraw Hill.
+
Keeney, R.L. & R.S. Gregory. 2005. "[[Selecting Attributes to Measure the Achievement of Objectives]]". ''Operations Research'', 53(1), pp.1–11.
  
Raiffa, H. 1997. ''[[Decision Analysis: Introductory Lectures on Choices under Uncertainty]]''. New York, NY, USA: McGraw-Hill.
+
Kirkwood, C.W. 1996. ''[[Strategic Decision Making]]: Multiobjective Decision Analysis with Spreadsheets.'' Belmont, California: Duxbury Press.
  
Saaty, T.L. 2008. ''[[Decision Making for Leaders: The Analytic Hierarchy Process for Decisions in a Complex World]]''. Pittsburgh, PA, USA: RWS Publications.  
+
===Additional References===
 +
Buede, D.M. and R.W. Choisser. 1992. "Providing an Analytic Structure for Key System Design Choices." ''Journal of Multi-Criteria Decision Analysis'', 1(1), pp.17–27.
  
Samson, D. 1988. ''[[Managerial Decision Analysis]]''. New York, NY, USA: Richard D. Irwin, Inc.
+
Felix, A. 2004. "Standard Approach to Trade Studies." Proceedings of the International Council on Systems Engineering (INCOSE) Mid-Atlantic Regional Conference, November 2-4 2004, Arlington, VA.
  
Schlaiffer, R. 1969. ''[[Analysis of Decisions under Uncertainty]]''. New York, NY, USA: McGraw-Hill.
+
Felix, A. 2005. "How the Pro-Active Program (Project) Manager Uses a Systems Engineer’s Trade Study as a Management Tool, and not just a Decision Making Process." Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 10-15, 2005, Rochester, NY.
  
===Additional References===
+
Jackson, S. 2017. "Irrationality in Decision Making: A Systems Engineering Perspective." INCOSE ''Insight'', 74.
Blanchard, B.S. 2004. ''[[Systems Engineering Management]]''. 3rd ed. New York, NY,USA: John Wiley & Sons.
 
  
Boehm, B. 1981. "Software risk management: Principles and practices." ''IEEE Software'' 8 (1) (January 1991): 32-41.
+
Jackson, S. 2018. "Cognitive Bias: A Game-Changer for Decision Management?" INCOSE ''Insight'', 41-42.
  
Detwarasiti, A., and R. D. Shachter. 2005. "Influence diagrams for team decision analysis." ''Decision Analysis'' 2 (4): 207-28.
+
Jackson, S. and A. Harel. 2017. "Systems Engineering Decision Analysis can benefit from Added Consideration of Cognitive Sciences." ''Systems Engineering.'' 55, 19 July.
  
Kenney, R.L., and H. Raiffa. 1976. ''Decision with multiple objectives: Preferences and value- trade-offs.'' New York, NY: John Wiley & Sons.
+
Miller, G.A. 1956. "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information." ''Psychological Review'', 63(2), p.81.
  
Kepner, Charles Higgins, and Benjamin B. Tregoe. 1965. ''The rational manager: a systematic approach to problem solving and decision making''. New York: McGraw-Hill.
+
Ross, A.M. and D.E. Hastings. 2005. "Tradespace Exploration Paradigm." Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 10-15, 2005, Rochester, NY.
  
Parnell, G.S., P.J. Driscoll, and D.L. Henderson. 2010. ''Decision Making in Systems Engineering and Management.'' New York, NY, USA: John Wiley & Sons.
+
Sproles, N. 2002. "Formulating Measures of Effectiveness." ''Systems Engineering", 5(4), p. 253-263.''
  
Rich, B. and L. Janos. 1996. ''Skunk Works''. Boston, MA, USA: Little, Brown & Company.
+
Silletto, H. 2005. "Some Really Useful Principles: A new look at the scope and boundaries of systems engineering."  Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 10-15, 2005, Rochester, NY.
  
Skinner, D. 1999. ''Introduction to Decision Analysis.'' 2nd ed. Sugar Land, TX, USA: Probabilistic Publishing.
+
Ullman, D.G. and B.P. Spiegel. 2006. "Trade Studies with Uncertain Information." Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 9-13, 2006, Orlando, FL.
  
 
----
 
----
<center>[[Measurement|< Previous Article]]  |  [[Systems Engineering Management|Parent Article]]  |  [[Configuration Management|Next Article >]]</center>
+
<center>[[Assessment and Control|< Previous Article]]  |  [[Systems Engineering Management|Parent Article]]  |  [[Risk Management|Next Article >]]</center>
  
{{DISQUS}}
+
<center>'''SEBoK v. 2.9, released 20 November 2023'''</center>
  
 
[[Category: Part 3]][[Category:Topic]]
 
[[Category: Part 3]][[Category:Topic]]
 
[[Category:Systems Engineering Management]]
 
[[Category:Systems Engineering Management]]

Latest revision as of 22:22, 18 November 2023


Lead Author: Ray Madachy, Contributing Authors: Garry Roedler, Greg Parnell, Scott Jackson


Many systems engineering decisions are difficult because they include numerous stakeholders, multiple competing objectives, substantial uncertainty, and significant consequences. In these cases, good decision making requires a formal decision managementdecision management process. The purpose of the decision management process is:

“…to provide a structured, analytical framework for objectively identifying, characterizing and evaluating a set of alternatives for a decision at any point in the life cycle and select the most beneficial course of action.”(ISO/IEC/IEEE 15288)

Decision situations (opportunitiesopportunities) are commonly encountered throughout a system’s lifecyclesystem’s lifecycle. The decision management method most commonly employed by systems engineers is the trade study. Trade studies aim to define, measure, and assess shareholder and stakeholderstakeholder valuevalue to facilitate the decision maker’s search for an alternative that represents the best balance of competing objectives. By providing techniques for decomposing a trade decision into logical segments and then synthesizing the parts into a coherent whole, a decision management process allows the decision maker to work within human cognitive limits without oversimplifying the problem. Furthermore, by decomposing the overall decision problem, experts can provide assessments of alternatives in their area of expertise.

Decision Management Process

The decision analysis process is depicted in Figure 1 below. The decision management process is based on several best practices, including:

  • Utilizing sound mathematical technique of decision analysis for trade studies. Parnell (2009) provided a list of decision analysis concepts and techniques.
  • Developing one master decision model, followed by its refinement, update, and use, as required for trade studies throughout the system life cycle.
  • Using Value-Focused Thinking (Keeney 1992) to create better alternatives.
  • Identifying uncertainty and assessing risks for each decision.
Figure 1. Decision Management Process (INCOSE DAWG 2013). Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

The center of the diagram shows the five trade space objectives (listed clockwise): Performance, Growth Potential, Schedule, Development & Procurement Costs, and Sustainment Costs . The ten blue arrows represent the decision management process activities and the white text within the green ring represents SE process elements. Interactions are represented by the small, dotted green or blue arrows. The decision analysis process is an iterative process. A hypothetical UAV decision problem is used to illustrate each of the activities in the following sections.

Framing and Tailoring the Decision

To ensure the decision team fully understands the decision context, the analyst should describe the system baseline, boundaries and interfaces. The decision context includes: the system definition, the life cycle stage, decision milestones, a list of decision makers and stakeholders, and available resources. The best practice is to identify a decision problem statement that defines the decision in terms of the system life cycle.

Developing Objectives and Measures

Defining how an important decision will be made is difficult. As Keeney (2002) puts it:

Most important decisions involve multiple objectives, and usually with multiple-objective decisions, you can't have it all. You will have to accept less achievement in terms of some objectives in order to achieve more on other objectives. But how much less would you accept to achieve how much more?

The first step is to develop objectives and measures using interviews and focus groups with subject matter experts (SMEs) and stakeholders. For systems engineering trade-off analyses, stakeholder value often includes competing objectives of performance, development schedule, unit cost, support costs, and growth potential. For corporate decisions, shareholder value would also be added to this list. For performance, a functional decomposition can help generate a thorough set of potential objectives. Test this initial list of fundamental objectives by checking that each fundamental objective is essential and controllable and that the set of objectives is complete, non-redundant, concise, specific, and understandable (Edwards et al. 2007). Figure 2 provides an example of an objectives hierarchy.

Figure 2. Fundamental Objectives Hierarchy (INCOSE DAWG 2013). Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

For each objective, a measure must be defined to assess the value of each alternative for that objective. A measure (attribute, criterion, and metric) must be unambiguous, comprehensive, direct, operational, and understandable (Keeney & Gregory 2005). A defining feature of multi-objective decision analysis is the transformation from measure space to value space. This transformation is performed by a value function which shows returns to scale on the measure range. When creating a value function, the walk-away point on the measure scale (x-axis) must be ascertained and mapped to a 0 value on the value scale (y-axis). A walk-away point is the measure score where regardless of how well an alternative performs in other measures, the decision maker will walk away from the alternative. He or she does this through working with the user, finding the measure score beyond, at which point an alternative provides no additional value, and labeling it "stretch goal" (ideal) and then mapping it to 100 (or 1 and 10) on the value scale (y-axis). Figure 3 provides the most common value curve shapes. The rationale for the shape of the value functions should be documented for traceability and defensibility (Parnell et al. 2011).

Figure 3. Value Function Examples (INCOSE DAWG 2013). Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

The mathematics of multiple objective decision analysis (MODA) requires that the weights depend on importance of the measure and the range of the measure (walk away to stretch goal). A useful tool for determining priority weighting is the swing weight matrix (Parnell et al. 2011). For each measure, consider its importance through determining whether the measure corresponds to a defining, critical, or enabling function and consider the gap between the current capability and the desired capability; finally, put the name of the measure in the appropriate cell of the matrix (Figure 4). The highest priority weighting is placed in the upper-left corner and assigned an unnormalized weight of 100. The unnormalized weights are monotonically decreasing to the right and down the matrix. Swing weights are then assessed by comparing them to the most important value measure or another assessed measure. The swing weights are normalized to sum to one for the additive value model used to calculate value in a subsequent section.

Figure 4. Swing Weight Matrix (INCOSE DAWG 2013). Permission granted by Gregory Parnell who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

Generating Creative Alternatives

To help generate a creative and comprehensive set of alternatives that span the decision space, consider developing an alternative generation table (also called a morphological box) (Buede, 2009; Parnell et al. 2011). It is a best practice to establish a meaningful product structure for the system and to be reported in all decision presentations (Figure 5).

Figure 5. Descriptions of Alternatives (INCOSE DAWG 2013). Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

Assessing Alternatives via Deterministic Analysis

With objectives and measures established and alternatives having been defined, the decision team should engage SMEs, equipped with operational data, test data, simulations, models, and expert knowledge. Scores are best captured on scoring sheets for each alternative/measure combination which document the source and rationale. Figure 6 provides a summary of the scores.

Figure 6. Alternative Scores (INCOSE DAWG 2013). Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

Note that in addition to identified alternatives, the score matrix includes a row for the ideal alternative. The ideal is a tool for value-focused thinking, which will be covered later.

Synthesizing Results

Next, one can transform the scores into a value table, by using the value functions developed previously. A color heat map can be useful to visualize value tradeoffs between alternatives and identify where alternatives need improvement (Figure 7).

Figure 7. Value Scorecard with Heat Map (INCOSE DAWG 2013). Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

The additive value model uses the following equation to calculate each alternative’s value:

Eq 1.jpg

where

Eq 2.jpg

The value component chart (Figure 8) shows the total value and the weighted value measure contribution of each alternative (Parnell et al. 2011).

Figure 8. Value Component Graph (INCOSE DAWG 2013). Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

The heart of a decision management process for system engineering trade off analysis is the ability to assess all dimensions of shareholder and stakeholder value. The stakeholder value scatter plot in Figure 9 shows five dimensions: unit cost, performance, development risk, growth potential, and operation and support costs for all alternatives.

Figure 9. Example of a Stakeholder Value Scatterplot (INCOSE DAWG 2013). Permission granted by Richard Swanson who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

Each system alternative is represented by a scatter plot marker (Figure 9). An alternative’s unit cost and performance value are indicated by x and y positions respectively. An alternative’s development risk is indicated by the color of the marker (green = low, yellow= medium, red = high), while the growth potential is shown as the number of hats above the circular marker (1 hat = low, 2 hats = moderate, 3 hats = high).

Identifying Uncertainty and Conducting Probabilistic Analysis

As part of the assessment, the SME should discuss the potential uncertainty of the independent variables. The independent variables are the variables that impact one or more scores; the scores that are independent scores. Many times the SME can assess an upper, nominal, and lower bound by assuming low, moderate, and high performance. Using this data, a Monte Carlo Simulation summarizes the impact of the uncertainties and can identify the uncertainties that have the most impact on the decision.

Accessing Impact of Uncertainty - Analyzing Risk and Sensitivity

Decision analysis uses many forms of sensitivity analysis including line diagrams, tornado diagrams, waterfall diagrams and several uncertainty analyses including Monte Carlo Simulation, decision trees, and influence diagrams (Parnell et al. 2013). A line diagram is used to show the sensitivity to the swing weight judgment (Parnell et al. 2011). Figure 10 shows the results of a Monte Carlo Simulation of performance value.

Figure 10. Uncertainty on Performance Value from Monte Carlo Simulation (INCOSE DAWG 2013). Permission granted by Matthew Cilli who prepared image for the INCOSE Decision Analysis Working Group (DAWG). All other rights are reserved by the copyright owner.

Improving Alternatives

Mining the data generated for the alternatives will likely reveal opportunities to modify some design choices to claim untapped value and/or reduce risk. Taking advantage of initial findings to generate new and creative alternatives starts the process of transforming the decision process from "alternative-focused thinking" to "value-focused thinking" (Keeney 1993).

Communicating Tradeoffs

This is the point in the process where the decision analysis team identifies key observations about tradeoffs and the important uncertainties and risks.

Presenting Recommendations and Implementing Action Plan

It is often helpful to describe the recommendation(s) in the form of a clearly-worded, actionable task-list in order to increase the likelihood of the decision implementation. Reports are important for historical traceability and future decisions. Take the time and effort to create a comprehensive, high-quality report detailing study findings and supporting rationale. Consider static paper reports augmented with dynamic hyper-linked e-reports.

The Cognitive Bias Effect on Decisions

Research by (Kahneman 2011) and (Thaler and Sunstein 2008) has concluded that cognitive bias can seriously distort decisions made by any decision maker. Both Kahneman and Thaler were awarded the Nobel prize for their work. The cause of this distortion is called the cognitive bias. These distorted decisions have contributed to major catastrophes, such as Challenger and Columbia. Other sources attributing major catastrophes are (Murata, Nakamura, and Karwowski 2015) and (Murata 2017).

(Kahneman 2011) and (Thaler and Sunstein 2008) have identified a large number of individual biases, the most well-known of which is the confirmation bias. This bias states that humans have a tendency to interpret new evidence as confirmation of one's existing beliefs or theories. Regarding mitigation of theses biases, there is general agreement that self-mitigation by the decision-maker is not feasible for most biases. (Thaler and Sunstein 2008) provide methods to influence the mitigation of most biases. They refer to these influences as “nudges”.

Considering cognitive biases in a systems engineering is discussed by (Jackson 2017, Jackson and Harel 2017), and (Jackson 2018). The primary theme of these references is that rational decisions are rarely possible and that cognitive bias must be taken into account.

Decisions with Cognitive Bias

According to (INCOSE 2015) ideal decisions are made while “objectively identifying, characterizing, and evaluating a set of alternatives for a decision…” Research in the field of behavioral economics has shown that these decisions can be distorted by a phenomenon known as cognitive bias. Furthermore, most decision makers are unaware of these biases. The literature also provides methods for mitigating these biases.

According to (Haselton, Nettle, and Andrews 2005, p. 2) a cognitive bias represents a situation in which “human cognition reliably produces representations that are systematically distorted compared to some aspect of objective reality.” Cognitive biases are typically stimulated by emotion and prior belief. The literature reveals large numbers of cognitive biases of which the following three are typical:

  1. The rankism bias. According to (Fuller 2011), rankism is simply the idea that persons of higher rank in an organization are better able to assert their authority over persons of lower rank regardless of the decision involved. Rankism frequently occurs in aircraft cockpits. According to (McCreary et al. 1998), rankism was a factor in the famous Tenerife disaster.
  2. The complacency bias. According to (Leveson 1995, pp. 54-55), complacency is the disregard for safety and the belief that current safety measures are adequate. According to (Leveson 1995, pp. 54-55), complacency played a role in the Three Mile Island and Bhopal disasters.
  3. The optimism bias. According to (Leveson 1995, pp. 54-55), famous physicist Richard Feynman states that NASA “exaggerates the reliability of the system.” This is an example of the optimism bias.

Mitigation of Cognitive Bias

Various sources have suggested methods to mitigate the effects of cognitive bias. Following are some of the major ones.

  1. Independent Review. The idea of independent review is that advice on decisions should come from an outside body, called by the Columbia Accident Investigation Board (CAIB) (NASA 2003, 227) as the Independent Technical Authority (ITA). This authority must be both organizationally and financially independent of the program in question. That is, the ITA cannot be subordinate to the program manager.
  2. Crew Resource Management. Following a period of high accident rate, several airlines have adopted the crew resource management (CRM) method. The primary purposes of this method are first to assure that all crew members do their job properly and secondly that they communicate with the pilot effectively when they have a concern. The impetus for this method was the judgment that many pilots were experiencing the rankism bias or were preoccupied with other tasks and simple did not understand the concerns of the other crew members. The result is that this strategy has been successful, and that the accident rate has fallen.
  3. The Premortem. (Kahneman 2011) (pp. 264-265) suggests this method of nudging in an organizational context. This method, like others, requires a certain amount of willingness on the part of the decision-maker to participate in this process. It calls for decision-makers to surround themselves with trusted experts in advance of major decisions. According to Kahneman the primary job of the experts is to present the negative argument against any decision. For example, the decision-maker should not authorize the launch now, perhaps later.

References

Works Cited

Buede, D.M. 2009. The engineering design of systems: Models and methods. 2nd ed. Hoboken, NJ: John Wiley & Sons Inc.

Edwards, W., R.F. Miles Jr., and D. Von Winterfeldt. 2007. Advances In Decision Analysis: From Foundations to Applications. New York, NY: Cambridge University Press.

Fuller, R.W. 2011. "What is Rankism and Why to We "Do" It?" Psychology Today. 25 May 2011. https://www.psychologytoday.com/us/blog/somebodies-and-nobodies/201002/what-is-rankism-and-why-do-we-do-it

Haselton, M.G., D. Nettle, and P.W. Andrews. 2005. "The Evolution of Cognitive Bias." Handbook of Psychology.

INCOSE. 2015. Systems Engineering Handbook, 4th Ed. Edited by D.D. Walden, G.J. Roedler, K.J. Forsberg, R.D. Hamelin, and T.M. Shortell. San Diego, CA: International Council on Systems Engineering (INCOSE).

ISO/IEC/IEEE. 2015. Systems and Software Engineering -- System Life Cycle Processes. Geneva, Switzerland: International Organisation for Standardisation / International Electrotechnical Commissions / Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.

Kahneman, D. 2011. "Thinking Fast and Slow." New York, NY: Farrar, Straus, and Giroux.

Keeney, R.L. and H. Raiffa. 1976. Decisions with Multiple Objectives - Preferences and Value Tradeoffs. New York, NY: Wiley.

Keeney, R.L. 1992. Value-Focused Thinking: A Path to Creative Decision-Making. Cambridge, MA: Harvard University Press.

Keeney, R.L. 1993. "Creativity in MS/OR: Value-focused thinking—Creativity directed toward decision making." Interfaces, 23(3), p.62–67.

Leveson, N. 1995. Safeware: System Safety and Computers. Reading, MA: Addison Wesley.

McCreary, J., M. Pollard, K. Stevenson, and M.B. Wilson. 1998. "Human Factors: Tenerife Revisited." Journal of Air Transportation World Wide. 3(1).

Murata, A. 2017. "Cultural Difference and Cognitive Biases as a Trigger of Critical Crashes or Disasters - Evidence from Case Studies of Human Factors Analysis." Journal of Behavioral and Brain Science. 7:299-415.

Murata, A., T. Nakamura, and W. Karwowski. 2015. "Influences of Cognitive Biases in Distorting Decision Making and Leading to Critical Unfavorable Incidents." Safety. 1:44-58.

Parnell, G.S. 2009. "Decision Analysis in One Chart," Decision Line, Newsletter of the Decision Sciences Institute. May 2009.

Parnell, G.S., P.J. Driscoll, and D.L Henderson (eds). 2011. Decision Making for Systems Engineering and Management, 2nd ed. Wiley Series in Systems Engineering. Hoboken, NJ: Wiley & Sons Inc.

Parnell, G.S., T. Bresnick, S. Tani, and E. Johnson. 2013. Handbook of Decision Analysis. Hoboken, NJ: Wiley & Sons.

Thaler, Richard H., and Cass R. Sunstein. 2008. Nudge: Improving Decisions About Health, Wealth, and Happiness. New York: Penguin Books.

Primary References

Buede, D.M. 2004. "On Trade Studies." Proceedings of the 14th Annual International Council on Systems Engineering International Symposium, 20-24 June, 2004, Toulouse, France.

Keeney, R.L. 2004. "Making Better Decision Makers." Decision Analysis, 1(4), pp.193–204.

Keeney, R.L. & R.S. Gregory. 2005. "Selecting Attributes to Measure the Achievement of Objectives". Operations Research, 53(1), pp.1–11.

Kirkwood, C.W. 1996. Strategic Decision Making: Multiobjective Decision Analysis with Spreadsheets. Belmont, California: Duxbury Press.

Additional References

Buede, D.M. and R.W. Choisser. 1992. "Providing an Analytic Structure for Key System Design Choices." Journal of Multi-Criteria Decision Analysis, 1(1), pp.17–27.

Felix, A. 2004. "Standard Approach to Trade Studies." Proceedings of the International Council on Systems Engineering (INCOSE) Mid-Atlantic Regional Conference, November 2-4 2004, Arlington, VA.

Felix, A. 2005. "How the Pro-Active Program (Project) Manager Uses a Systems Engineer’s Trade Study as a Management Tool, and not just a Decision Making Process." Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 10-15, 2005, Rochester, NY.

Jackson, S. 2017. "Irrationality in Decision Making: A Systems Engineering Perspective." INCOSE Insight, 74.

Jackson, S. 2018. "Cognitive Bias: A Game-Changer for Decision Management?" INCOSE Insight, 41-42.

Jackson, S. and A. Harel. 2017. "Systems Engineering Decision Analysis can benefit from Added Consideration of Cognitive Sciences." Systems Engineering. 55, 19 July.

Miller, G.A. 1956. "The Magical Number Seven, Plus or Minus Two: Some Limits on Our Capacity for Processing Information." Psychological Review, 63(2), p.81.

Ross, A.M. and D.E. Hastings. 2005. "Tradespace Exploration Paradigm." Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 10-15, 2005, Rochester, NY.

Sproles, N. 2002. "Formulating Measures of Effectiveness." Systems Engineering", 5(4), p. 253-263.

Silletto, H. 2005. "Some Really Useful Principles: A new look at the scope and boundaries of systems engineering." Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 10-15, 2005, Rochester, NY.

Ullman, D.G. and B.P. Spiegel. 2006. "Trade Studies with Uncertain Information." Proceedings of the International Council on Systems Engineering (INCOSE) International Symposium, July 9-13, 2006, Orlando, FL.


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.9, released 20 November 2023