Difference between revisions of "Decision Management"

From SEBoK
Jump to navigation Jump to search
Line 81: Line 81:
 
{| border="1" cellpadding="5" cellspacing="0" align="center"
 
{| border="1" cellpadding="5" cellspacing="0" align="center"
 
|+ '''Table 1. Weighted Criteria Example.''' (SEBoK Original)
 
|+ '''Table 1. Weighted Criteria Example.''' (SEBoK Original)
 
 
|-
 
|-
 
! colspan="2" style="background: white;" |   
 
! colspan="2" style="background: white;" |   
 
! colspan="2" style="background: #ffdead;" | Alternative A
 
! colspan="2" style="background: #ffdead;" | Alternative A
 
! colspan="2" style="background: #ffdead;" | Alternative B
 
! colspan="2" style="background: #ffdead;" | Alternative B
 
 
|-
 
|-
 
|'''Criteria'''
 
|'''Criteria'''
Line 94: Line 92:
 
| '''Rating'''
 
| '''Rating'''
 
| '''Weight * Rating'''
 
| '''Weight * Rating'''
 
 
|-
 
|-
 
| Better
 
| Better
Line 102: Line 99:
 
| align="center" | 10
 
| align="center" | 10
 
| align="center" | 5.0
 
| align="center" | 5.0
 
 
|-
 
|-
 
| Faster
 
| Faster
Line 110: Line 106:
 
| align="center" | 5
 
| align="center" | 5
 
| align="center" | 21.5
 
| align="center" | 21.5
 
 
|-
 
|-
 
| Cheaper
 
| Cheaper
Line 118: Line 113:
 
| align="center" | 3
 
| align="center" | 3
 
| align="center" | 0.6
 
| align="center" | 0.6
 
 
|-
 
|-
 
| colspan="2" | Total Weighted Score
 
| colspan="2" | Total Weighted Score
Line 125: Line 119:
 
|
 
|
 
| align="center" | 7.1
 
| align="center" | 7.1
 
 
|}
 
|}
  

Revision as of 20:56, 15 August 2012

The purpose of decision management is to analyze possible decisions using a formal evaluation process that evaluates identified alternatives against established decision criteria. It involves establishing guidelines to determine which issues should be subjected to a formal evaluation process, and then applying formal evaluation processes to these issues.

Making decisions is one of the most important processes practiced by systems engineers, project managers, and all team members. Sound decisions are based on good judgment and experience. There are concepts, methods, processes, and tools that can assist in the process of decision making, especially in making comparisons of decision alternatives. These tools can also assist in building team consensus in selecting and supporting the decision made and in defending it to others.

The technical discipline of decision analysis is identifying the best option among a set of alternatives under uncertainty. Many analysis methods are used in the multidimensional tradeoffs of systems engineering under varying degrees of uncertainty. These start from non-probabilistic decision rules that ignore the likelihood of chance outcomes, move to expected value rules, and end in more general utility approaches. Decision judgment and analysis methods are described after the overview of organizational processes.

Process Overview

The best practices of decision management for evaluating alternatives are described in the following sections and grouped by specific practices (SEI 2007).

  • Establish Guidelines for Decision Analysis. Guidelines should be established to determine which issues are subject to a formal evaluation process, as not every decision is significant enough to warrant formality. Whether a decision is significant or not is dependent on the project and circumstances, and is determined by the established guidelines.
  • Establish Evaluation Criteria. Criteria for evaluating alternatives and the relative ranking of these criteria are established. The evaluation criteria provide the basis for evaluating alternative solutions. The criteria are ranked so that the highest ranked criteria exert the most influence. There are many contexts in which a formal evaluation process can be used by systems engineering (SE), so the criteria may have already been defined as part of another process.
  • Identify Alternative Solutions. Identify alternative solutions to address program issues. A wider range of alternatives can surface by soliciting more stakeholders with diverse skills and backgrounds to identify and address assumptions, constraints, and biases. Brainstorming sessions may stimulate innovative alternatives through rapid interaction and feedback. Sufficient candidate solutions may not be furnished for analysis. As the analysis proceeds, other alternatives should be added to the list of potential candidates. The generation and consideration of multiple alternatives early in a decision process increases the likelihood that an acceptable decision will be made and that consequences of the decision will be understood.
  • Select Evaluation Methods. Methods for evaluating alternative solutions against established criteria can range from simulations to the use of probabilistic models and decision theory. These methods need to be carefully selected. The level of detail of a method should be commensurate with cost, schedule, performance, and associated risk. Typical analysis evaluation methods include the following:
    • Modeling and simulation.
    • Analysis studies on business opportunities, engineering, manufacturing, cost, etc.
    • Surveys and user reviews.
    • Extrapolations based on field experience and prototypes.
    • Testing.
    • Judgment provided by an expert or group of experts (e.g., Delphi Method).
  • Evaluate Alternatives. Evaluate alternative solutions using the established criteria and methods. Evaluating alternative solutions involves analysis, discussion, and review. Iterative cycles of analysis are sometimes necessary. Supporting analyses, experimentation, prototyping, piloting, or simulations may be needed to substantiate scoring and conclusions.
  • Select Solutions. Select solutions from the alternatives based on the evaluation criteria. Selecting solutions involves weighing the results from the evaluation of alternatives and the corresponding risks.

Decision Judgment Methods

Common alternative judgment methods range from indifference to a probability based judgement. Methods that the practitioner should be aware of include:

  • Emotion based judgment. Once a decision is made public, the decision-makers will vigorously defend their choice, even in the face of contrary evidence, because it is easy to become emotionally tied to the decision. Another phenomenon is that people often need “permission” to support an action or idea, as explained by Cialdini (2006), and this inherent human trait also suggests why teams often resist new ideas.
  • Intuition based judgment. Intuition plays a key role in leading development teams to creative solutions. Gladwell (2005) makes the argument that we intuitively see the powerful benefits or fatal flaws inherent in a newly proposed solution. Intuition can be an excellent guide when based on relevant past experience but it may blind you to as-yet undiscovered concepts. Ideas generated based on intuition should be considered seriously, but should be treated as an output of a brainstorming session, and evaluated using one of the next three approaches. Also, see Skunk Works (Rich and Janos 1996) for more on intuitive decisions.
  • Expert based judgment. For certain problems, especially ones involving technical expertise outside your field, calling in experts is a cost effective approach. The decision-making challenge is to establish perceptive criteria for selecting the right experts.
  • Fact based judgment. This is the most common situation and discussed in more detail below.
  • Probability based judgment. SE methods are used to deal with uncertainty, and this topic is also elaborated below.

Fact Based Decision Making

Informed decision-making requires a clear statement of objectives, a clear understanding of the value of the outcome, a gathering of relevant information, an appropriate assessment of alternatives, and a logical process to make a selection.

Regardless of the method, the starting point is to identify an appropriate (usually small) team to frame and challenge the decision statement. The decision statement should be concise, but the decision maker and the team should iterate until they have considered positive and negative consequences of the way they have expressed their objective.

Once the decision maker and team accept the decision statement, the next step is to define the decision criteria. As shown in Figure 1, the criteria fall into two categories: “Musts” and “Wants.” Any candidate solution that does not satisfy a “Must” should be rejected, no matter how attractive all other aspects of the candidate are.

If a candidate solution you feel is promising fails the “Must” requirements, there is nothing wrong in challenging the requirements, as long as this is done with open awareness to avoid bias. This is a judgmental process and the resulting matrix is a decision support guide, not a mandatory theoretical constraint. A sample flowchart to assist in fact based judgment from Visualizing Project Management is below in Figure 1 (Forsberg, Mooz, and Cotterman 2005, 154-155).

Figure 1. Decision Selection Flowchart (Forsberg, Mooz, Cotterman 2005). Reprinted with permission of John Wiley & Sons.

The next step is to define the desirable characteristics of the solution and develop a relative weighting. If no weighting is used it implies all criteria are of equal importance. One fatal flaw is if the team creates too many criteria (15 or 20 or more), since this tends to obscure important differences in the candidate solutions.

There are a number of approaches, starting with Pugh’s Method in which each candidate solution is compared to a reference standard, and rated as equal, better, or inferior for each of the criteria. A more perceptive approach is the Kepner Tragoe decision matrix described by Forsberg, Mooz, and Cotterman (2005, 154-155).

When the relative comparison is completed, scores within five percent of each other are essentially equal. An opportunity and risk assessment should be performed on the best candidates, as well as a sensitivity analysis on the scores and weights to ensure that the robustness (or fragility) of the decision is known.

In the preceding example all criteria are compared to the highest-value criterion. Another approach is to create a full matrix of pair-wise comparisons of all criteria against each other, and from that, using the Analytical Hierarchy Process (AHP), the relative importance is calculated. The process is still judgmental, and the results are no more accurate, but several computerized decision support tools have made effective use of this process (Saaty 2008).

Probability Based Decision Analysis

Probability based decisions are made when there is uncertainty. Decision management techniques and tools for decisions based on uncertainty include probability theory, utility functions, decision tree analysis, modeling, and simulation. A classic mathematically oriented reference in the area of decision analysis is Raiffa (1997) for understanding decision trees and probability analysis. Another classic introduction is presented by Schlaiffer (1969) with more of an applied focus. The aspect of modeling and simulation is covered in the popular textbook Simulation Modeling and Analysis (Law 2007), which also has good coverage of Monte Carlo analysis. Some of these more commonly used and fundamental methods are overviewed below.

Decision trees and influence diagrams are visual analytical decision support tools where the expected values (or expected utility) of competing alternatives are calculated. A decision tree uses a tree-like graph or model of decisions and their possible consequences, including chance event outcomes, resource costs, and utility. Influence diagrams are used for decision models as alternate, more compact graphical representations of decision trees.

The figure below demonstrates a simplified make vs. buy decision analysis tree and the associated calculations. Suppose making a product costs $200K more than buying an alternative off the shelf, reflected as a difference in the net payoffs in the figure. The custom development is also expected to be a better product with a corresponding larger probability of high sales at 80% vs. 50% for the bought alternative. With these assumptions, the monetary expected value of the make alternative is (.8*2.0M) + (.2*0.5M) = 1.7M and the buy alternative is (.5*2.2M) + (.5*0.7M) = 1.45M.

Figure 2. Decision Tree Example. (SEBoK Original)

Influence diagrams focus attention on the issues and relationships between events. They are generalizations of Bayesian networks whereby maximum expected utility criteria can be modeled. A good reference is presented by Detwarasiti and Shachter (2005, 207-228) for using influence diagrams in team decision analysis.

Expected utility is more general than expected value. Utility is a measure of relative satisfaction that takes into account the decision maker's preference function, which may be nonlinear. Expected utility theory deals with the analysis of choices with multidimensional outcomes. The analyst should determine the decision-maker's utility for money and select the alternative course of action that yields the highest expected utility, rather than the highest expected monetary value. A classic reference on applying multiple objective methods, utility functions, and allied techniques is Decision with multiple objectives: Preferences and value- trade-offs (Kenney and Raiffa 1976). References with applied examples of decision tree analysis and utility functions include Managerial Decision Analysis (Samson 1988) and Introduction to decision analysis (Skinner 1999).

Systems engineers often have to consider many different criteria when making choices about system tradeoffs. Taken individually, these criteria could lead to very different choices. A weighted objectives analysis is a rational basis for incorporating the multiple criteria into the decision.



Each criterion is weighted a certain amount depending on its importance relative to the others. Table 1 below shows an example deciding between two alternatives using their criteria weights, ratings, weighted ratings, and weighted totals to decide between the alternatives.

Table 1. Weighted Criteria Example. (SEBoK Original)
Alternative A Alternative B
Criteria Weight Rating Weight * Rating Rating Weight * Rating
Better 0.5 4 2.0 10 5.0
Faster 0.3 8 2.4 5 21.5
Cheaper 0.2 5 1.0 3 0.6
Total Weighted Score 5.4 7.1


There are numerous other methods used in decision analysis. One of the simplest is sensitivity analysis, which looks at the relationships between the outcomes and their probabilities to find how sensitive a decision point is to changes in inputs. Value of information methods expend effort on data analysis and modeling to improve the optimum expected value. Multi Attribute Utility Analysis (MAUA) is a method that develops equivalencies between dissimilar units of measure.

Systems Engineering Management (Blanchard 2004b) shows a variety of these decision analysis methods in many technical decision scenarios. A comprehensive reference demonstrating decision analysis methods for software-intensive systems is "Software risk management: Principles and practices" (Boehm 1981, 32-41). It is a major treatment of multiple goal decision analysis, dealing with uncertainties, risks, and the value of information.

Facets of a decision situation which cannot be explained by a quantitative model should be reserved for intuition and judgment applied by the decision maker. Sometimes outside parties are also called upon. One method to canvas experts is the Delphi Technique procedure for organizing and sharing expert forecasts about the future outcomes or parameter values. The Delphi Technique is a method of group decision-making and forecasting that involves successively collating the judgments of experts. A variant called the Wideband Delphi technique is described by Boehm (1981, 32-41) for improving upon the standard Delphi with more rigorous iterations of statistical analysis and feedback forms.

General tools, such as spreadsheets and simulation packages, can be used with these methods. There are also tools targeted specifically for aspects of decision analysis such as decision trees, evaluation of probabilities, Bayesian influence networks, and others. The INCOSE website for the tools database (INCOSE 2010, 1) has an extensive list of analysis tools.

Linkages to Other Systems Engineering Management Topics

Decision management is used in many other process areas due to numerous contexts for a formal evaluation process, both technical and management. It is closely coupled with other management areas. Risk Management in particular uses decision analysis methods for risk evaluation and mitigation decisions and a formal evaluation process to address medium or high risks. The Measurement process describes how to derive quantitative indicators as input to decisions. Project Assessment and Control uses decision results for controlling. Refer to the Planning process area for more information about incorporating decision results into project plans.

Practical Considerations

Key pitfalls and good practices related to decision analysis are described below.

Pitfalls

Some of the key pitfalls are below in Table 2 (SEBoK Original).

Name Description
False Confidence
  • False confidence in the accuracy of values used in decisions.
No External Validation
  • Not engaging experts and holding peer reviews. The decision-maker should engage experts to validate decision values.
Errors and False Assumptions
  • Prime sources of errors in risky decision-making include false assumptions, not having an accurate estimation of the probabilities, relying on expectations, difficulties in measuring the utility function, and forecast errors.
Impractical Application
  • The analytical hierarchy process may not handle real-life situations taking into account the theoretical difficulties in using eigenvectors.

Good Practices

Some good practices are below in Table 3 (SEBoK Original).

Name Description
Progressive Decision Modeling
  • Use progressive model building. Detail and sophistication can be added as confidence in the model is built up.
Necessary Measurements
  • Measurements need to be tied to the information needs of the decision makers.
Define Selection Criteria
  • Define selection criteria and process (and success criteria) before identifying trade alternatives.

Additional good practices can be found in ISO/IEC/IEEE (2009, Clause 6.3) and INCOSE (2010, Section 5.3.1.5). Parnell, Driscoll, and Henderson (2008) provide a thorough overview.

References

Works Cited

Blanchard, B.S. 2004. Systems Engineering Management. 3rd ed. New York, NY,USA: John Wiley & Sons.

Boehm, B. 1981. "Software risk management: Principles and practices." IEEE Software 8 (1) (January 1991): 32-41.

Cialdini, R.B. 2006. Influence: The Psychology of Persuasion. New York, NY, USA: Collins Business Essentials.

Detwarasiti, A., and R. D. Shachter. 2005. "Influence diagrams for team decision analysis." Decision Analysis 2 (4): 207-28.

Gladwell, M. 2005. Blink: the Power of Thinking without Thinking. Boston, MA, USA: Little, Brown & Co.

Kenney, R.L., and H. Raiffa. 1976. Decision with multiple objectives: Preferences and value- trade-offs. New York, NY: John Wiley & Sons.

INCOSE. 2011. Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities. Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.

Law, A. 2007. Simulation Modeling and Analysis. 4th ed. New York, NY, USA: McGraw Hill.

Rich, B. and L. Janos. 1996. Skunk Works. Boston, MA, USA: Little, Brown & Company.

Saaty, T.L. 2008. Decision Making for Leaders: The Analytic Hierarchy Process for Decisions in a Complex World. Pittsburgh, PA, USA:: RWS Publications. ISBN 0-9620317-8-X.

Raiffa, H. 1997. Decision Analysis: Introductory Lectures on Choices under Uncertainty. New York, NY, USA: McGraw-Hill.

Samson, D. 1988. Managerial Decision Analysis. New York, NY, USA: Richard D. Irwin, Inc.

Schlaiffer, R. 1969. Analysis of Decisions under Uncertainty. New York, NY, USA: McGraw-Hill book Company.

Skinner, D. 1999. Introduction to Decision Analysis. 2nd ed. Sugar Land, TX, USA: Probabilistic Publishing.

Wikipedia contributors. "Decision making software." Wikipedia, The Free Encyclopedia. Accessed September 13, 2011. Available at: http://en.wikipedia.org/w/index.php?title=Decision_making_software&oldid=448914757.

Primary References

Forsberg, K., H. Mooz, and H. Cotterman. 2005. Visualizing Project Management. 3rd ed. Hoboken, NJ, USA: John Wiley and Sons. p. 154-155.

Law, A. 2007. Simulation Modeling and Analysis. 4th ed. New York, NY, USA: McGraw Hill.

Raiffa, H. 1997. Decision Analysis: Introductory Lectures on Choices under Uncertainty. New York, NY, USA: McGraw-Hill.

Saaty, T.L. 2008. Decision Making for Leaders: The Analytic Hierarchy Process for Decisions in a Complex World. Pittsburgh, PA, USA: RWS Publications. ISBN 0-9620317-8-X.

Samson, D. 1988. Managerial Decision Analysis. New York, NY, USA: Richard D. Irwin, Inc.

Schlaiffer, R. 1969. Analysis of Decisions under Uncertainty. New York, NY, USA: McGraw-Hill.

Additional References

Blanchard, B. S. 2004. Systems Engineering Management. 3rd ed. New York, NY, USA: John Wiley & Sons.

Boehm, B. 1981. "Software Risk Management: Principles and Practices." IEEE Software 8(1) (January 1991): 32-41.

Detwarasiti, A. and R.D. Shachter. 2005. "Influence Diagrams for Team Decision Analysis." Decision Analysis 2(4): 207-28.

INCOSE. 2011. Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities. Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.

Kenney, R.L., and H. Raiffa. 1976. Decision with Multiple Objectives: Preferences and Value- Trade-Offs. New York, NY, USA: John Wiley & Sons.

Parnell, G.S., P.J. Driscoll, and D.L. Henderson. 2010. Decision Making in Systems Engineering and Management. New York, NY, USA: John Wiley & Sons.

Rich, B. and L. Janos. 1996. Skunk Works. Boston, MA, USA: Little, Brown & Company

Skinner, D. 1999. Introduction to Decision Analysis. 2nd ed. Sugar Land, TX, USA: Probabilistic Publishing.


< Previous Article | Parent Article | Next Article >

Comments from SEBok 0.5 Wiki

No comments were logged for this article in the SEBoK 0.5 wiki. Because of this, it is especially important for reviewers to provide feedback on this article. Please see the discussion prompts below.


SEBoK v. 1.9.1 released 30 September 2018

SEBoK Discussion

Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.

If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.

blog comments powered by Disqus