Difference between revisions of "Assessment and Control"
Line 25: | Line 25: | ||
{| | {| | ||
− | |+ '''Table 1. Major Technical Reviews (Released by Defense Acquisition University (DAU)/U.S. Department of Defense (DoD | + | |+ '''Table 1. Major Technical Reviews (DAU 2010) Released by Defense Acquisition University (DAU)/U.S. Department of Defense (DoD)''' |
|- | |- | ||
! Name | ! Name |
Revision as of 03:18, 16 September 2011
The purpose of Systems Engineering Assessment and Control (SEAC) is to provide adequate visibility into the project’s actual technical progress and risks with respect to the technical plans (i.e., Systems Engineering Management Plan (SEMP) and subordinate plans). The visibility allows the project team to take timely preventive action when trends are recognized or corrective action when performance deviates beyond established thresholds or expected values. SEAC includes preparing for and conducting reviews and audits to monitor performance. The results of the reviews and measurement analyses are used to identify and record findings/discrepancies and may lead to causal analysis and corrective/preventive action plans. Action plans are implemented, tracked, and monitored to closure. (NASA 2007, Section 6.7) (SEG-ITS, 2009, Section 3.9.3, 3.9.10) (INCOSE, 2010, Clause 6.2) (SEI, 2007)
SE Assessment and Control Process Overview
The Systems Engineering Assessment and Control process includes determining and initiating appropriate handling strategies and actions for findings and/or discrepancies that are uncovered in the enterprise, infrastructure, or life cycle activities associated with the project. Analysis of the causes of the findings/discrepancies aids in the determination of appropriate handling strategies. Implementation of approved preventive, corrective, or improvement actions ensures satisfactory completion of the project within planned technical, schedule, and cost objectives. Potential action plans for findings and/or discrepancies are reviewed in the context of the overall set of actions and priorities in order to optimize the benefits to the project and/or organization. Interrelated items are analyzed together to obtain a consistent and cost effective resolution.
The SE assessment and control process includes the following activities:
- Monitor and review technical performance and resource usage against plan
- Monitor technical risk, escalate significant risks to the project risk register and seek project funding to execute risk mitigation plans
- Hold technical reviews and report outcomes at the project reviews
- Analyze issues and determine appropriate actions
- Manage actions to closure
- Hold a Post Delivery Assessment (also known as a Post Project Review) to capture knowledge associated with the project (this may be a separate technical assessment or it may be conducted as part of the Project Assessment and Control process).
The following activities are normally conducted as part of a Project Assessment and Control process
- Authorization, release and closure of work
- Monitor project performance and resource usage against plan
- Monitor project risk and authorize expenditure of project funds to execute risk mitigation plans
- Hold Project reviews
- Analyze issues and determine appropriate actions
- Manage actions to closure
- Hold a Post Delivery Assessment (also known as a Post Project Review) to capture knowledge associated with the Project
Examples of major technical reviews used in SEAC from (DAU 2010) include:
Name | Description |
---|---|
Alternative Systems Review |
A multi-disciplined review to ensure the resulting set of requirements agrees with the customers' needs and expectations. |
Critical Design Review (CDR) |
A multi-disciplined review establishing the initial product baseline to ensure that the system under review has a reasonable expectation of satisfying the requirements of the Capability Development Document within the currently allocated budget and schedule. |
Functional Configuration Audit |
formal examination of the as tested characteristics of a configuration item (hardware and software) with the objective of verifying that actual performance complies with design and interface requirements in the functional baseline. |
In-Service Review |
A multi-disciplined product and process assessment to ensure that the system under review is operationally employed with well-understood and managed risk. |
Initial Technical Review |
A multi-disciplined review to support a program's initial Program Objective Memorandum submission. |
Integrated Baseline Review |
A joint assessment conducted by the government program manager and the contractor to establish the Performance Measurement Baseline. |
Operational Test Readiness Review |
A multi-disciplined product and process assessment to ensure that the system can proceed into Initial Operational Test and Evaluation with a high probability of success, and that the system is effective and suitable for service introduction. |
Production Readiness Review (PRR) |
Examines a program to determine if the design is ready for production and if the prime contractor and major subcontractors have accomplished adequate production planning without incurring unacceptable risks that will breach thresholds of schedule, performance, cost, or other established criteria. |
Physical Configuration Audit |
Examines the actual configuration of an item being produced around the time of the Full-Rate Production Decision. |
Preliminary Design Review (PDR) |
A technical assessment establishing the physically allocated baseline to ensure that the system under review has a reasonable expectation of being judged operationally effective and suitable. |
System Functional Review (SFR) |
A multi-disciplined review to ensure that the system's functional baseline is established and has a reasonable expectation of satisfying the requirements of the Initial Capabilities Document or draft Capability Development Document within the currently allocated budget and schedule. |
System Requirements Review (SRR) |
A multi-disciplined review to ensure that the system under review can proceed into initial systems development, and that all system requirements and performance requirements derived from the Initial Capabilities Document or draft Capability Development Document are defined and testable, and are consistent with cost, schedule, risk, technology readiness, and other system constraints. |
System Verification Review (SVR) |
A multi-disciplined product and process assessment to ensure the system under review can proceed into Low-Rate Initial Production and full-rate production within cost (program budget), schedule (program schedule), risk, and other system constraints. |
Technology Readiness Assessment |
A systematic, metrics-based process that assesses the maturity of critical technology elements, including sustainment drivers. |
Test Readiness Review (TRR) |
A multi-disciplined review designed to ensure that the subsystem or system under review is ready to proceed into formal test. |
Linkages to Other Systems Engineering Management Topics
The Systems Engineering assessment and control process is closely coupled with the Measurement, Planning, Decision Management, and Risk Management processes. The Measurement process provides indicators for comparing actuals to plans. Planning provides estimates and milestones that constitute plans for monitoring, and the project plan with measures used to monitor progress. Decision Management uses the results of project monitoring as decision criteria for making control decisions.
Practical Considerations
Key pitfalls and good practices related to SEAC are described in the next two sections.
Pitfalls
Some of the key pitfalls encountered in planning and performing SE Assessment and Control are:
Name | Description |
---|---|
No Measurement |
|
"Something in Time" Culture |
|
No Teeth |
|
Too Early Baselining |
|
Good Practices
Some good practices gathered from the references are:
Name | Description |
---|---|
Independence |
|
Peer Reviews |
|
Accept Uncertainty |
|
Risk Mitigation Plans |
|
Just In-Time Baselining |
|
Communication |
|
Full Visibility |
|
Leverage Previous Root Cause Analysis |
|
Concurrent Management |
|
Lessons Learned and Post-Mortems |
|
Additional good practices can be found in (INCOSE 2010, Clause 6.2), (SEG-ITS, 2009, Sections 3.9.3 and 3.9.10), (INCOSE 2010, Section 5.2.1.5), (NASA 2007, Section 6.7).
References
Citations
Caltrans and USDOT. 2005. Systems Engineering Guidebook for Intelligent Transportation Systems (ITS). Version 1.1. Sacramento, CA, USA: California Department of Transportation (Caltrans) Division of Reserach & Innovation/U.S. Department of Transportation (USDOT), SEG for ITS 1.1.
DAU. 2010. Defense Acquisition Guidebook (DAG). Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). February 19, 2010.
INCOSE. 2011. Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities. Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.
NASA. 2007. Systems Engineering Handbook. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.
SEI. 2007. "Measurement and Analysis Process Area," in Capability Maturity Model Integrated (CMMI) for Development. Version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
Primary References
Caltrans and USDOT. 2005. Systems Engineering Guidebook for Intelligent Transportation Systems (ITS). Version 1.1. Sacramento, CA, USA: California Department of Transportation (Caltrans) Division of Reserach & Innovation/U.S. Department of Transportation (USDOT), SEG for ITS 1.1.
DAU. 2010. Defense Acquisition Guidebook (DAG). Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). February 19, 2010.
IINCOSE. 2011. Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities. Version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.
NASA. 2007. Systems Engineering Handbook. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105
SEI. 2007. "Measurement and Analysis Process Area," in Capability Maturity Model Integrated (CMMI) for Development. Version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
Additional References
ISO/IEC/IEEE. 2009. Systems and Software Engineering - Life Cycle Processes - Project Management. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC)/Institute of Electrical and Electronics Engineers (IEEE), ISO/IEC/IEEE 16326:2009(E).