System Verification

From SEBoK
Jump to navigation Jump to search

Verification is a set of actions used to check the correctness of any element such as a system element, a system, a document, a service, a task, a requirement, etc. These actions are planned and carried out throughout the life cycle of the system. Verification is a generic term that needs to be instantiated within the context it occurs.

Verification understood as a process is a transverse activity to every life cycle stage of the system. In particular during the development cycle of the system, the Verification Process is performed in parallel of the System Definition and System Realization processes, and applies onto any activity and product resulting from the activity. The activities of every life cycle process and those of the Verification Process fit into each others. The Integration Process uses intensively the Verification Process.

Definition and Purpose

Definition of Verification: Confirmation, through the provision of objective evidence, that specified requirements have been fulfilled.

With a note added in ISO-IEC 15288: Verification is a set of activities that compares a system or system element against the required characteristics. This may include, but is not limited to, specified requirements, design description and the system itself.

The purpose of Verification, as a generic action, is to identify the faults/defects introduced at the time of any transformation of inputs into outputs. Verification is used to prove that the transformation was made according to the selected and appropriate methods, techniques, standards or rules.

Verification is based on tangible evidences; this means based on information whose veracity can be demonstrated, based on factual results obtained by techniques such as inspection, measurement, test, analysis, calculation, etc. So, verify a system (Product, Service, or Enterprise) consists in comparing the realized characteristics or properties of the Product, Service or Enterprise against its expected Design Properties.

Principles and concepts

Concept of Verification Action

Why to verify? – In the context of human realizations, any conscious people know that "error" is part of the thought and the human activity. It is the case with any engineering activity. Studies in human reliability have shown that people trained to a specific operation makes around 10-3 errors per hour in the best case. In any activity or outcome of activity, the search of potential errors should not either be neglected by considering that they will not happen or that they should not happen; the consequence of errors can cause extremely significant failures or threats.

Verification Action – A Verification Action is defined then performed.

Figure 1. Definition and Performance of a Verification and Validation Action (Faisandier 2011) Reprinted with permission of © Alain Faisandier

Definition of a Verification Action applied to an engineering element includes:

  • Identification of the element on which the Verification Action will be performed,
  • Identification of the reference to define the expected result of the Verification Action.


Performance of the Verification Action includes:

  • obtaining a result from performance of the Verification Action onto the submitted element,
  • comparing obtained result with expected result,
  • deducing a degree of correctness of the element.

What to verify? – Any engineering element can be verified using a specific reference for comparison: Stakeholder Requirement, System Requirement, Function, System Element, Document, etc. Examples are provided in table x.

Table 1. Examples of Verified and Validated Items (Table Developed for BKCASE)
SEBoKv05 KA-SystRealiz V&V Examples.png

Verification versus Validation

The term Verification is often associated with the term Validation and understood as a single concept of V & V. Validation is used to ensure that “one is working the right problem” whereas Verification is used to ensure that “one has solved the problem right”. (Martin 1997)

Source of the terms – From an actual and etymological meaning, the term verification comes from the Latin "verus" – that means truth – and "facere" – that means make/perform. So verification means to prove that something is “true” or correct (a property, a characteristic, etc.). The term validation comes from the Latin "valere" – that means become strong – and has the same root as value. So validation means to prove that something has the right features to produce the expected effects. Verification and Validation in plain English – (Jerome, Lake, INCOSE 1999)

Process similarities and differences - The main differences between the Verification Process and the Validation Process concern the reference used to check correctness of an element, and acceptability of the effective correctness.

  • Within verification, comparison between expected result and obtained result is generally binary whereas within validation the result of the comparison may require a judgment of value to accept or not the obtained result compared to a threshold, a limit.
  • Verification relates mainly to one element, whereas validation relates to a set of elements and considers this set as a whole;
  • Validation presupposes that Verification Actions have been performed first.
  • The techniques used to define and perform the Verification Actions and those for Validation Actions are very similar.

Integration, Verification, Validation of the system

There is sometimes a misconception that Verification occurs after Integration and before Validation. In most of the cases, it is more appropriate to begin verification activities during development and to continue them into deployment and use.

Once System Elements have been realized, their integration to form the complete system is performed. Integration consists to assemble and to perform Verification Action as stated in the Integration Process. A Final Validation activity generally occurs when the system is integrated, but a certain number of Validation Actions are also performed in parallel of the System Integration in order to reduce as much as possible the number of Verification Actions and Validation Actions while controlling the risks that could be generated if some checks are dropped out. Integration, Verification and Validation are intimately processed together due to the necessity of optimizing the strategy of Verification and Validation and the strategy of Integration.

Process Approach

Purpose and Principle of the Approach

The purpose of the Verification Process is to confirm that the specified design requirements are fulfilled by the system. This process provides the information required to effect the remedial actions that correct non-conformances in the realized system or the processes that act on it. (ISO-IEC 15288:2008)

Each system element and the complete system should be compared against its own design references (specified requirements). As stated by Dennis Buede, “verification is the matching of [configuration items], components, sub-systems, and the system to corresponding requirements to ensure that each has been built right.” (Buede 2009) This means that the Verification Process is instantiated as many times as necessary during the global development of the system.

Because of the generic nature of a process, the Verification Process can be applied to any engineering element that has conducted to the definition and realization of the system elements and the system itself.

Facing the huge number of potential Verification Actions that may be generated by the normal approach, it is necessary to optimize the verification strategy. This strategy is based on the balance between what should be verified as a must, and constraints such as time, cost, and feasibility of testing that limit naturally the number of Verification Actions, and the risks one accepts dropping out some Verification Actions.

Several approaches exist that may be used for defining the Verification Process. INCOSE defines two main steps: plan and perform the Verification Actions. (INCOSE 2010) NASA has a slightly more detailed approach that includes five main steps: prepare verification, perform verification, analyze outcomes, produce report, and capture work products. (NASA December 2007, 1-360, p. 102)

Any approach may be used, provided that it is appropriate to the scope of the system, the constraints of the project, includes the activities listed above in some way, and is appropriately coordinated with other activities.

Generic inputs are baseline references of the submitted element. If the element is a system, inputs are the logical and physical architectures elements as described in a System Design Document, the design description of internal interfaces to the system and Interfaces Requirements external to the system, and by extension the System Requirements.

Generic outputs are the Verification Plan that includes verification strategy, selected Verification Actions, Verification Procedures, Verification Tools, verified element or system, Verification Reports, issue/trouble reports and change requests on design.

Activities of the Process

  • Establish the verification strategy drafted in a Verification Plan (this activity is carried out concurrently to System Definition activities):
  1. Identify verification scope in listing as exhaustive as possible characteristics or properties that should be checked; number of Verification Actions can be high.
  2. Identify constraints according to their origin (technical feasibility, management constraints as cost, time, availability of verification means or qualified personnel, contractual constraints as criticality of the mission) that limit potentially Verification Actions.
  3. Define appropriate verification techniques to be applied such as inspection, analysis, simulation, peer-review, testing, etc., depending of the best step of the project to perform every Verification Action according to constraints.
  4. Trade off of what should be verified (scope) taking into account all constraints or limits and deduce what can be verified; the selection of Verification Actions would be made according to the type of system, objectives of the project, acceptable risks and constraints.
  5. Optimize the verification strategy defining most appropriate verification technique for every Verification Action, defining necessary verification means (tools, test-benches, personnel, location, facilities) according to selected verification technique, scheduling Verification Actions execution in the project steps or milestones, defining configuration of elements submitted to Verification Actions (mainly about testing on physical elements).
  • Perform Verification Actions includes following tasks:
  1. Detail each Verification Action, in particular expected results, verification technique to be applied and corresponding means (equipments, resources and qualified personnel).
  2. Acquire verification means used during system definition steps (qualified personnel, modeling tools, mocks-up, simulators, facilities); then those during the integration step (qualified personnel, Verification Tools, measuring equipments, facilities, Verification Procedures, etc.).
  3. Carry out Verification Procedures at the right time, in the expected environment, with expected means, tools and techniques.
  4. Capture and record results obtained when performing Verification Actions using Verification Procedures and means.
  • Analyze obtained results and compare them to expected results; record status compliant or not; generate Verification Reports and potential issue/trouble reports and change requests on design as necessary.
  • Control the process includes following tasks:
  1. Update the Verification Plan according to the progress of the project; in particular planned Verification Actions can be redefined because of unexpected events.
  2. Coordinate verification activities with project manager for schedule, acquisition of means, personnel and resources, with designers for issue/trouble/non conformance reports, with configuration manager for versions of physical elements, design baselines, etc.

Artifacts and Ontology Elements

This process may create several artifacts such as:

  • Verification Plan (contains the verification strategy)
  • Verification Matrix (contains for each Verification Action, submitted element, applied technique, step of execution, system block concerned, expected result, obtained result, etc.)
  • Verification Procedures (describe Verification Actions to be performed, Verification Tools needed, Verification Configuration, resources, personnel, schedule, etc.)
  • Verification Reports
  • Verification Tools
  • Verified element
  • Issue / Non-Conformance / Trouble Reports
  • Change Requests on design

This process handles the ontology elements of the Table 1.

Placeholder for Table 1 - to be created

Process Approach

Introduction

Several approaches exist for defining the verification and validation Processes. INCOSE defines two main steps: plan and perform the verification Actions (INCOSE 2010).

NASA has a slightly more detailed approach that includes five main steps: prepare verification, perform verification, analyze outcomes, produce report, and capture work products (NASA 2007, 102).

Any approach may be used, provided that it is appropriate to the scope of the system, the constraints of the project, includes the activities listed above in some way, and is appropriately coordinated with other activities (including system definition, system realization, and extension to the rest of the life cycle).

Verification Process

Major activities and tasks performed during this process include:

  1. Establish a verification strategy drafted in a verification and validation plan (glossary) (this activity is carried out concurrently to system definition activities) obtained by the following tasks:
    1. Identify the verification scope by listing as exhaustively as possible the characteristics or properties that should be checked. The number of verification and validation actions can be very high.
    2. Identify the constraints according to their origin (e.g., technical feasibility, management constraints as cost, time, availability of verification and validation means or qualified personnel, contractual constraints as criticality of the mission) that potentially limit the verification and validation actions.
    3. Define the appropriate verification and validation techniques to be applied, such as inspection, analysis, simulation, peer-review, testing, etc. depending on the best step of the project to perform every verification and validation action according to constraints.
    4. Trade off of what should be verified (scope) taking into account all the constraints or limits and deduce what can be verified. The selection of verification and validation actions should be made according to the type of system, objectives of the project, importance of the requirement, acceptable risks, and constraints.
    5. Develop the verification and validation strategy by defining the most appropriate verification technique for every verification and validation action, defining the necessary verification and validation means (tools, test-benches, personnel, location, facilities) according to the selected verification technique, scheduling the verification and validation actions execution in the project steps or milestones, and defining the configuration of the elements submitted to verification and validation actions (mainly about testing on physical elements).
  2. Performance of the verification and validation actions includes the following tasks:
    1. Detail each verification and validation action, in particular the expected results, the verification technique to be applied, and the corresponding means (equipment, resources, and qualified personnel).
    2. Acquire the verification and validation means used during the system definition steps (qualified personnel, modeling tools, mocks-up, simulators, facilities). Then acquire the verification and validation means during the integration step (qualified personnel, verification and validation tools, measuring equipments, facilities, verification and validation Procedures, etc.).
    3. Carry out the verification and validation procedures at the right time, in the expected environment, with the expected means, tools, and techniques.
    4. Capture and record the results obtained when performing the verification and validation actions using verification and validation procedures and means.
  3. Analyze obtained results and compare them to the expected results, record the status as compliant or not, generate verification reports and potential issue/trouble reports, and change requests on the design as necessary.
  4. Controlling the process includes the following tasks:
    1. Update the verification and validation plan according to the progress of the project. In particular, the planned verification and validation actions can be redefined because of unexpected events (addition, deletion, or modification of actions).
    2. Coordinate the verification activities with the project manager for schedule, acquisition of means, and review of personnel and resources, as well as with the designers for issues/trouble/non-conformance reports and with configuration managers for versions of physical elements, design baselines, etc.

Validation Process

Major activities and tasks performed during this process include:

  1. Establish a validation strategy in a verification and validation plan (this activity is carried out concurrently to system definition activities) including the following tasks:
    1. Identify the validation scope that is represented by the system and/or stakeholder requirements; normally, every requirement should be checked as the number of verification and validation actions can be high.
    2. Identify the constraints according to their origin: (same as verification process above).
    3. Define the appropriate verification/validation techniques to be applied: (same as verification process above).
    4. Trade off of what should be validated (scope): (same as verification process above).
    5. Optimize the verification and validation strategy: (same as verification process above).
  2. Performance of the verification and validation actions includes the following tasks: (same as verification process above).
  3. Analyze obtained results and compare them to the expected results. Decide about the acceptability of the conformance/compliance, record the decision and the status as compliant or not, generate validation reports and potential issue/trouble reports, and change requests on the system or stakeholder requirements as necessary.
  4. Control the process includes the following tasks: (same as verification process above).

Artifacts and Ontology Elements

These processes may create several artifacts such as:

  1. Verification and validation plan (contains in particular the verification and validation strategy with objectives, constraints, the list of the selected verification and validation Actions, etc.)
  2. Verification and validation matrix (contains for each verification and validation action, the submitted element, the applied technique/method, the step of execution, the system block concerned, the expected result, the obtained result, etc.)
  3. Verification and validation procedures (describe the verification and validation actions to be performed, the verification and validation tools needed, the verification and validation configuration, resources, personnel, schedule, etc.)
  4. Verification and validation reports
  5. Verification and validation tools
  6. Verified and validated element (system, system element, etc.)
  7. Issue / Non Conformance / Trouble Reports
  8. Change requests on requirement, product, service, enterprise

These processes handle the ontology elements of Table 2.

Table 2. Main Ontology Elements As Handled Within Verification and Validation (Table Developed for BKCASE)
SEBoKv05 KA-SystRealiz ontology V&V.png

Methods and Techniques

Verification and Validation techniques

There are several verification and validation techniques/methods to check that an element or a system complies to its system's and/or stakeholder's requirements. These techniques are common to verification and validation. The purposes are different; verification is used to detect faults/defects, whereas validation is used to prove the satisfaction of a system's and/or stakeholder's requirements. Table 3 below provides synthetic descriptions of some techniques.

Table 3. Verification and Validation Techniques (Table Developed for BKCASE)
SEBoKv05 KA-SystRealiz V&V Techniques.png

Validation/Traceability Matrix

The System Definition Knowledge Area introduced the traceability matrix. It may also be extended and used to record data, such as the verification and validation actions list, the selected verification/validation technique to verify/validate the implementation of every engineering element (in particular stakeholder and System requirements), the expected results, and the obtained results when the verification and validation action has been performed. The use of such a matrix enables the development team to ensure that the selected stakeholder and system requirements have been verified or to evaluate the percentage of verification and validation actions completed.

Practical Considerations

Major pitfalls encountered with System verification and validation are presented in Table 4.

Table 4. Major pitfalls with System Verification and Validation (Table Developed for BKCASE)
SEBoKv05 KA-SystRealiz pitfalls with V&V.png

Major proven practices encountered with system verification and validation are presented in Table 5.

Table 5. Proven Practices with System Verification and Validation (Table Developed for BKCASE)
SEBoKv05 KA-SystRealiz practices with V&V.png

A good survey with examples of famous failures is provided by Bahill and Henderson (2005).

References

Citations

Bahill, A.T. and S.J. Henderson. 2005. "Requirements Development, Verification, and Validation Exhibited in Famous Failures." Systems Engineering 8(1): 1-14.

Buede, D.M. 2009. The Engineering Design of Systems: Models and Methods. 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.

INCOSE. 2011. INCOSE Systems Engineering Handbook. Version 3.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.

ISO/IEC 2008. Systems and Software Engineering -- System Life Cycle Processes. Geneva, Switzerland: International Organisation for Standardisation / International Electrotechnical Commissions. ISO/IEC/IEEE 15288:2008 (E).

NASA. 2007. Systems Engineering Handbook. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Primary References

INCOSE. 2011. INCOSE Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.1.

ISO/IEC/IEEE. 2008. Systems and Software Engineering - System Life Cycle Processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2008 (E).

NASA. 2007. Systems Engineering Handbook. Washington, D.C.: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Additional References

Bahill, A.T. and S.J. Henderson. 2005. "Requirements Development, Verification, and Validation Exhibited in Famous Failures." Systems Engineering 8(1): 1-14.

Buede, D.M. 2009. The Engineering Design of Systems: Models and Methods. 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.

DAU. 2010. Defense Acquisition Guidebook (DAG). Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). February 19, 2010.

ECSS. 2009. Systems Engineering General Requirements. Noordwijk, Netherlands: Requirements and Standards Division, European Cooperation for Space Standardization (ECSS), 6 March 2009. ECSS-E-ST-10C.

SAE International. 1996. Certification Considerations for Highly-Integrated or Complex Aircraft Systems. Warrendale, PA, USA: SAE International, ARP475.

SEI. 2007. "Measurement and Analysis Process Area" in Capability Maturity Model Integrated (CMMI) for Development, version 1.2. Pittsburg, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).


<- Previous Article | Parent Article | Next Article ->


Comments from 0.5 wiki

<html> <iframe src="http://bkcasewiki.org/index.php?title=Talk:System_Verification_and_Validation&printable=yes" width=825 height=200 frameborder=1 scrolling=auto> </iframe> </html>