System Validation

From SEBoK
Jump to navigation Jump to search

Validation is a set of actions used to check compliance of any element (a system element, a system, a document, a service, a task, a System Requirement, etc.) to its purpose. These actions are planned and carried out throughout the life cycle of the system. Validation is a generic term that needs to be instantiated within the context it occurs.

Validation understood as a process is a transverse activity to every life cycle stage of the system. In particular during the development cycle of the system, the Validation Process is performed in parallel of System Definition and System Realization processes, and applies onto any activity and product resulting from this activity. The Validation Process generally occurs at the end of a set of life cycle tasks or activities, and at least at the end of every milestone of a development project.

The Validation Process is not limited to a phase at the end of development of the system. It might be performed on an iterative basis on every produced engineering element during development and might begin with validation of expressed Stakeholder Requirements. The Validation Process applied onto the system when completely integrated is often called Final Validation.

Definition and Purpose

Definition of Validation: Confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled.

With a note added in ISO 9000:2005: Validation is the set of activities ensuring and gaining confidence that a system is able to accomplish its intended use, goals and objectives (i.e., meet stakeholder requirements) in the intended operational environment.

The purpose of Validation, as a generic action, is to establish compliance of any activity output, compared to inputs of this activity. Validation is used to prove that the transformation of inputs produced the expected, the "right" result.

Validation is based on tangible evidences; this means based on information whose veracity can be demonstrated, based on factual results obtained by techniques or methods such as inspection, measurement, test, analysis, calculation, etc.

So, validate a system (Product, Service, or Enterprise) consists in demonstrating that it satisfies its System Requirements and eventually to Stakeholders Requirements depending of contractual practices. From a purpose and a global point of view, validate a system consists in acquiring confidence in its ability to achieve its intended mission or use under specific operational conditions.

Principles

Concept of Validation Action

Validation Action – A Validation Action is defined then performed.

Definition of a Validation Action applied to an engineering element includes:

  • Identification of the element on which the Validation Action will be performed,
  • Identification of the reference to define the expected result of the Validation Action.

Performance of the Validation Action includes:

  • Obtaining a result from performance of the Validation Action onto the submitted element,
  • Comparing obtained result with expected result,
  • Deducing a degree of compliance of the element,
  • Deciding about acceptability of this compliance, because sometimes the result of the comparison may require a judgment of value regarding relevance in the context of use to accept or not the obtained result.

Note: If there is uncertainty about compliance, the cause could come from ambiguity in requirements.

Placeholder for Figure 1 - Figure 1 - Definition and usage of a Validation Action

ADD FIGURE 1; TITLE HAS BEEN UPDATED

What to validate? – Any engineering element can be validated using a specific reference for comparison: Stakeholder Requirement, System Requirement, Function, System element, Document, etc. Examples are provided in Table 1:

Placeholder for Table 1 - Table 1 - Examples of validated items

ADD TABLE 1; TITLE HAS BEEN UPDATED

Validation versus Verification

The "Verification versus Validation" section of System Verification gives fundamental differences between the two concepts and associated processes. Following table provides synthetic information to help understanding differences.

Placeholder for Table 2 - Table 2 - Verification and validation differences


ADD FIGURE; TITLE HAS BEEN CREATED

Validation, Final Validation, Operational Validation

System Validation concerns the global system seen as a whole and is based on the totality of requirements (System Requirements, Stakeholders Requirements). But it is obtained gradually throughout development stage by pursuing three non exclusive ways:

  • Cumulating Verification Actions and Validation Actions results provided by application of corresponding processes to every engineering element.
  • Performing final Validation Actions onto the complete integrated system in an industrial environment (as close as possible from operational environment).
  • Performing operational Validation Actions onto the complete system in its operational environment (context of use).

Verification and Validation Level per Level

It is impossible to carry out only a single global Validation on a complete integrated complex system. The sources of faults/defects could be important and it would be impossible to determine the causes of non conformance raised during this global check. Generally the System-of-Interest has been decomposed during design in a set of layers of systems, thus every system, system element is verified, validated and possibly corrected before being integrated into the parent system of the higher level, as shown on Figure 2.

Figure 2 - Verification and Validation level per level (Faisandier, Roussel, 2011)

ADD FIGURE; TITLE HAS BEEN CREATED

As necessary, systems, system elements are partially integrated in sub-sets in order to limit the number of properties to be verified within a single step. For each level, it is necessary to make sure by a set of final Validation Actions that features stated at preceding level are not damaged. Moreover, a compliant result obtained in a given environment can turn into non compliant if the environment changes. So, as long as the system is not completely integrated and/or doesn't operate in the real operational environment, no result must be regarded as definitive.

Verification Actions and Validation Actions inside and transverse to levels

Inside each level of system decomposition, Verification Actions and Validation Actions are performed during System Definition and System Realization as represented in Figure 3 for the upper levels and in Figure 4 for lower levels. Stakeholder Requirements Definition and Operational Validation make the link between two levels of the system decomposition.

Figure 3 - Verification and Validation Actions in Upper Levels of System Decomposition (Faisandier 2011) Reprinted with permission of © Alain Faisandier

System Elements Requirements and Products Operational Validation make the link between the two lower levels of the decomposition – see Figure 4.

Figure 4 - Verification and Validation Actions in Lower Levels of System Decomposition (Faisandier 2011) Reprinted with permission of © Alain Faisandier

Note: The last level of the system decomposition is dedicated to the Realization of System Elements, and vocabulary and number of activities used on the Figure 4 may be different.

Verification and Validation Strategy

The difference between verification and validation is especially useful for elaborating the Integration strategy, the Verification strategy, and the Validation strategy. In fact the efficiency of System Realization is gained optimizing the three strategies together to form what is often called Verification & Validation strategy. The optimization consists to define and to perform the minimum of Verification and Validation Actions but detecting the maximum of errors/faults/defects and getting the maximum of confidence. The optimization takes into account the risks potentially generated if Verification Actions or Validation Actions are dropped out.

Process Approach

Purpose and Principle of the approach

The purpose of the Validation Process is to provide objective evidence that the services provided by a system when in use comply with stakeholder requirements, achieving its intended use in its intended operational environment. (ISO/IEC 2008)

This process performs a comparative assessment and confirms that the stakeholder requirements are correctly defined. Where variances are identified, these are recorded and guide corrective actions. System validation is ratified by stakeholders. (ISO/IEC 2008)

The validation process demonstrates that the realized end product satisfies its stakeholders' (customers and other interested parties) expectations within the intended operational environments, with validation performed by anticipated operators and/or users. (NASA December 2007, 1-360).

Each system element, system, and the complete system-of-interest are compared against their own applicable requirements (System Requirements, Stakeholder Requirements). This means that the Validation Process is instantiated as many times as necessary during the global development of the system.

In order to ensure that validation is feasible, the implementation of requirements must be verifiable onto the submitted element. Ensuring that requirements are properly written, i.e. quantifiable, measurable, unambiguous, etc., is essential. In addition, verification/validation requirements are often written in conjunction with Stakeholder and System Requirements and provide method for demonstrating implementation of each System Requirement or Stakeholder requirement.

Generic inputs are references of requirements applicable to the submitted element. If the element is a system, inputs are System Requirements and Stakeholder Requirements.

Generic outputs are the Validation Plan that includes validation strategy, selected Validation Actions, Validation Procedures, Validation Tools, validated element or system, Validation Reports, issue/trouble reports and change requests on requirements or on the system.

Activities of the Process

Major activities and tasks performed during this process include:

  • Establish a validation strategy drafted in a Validation Plan (this activity is carried out concurrently to System Definition activities):
  1. Identify validation scope that is represented by (System and or Stakeholder) requirements; normally, every requirement should be checked; number of Validation Actions can be high.
  2. Identify constraints according to their origin (technical feasibility, management constraints as cost, time, availability of validation means or qualified personnel, contractual constraints as criticality of the mission) that limit or increase potentially Validation Actions.
  3. Define appropriate verification/validation techniques to be applied such as inspection, analysis, simulation, review, testing, etc., depending of the best step of the project to perform every Validation Action according to constraints.
  4. Trade off of what should be validated (scope) taking into account all constraints or limits and deduce what can be validated objectively; selection of Validation Actions would be made according to the type of system, objectives of the project, acceptable risks and constraints.
  5. Optimize the validation strategy defining most appropriate validation technique for every Validation Action, defining necessary validation means (tools, test-benches, personnel, location, facilities) according to selected validation technique, scheduling Validation Actions execution in the project steps or milestones, defining configuration of elements submitted to Validation Actions (mainly about testing on physical elements).
  • Perform Validation Actions includes following tasks:
  1. Detail each Validation Action, in particular expected results, validation technique to be applied and corresponding means (equipments, resources and qualified personnel).
  2. Acquire validation means used during system definition steps (qualified personnel, modeling tools, mocks-up, simulators, facilities); then those during integration, final and operational steps (qualified personnel, Validation Tools, measuring equipments, facilities, Validation Procedures, etc.).
  3. Carry out Validation Procedures at the right time, in the expected environment, with expected means, tools and techniques.
  4. Capture and record results obtained when performing Validation Actions using Validation Procedures and means.
  • Analyze obtained results and compare them to expected results; decide about acceptability of compliance; record decision and status compliant or not; generate validation reports and potential issue/trouble reports and change requests on (System or Stakeholder) Requirements as necessary.
  • Control the process includes following tasks:
  1. Update the Validation Plan according to progress of the project; in particular planned Validation Actions can be redefined because of unexpected events.
  2. Coordinate validation activities with project manager for schedule, acquisition of means, personnel and resources, with designers for issue/trouble/non conformance reports, with configuration manager for versions of physical elements, design baselines, etc.

Artifacts and Ontology Elements

This process may create several artifacts such as:

  • Validation Plan (contains the validation strategy)
  • Validation Matrix (contains for each Validation Action, submitted element, applied technique, step of execution, system block concerned, expected result, obtained result, etc.)
  • Validation Procedures (describe Validation Actions to be performed, Validation Tools needed, Validation Configuration, resources, personnel, schedule, etc.)
  • Validation Reports
  • Validation Tools
  • Validated element
  • Issue / Non-Conformance / Trouble Reports
  • Change Requests on requirement, product, service, enterprise

This process handles the ontology elements of Table x.

Placeholder for Table x - table to be created/inserted

Methods and Techniques

The validation techniques are the same as those used for verification, but purposes are different; verification is used to detect faults/defects, whereas validation is to prove satisfaction of (System and/or Stakeholder) Requirements.

Validation Traceability Matrix – The traceability matrix is introduced in Stakeholder Requirements Definition topic. It may also be extended and used to record data such as Validation Actions list, selected validation technique to validate implementation of every engineering element (in particular Stakeholder and System Requirement), expected results, obtained results when Validation Action has been performed. The use of such a matrix enables development team to ensure that selected Stakeholder and System Requirements have been verified, or to evaluate percentage of Validation Actions completed.

Practical Considerations

Major pitfalls encountered with System Validation are presented in Table Y.

Placeholder for Table Y - table to be created/inserted

Proven practices:

Major proven practices encountered with System Validation are presented in Table Z.

Placeholder for Table Z - table to be created/inserted

In the Word document, there is debate about whether or not to keep this. Please review.

The following are elements that should be considered when practicing any of the activities discussed as part of system realization:

  • Mixing verification and validation is a common issue. Validation demonstrates that the product, service, enterprise as provided, fulfils its intended use, whereas verification addresses whether a local work product properly reflects its specified requirements. In other words, verification ensures that “one built the system right” whereas validation ensures that “one built the right system.” Validation Actions use the same techniques as the Verification Actions (e.g., test, analysis, inspection, demonstration, or simulation).
  • State who the witnesses will be for the purpose of collecting the evidence of success, what general steps will be followed, and what special resources are needed, such as instrumentation, special test equipment or facilities, simulators, specific data gathering, or rigorous analysis of demonstration results.
  • Identify the test facility, test equipment, any unique resource needs and environmental conditions, required qualifications and test personnel, general steps that will be followed, specific data to be collected, criteria for repeatability of collected data, and methods for analyzing the results.

References

Citations

Buede, D. M. 2009. The engineering design of systems: Models and methods. 2nd ed. Hoboken, NJ: John Wiley & Sons Inc.

INCOSE. 2011. INCOSE Systems Engineering Handbook, version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.

ISO/IEC. 2008. Systems and software engineering - system life cycle processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), ISO/IEC 15288:2008 (E).

NASA. 2007. Systems Engineering Handbook. Washington, D.C.: National Aeronautics and Space Administration (NASA), NASA/SP-2007-6105, December 2007.

Primary References

INCOSE. 2011. INCOSE Systems Engineering Handbook, version 3.2.1. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.

ISO/IEC. 2008. Systems and software engineering - system life cycle processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), ISO/IEC 15288:2008 (E).

NASA. 2007. Systems Engineering Handbook. Washington, D.C.: National Aeronautics and Space Administration (NASA), NASA/SP-2007-6105, December 2007.

Additional References

Buede, D. M. 2009. The engineering design of systems: Models and methods. 2nd ed. Hoboken, NJ: John Wiley & Sons Inc.

DAU. February 19, 2010. Defense acquisition guidebook (DAG). Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense.

ECSS. 6 March 2009. Systems engineering general requirements. Noordwijk, Netherlands: Requirements and Standards Division, European Cooperation for Space Standardization (ECSS), ECSS-E-ST-10C.

SAE International. 1996. Certification considerations for highly-integrated or complex aircraft systems. Warrendale, PA, USA: SAE International, ARP475

SEI. 2007. Capability maturity model integrated (CMMI) for development, version 1.2, measurement and analysis process area. Pittsburg, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).

Comments from 0.5 wiki

<html> <iframe src="http://bkcasewiki.org/index.php?title=Talk:System_Verification_and_Validation&printable=yes" width=825 height=200 frameborder=1 scrolling=auto> </iframe> </html>


SEBoK v. 1.9.1 released 30 September 2018

SEBoK Discussion

Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.

If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.

blog comments powered by Disqus