Difference between revisions of "System Verification"

From SEBoK
Jump to navigation Jump to search
m (Text replacement - "SEBoK v. 2.9, released 13 November 2023" to "SEBoK v. 2.9, released 20 November 2023")
 
(203 intermediate revisions by 13 users not shown)
Line 1: Line 1:
Verification is a set of actions used to check the correctness of any element such as a System Element, a system, a document, a service, a task, a requirement, etc. Validation is a set of actions used to check the compliance of these elements to their purpose.
+
----
 
+
'''''Lead Authors:''''' ''John Snoderly, Alan Faisandier''
==Overview==
+
----
The Verification Process is performed on an iterative basis on every produced engineering element throughout the life cycle.
+
{{Term|Verification (glossary)|System Verification}} is a set of actions used to check the ''correctness'' of any element, such as a {{Term|System Element (glossary)|system element}}, a {{Term|System (glossary)|system}}, a document, a {{Term|Service (glossary)|service}}, a task, a {{Term|Requirement (glossary)|requirement}}, etc. These types of actions are planned and carried out throughout the {{Term|Life Cycle (glossary)|life cycle}} of the system. Verification is a generic term that needs to be instantiated within the context it occurs. As a process, verification is a transverse activity to every life cycle stage of the system. In particular, during the development cycle of the system, the verification process is performed in parallel with the {{Term|System Definition (glossary)|system definition}} and {{Term|System Realization (glossary)|system realization}} processes and applies to any activity and any product resulting from the activity. The activities of every life cycle process and those of the verification process can work together. For example, the {{Term|Integration (glossary)|integration}} process frequently uses the verification process. It is important to remember that verification, while separate from [[System Validation|validation]], is intended to be performed in conjunction with validation.
 
 
The Validation Process generally occurs at the end of a set of life cycle tasks or activities, and at least at the end of every milestone of a development project. The Validation Process applied onto the system when completely integrated is often called Final Validation – see [[System Integration]] Figure 1.
 
 
 
'''The purpose of Verification''' is to show that the element meets the requirements.
 
 
 
'''The purpose of Validation''' is to show that the requirements were correct.
 
 
 
Validation is used to ensure that “one is working the right problem” whereas Verification is used to ensure that “one has solved the problem right”.  (Martin. 1997)
 
 
 
'''Remarks about definitions''':
 
 
 
There are several books and standards that provide different definitions of Verification and of Validation. The most and general accepted definitions can be found in [ISO-IEC 12207:2008, ISO-IEC 15288:2008, ISO25000:2005, ISO 9000:2005]:
 
 
 
'''Verification''': confirmation, through the provision of objective evidence, that specified requirements have been fulfilled. With a note added in ISO-IEC 15288: Verification is a set of activities that compares a system or system element against the required characteristics. This may include, but is not limited to, specified requirements, design description and the system.
 
 
 
'''Validation''': confirmation, through the provision of objective evidence, that the requirements for a specific intended use or application have been fulfilled. With a note added in ISO 9000:2005: Validation is the set of activities ensuring and gaining confidence that a system is able to accomplish its intended use, goals and objectives (i.e., meet stakeholder requirements) in the intended operational environment.
 
 
 
==Principles==
 
 
 
===Concept of Verification and Validation Action===
 
The performance of the Verification and Validation Action includes:
 
*to obtain a result from the performance of the Verification and Validation Action onto the submitted element,
 
*to compare the obtained result with the expected result,
 
*to deduce a degree of correctness and of conformance/compliance of the submitted element,  
 
*to decide about the acceptability of this conformance/compliance, because sometimes the result of the comparison may require a judgment of value regarding the relevance in the context of use to accept or not the obtained result (generally analyzing it against a threshold, a limit).
 
 
 
Note: If there is uncertainty about the conformance/compliance, the cause could come from ambiguity in the requirements; the typical example is the case of a measure of effectiveness expressed without a "limit of acceptance" (above or below threshold the measure is declared unfulfilled).
 
 
 
[[File:SEBoKv05_KA-SystRealiz_Definition_&_performance_of_V&V_Action.png|300 px|Definition and Performance of a Verification and Validation Action|center|]]
 
Figure 1 – Definition and performance of a Verification and Validation Action (Faisandier, 2011)
 
 
 
  
 +
==Definition and Purpose==
  
'''What to verify and validate?''' – Any engineering element can be verified and validated using a specific reference for comparison: Stakeholder Requirement, System Requirement, Function, System Element, Document, etc. Examples are provided in Table 1 below.
+
{{Term|Verification (glossary)|Verification}} is the confirmation, through the provision of objective evidence, that specified requirements have been fulfilled. With a note added in ISO/IEC/IEEE 15288, the scope of verification includes a set of activities that compares a system or system element against the requirements, architecture and design characteristics, and other properties to be verified (ISO/IEC/IEEE 2015). This may include, but is not limited to, specified requirements, design description, and the system itself.
  
[[File:SEBoKv05_KA-SystRealiz_V&V_Examples.png|700px|center]]
+
The purpose of verification, as a generic action, is to identify the faults/defects introduced at the time of any transformation of inputs into outputs. Verification is used to provide information and evidence that the transformation was made according to the selected and appropriate methods, techniques, standards, or rules.  
  
 +
Verification is based on tangible evidence; i.e., it is based on information whose veracity can be demonstrated by factual results obtained from techniques such as inspection, measurement, testing, analysis, calculation, etc. Thus, the process of verifying a {{Term|System (glossary)|system}} ({{Term|Product (glossary)|product}}, {{Term|Service (glossary)|service}}, {{Term|Enterprise (glossary)|enterprise}}, or {{Term|System of Systems (SoS) (glossary)|system of systems}} (SoS)) consists  of comparing the realized characteristics or properties of the product, service, or enterprise against its expected design properties.
  
Table 1 – Examples of verified and validated items
+
==Principles and Concepts==
  
===Integration, Verification, Validation, Final Validation, Operational Validation===
+
===Concept of Verification Action===
Verification activities begin during development (definition and realization) and continue into deployment and use.
 
  
+
====Why Verify?====
Once the System Elements have been realized, they begin integration into the complete system, and this integration includes Verification and Validation activities as stated in [[System Integration]]. These activities are done in parallel, reducing the tasks at Final Validation.
+
In the context of human realization, any human thought is susceptible to error. This is also the case with any engineering activity. Studies in human reliability have shown that people trained to perform a specific operation make around 1-3 errors per hour in best case scenarios. In any activity, or resulting outcome of an activity, the search for potential errors should not be neglected, regardless of whether or not one thinks they will happen or that they should not happen; the consequences of errors can cause extremely significant failures or threats.
  
 +
A '''verification action''' is defined, and then performed, as shown in Figure 1.
  
System Validation concerns the global system (Product, Service, or Enterprise) seen as a whole and is based on the totality of the requirements (System Requirements, Stakeholder Requirements). Three methods are used:
+
[[File:Definition_and_usage_of_a_Verification_Action.png|thumb|400px|center|'''Figure 1. Definition and Usage of a Verification Action.''' (SEBoK Original)]]
*First, the V&V results are aggregated from the application of the Verification Process and the Validation Process to every definition element and to every integration element;
 
*Second, final Validation is performed for the complete integrated system in an industrial environment (as close as possible to the operational environment);
 
*Third, operational Validation is performed on the complete system in its operational environment (context of use - higher level of system).
 
  
Operational Validation relates to the operational mission of the system and shows if the system is ready for use or for production.
+
The definition of a verification action applied to an engineering element includes the following:
 +
* Identification of the element on which the verification action will be performed
 +
* Identification of the reference to define the expected result of the verification action (see examples of reference in Table 1)
  
===Integration, Verification and Validation===
+
The performance of a verification action includes the following:
It is impossible to carry out only a single global Validation on a complete integrated complex system. The sources of faults/defects could be important and it would be impossible to determine the causes of a non conformance raised during this global check. As generally the System of Interest has been decomposed during design in a set of blocks and layers of systems and System Elements, thus every system, System Element is verified, validated and possibly corrected before being integrated into the parent system block of the higher level, as shown on Figure 2.
+
* Obtaining a result by performing the verification action onto the submitted element
 +
* Comparing the obtained result with the expected result
 +
* Deducing the degree of correctness of the element
  
[[File:SEBoKv05_KA-SystRealiz_Verification_and_Validation_level_per_level_V4.png|600px|center|Verification and Validation Level by Level]]
+
====What to Verify?====
 +
Any engineering element can be verified using a specific reference for comparison: stakeholder requirement, system requirement, function, system element, document, etc. Examples are provided in Table 1.
  
Figure 2 – Verification and Validation level per level (Faisandier & Roussel, 2011)
+
{|
 +
|+'''Table 1. Examples of Verified Items.''' (SEBoK Original)
 +
!Items
 +
!Explanation for Verification
 +
|-
 +
|'''Document'''
 +
|To verify a document is to check the application of drafting rules.
 +
|-
 +
|'''Stakeholder Requirement and System Requirement'''
 +
|To verify a stakeholder requirement or a system requirement is to check the application of syntactic and grammatical rules, characteristics defined in the stakeholder requirements definition process, and the system requirements definition process such as necessity, implementation free, unambiguous, consistent, complete, singular, feasible, traceable, and verifiable.
 +
|-
 +
|'''Design'''
 +
|To verify the design of a system is to check its logical and physical architecture elements against the characteristics of the outcomes of the design processes.
 +
|-
 +
|'''System'''
 +
|To verify a system (product, service, or enterprise) is to check its realized characteristics or properties against its expected design characteristics.
 +
|-
 +
|'''Aggregate'''
 +
|To verify an aggregate for integration is to check every interface and interaction between implemented elements.
 +
|-
 +
|'''Verification Procedure'''
 +
|To verify a verification procedure is to check the application of a predefined template and drafting rules.
 +
|}
  
 +
===Verification versus Validation===
 +
The term ''verification'' is often associated with the term ''validation'' and understood as a single concept of ''V&V''. Validation is used to ensure that ''one is working the right problem'', whereas verification is used to ensure that ''one has solved the problem right'' (Martin 1997). From an actual and etymological meaning, the term verification comes from the Latin ''verus'', which means truth, and ''facere'', which means to make/perform. Thus, verification means to prove that something is ''true'' or correct (a property, a characteristic, etc.). The term validation comes from the Latin ''valere'', which means to become strong, and has the same etymological root as the word ''value''. Thus, validation means to prove that something has the right features to produce the expected effects. (Adapted from "Verification and Validation in plain English" (Lake INCOSE 1999).)
  
As necessary, the systems, System Elements are partially integrated in sub-sets in order to limit the number of properties/characteristics to be verified within a single step - see [[System Integration]]. For each level, it is necessary to make sure by a set Verification and Validation Actions that the features stated at the preceding level are not damaged. Moreover, a compliant result obtained in a given environment (for example: final validation environment) can turn into non compliant if the environment changes (for example: operational validation environment). So, as long as the sub-system is not completely integrated and/or does not operate in the real operational environment, no result must be regarded as definitive.
+
The main differences between the verification process and the validation process concern the references used to check the correctness of an element, and the acceptability of the effective correctness.  
  
During modifications made to a system, the temptation is to focus on the new adapted configuration forgetting the environment and the other configurations. However, a modification can have significant consequences on other configurations. Thus, any modification requires regression Verification and Validation Actions (often called Regression Testing).
+
* Within verification, comparison between the expected result and the obtained result is generally binary, whereas within validation, the result of the comparison may require a judgment of value regarding whether or not to accept the obtained result compared to a threshold or limit.  
 +
* Verification relates more to one element, whereas validation relates more to a set of elements and considers this set as a whole.
 +
* Validation presupposes that verification actions have already been performed.
 +
* The techniques used to define and perform the verification actions and those for validation actions are very similar.
  
===Verification Actions and Validation Actions inside and transverse to levels===
+
===Integration, Verification, and Validation of the System===
Inside each level of system decomposition, Verification and Validation Actions are performed during System Definition and System Realization as represented in Figure 3 for the upper levels and in Figure 4 for lower levels. The Stakeholder Requirements Definition and the Operational Validation make the link between two levels of the system decomposition.
+
There is sometimes a misconception that verification occurs after integration and before validation. In most cases, it is more appropriate to begin verification activities during development or {{Term|Implementation (glossary)|implementation}} and to continue them into [[System Deployment and Use|deployment and use]].
 
 
[[File:SEBoKv05_KA-SystRealiz_V&V_Actions_upper_levels.png|500px|center|Verification and Validation Actions in Upper Levels of System Decomposition]]
 
 
 
Figure 3 – Verification and Validation Actions in upper levels of system decomposition (Faisandier, 2011)
 
 
 
 
 
The System Elements Requirements and the End products Operational Validation make the link between the two lower levels of the decomposition – see Figure 4.
 
 
 
[[File:SEBoKv05_KA-SystRealiz_V&V_Actions_lower_levels.png|500px|center|Verification and Validation Actions in Lower Levels of System Decomposition]]
 
 
 
Figure 4 - Verification and Validation Actions in lower levels of system decomposition (Faisandier, 2011)
 
 
 
 
 
Note 1: The two figures above show a perfect dispatching of verification and validation activities on the right part, using the corresponding references provided by the System Definition processes on the left part. Some times in the real practices, the outputs of the Stakeholder Requirements Definition Process are not sufficiently formalized or do not contain sufficient operational scenarios and cannot serve as a reference to define Verification and Validation Actions to be performed in the operational environment. In this case, the System Requirements Definition Process outputs may be used in place of.
 
 
 
Note 2: The last level of the system decomposition is dedicated to the Realization of the System Elements, and the vocabulary and the number of activities used on the Figure 4 may be different – see [[System Implementation]].
 
 
 
===Verification and Validation strategy===
 
The difference between verification and validation is especially useful for elaborating the Integration strategy, the Verification strategy, and the Validation strategy. In fact the efficiency of the System Realization is gained optimizing the three strategies together to form what is often called Verification & Validation strategy. The optimization consists to define and to perform the minimum of Verification and Validation Actions but detecting the maximum of errors/faults/defects and getting the maximum of confidence in the use of the Product, Service, or Enterprise. Of course the optimization takes into account the potential risks potentially generated if Verification and Validation Actions are dropped out.
 
  
 +
Once the system elements have been realized, they are integrated to form the complete system. Integration consists of assembling and performing verification actions as stated in the integration process. A final validation activity generally occurs when the system is integrated, but a certain number of validation actions are also performed parallel to the system integration in order to reduce the number of verification actions and validation actions while controlling the risks that could be generated if some checks are excluded. Integration, verification, and validation are intimately processed together due to the necessity of optimizing the strategy of verification and validation, as well as the strategy of integration.
  
 
==Process Approach==
 
==Process Approach==
===Introduction===
+
===Purpose and Principle of the Approach===
As explained above, there are two processes: the Verification Process and the Validation Process.
+
The purpose of the verification process is to confirm that the system fulfills the specified design requirements. This process provides the information required to effect the remedial actions that correct non-conformances in the realized system or the processes that act on it - see [[ISO/IEC/IEEE 15288]] (ISO/IEC/IEEE 2015).
  
Because of the generic nature of these processes, they can be applied to any engineering element that has conducted to the definition and realization of the System Elements, the systems and the System of Interest.
+
Each system element and the complete system itself should be compared against its own design references (specified requirements). As stated by Dennis Buede, ''verification is the matching of [configuration items], components, sub-systems, and the system to corresponding requirements to ensure that each has been built right'' (Buede 2009). This means that the verification process is instantiated as many times as necessary during the global development of the system. Because of the generic nature of a process, the verification process can be applied to any engineering element that has conducted to the definition and realization of the system elements and the system itself.
  
But facing the huge number of potential Verification and validation Actions that may be generated by this normal approach, it is necessary to optimize the verification and validation strategy. This strategy is based on the balance between what should be verified as a must, the constraints such as time, cost, and feasibility of testing that limit naturally the number of Verification and Validation Actions, and the risks one accepts dropping out some Verification and Validation Actions.
+
Facing the huge number of potential verification actions that may be generated by the normal approach, it is necessary to optimize the verification strategy. This strategy is based on the balance between what must be verified and constraints, such as time, cost, and feasibility of testing, which naturally limit the number of verification actions and the risks one accepts when excluding some verification actions.
  
Several approaches exist that may be used for defining the Verification and Validation Processes. INCOSE defines two main steps: plan and perform the Verification Actions.  (INCOSE. 2010)  
+
Several approaches exist that may be used for defining the verification process. The International Council on Systems Engineering (INCOSE) dictates that two main steps are necessary for verification: planning and performing verification actions (INCOSE 2012). NASA has a slightly more detailed approach that includes five main steps: prepare verification, perform verification, analyze outcomes, produce a report, and capture work products (NASA December 2007, 1-360, p. 102). Any approach may be used, provided that it is appropriate to the scope of the system, the constraints of the project, includes the activities of the process listed below in some way, and is appropriately coordinated with other activities.
  
NASA has a slightly more detailed approach that includes five main steps:  prepare verification, perform verification, analyze outcomes, produce report, and capture work products.  (NASA. 2007) page 102
+
'''Generic inputs''' are baseline references of the submitted element. If the element is a system, inputs are the logical and physical architecture elements as described in a system design document, the design description of internal interfaces to the system and interfaces requirements external to the system, and by extension, the system requirements.
+
'''Generic outputs''' define the verification plan that includes verification strategy, selected verification actions, verification procedures, verification tools, the verified element or system, verification reports, issue/trouble reports, and change requests on design.
Any approach may be used, provided that it is appropriate to the scope of the system, the constraints of the project, includes the activities listed above in some way, and is appropriately coordinated with other activities (including System Definition, System Realization, and extension to the rest of the life cycle).
 
  
 +
===Activities of the Process===
 +
To establish the verification strategy drafted in a verification plan (this activity is carried out concurrently to system definition activities), the following steps are necessary:
 +
* Identify verification scope by listing as many characteristics or properties as possible that should be checked. The number of verification actions can be extremely high.
 +
* Identify constraints according to their origin (technical feasibility, management constraints as cost, time, availability of verification means or qualified personnel, and contractual constraints that are critical to the mission) that limit potential verification actions.
 +
* Define appropriate verification techniques to be applied, such as inspection, analysis, simulation, peer-review, testing, etc., based on the best step of the project to perform every verification action according to the given constraints.
 +
* Consider a tradeoff of what should be verified (scope) taking into account all constraints or limits and deduce what can be verified; the selection of verification actions would be made according to the type of system, objectives of the project, acceptable risks, and constraints.
 +
* Optimize the verification strategy by defining the most appropriate verification technique for every verification action while defining necessary verification means (tools, test-benches, personnel, location, and facilities) according to the selected verification technique.
 +
* Schedule the execution of verification actions in the project steps or milestones and define the configuration of elements submitted to verification actions (this mainly involves testing on physical elements).
  
===Verification Process===
+
Performing verification actions includes the following tasks:
====Purpose of the Verification Process====
+
* Detail each verification action; in particular, note the expected results, the verification techniques to be applied, and the corresponding means required (equipment, resources, and qualified personnel).
The purpose of the [System] Verification Process is to confirm that the specified design requirements are fulfilled by the system. This process provides the information required to effect the remedial actions that correct non-conformances in the realized system or the processes that act on it. (ISO/IEC. 2008)
+
* Acquire verification means used during system definition steps (qualified personnel, modeling tools, mocks-up, simulators, and facilities), and then those used during the integration step (qualified personnel, verification tools, measuring equipment, facilities, verification procedures, etc.).
 +
* Carry out verification procedures at the right time, in the expected environment, with the expected means, tools, and techniques.
 +
* Capture and record the results obtained when performing verification actions using verification procedures and means.
  
It is possible to generalize the process using an extended purpose as follows: the purpose of the Verification Process applied to any element is to confirm that the applicable design reference is fulfilled by this element.
+
The obtained results must be analyzed and compared to the expected results so that the status may be recorded as either ''compliant'' or ''non-compliant''. {{Term|Systems Engineering (glossary)|Systems engineering}} (SE) practitioners will likely need to generate verification reports, as well as potential issue/trouble reports, and change requests on design as necessary.
 
 
Each System Element, system, and the complete System of Interest should be compared against its own design references. As stated by Dennis Buede, “verification is the matching of [configuration items], components, sub-systems, and the system to corresponding requirements to ensure that each has been built right.”  (Buede 2009) This means that the Verification Process is instantiated as many times as necessary during the global development of the system. The Verification Process occurs at every different level of the system decomposition and as necessary all along the system development.
 
 
 
'''The generic inputs''' are the baseline references of the submitted element. If the element is a system, the inputs are the functional and physical architectures elements as described in a System Design Document, the design description of the internal interfaces (Input/Output Flows, Physical Interfaces) to the system and the Interfaces Requirements external to the system.
 
 
 
'''The generic outputs''' are elements of the Verification and Validation Plan that includes the verification and validation strategy, the selected Verification and Validations Actions, the Verification and Validation Procedures, the Verification and Validation Tools, the verified element or system, the verification reports, the issue/trouble reports and change requests on the design.
 
 
 
====Activities of the Verification Process====
 
Major activities and tasks performed during this process include:
 
 
 
#'''Establish a verification strategy''' drafted in a [[Verification and Validation Plan (glossary)]] (this activity is carried out concurrently to System Definition activities) obtained by the following tasks:
 
##Identify the verification scope in listing as exhaustive as possible the characteristics or properties that should be checked; the number of Verification and Validation Actions can be very high;
 
##Identify the constraints according to their origin (technical feasibility, management constraints as cost, time, availability of verification and validation means or qualified personnel, contractual constraints as criticality of the mission) that limit potentially the Verification and Validation Actions;
 
##Define the appropriate verification and validation techniques to be applied such as inspection, analysis, simulation, peer-review, testing, etc., depending of the best step of the project to perform every Verification and Validation Action according to constraints;
 
##Trade off of what should be verified (scope) taking into account all the constraints or limits and deduce what can be verified; the selection of Verification and Validation Actions would be made according to the type of system, objectives of the project, acceptable risks and constraints;
 
##Optimize the Verification and Validation strategy defining the most appropriate verification technique for every Verification and Validation Action, defining the necessary verification and validation means (tools, test-benches, personnel, location, facilities) according to the selected verification technique, scheduling the Verification and Validation Actions execution in the project steps or milestones, defining the configuration of the elements submitted to Verification and Validation Actions (mainly about testing on physical elements).
 
#'''Perform the Verification and Validation Actions''' includes the following tasks:
 
##Detail each Verification and Validation Action, in particular the expected results, the verification technique to be applied and corresponding means (equipments, resources and qualified personnel);
 
##Acquire the verification and validation means used during the system definition steps (qualified personnel, modeling tools, mocks-up, simulators, facilities); then those during the integration step (qualified personnel, Verification and Validation Tools, measuring equipments, facilities, Verification and Validation Procedures, etc.);
 
##Carry out the Verification and Validation Procedures at the right time, in the expected environment, with the expected means, tools and techniques;
 
##Capture and record the results obtained when performing the Verification and Validation Actions using Verification and Validation Procedures and means.
 
#'''Analyze obtained results''' and compare them to the expected results; record the status compliant or not; generate verification reports and potential issue/trouble reports and change requests on the design as necessary.
 
#'''Control the process''' includes the following tasks:
 
##Update the Verification and Validation Plan according to the progress of the project; in particular the planned Verification and Validation Actions can be redefined because of unexpected events (addition, deletion or modification of actions);
 
##Coordinate the verification activities with the project manager for schedule, acquisition of means, personnel and resources, with the designers for issue/trouble/non conformance reports, with configuration manager for versions of physical elements, design baselines, etc.
 
 
 
 
 
===Validation Process===
 
====Purpose of the Validation Process====
 
The purpose of the [System] Validation Process is to provide objective evidence that the services provided by a system when in use comply with stakeholder requirements, achieving its intended use in its intended operational environment.  (ISO/IEC. 2008)
 
 
 
This process performs a comparative assessment and confirms that the stakeholders' requirements are correctly defined. Where variances are identified, these are recorded and guide corrective actions.  System validation is ratified by stakeholders.  (ISO/IEC. 2008)
 
 
 
The validation process demonstrates that the realized end product satisfies its stakeholders' (customers and other interested parties) expectations within the intended operational environments, with validation performed by anticipated operators and/or users. (NASA. 2007)
 
 
 
It is possible to generalize the process using an extended purpose as follows: the purpose of the Validation Process applied to any element is to demonstrate or prove that this element complies with its applicable requirements achieving its intended use in its intended operational environment.
 
 
 
Each System Element, system, and the complete System of Interest are compared against their own applicable requirements (System Requirements, Stakeholder Requirements). This means that the Validation Process is instantiated as many times as necessary during the global development of the system. The Validation Process occurs at every different level of the system decomposition and as necessary all along the system development. Because of the generic nature of a process, the Validation Process can be applied to any engineering element that has conducted to the definition and realization of the system elements, the systems and the System of Interest.
 
 
 
In order to ensure that validation is feasible, the implementation of requirements must be verifiable onto the submitted element.  Ensuring that requirements are properly written, i.e. quantifiable, measurable, unambiguous, etc., is essential.  In addition, verification/validation requirements are often written in conjunction with Stakeholder and System Requirements and provide the method for demonstrating the implementation of each System Requirement or Stakeholder Requirement.
 
 
 
'''The generic inputs''' are the baseline references of requirements applicable to the submitted element. If the element is a system, the inputs are the System Requirements and Stakeholder Requirements.
 
 
 
'''The generic outputs''' are elements of the Verification and Validation Plan.
 
 
 
====Activities of the ValidationProcess====
 
Major activities and tasks performed during this process include:
 
 
 
#'''Establish a validation strategy''' drafted in a Verification and Validation Plan (this activity is carried out concurrently to System Definition activities) obtained by the following tasks:
 
##Identify the validation scope that is represented by the [System and or Stakeholder] Requirements; normally, every requirement should be checked; the number of Verification and Validation Actions can be high;
 
##Identify the constraints according to their origin : same as for Verification Process;
 
##Define the appropriate verification/validation techniques to be applied : same as for Verification Process;
 
##Trade off of what should be validated (scope) : same as for Verification Process;
 
##Optimize the Verification and Validation strategy : same as for Verification process
 
#'''Perform the Verification and Validation Actions''' includes the following tasks: same as for Verification Process
 
#'''Analyze obtained results''' and compare them to the expected results; decide about the acceptability of the conformance/compliance; record the decision and the status compliant or not; generate validation reports and potential issue/trouble reports and change requests on the [System or Stakeholder] Requirements as necessary.  
 
#'''Control the process''' includes the following tasks: same as for Verification Process
 
  
 +
Controlling the process includes the following tasks:
 +
* Update the verification plan according to the progress of the project; in particular, planned verification actions can be redefined because of unexpected events.
 +
* Coordinate verification activities with the project manager: review the schedule and the acquisition of means, personnel, and resources. Coordinate with designers for issues/trouble/non-conformance reports and with the configuration manager for versions of the physical elements, design baselines, etc.
  
 
===Artifacts and Ontology Elements===
 
===Artifacts and Ontology Elements===
These processes may create several artifacts such as:  
+
This process may create several artifacts such as:
 +
* verification plans (contain the verification strategy)
 +
* verification matrices (contain the verification action, submitted element, applied technique, step of execution, system block concerned, expected result, obtained result, etc.)
 +
* verification procedures (describe verification actions to be performed, verification tools needed, the verification configuration, resources and personnel needed, the schedule, etc.)
 +
* verification reports
 +
* verification tools
 +
* verified elements
 +
* issue / non-conformance / trouble reports
 +
* change requests to the design
  
#Verification and Validation Plan (contains in particular the Verification and Validation strategy with objectives, constraints, the list of the selected Verification and Validation Actions, etc.)
+
This process utilizes the ontology elements displayed in Table 2 below.
#Verification and Validation Matrix (contains for each Verification and Validation Action, the submitted element, the applied technique / method, the step of execution, the system block concerned, the expected result, the obtained result, etc.)
 
#Verification and Validation Procedures (describe the Verification and Validation Actions to be performed, the Verification and Validation Tools needed, the Verification and Validation Configuration, resources, personnel, schedule, etc.)
 
#Verification and Validation Reports
 
#Verification and Validation Tools
 
#Verified and validated element (system, system element, etc.)
 
#Issue / Non Conformance / Trouble Reports
 
#Change Requests on requirement, product, service, enterprise
 
 
 
 
 
These processes handles the ontology elements of Table 3.
 
 
 
 
 
[[File:SEBoKv05_KA-SystRealiz_ontology_V&V.png|650px|center|Main Ontology Elements as Handled within Verification and Validation]]
 
 
 
Table 3 - Main ontology elements as handled within Verification and Validation
 
 
 
 
 
The main relationships between ontology elements are presented in Figure 5.
 
 
 
[[File:SEBoKv05_KA-SystRealiz_V&V_relationships.png|600px|center|Verification and Validation Elements Relationships with other Engineering Elements]]
 
 
 
Figure 5 – Verification and Validation elements relationships with other engineering elements (Faisandier, 2011)
 
 
 
 
Note: "Design Reference" is a generic term; instances depend of the type of submitted engineering elements, for example: specified requirements, description of design characteristics or properties, drafting rules, standards, regulations, etc.
 
 
 
===Checking and Correctness of Verification and Validation===
 
The main items to be checked during the Verification and Validation Processes concern the items produced by the Verification and Validation processes:
 
*The Verification and Validation Plan, the Verification and Validation Actions, the Verification and Validation Procedures, Verification and Validation reports respect their corresponding template.
 
*Every verification and validation activity has been planned, performed, recorded and has generated outcomes as defined in the processes description above.
 
  
 +
{|
 +
|+'''Table 2. Main Ontology Elements as Handled within Verification.''' (SEBoK Original)
 +
!Element
 +
!Definition
 +
----
 +
Attributes (examples)
 +
|-
 +
|'''Verification Action'''
 +
|A verification action describes what must be verified (the element as reference) on which element, the expected result, the verification technique to apply, on which level of decomposition.
 +
----
 +
Identifier, name, description
 +
|-
 +
|'''Verification Procedure'''
 +
|A verification procedure groups a set of verification actions performed together (as a scenario of tests) in a gin verification configuration.
 +
----
 +
Identifier, name, description, duration, unit of time
 +
|-
 +
|'''Verification Tool'''
 +
|A verification tool is a device or physical tool used to perform verification procedures (test bench, simulator, cap/stub, launcher, etc.).
 +
----
 +
Identifier, name, description
 +
|-
 +
|'''Verification Configuration'''
 +
|A verification configuration groups all physical elements (aggregates and verification tools) necessary to perform a verification procedure.
 +
----
 +
Identifier, name, description
 +
|-
 +
|'''Risk'''
 +
|An event having a probability of occurrence and a gravity degree on its consequence onto the system mission or on other characteristics (used for technical risk in engineering). A risk is the combination of vulnerability and of a danger or a threat.
 +
|-
 +
|'''Rationale'''
 +
|An argument that provides the justification for the selection of an engineering element.
 +
----
 +
Identifier, name, description (rationale, reasons for defining a verification action, a verification procedure, for using a verification tool, etc.)
 +
|}
  
 
===Methods and Techniques===
 
===Methods and Techniques===
====Verification and Validation techniques====
+
There are several verification techniques to check that an element or a system conforms to its design references or its specified requirements. These techniques are almost the same as those used for [[System Validation|validation]], though the application of the techniques may differ slightly. In particular, the purposes are different; verification is used to detect faults/defects, whereas validation is used to provide evidence for the satisfaction of (system and/or stakeholder) requirements. Table 3 below provides descriptions of some techniques for verification.
 
 
There are several verification/validation techniques / method to check that an element or a system complies to its [System, Stakeholders] Requirements. These techniques are common to verification and validation. The purposes are different; verification is used to detect faults/defects, whereas validation is to prove satisfaction of [System and/or Stakeholders] Requirements. Table 4 below provides synthetic descriptions of some techniques.
 
 
 
[[File:SEBoKv05_KA-SystRealiz_V&V_Techniques.png|700px|center|Verification Techniques and Types of System]]
 
 
 
 
 
Table 4 – Verification and Validation techniques
 
 
 
  
 
+
{|
Note: Demonstration and testing can be functional or structural. Functional demonstration and testing are designed to ensure that correct outputs are produced given specific inputs. For structural demonstration and testing, there are performance, recovery, interface, and stress considerations. These considerations will determine the system’s ability to perform and survive given expected conditions.
+
|+'''Table 3. Verification Techniques.''' (SEBoK Original)
 
+
!Verification Technique
====Validation/ Traceability Matrix====
+
!Description
 
+
|-
The importance of traceability is introduced in every topic of the [[System Definition]] KA using a Traceability Matrix. It may also be extended and used to record data such as the Verification and Validation Actions list, the selected Verification / validation Technique to verify / validate the implementation of every engineering element (in particular Stakeholder and System Requirement), the expected results, the obtained results when the Verification and Validation Action has been performed. The use of such a matrix enables the development team to ensure that the selected Stakeholder and System Requirements have been verified, or to evaluate the percentage of Verification and Validation Actions completed. In addition, the matrix helps to check the performed Verification and Validation activities against the planned activities as outlined in the Verification and Validation Plan, and finally to ensure that System Validation has been appropriately conducted.
+
|'''Inspection'''
 
+
|Technique based on visual or dimensional examination of an element; the verification relies on the human senses or uses simple methods of measurement and handling. Inspection is generally non-destructive, and typically includes the use of sight, hearing, smell, touch, and taste, simple physical manipulation, mechanical and electrical gauging, and measurement. No stimuli (tests) are necessary. The technique is used to check properties or characteristics best determined by observation (e.g. paint color, weight, documentation, listing of code, etc.).
===Application to Product systems, Service systems, Enterprise systems===
+
|-
 
+
|'''Analysis'''
Because of the generic aspect of the process, this one is applied as defined above. The main difference resides on the detailed implementation of the verification techniques described above in Table 5.
+
|Technique based on analytical evidence obtained without any intervention on the submitted element using mathematical or probabilistic calculation, logical reasoning (including the theory of predicates), modeling and/or simulation under defined conditions to show theoretical compliance. Mainly used where testing to realistic conditions cannot be achieved or is not cost-effective.
+
|-
[[File:SEBoKv05_KA-SystRealiz_Verification_techniques_and_types_of_system.png|600px|center|Verification techniques and types of system]]
+
|'''Analogy or Similarity'''
 
+
|Technique based on evidence of similar elements to the submitted element or on experience feedback. It is absolutely necessary to show by prediction that the context is invariant that the outcomes are transposable (models, investigations, experience feedback, etc.). Similarity can only be used if the submitted element is similar in design, manufacture, and use; equivalent or more stringent verification actions were used for the similar element, and the intended operational environment is identical to or less rigorous than the similar element.
Table 5 – Verification techniques and types of system
+
|-
 +
|'''Demonstration'''
 +
|Technique used to demonstrate correct operation of the submitted element against operational and observable characteristics without using physical measurements (no or minimal instrumentation or test equipment). Demonstration is sometimes called 'field testing'. It generally consists of a set of tests selected by the supplier to show that the element response to stimuli is suitable or to show that operators can perform their assigned tasks when using the element. Observations are made and compared with predetermined/expected responses. Demonstration may be appropriate when requirements or specification are given in statistical terms (e.g. mean time to repair, average power consumption, etc.).
 +
|-
 +
|'''Test'''
 +
|Technique performed onto the submitted element by which functional, measurable characteristics, operability, supportability, or performance capability is quantitatively verified when subjected to controlled conditions that are real or simulated. Testing often uses special test equipment or instrumentation to obtain accurate quantitative data to be analyzed.
 +
|-
 +
|'''Sampling'''
 +
|Technique based on verification of characteristics using samples. The number, tolerance, and other characteristics must be specified to be in agreement with the experience feedback.
 +
|}
  
 
==Practical Considerations==
 
==Practical Considerations==
Major pitfalls encountered with System Verification and Validation are presented in Table 6.
+
Key pitfalls and good practices related to this topic are described in the next two sections.
  
[[File:SEBoKv05_KA-SystRealiz_pitfalls_with_V&V.png|650px|center|Major Pitfalls with System Verification and Validation]]
+
===Pitfalls===
 +
Some of the key pitfalls encountered in planning and performing System Verification are provided in Table 4.
  
 +
{|
 +
|+'''Table 4. Major Pitfalls with System Verification''' (SEBoK Original)
 +
!Pitfall
 +
!Description
 +
|-
 +
|Confusion between verification and validation
 +
|Confusion between verification and validation causes developers to take the wrong reference/baseline to define verification and validation actions and/or to address the wrong level of granularity (detail level for verification, global level for validation).
 +
|-
 +
|No verification strategy
 +
|One overlooks verification actions because it is impossible to check every characteristic or property of all system elements and of the system in any combination of operational conditions and scenarios. A strategy (justified selection of verification actions against risks) must be established.
 +
|-
 +
|Save or spend time
 +
|Skip verification activity to save time.
 +
|-
 +
|Use only testing
 +
|Use only testing as a verification technique. Testing requires checking products and services only when they are implemented. Consider other techniques earlier during design; analysis and inspections are cost effective and allow discovering early potential errors, faults, or failures.
 +
|-
 +
|Stop verifications when funding is diminished
 +
|Stopping the performance of verification actions when budget and/or time are consumed. Prefer using criteria such as coverage rates to end verification activity.
 +
|}
  
Table 6 – Major pitfalls with System Verification and Validation
+
===Proven Practices===
 +
Some proven practices gathered from the references are provided in Table 5.
  
 +
{|
 +
|+'''Table 5. Proven Practices with System Verification.''' (SEBoK Original)
 +
!Practice
 +
!Description
 +
|-
 +
|Start verifications early in the development
 +
|The earlier characteristics of an element are verified in the project, the easier the corrections are to do and the consequences on schedule and cost will be fewer.
 +
|-
 +
|Define criteria ending verifications
 +
|Carrying out verification actions without limits generates a risk of drift for costs and deadlines. Modifying and verifying in a non-stop cycle until arriving at a perfect system is the best way to never supply the system. Thus, it is necessary to set limits of cost, time, and a maximum number of modification loops back for each verification action type, ending criteria (percentages of success, error count detected, coverage rate obtained, etc.).
 +
|-
 +
|Involve design responsible with verification
 +
|Include the verification responsible in the designer team or include some designer onto the verification team.
 +
|}
  
 +
==References==
  
Major proven practices encountered with System Verification and Validation are presented in Table 7.
+
===Works Cited===
 +
Buede, D.M. 2009. ''The Engineering Design of Systems: Models and Methods.'' 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.  
  
[[File:SEBoKv05_KA-SystRealiz_practices_with_V&V.png|650px|center|Proven Practices with System Verification and Validation]]
+
INCOSE. 2012. ''INCOSE Systems Engineering Handbook,'' version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.  
  
 +
ISO/IEC/IEEE. 2015.''Systems and Software Engineering - System Life Cycle Processes. ''Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.
  
Table 7 – Proven practices with System Verification and Validation
+
Lake, J. 1999. "V & V in Plain English." International Council on Systems Engineering (INCOSE) 9th Annual International Symposium, Brighton, UK, 6-10 June 1999.
  
==References==
+
NASA. 2007. ''Systems Engineering Handbook''. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.
 
 
===Citations===
 
Buede, D. M. 2009. The engineering design of systems: Models and methods. 2nd ed. Hoboken, NJ: John Wiley & Sons Inc.
 
 
 
INCOSE. 2010. INCOSE systems engineering handbook, version 3.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.
 
 
 
ISO/IEC. 2008. Systems and software engineering - system life cycle processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), ISO/IEC 15288:2008 (E).
 
 
 
NASA. 2007. Systems Engineering Handbook. Washington, D.C.: National Aeronautics and Space Administration (NASA), NASA/SP-2007-6105, December 2007.
 
  
 
===Primary References===
 
===Primary References===
INCOSE. 2010. [[INCOSE Systems Engineering Handbook|Systems Engineering Handbook]]: A Guide to Life Cycle Processes and Activities, version 3.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.  
+
INCOSE. 2012. ''[[INCOSE Systems Engineering Handbook]]: A Guide for System Life Cycle Processes and Activities'', version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.
  
ISO/IEC. 2008. [[ISO/IEC/IEEE 15288|Systems and Software Engineering - System Life Cycle Processes]]. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electronical Commission (IEC), [[ISO/IEC/IEEE 15288|ISO/IEC/IEEE 15288:2008 (E)]].  
+
ISO/IEC/IEEE. 2015. ''[[ISO/IEC/IEEE 15288|Systems and Software Engineering - System Life Cycle Processes]].'' Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC)/ Institute of Electrical and Electronics Engineers. [[ISO/IEC/IEEE 15288]]:2015.
  
NASA. 2007. [[NASA Systems Engineering Handbook|Systems Engineering Handbook]]. Washington, D.C.: National Aeronautics and Space Administration (NASA), NASA/SP-2007-6105, December 2007.
+
NASA. 2007. ''[[NASA Systems Engineering Handbook|Systems Engineering Handbook]].'' Washington, D.C.: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.
  
 
===Additional References===
 
===Additional References===
 +
Buede, D.M. 2009. ''The Engineering Design of Systems: Models and Methods,'' 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.
 +
 +
DAU. 2010. ''Defense Acquisition Guidebook (DAG)''. Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). February 19, 2010.
  
Buede, D. M. 2009. The engineering design of systems: Models and methods. 2nd ed. Hoboken, NJ: John Wiley & Sons Inc.  
+
ECSS. 2009. ''Systems Engineering General Requirements.'' Noordwijk, Netherlands: Requirements and Standards Division, European Cooperation for Space Standardization (ECSS), 6 March 2009. ECSS-E-ST-10C.  
  
DAU. February 19, 2010. Defense acquisition guidebook (DAG). Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense.  
+
MITRE. 2011. "Verification and Validation."  in ''Systems Engineering Guide.''  Accessed 11 March 2012 at [[http://mitre.org/work/systems_engineering/guide/se_lifecycle_building_blocks/test_evaluation/verification_validation.html]].
  
ECSS. 6 March 2009. Systems engineering general requirements. Noordwijk, Netherlands: Requirements and Standards Division, European Cooperation for Space Standardization (ECSS), ECSS-E-ST-10C.  
+
SAE International. 1996. ''Certification Considerations for Highly-Integrated or Complex Aircraft Systems.'' Warrendale, PA, USA: SAE International, ARP475.
  
SAE International. 1996. Certification considerations for highly-integrated or complex aircraft systems. Warrendale, PA, USA: SAE International, ARP475
+
SEI. 2007. "Measurement and Analysis Process Area" in ''Capability Maturity Model Integrated (CMMI) for Development, version 1.2.'' Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
  
SEI. 2007. Capability maturity model integrated (CMMI) for development, version 1.2, measurement and analysis process area. Pittsburg, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).
+
===Relevant Videos===
 +
*[https://www.youtube.com/watch?v=pcrkmaAx_QA Systems Engineering-Test, Evaluation, and Validation]
  
 
----
 
----
====Article Discussion====
+
<center>[[System Integration|< Previous Article]] | [[System Realization|Parent Article]] | [[System Transition|Next Article >]]</center>
  
[[{{TALKPAGENAME}}|[Go to discussion page]]]
+
<center>'''SEBoK v. 2.9, released 20 November 2023'''</center>
<center>[[System Integration|<- Previous Article]] | [[System Realization|Parent Article]] | [[System Deployment and Use|Next Article ->]]</center>
 
  
==Signatures==
 
 
[[Category: Part 3]][[Category:Topic]]
 
[[Category: Part 3]][[Category:Topic]]
 +
[[Category:System Realization]]

Latest revision as of 22:01, 18 November 2023


Lead Authors: John Snoderly, Alan Faisandier


System VerificationSystem Verification is a set of actions used to check the correctness of any element, such as a system elementsystem element, a systemsystem, a document, a serviceservice, a task, a requirementrequirement, etc. These types of actions are planned and carried out throughout the life cyclelife cycle of the system. Verification is a generic term that needs to be instantiated within the context it occurs. As a process, verification is a transverse activity to every life cycle stage of the system. In particular, during the development cycle of the system, the verification process is performed in parallel with the system definitionsystem definition and system realizationsystem realization processes and applies to any activity and any product resulting from the activity. The activities of every life cycle process and those of the verification process can work together. For example, the integrationintegration process frequently uses the verification process. It is important to remember that verification, while separate from validation, is intended to be performed in conjunction with validation.

Definition and Purpose

VerificationVerification is the confirmation, through the provision of objective evidence, that specified requirements have been fulfilled. With a note added in ISO/IEC/IEEE 15288, the scope of verification includes a set of activities that compares a system or system element against the requirements, architecture and design characteristics, and other properties to be verified (ISO/IEC/IEEE 2015). This may include, but is not limited to, specified requirements, design description, and the system itself.

The purpose of verification, as a generic action, is to identify the faults/defects introduced at the time of any transformation of inputs into outputs. Verification is used to provide information and evidence that the transformation was made according to the selected and appropriate methods, techniques, standards, or rules.

Verification is based on tangible evidence; i.e., it is based on information whose veracity can be demonstrated by factual results obtained from techniques such as inspection, measurement, testing, analysis, calculation, etc. Thus, the process of verifying a systemsystem (productproduct, serviceservice, enterpriseenterprise, or system of systemssystem of systems (SoS)) consists of comparing the realized characteristics or properties of the product, service, or enterprise against its expected design properties.

Principles and Concepts

Concept of Verification Action

Why Verify?

In the context of human realization, any human thought is susceptible to error. This is also the case with any engineering activity. Studies in human reliability have shown that people trained to perform a specific operation make around 1-3 errors per hour in best case scenarios. In any activity, or resulting outcome of an activity, the search for potential errors should not be neglected, regardless of whether or not one thinks they will happen or that they should not happen; the consequences of errors can cause extremely significant failures or threats.

A verification action is defined, and then performed, as shown in Figure 1.

Figure 1. Definition and Usage of a Verification Action. (SEBoK Original)

The definition of a verification action applied to an engineering element includes the following:

  • Identification of the element on which the verification action will be performed
  • Identification of the reference to define the expected result of the verification action (see examples of reference in Table 1)

The performance of a verification action includes the following:

  • Obtaining a result by performing the verification action onto the submitted element
  • Comparing the obtained result with the expected result
  • Deducing the degree of correctness of the element

What to Verify?

Any engineering element can be verified using a specific reference for comparison: stakeholder requirement, system requirement, function, system element, document, etc. Examples are provided in Table 1.

Table 1. Examples of Verified Items. (SEBoK Original)
Items Explanation for Verification
Document To verify a document is to check the application of drafting rules.
Stakeholder Requirement and System Requirement To verify a stakeholder requirement or a system requirement is to check the application of syntactic and grammatical rules, characteristics defined in the stakeholder requirements definition process, and the system requirements definition process such as necessity, implementation free, unambiguous, consistent, complete, singular, feasible, traceable, and verifiable.
Design To verify the design of a system is to check its logical and physical architecture elements against the characteristics of the outcomes of the design processes.
System To verify a system (product, service, or enterprise) is to check its realized characteristics or properties against its expected design characteristics.
Aggregate To verify an aggregate for integration is to check every interface and interaction between implemented elements.
Verification Procedure To verify a verification procedure is to check the application of a predefined template and drafting rules.

Verification versus Validation

The term verification is often associated with the term validation and understood as a single concept of V&V. Validation is used to ensure that one is working the right problem, whereas verification is used to ensure that one has solved the problem right (Martin 1997). From an actual and etymological meaning, the term verification comes from the Latin verus, which means truth, and facere, which means to make/perform. Thus, verification means to prove that something is true or correct (a property, a characteristic, etc.). The term validation comes from the Latin valere, which means to become strong, and has the same etymological root as the word value. Thus, validation means to prove that something has the right features to produce the expected effects. (Adapted from "Verification and Validation in plain English" (Lake INCOSE 1999).)

The main differences between the verification process and the validation process concern the references used to check the correctness of an element, and the acceptability of the effective correctness.

  • Within verification, comparison between the expected result and the obtained result is generally binary, whereas within validation, the result of the comparison may require a judgment of value regarding whether or not to accept the obtained result compared to a threshold or limit.
  • Verification relates more to one element, whereas validation relates more to a set of elements and considers this set as a whole.
  • Validation presupposes that verification actions have already been performed.
  • The techniques used to define and perform the verification actions and those for validation actions are very similar.

Integration, Verification, and Validation of the System

There is sometimes a misconception that verification occurs after integration and before validation. In most cases, it is more appropriate to begin verification activities during development or implementationimplementation and to continue them into deployment and use.

Once the system elements have been realized, they are integrated to form the complete system. Integration consists of assembling and performing verification actions as stated in the integration process. A final validation activity generally occurs when the system is integrated, but a certain number of validation actions are also performed parallel to the system integration in order to reduce the number of verification actions and validation actions while controlling the risks that could be generated if some checks are excluded. Integration, verification, and validation are intimately processed together due to the necessity of optimizing the strategy of verification and validation, as well as the strategy of integration.

Process Approach

Purpose and Principle of the Approach

The purpose of the verification process is to confirm that the system fulfills the specified design requirements. This process provides the information required to effect the remedial actions that correct non-conformances in the realized system or the processes that act on it - see ISO/IEC/IEEE 15288 (ISO/IEC/IEEE 2015).

Each system element and the complete system itself should be compared against its own design references (specified requirements). As stated by Dennis Buede, verification is the matching of [configuration items], components, sub-systems, and the system to corresponding requirements to ensure that each has been built right (Buede 2009). This means that the verification process is instantiated as many times as necessary during the global development of the system. Because of the generic nature of a process, the verification process can be applied to any engineering element that has conducted to the definition and realization of the system elements and the system itself.

Facing the huge number of potential verification actions that may be generated by the normal approach, it is necessary to optimize the verification strategy. This strategy is based on the balance between what must be verified and constraints, such as time, cost, and feasibility of testing, which naturally limit the number of verification actions and the risks one accepts when excluding some verification actions.

Several approaches exist that may be used for defining the verification process. The International Council on Systems Engineering (INCOSE) dictates that two main steps are necessary for verification: planning and performing verification actions (INCOSE 2012). NASA has a slightly more detailed approach that includes five main steps: prepare verification, perform verification, analyze outcomes, produce a report, and capture work products (NASA December 2007, 1-360, p. 102). Any approach may be used, provided that it is appropriate to the scope of the system, the constraints of the project, includes the activities of the process listed below in some way, and is appropriately coordinated with other activities.

Generic inputs are baseline references of the submitted element. If the element is a system, inputs are the logical and physical architecture elements as described in a system design document, the design description of internal interfaces to the system and interfaces requirements external to the system, and by extension, the system requirements. Generic outputs define the verification plan that includes verification strategy, selected verification actions, verification procedures, verification tools, the verified element or system, verification reports, issue/trouble reports, and change requests on design.

Activities of the Process

To establish the verification strategy drafted in a verification plan (this activity is carried out concurrently to system definition activities), the following steps are necessary:

  • Identify verification scope by listing as many characteristics or properties as possible that should be checked. The number of verification actions can be extremely high.
  • Identify constraints according to their origin (technical feasibility, management constraints as cost, time, availability of verification means or qualified personnel, and contractual constraints that are critical to the mission) that limit potential verification actions.
  • Define appropriate verification techniques to be applied, such as inspection, analysis, simulation, peer-review, testing, etc., based on the best step of the project to perform every verification action according to the given constraints.
  • Consider a tradeoff of what should be verified (scope) taking into account all constraints or limits and deduce what can be verified; the selection of verification actions would be made according to the type of system, objectives of the project, acceptable risks, and constraints.
  • Optimize the verification strategy by defining the most appropriate verification technique for every verification action while defining necessary verification means (tools, test-benches, personnel, location, and facilities) according to the selected verification technique.
  • Schedule the execution of verification actions in the project steps or milestones and define the configuration of elements submitted to verification actions (this mainly involves testing on physical elements).

Performing verification actions includes the following tasks:

  • Detail each verification action; in particular, note the expected results, the verification techniques to be applied, and the corresponding means required (equipment, resources, and qualified personnel).
  • Acquire verification means used during system definition steps (qualified personnel, modeling tools, mocks-up, simulators, and facilities), and then those used during the integration step (qualified personnel, verification tools, measuring equipment, facilities, verification procedures, etc.).
  • Carry out verification procedures at the right time, in the expected environment, with the expected means, tools, and techniques.
  • Capture and record the results obtained when performing verification actions using verification procedures and means.

The obtained results must be analyzed and compared to the expected results so that the status may be recorded as either compliant or non-compliant. Systems engineeringSystems engineering (SE) practitioners will likely need to generate verification reports, as well as potential issue/trouble reports, and change requests on design as necessary.

Controlling the process includes the following tasks:

  • Update the verification plan according to the progress of the project; in particular, planned verification actions can be redefined because of unexpected events.
  • Coordinate verification activities with the project manager: review the schedule and the acquisition of means, personnel, and resources. Coordinate with designers for issues/trouble/non-conformance reports and with the configuration manager for versions of the physical elements, design baselines, etc.

Artifacts and Ontology Elements

This process may create several artifacts such as:

  • verification plans (contain the verification strategy)
  • verification matrices (contain the verification action, submitted element, applied technique, step of execution, system block concerned, expected result, obtained result, etc.)
  • verification procedures (describe verification actions to be performed, verification tools needed, the verification configuration, resources and personnel needed, the schedule, etc.)
  • verification reports
  • verification tools
  • verified elements
  • issue / non-conformance / trouble reports
  • change requests to the design

This process utilizes the ontology elements displayed in Table 2 below.

Table 2. Main Ontology Elements as Handled within Verification. (SEBoK Original)
Element Definition

Attributes (examples)

Verification Action A verification action describes what must be verified (the element as reference) on which element, the expected result, the verification technique to apply, on which level of decomposition.

Identifier, name, description

Verification Procedure A verification procedure groups a set of verification actions performed together (as a scenario of tests) in a gin verification configuration.

Identifier, name, description, duration, unit of time

Verification Tool A verification tool is a device or physical tool used to perform verification procedures (test bench, simulator, cap/stub, launcher, etc.).

Identifier, name, description

Verification Configuration A verification configuration groups all physical elements (aggregates and verification tools) necessary to perform a verification procedure.

Identifier, name, description

Risk An event having a probability of occurrence and a gravity degree on its consequence onto the system mission or on other characteristics (used for technical risk in engineering). A risk is the combination of vulnerability and of a danger or a threat.
Rationale An argument that provides the justification for the selection of an engineering element.

Identifier, name, description (rationale, reasons for defining a verification action, a verification procedure, for using a verification tool, etc.)

Methods and Techniques

There are several verification techniques to check that an element or a system conforms to its design references or its specified requirements. These techniques are almost the same as those used for validation, though the application of the techniques may differ slightly. In particular, the purposes are different; verification is used to detect faults/defects, whereas validation is used to provide evidence for the satisfaction of (system and/or stakeholder) requirements. Table 3 below provides descriptions of some techniques for verification.

Table 3. Verification Techniques. (SEBoK Original)
Verification Technique Description
Inspection Technique based on visual or dimensional examination of an element; the verification relies on the human senses or uses simple methods of measurement and handling. Inspection is generally non-destructive, and typically includes the use of sight, hearing, smell, touch, and taste, simple physical manipulation, mechanical and electrical gauging, and measurement. No stimuli (tests) are necessary. The technique is used to check properties or characteristics best determined by observation (e.g. paint color, weight, documentation, listing of code, etc.).
Analysis Technique based on analytical evidence obtained without any intervention on the submitted element using mathematical or probabilistic calculation, logical reasoning (including the theory of predicates), modeling and/or simulation under defined conditions to show theoretical compliance. Mainly used where testing to realistic conditions cannot be achieved or is not cost-effective.
Analogy or Similarity Technique based on evidence of similar elements to the submitted element or on experience feedback. It is absolutely necessary to show by prediction that the context is invariant that the outcomes are transposable (models, investigations, experience feedback, etc.). Similarity can only be used if the submitted element is similar in design, manufacture, and use; equivalent or more stringent verification actions were used for the similar element, and the intended operational environment is identical to or less rigorous than the similar element.
Demonstration Technique used to demonstrate correct operation of the submitted element against operational and observable characteristics without using physical measurements (no or minimal instrumentation or test equipment). Demonstration is sometimes called 'field testing'. It generally consists of a set of tests selected by the supplier to show that the element response to stimuli is suitable or to show that operators can perform their assigned tasks when using the element. Observations are made and compared with predetermined/expected responses. Demonstration may be appropriate when requirements or specification are given in statistical terms (e.g. mean time to repair, average power consumption, etc.).
Test Technique performed onto the submitted element by which functional, measurable characteristics, operability, supportability, or performance capability is quantitatively verified when subjected to controlled conditions that are real or simulated. Testing often uses special test equipment or instrumentation to obtain accurate quantitative data to be analyzed.
Sampling Technique based on verification of characteristics using samples. The number, tolerance, and other characteristics must be specified to be in agreement with the experience feedback.

Practical Considerations

Key pitfalls and good practices related to this topic are described in the next two sections.

Pitfalls

Some of the key pitfalls encountered in planning and performing System Verification are provided in Table 4.

Table 4. Major Pitfalls with System Verification (SEBoK Original)
Pitfall Description
Confusion between verification and validation Confusion between verification and validation causes developers to take the wrong reference/baseline to define verification and validation actions and/or to address the wrong level of granularity (detail level for verification, global level for validation).
No verification strategy One overlooks verification actions because it is impossible to check every characteristic or property of all system elements and of the system in any combination of operational conditions and scenarios. A strategy (justified selection of verification actions against risks) must be established.
Save or spend time Skip verification activity to save time.
Use only testing Use only testing as a verification technique. Testing requires checking products and services only when they are implemented. Consider other techniques earlier during design; analysis and inspections are cost effective and allow discovering early potential errors, faults, or failures.
Stop verifications when funding is diminished Stopping the performance of verification actions when budget and/or time are consumed. Prefer using criteria such as coverage rates to end verification activity.

Proven Practices

Some proven practices gathered from the references are provided in Table 5.

Table 5. Proven Practices with System Verification. (SEBoK Original)
Practice Description
Start verifications early in the development The earlier characteristics of an element are verified in the project, the easier the corrections are to do and the consequences on schedule and cost will be fewer.
Define criteria ending verifications Carrying out verification actions without limits generates a risk of drift for costs and deadlines. Modifying and verifying in a non-stop cycle until arriving at a perfect system is the best way to never supply the system. Thus, it is necessary to set limits of cost, time, and a maximum number of modification loops back for each verification action type, ending criteria (percentages of success, error count detected, coverage rate obtained, etc.).
Involve design responsible with verification Include the verification responsible in the designer team or include some designer onto the verification team.

References

Works Cited

Buede, D.M. 2009. The Engineering Design of Systems: Models and Methods. 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.

INCOSE. 2012. INCOSE Systems Engineering Handbook, version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.

ISO/IEC/IEEE. 2015.Systems and Software Engineering - System Life Cycle Processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.

Lake, J. 1999. "V & V in Plain English." International Council on Systems Engineering (INCOSE) 9th Annual International Symposium, Brighton, UK, 6-10 June 1999.

NASA. 2007. Systems Engineering Handbook. Washington, DC, USA: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Primary References

INCOSE. 2012. INCOSE Systems Engineering Handbook: A Guide for System Life Cycle Processes and Activities, version 3.2.2. San Diego, CA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-2003-002-03.2.2.

ISO/IEC/IEEE. 2015. Systems and Software Engineering - System Life Cycle Processes. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC)/ Institute of Electrical and Electronics Engineers. ISO/IEC/IEEE 15288:2015.

NASA. 2007. Systems Engineering Handbook. Washington, D.C.: National Aeronautics and Space Administration (NASA), December 2007. NASA/SP-2007-6105.

Additional References

Buede, D.M. 2009. The Engineering Design of Systems: Models and Methods, 2nd ed. Hoboken, NJ, USA: John Wiley & Sons Inc.

DAU. 2010. Defense Acquisition Guidebook (DAG). Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). February 19, 2010.

ECSS. 2009. Systems Engineering General Requirements. Noordwijk, Netherlands: Requirements and Standards Division, European Cooperation for Space Standardization (ECSS), 6 March 2009. ECSS-E-ST-10C.

MITRE. 2011. "Verification and Validation." in Systems Engineering Guide. Accessed 11 March 2012 at [[1]].

SAE International. 1996. Certification Considerations for Highly-Integrated or Complex Aircraft Systems. Warrendale, PA, USA: SAE International, ARP475.

SEI. 2007. "Measurement and Analysis Process Area" in Capability Maturity Model Integrated (CMMI) for Development, version 1.2. Pittsburgh, PA, USA: Software Engineering Institute (SEI)/Carnegie Mellon University (CMU).

Relevant Videos


< Previous Article | Parent Article | Next Article >
SEBoK v. 2.9, released 20 November 2023