Difference between revisions of "Why Model?"

From SEBoK
Jump to navigation Jump to search
(No difference)

Revision as of 19:06, 18 August 2011

System models can be used for many purposes. This section highlights some of those purposes in support of model-based systems engineering (MBSE), and provides indicators of an effective model.

Purpose of a Model

Models are representations that can aid in defining, analyzing, and communicating a set of concepts. System models are specifically developed to support analysis, specification, design, and verification of a system, and communication of this information. One of the first principles of modeling is to clearly define the purpose of the model. Some of the purposes that models can serve throughout the system life cycle are highlighted below.

  • Characterizing an existing system. Many existing systems may be poorly documented, and modeling the system can provide a concise way to capture the existing system design. This information can then be used to facilitate maintaining the system, or to assess the system with the goal of improving it. This is analogous to creating an architectural model of an old building with overlays for electrical, plumbing, and structure, before proceeding to upgrade it to withstand new earthquake standards.
  • Mission and system concept formulation and evaluation. Models can be applied early in the system life cycle to synthesize and evaluate alternative mission and system concepts through. This includes clearly and unambiguously defining the system's mission and the value it is expected to deliver to its beneficiaries. Models can be used to explore a trade-space by modeling alternative system designs, and assessing the impact of critical system parameters such as weight, speed, accuracy, reliability, and cost on the overall measures of merit. In addition to bounding the system design parameters, the models can also be used to validate that the system requirements meet the stakeholder needs, before proceeding with synthesizing the detailed system design.
  • System design synthesis and requirements flowdown. Models can be applied to support architecting system solutions, and flow the mission and system requirements down to the components of the system. Different models may be required to specify different aspects of the system design and address the broad range of system requirements. This may include models to specify functional, interface, performance, and physical requirements, and other non-functional requirements such as reliability, maintainability, safety, and security.
  • Support for system integration and verification. Models can be applied to support integration of the hardware and software components into a system, and verify that the system satisfies its requirements. This often involves integrating lower level hardware and software design models with the system-level design models that verify the system requirements are satisfied. System integration and verification may also include replacing selected hardware and design models with actual hardware and software products, to incrementally verify that the system requirements are satisfied. This is referred to as hardware- and software-in-the-loop testing (glossary). The models can also be used to define the test cases (glossary) and the other aspects of the test program to assist in test planning and execution.
  • Support for Training. Models can be used to simulate various aspects of the system to help train users on how to interact with the system. The users may be operators, maintainers, or potentially other stakeholders. The models may be a basis for developing a simulator of the system that provide very high fidelity representation of the user interaction in different usage scenarios.
  • Knowledge Capture and System Design Evolution. Models can provide an effective means for capturing knowledge about the system, and retaining it as part of the organizational knowledge. This knowledge , which can be reused and evolved, provide a basis for supporting the evolution of the system, including changing requirements in the face of emerging relevant technologies, new applications, and new customers.

Indicators of an Effective Model

A model is intended to support one or more purposes. The value of a model can be assessed in terms of how effectively it supports its purpose(s). The following are some indicators of an effective model taken in part from A Practical Guide to SysML.

Model Scope

The model must be scoped to address the intended purpose. In particular, the type of models and associated modeling languages must be selected to support the specific needs. For example, a system architecture model may be required to describe the interconnection among the airplane parts, a trajectory analysis model is required to analyze the airplane trajectory, and a fault tree analysis model is required to assess the potential causes of airplane failure.

For each type of model, the appropriate breadth, depth, and fidelity of the model should be determined to address the intended purpose. The model breadth reflects the system requirements coverage in terms of the degree to which the model must address the functional, interface, performance, and physical requirements and other non-functional requirements, such reliability, maintainability, and safety. For an airplane functional model, the model breadth may be required to address some or all of the functional requirements to power up, takeoff, fly, land, power down, and maintain the aircraft environment.

The model depth indicates the coverage of system decomposition from the system context down to the system components. For the airplane example, the model scope may be required to model the system context that includes the aircraft, the control tower and the physical environment, down to the navigation subsystem, its components, such as the inertial measurement unit, and perhaps down to lower-level parts of the inertial measurement unit.

The model fidelity indicates the level of detail the model must represent for any given part of the model. For example, a model that specifies the system interfaces may be fairly abstract and represent only the logical information content, such as aircraft status data, or it may be much more detailed to support higher fidelity information that includes the encoding of a message in terms of bits, bytes, and signal characteristics. Fidelity can also refer to the precision of a computational model, such as the time step required for a simulation.

Indicators of Model Quality

The quality of a model should not be confused with the quality of the design that the model represents. For example, one may have a high-quality computer-aided design model of a chair, which accurately represents the design of the chair. However, the design may be flawed, such that when one sits in the chair, it falls apart. A high quality model should assist the design team in assessing the quality of the design and uncovering design issues.

Model Quality is often assessed in terms of the adherence of the model to modeling guidelines and the degree to which the model addresses its intended purpose. Typical examples of modeling guidelines include naming conventions, application of appropriate model annotations, proper use of modeling constructs, and applying model reuse considerations. Specific guidelines are different for different types of models. For example, the guidelines for developing a geometric model using a computer-aided design tool may include conventions for defining coordinate systems, dimensioning, and tolerances.

Model-based Metrics

Models can provide a wealth of information that can be used for both technical and management metrics to assess the modeling effort and in some cases, the overall systems engineering effort. Different types of models will provide different types of information. In general, models can provide information that enables one to perform the following:

  • Assess progress
  • Estimate effort and cost
  • Assess technical quality and risk
  • Assess model quality

The models can be used to capture metrics that are similar to those captured in a traditional document-based approach to systems engineering, but potentially more precisely due to the more precise nature of models over documents. Traditional systems engineering metrics are described in the Metrics Guidebook for Integrated Systems and Product Development.

Progress can be assessed in terms of the completeness of the modeling effort relative to the defined scope of the model. Models may also be used to assess progress in terms of the extent to which requirements have been satisfied by the design or verified through test. When augmented with productivity metrics, the model can be used to estimate the cost to perform the required systems engineering effort to deliver the system.

The models can be used to identify critical system parameters, and assess technical risk in terms of uncertainty in those parameters. The models may also be used to provide additional metrics that are associated with its purpose. For example, when the modeling purpose is to support mission and system concept formulation and evaluation, then a key metric may be the number of alternative concepts that are explored over a specified period of time.

References

Please make sure all references are listed alphabetically and are formatted according to the Chicago Manual of Style (15th ed). See the BKCASE Reference Guidance for additional information.

Citations

List all references cited in the article. Note: SEBoK 0.5 uses Chicago Manual of Style (15th ed). See the BKCASE Reference Guidance for additional information.

Primary References

Friedenthal, S., A. Moore, and R. Steiner. 2009. A Practical Guide to SysML: The Systems Modeling Language. Morgan Kaufman. Needham, MA, USA: OMG Press.(Chapter 2)

Wilbur, A., Towers, G., Sherman, T., Yasukawa, D. and Sue Shreve. 2005. Metrics Guidebook for Integrated Systems and Product Development. Seattle, WA, USA: International Council on Systems Engineering (INCOSE), INCOSE-TP-1995-002-01. Available at http://www.incose.org/ProductsPubs/products/metricsguidebook.aspx

Additional References

All additional references should be listed in alphabetical order.


Article Discussion

[Go to discussion page]

<- Previous Article | Parent Article | Next Article ->

Signatures

--Radcock 21:35, 10 August 2011 (UTC) --Olwell