Virginia Class Submarine

From SEBoK
Jump to navigation Jump to search

Overview

The Virginia-class attack submarine is a weapon system designed to combat enemy submarines and surface ships in both the open ocean and the littorals, with the ability to fire cruise missiles and provide improved surveillance and special operation in the littorals (GAO, 2008). Virginia was originally designed to facilitate technology insertion over the life of the ship (GD Electric Boat, 2002).

Application Domain: Sonar Systems, Combat Systems, COTS

Vignette Description

Prior to the Virginia Class Submarine, sonar systems were comprised of proprietary components and interfaces. However, the Virginia Class submarine system design represented a global change in architectural approaches adopted by the United States government in the mid-1990s that included the acceptance of commercially developed products (COTS) as a cost savings measure. The lead ship of the program, Virginia, reduced the number of historically procured parts for nuclear submarines by sixty percent with the use of standardization. While previous sonar system architectures provided similar functionality, when you compare historical sonar systems on the basis of modularity, commonality, standards, and reliability, maintainability and testability (RMT), the Virginia Class Submarine Sonar System architecture comes out ahead in every aspect.

Summary

Based on lessons learned from the system architecture experts on these and other related programs, the initial set of system architecture evaluation metrics listed below, was developed:

  • Commonality
    • Physical Commonality (Within the system)
      • HW Commonality (# of unique line replaceable units, fasteners, cables, unique standards implemented)
      • SW Commonality ( # of unique SW packages implemented, languages, compilers, average SW instantiations, unique standards implemented)
    • Physical Familiarity (From other Systems)
      • % Vendors, Subcontractors Known
      • % HW, SW Technology Known
    • Operational Commonality
      • % of Operational Functions Automated
      • Number of Unique Skill Codes Required
      • Estimated Operational Training Time - Initial, Refresh from Previous System
      • Estimated Maintenance Training Time - Initial, Refresh from Previous System
  • Modularity
    • Physical Modularity (Ease of system element, operating system upgrade)
    • Functional Modularity (Ease of adding new functionality, upgrading existing functionality)
    • Orthogonality
      • Are functional requirements fragmented across multiple processing elements and interfaces?
      • Are there throughput requirements across interfaces?
      • Are common specifications identified?
    • Abstraction (Does the system architecture provide an option for information hiding?)
    • Interfaces
      • '# of Unique Interfaces per System Element
      • '# of Different Networking Protocols
      • Explicit versus Implicit Interfaces
      • Does the architecture involve implicit interfaces?
      • '# of Cables in the System
  • Standards Based - Openness
    • Interface Standards
      • '# of Interface Standards/# of Interfaces
      • Multiple Vendors Exist for Products Based on Standards
      • Multiple Business Domains Apply/Use Standard (Aerospace, Medical, Telecommunications)
      • Standard Maturity
    • Hardware Standards
      • '# of Form Factors/# of LRUs
      • Multiple Vendors Exist for Products Based on Standards
      • Multiple Business Domains Apply/Use Standard (Aerospace, Medical, Telecommunications)
      • Standard Maturity
    • Software Standards
      • '# of proprietary & unique operating systems
      • '# of non-std databases
      • '# of proprietary middle-ware
      • '# of non-std languages
    • Consistency Orientation
      • Common Guidelines for Implementing Diagnostics and PM/FL
      • Common Guidelines for Implementing HMI
  • Reliability, Maintainability, Testability
    • Reliability (Fault Tolerance)
    • Critical Points of Delicateness (System Loading comprised of % of Processor, Memory, Network Loading)
    • Maintainability (Expected MTTR, Maximum Fault Group Size, Is system operational during maintenance?)
    • Accessibility (Are there space restrictions? special tool requirements? special skill requirements?)
    • Testability
      • '# of LRUs covered by BIT (BIT Coverage)
      • Reproducibility of Errors
      • Logging/Recording Capability
      • Create system state at time of system failure?
      • Online Testing (Is system operational during external testing?, Ease of access to external testpoints?)
      • Automated Input/Stimulation Insertion

References

  • GAO. 2008. GAO-08-467SP: Defense Acquisitions: Assessment of Selected Weapon Programs Report GAO-09-326SP, United States Government Accountability Office, March, 2009.
  • General Dynamics Electric Boat. 2002. The VIRGINIA Class Submarine Program: A Case Study. Groton, CT. (February 2002).

Citations

List all references cited in the article. Note: SEBoK 0.5 uses Chicago Manual of Style (15th ed). See the BKCASE Reference Guidance for additional information.

Primary References

All primary references should be listed in alphabetical order. Remember to identify primary references by creating an internal link using the ‘’’reference title only’’’ (title). Please do not include version numbers in the links.

Additional References

All additional references should be listed in alphabetical order.

Article Discussion

[Go to discussion page]