Virginia Class Submarine

From SEBoK
Jump to navigation Jump to search

Prior to the Virginia class submarine, sonar systems were comprised of proprietary components and interfaces. However, in the mid-1990s the United States government transitioned to the use of commercially developed products (COTS) as a cost-saving measure to reduce the escalating costs associated with proprietary-based research and development. The Virginia class submarine system design represented a transition to COTS-based parts and initiated a global change in architectural approaches adopted by the sonar community. The lead ship of the program, Virginia, reduced the number of historically procured parts for nuclear submarines by 60% with the use of standardization. The Virginia class submarine sonar system architecture has improved modularity, commonality, standardization, and reliability, as well as maintainability and testability (RMT) over historical sonar systems.

Architectural Approach: Standardization

Based on the new architectural approach and the success of the transition, system architecture experts developed an initial set of architecture evaluation metrics. These metrics included:

  • Commonality
    • Physical commonality (within the system)
      • Hardware (HW) commonality (e.g., the number of unique line replaceable units, fasteners, cables, and unique standards implemented)
      • Software (SW) commonality (e.g., the number of unique SW packages implemented, languages, compilers, average SW instantiations, and unique standards implemented)
    • Physical familiarity (with other systems)
      • Percentage of vendors and subcontractors known
      • Percentage of HW and SW technology Known
    • Operational commonality
      • Percentage of operational functions which are automated
      • Number of unique skill codes required
      • Estimated operational training time (e.g., initial and refresh from previous system)
      • Estimated maintenance training time (e.g., initial and refresh from previous system)
  • Modularity
    • Physical modularity (e.g., ease of system element or operating system upgrade)
    • Functional modularity (e.g., ease of adding new functionality or upgrading existing functionality)
    • Orthogonality
      • Level to which functional requirements are fragmented across multiple processing elements and interfaces
      • Level to which throughput requirements span across interfaces
      • Level to which common specifications are identified
    • Abstraction (i.e., the level to which the system architecture provides an option for information hiding)
    • Interfaces
      • Number of unique interfaces per system element
      • Number of different networking protocols
      • Explicit versus implicit interfaces
      • Level to which the architecture includes implicit interfaces
      • Number of cables in the system
  • Standards-based openness
    • Interface standards
      • Ratio of the number of interface standards to the number of interfaces
      • Number of vendors for products based on standards
      • Number of business domains that apply/use the standard (e.g., aerospace, medical, and telecommunications)
      • Standard maturity
    • Hardware standards
      • Ratio of the number of form factors to the number of line replaceable units (LRUs)
      • Number of vendors for products based on standards
      • Standard maturity
    • Software standards
      • Number of proprietary and unique operating systems
      • Number of non-standard databases
      • Number of proprietary middle-ware
      • Number of non-standard languages
    • Consistency orientation
      • Common guidelines for implementing diagnostics and performance monitor/fault location (PM/FL)
      • Common guidelines for implementing human-machine interface (HMI)
  • Reliability, maintainability, and testability
    • Reliability (fault tolerance)
    • Critical points of fragility (e.g., system loading comprised of percent of processor, memory, and network loading)
    • Maintainability (e.g., expected mean time to repair (MTTR), maximum fault group size, whether the system can be operational during maintenance)
    • Accessibility (e.g., space restrictions, special tool requirements, special skill requirements)
    • Testability
      • Number of line replaceable units (LRUs) covered by built-in tests (BIT)| (BIT coverage)
      • Reproducibility of errors
      • Logging/recording capability
      • Whether the system state at time of system failure can be recreated
      • Online testing (e.g., whether the system is operational during external testing and the ease of access to external testpoints)
      • Automated input/stimulation insertion

Summary

In summary, the work on the Virginia class submarine prompted a change in the traditional architectural approach used in the sonar community to design submarine sonar and validated the cost savings in both research and development and in component costs when transitioning from proprietary interfaces to industry standard interfaces. The identification of a list of feasible architecture evaluation metrics was an added benefit of the effort.


References

Works Cited

GAO. 2008. Defense Acquisitions: Assessment of Selected Weapon Programs Report. Washington, DC, USA: US. Government Accountability Office (GAO). March 2009. GAO-09-326SP.

GD Electric Boat Division. 2002. The Virginia Class Submarine Program: A Case Study. Groton, CT: General Dynamics. February, 2002.

Primary References

No primary references have been identified for version 0.75. Please provide any recommendations on primary references in your review.

Additional References

No additional references have been identified for version 0.75. Please provide any recommendations on additional references in your review.


< Previous Article | Parent Article | Next Article >

Comments from SEBok 0.5 Wiki

No comments were logged for this article in the SEBoK 0.5 wiki. Because of this, it is especially important for reviewers to provide feedback on this article. Please see the discussion prompts below.


SEBoK v. 1.9.1 released 30 September 2018

SEBoK Discussion

Please provide your comments and feedback on the SEBoK below. You will need to log in to DISQUS using an existing account (e.g. Yahoo, Google, Facebook, Twitter, etc.) or create a DISQUS account. Simply type your comment in the text field below and DISQUS will guide you through the login or registration steps. Feedback will be archived and used for future updates to the SEBoK. If you provided a comment that is no longer listed, that comment has been adjudicated. You can view adjudication for comments submitted prior to SEBoK v. 1.0 at SEBoK Review and Adjudication. Later comments are addressed and changes are summarized in the Letter from the Editor and Acknowledgements and Release History.

If you would like to provide edits on this article, recommend new content, or make comments on the SEBoK as a whole, please see the SEBoK Sandbox.

blog comments powered by Disqus