Difference between revisions of "System Security"

From SEBoK
Jump to navigation Jump to search
m (Text replacement - "SEBoK v. 2.9, released 20 November 2023" to "SEBoK v. 2.10, released 06 May 2024")
 
(89 intermediate revisions by 7 users not shown)
Line 1: Line 1:
Security engineering is concerned with building systems that remain secure despite malice or error. It focuses on the tools, processes, and methods needed to design and implement complete systems that proactively and reactively mitigate vulnerabilities. Security engineering is a primary discipline used to achieve [[System Assurance (glossary)|system assurance]].
+
----
 +
'''''Lead Author:''''' Mark Winstead'', '''Contributing Authors:''''' Terri Chan and Keith Willett 
 +
----
 +
Security is freedom from those conditions that may lead to loss of assets (anything of value) with undesired consequences (Ross, Winstead, & McEvilley 2022). Systems Security specifically is concerned with systems delivering capability (an asset) with intended and only intended behaviors and outcomes (no unacceptable consequences such as loss of asset integrity) in contested operational environments (cyberspace or physical).  
  
The term System Security Engineering (SSE) is used to denote this specialty engineering field and the US Department of Defense define it as: ''"an element of system engineering that applies scientific and engineering principles to identify security vulnerabilities and minimize or contain risks associated with these vulnerabilities"'' (DODI5200.44, 12).
+
Restated, Systems Security is about engineering for intended and authorized system behavior and outcomes despite anticipated and unanticipated adversity, conditions that may cause loss (e.g., threats, attacks, hazards, disruptions, exposures). (McEvilley & Winstead 2022)
  
Please note that not all of the generic below sections have mature content at this timeAnyone wishing to offer content suggestions should contact the [[BKCASE Governance and Editorial Board|SEBoK Editors]] in the usual ways.
+
'''''Note:''''' This is a completely new article inserted in SEBoK 2.9, replacing the previous version by Richard Fairley, Alice Squires, and Keith Willett which originally appeared in SEBoK 1.0 and was periodically updated through SEBoK 2.8.  
  
 
==Overview==
 
==Overview==
Security engineering incorporates a number of cross-disciplinary skills, including cryptography, computer security, tamper-resistant hardware, applied psychology, supply chain management, and law. Security requirements differ greatly from one system to the next.  System security often has many layers built on user authentication, transaction accountability, message secrecy, and fault tolerance.  The challenges are protecting the right items rather than the wrong items and protecting the right items but not in the wrong way.
+
Secure systems ideally have three essential characteristics (Ross, Winstead, & McEvilley 2022):
 
+
* Enable required system capability delivery despite intentional and unintentional forms of adversity.
Security engineering is an area of increasing emphasis in the defense domain. Baldwin et al. (2012) provide a survey of the issues and a detailed reference list.  
+
* Enforce constraints to ensure only the desired behaviors and outcomes associated with the required capability are realized while realizing the first characteristic.
 +
* Enforce constraints based on a set of rules defining the only authorized interactions and operations allowed to occur while satisfying the second characteristic.
  
The primary objective of System Security Engineering (SSE) is to minimize or contain defense system vulnerabilities to known or postulated security threats and to ensure that developed systems protect against these threats. Engineering principles and practices are applied during all system development phases to identify and reduce these system vulnerabilities to the identified system threats.  
+
Desired behaviors and outcomes are those reflecting the delivery of the desired system capabilities and features without experiencing loss with undesired consequences, such as loss of information privacy.  
  
The basic premise of SSE is recognition that an initial investment in “engineering out” security vulnerabilities and “designing-in” countermeasures is a long-term benefit and cost saving measure. Further, SSE provides a means to ensure adequate consideration of security requirements, and, when appropriate, that specific security-related designs are incorporated into the overall system design during the engineering development program. Security requirements include: physical; personnel; procedural; emission; transmission;
+
While these characteristics are to be achieved to the extent practicable, gaps will occur between the ideal and what can be dependably achieved. A system should be as secure as reasonably practical (ASARP) while meeting minimum stakeholder expectations for security and optimized among other performance objectives and constraints, informed by the principle of commensurate trustworthiness – trustworthy to a level commensurate with the most significant adverse effect resulting from loss or failure. (Hild, McEvilley, & Winstead 2021)
cryptographic; communications; operations; and, computer security.
 
  
There may be some variation in the SSE process from program to program, due mainly to the level of design assurance—that is, ensuring that appropriate security controls have been implemented correctly as planned—required of the contractor. These assurance requirements are elicited early in the program (where they can be adequately planned), implemented, and verified in due course of the system development.  
+
Secure systems ideally have three essential characteristics (Ross, Winstead, & McEvilley 2022):
 +
* Enable required system capability delivery despite intentional and unintentional forms of adversity.
 +
* Enforce constraints to ensure only the desired behaviors and outcomes associated with the required capability are realized while realizing the first characteristic.
 +
* Enforce constraints based on a set of rules defining the only authorized interactions and operations allowed to occur while satisfying the second characteristic.
  
The System Security Engineering Management Plan (SSEMP) is a key document to develop for SSE. The SSEMP identifies the planned security tasks for the program and the organizations and individuals responsible for security aspects of the system. The goals of the SSEMP are to ensure that pertinent security issues are raised at the appropriate points in the program, to ensure adequate precautions are taken during design, implementation, test, and fielding, and to ensure that only an acceptable level of risk is incurred when the system is released for fielding. The SSEMP forms the basis for an agreement with SSE representing the developer, the government program office, the certifier, the accreditor, and any additional organizations that have a stake in the security of the system. The SSEMP identifies the major tasks for certification & accreditation (C&A), document preparation, system evaluation, and engineering; identifies the responsible organizations for each task; and presents a schedule for the completion of those tasks.  
+
Desired behaviors and outcomes are those reflecting delivery of the desired system capabilities and features without experiencing loss with undesired consequences, such as loss of information privacy.  
  
SSE security planning and risk management planning includes task and event planning associated with establishing statements of work and detailed work plans as well as preparation and negotiation of SSE plans with project stakeholders. For each program, SSE provides the System Security Plan (SSP) or equivalent. An initial system security Concept of Operations (CONOPS) may also be developed. The SSP provides: the initial planning of the proposed SSE work scope;
+
While these characteristics are to be achieved to the extent practicable, gaps will occur between the ideal and what can be dependably achieved. A system should be as secure as reasonably practical (ASARP) while meeting minimum stakeholder expectations for security and optimized among other performance objectives and constraints, informed by the principle of commensurate trustworthiness – trustworthy to a level commensurate with the most significant adverse effect resulting from loss or failure. (Hild, McEvilley, & Winstead 2021)
detailed descriptions of SSE activities performed throughout the system development life cycle; the operating conditions of the system; the security requirements; the initial SSE risk assessment (includes risks due to known system vulnerabilities and their potential impacts due to compromise and/or data loss); and,
 
the expected verification approach and validation results.  
 
  
These plans are submitted with the proposal and updated as required during engineering development. In the case where a formal C&A is contracted and implemented, these plans comply with the government’s C&A process, certification responsibilities, and other agreement details, as appropriate. The C&A process is the documented agreement between the customer and contractor on the certification boundary. Upon agreement of the stakeholders, these plans guide SSE activities throughout the system development life cycle.
+
[[File:SystemSecurityandPerformance.png|thumb|center|600px|'''Figure 1.''' System Security and Cost/Schedule/other Performance (Ross, Winstead, & McEvilley, 2022 - Public Domain)]]
  
===System Assurance===
+
To understand the optimization of security among other performance objectives, stakeholders need to be aligned and respectful of each other’s needs.  As loss and loss effects or consequences are easily understood, a collaborative understanding of loss tolerances provides a means to alignment as well as forms a basis for metrics across the system lifecycle. (Dove, et al. 2023)
  
NATO AEP-67 (Edition 1), Engineering for System Assurance in NATO Programs, defines system assurance  as:
+
==Why Security?==
 +
Security, one of 10 acknowledged areas of growing stakeholder expectations in INCOSE’s Systems Engineering Vision 2035, is recognized as needed to become a foundational perspective for system design. (INCOSE 2021) This growing expectation is motivated by increasing cyberspace-based attacks as evidenced by reports such as Security Magazine’s report for computers with Internet access, an attack occurred every 39 seconds in 2017 and has increased since (Security 2017) by observations from the conflict in Ukraine (Carnegie Endowment for International Peace 2023) including attacks on civilian targets such as critical infrastructure, and many well-publicized exploited vulnerable systems. (Agile IT 2023)
  
…the justified confidence that the system functions as intended and is free of exploitable vulnerabilities, either intentionally or unintentionally designed or inserted as part of the system at any time during the life cycle... This confidence is achieved by system assurance activities, which include a planned, systematic set of multi-disciplinary activities to achieve the acceptable measures of system assurance and manage the risk of exploitable vulnerabilities. (NATO 2010, 1)
+
==Scope==
 +
Adversity, conditions that can cause a loss of assets (e.g., threats, attacks, vulnerabilities, hazards, disruptions, and exposures), occurs throughout the system lifecycle. Such conditions are internal and external to the system, with internal conditions often the result of faults and defects (e.g., missing requirements, implementation errors). Consequently, system security’s scope matches systems engineering’s scope, and every systems engineering process and activity has security considerations. (Ross, Winstead, & McEvilley 2022)
  
The NATO document is organized based on the life cycle processes in ISO/IEC 15288:2008 and provides process and technology guidance to improve system assurance.  
+
“Unless security is [engineered] into a system from its inception, there is little chance that it can be made secure by retrofit”. (Anderson 1972) Consequently, security must be a foundational perspective from concept exploration.  
  
===Software Assurance===
+
==Assets==
 +
An asset is an item of value to a stakeholder. Ross, Winstead, & McEvilley (2022) identified broad asset classes, summarized in Table 1.
  
Since most modern systems derive a good portion of their functionality from software, software assurance  becomes a primary consideration in systems assurance. The Committee on National Security Systems (CNSS) (2010, 69) defines software assurance as a “level of confidence that software is free from vulnerabilities, either intentionally designed into the software or accidentally inserted at anytime during its lifecycle and that the software functions in the intended manner.
+
{|align ="center"
 +
|+'''Table 1.''' Common Asset Classes (Ross, Winstead, and McEvilley, 2022 - Public Domain)'''
 +
!Class
 +
!Description
 +
!Loss Protection Criteria
 +
|-
 +
|'''Material Resources and Infrastructure'''
 +
|Includes
 +
* physical property (e.g., buildings, facilities, equipment)
 +
* physical resources (e.g., water, fuel).
 +
* basic physical and organizational structures and facilities (i.e., infrastructure) needed for an activity or the operation of an enterprise or society.
  
Goertzel, et. al (2008, 8) point out that “the reason software assurance matters is that so many business activities and critical functions—from national defense to banking to healthcare to telecommunications to aviation to control of hazardous materials—depend on the on the correct, predictable operation of software.
+
An infrastructure commonly comprised of assets, such as a nation’s national airspace infrastructure, which includes the nation’s airports
 +
|''Material resources'' are protected from loss if they are not stolen, damaged, or destroyed or are able to function or be used as intended, as needed, and when needed.  
  
==System Description==
+
''Infrastructure'' is protected from loss if it meets performance expectations while delivering only the authorized and intended capability and producing only the authorized and intended outcomes
Robust security design explicitly rather than implicitly defines the protection goals. The Certified Information Systems Security Professional (CISSP) Common Body of Knowledge (CBK) partitions robust security into ten domains (Tipton 2006):
+
|-
 +
|'''System Capability'''
 +
|The set of capabilities and services provided by a system
 +
|''System capability'' is protected from loss when it meets its performance expectations while delivering only the authorized and intended capability and producing only the authorized and intended outcomes.
 +
|-
 +
|'''Human Resources'''
 +
|Personnel who are part of the system and personnel affected by the system
 +
|''Human resources'' are protected from loss if they are not injured, suffer illness, or killed.
 +
|-
 +
|'''Intellectual Property'''
 +
|Trade secrets, recipes, technology, and other items that constitute an advantage over competitors
 +
|''Intellectual property'' is protected from loss if it is not stolen, corrupted, destroyed, copied, substituted in an unauthorized manner, or reverse-engineered in an unauthorized manner.
 +
|-
 +
|'''Data and Information'''
 +
|Includes all types of data and information and all encodings and representations of data and information
 +
|''Data and information'' are protected from loss due to unauthorized alteration, exfiltration, infiltration, and destruction.
 +
|-
 +
|'''Derivative Non-Tangible'''
 +
|Includes image, reputation, and trust. Such assets are affected by the success or failure to protect other assets
 +
|''Non-tangible assets'' are protected from loss by ensuring the adequate protection of assets in the other classes.
 +
|}
  
1. Information security governance and risk management addresses the framework, principles, policies, and standards that establish the criteria and then assess the effectiveness of information protection. Security risk management contains governance issues, organizational behavior, ethics, and security awareness training.
+
==Systems Thinking==
 +
Systems thinking is the practice of thinking holistically about systems: relating systems behaviors and concepts to principles based on patterns. It is flexible, conceptual, and strategic in nature: hence generally adapted in systems architecture work. Systems thinking focuses not on immediate cause and effect, but also examines the dynamics of a system to identify secondary effects on behaviors and choices. (Goodman 2018)
  
2. Access control is the procedures and mechanisms that enable system administrators to allow or restrict operation and content of a system. Access control policies determine what processes, resources, and operations users can invoke.
+
Security, especially cybersecurity, has suffered from being treated as a tactics problem, focusing on threat defense and incident response. Security has also become a forensics exercise, steeped in root cause analysis, to add to the known threat defense. Systems thinking realizes the greater objective is assuring a systems’ ability to produce the capability (functions and services) that users depend on the system for and contributing to business and mission needs and produce that capability in an acceptable manner (i.e., no harm to stakeholder assets). The need is to focus less on efforts to defend against adversarial action beyond the control of the systems engineer and more on assuring the system performs and protects stakeholder assets, controlling the system from effects of loss, to include avoiding vulnerability that leads to loss when practical. (Young & Leveson 2013) Systems thinking brings security back into the conceptual, design, and implementation of systems capabilities.
  
3. Cryptography can be defined as the principles and methods of disguising information to ensure its integrity, confidentiality, and authenticity during communications and while in storage. Type I devices are certified by the US National Security Agency (NSA) for classified information processing. Type 2 devices are certified by NSA for proprietary information processing. Type 3 devices are certified by NSA for general information processing. Type 4 devices are produced by industry or other nations without any formal certification.  
+
Consequently, the need is to focus on a system’s trustworthiness rather than singularly focus on risk. (Dove, et al. 2021) Quality evidence such as that generated by verification and validation activities feed assurance arguments that merit trustworthiness, providing a basis for trust by stakeholders. Without such assurance, security functionality is a form of veneer security (Saydjari 2018), providing an unmerited sense of trustworthiness.  
  
4. Physical (environmental) security addresses the actual environment configuration, security procedures, countermeasures, and recovery strategies to protect the equipment and its location. These measures include separate processing facilities, restricted access into those facilities, and sweeps to detect eavesdropping devices.  
+
==Loss==
 +
Loss, the experience of having an asset taken away or destroyed or the failure to keep or to continue to have an asset in a desired state or form (Ross, Winstead, & McEvilley 2022), provides language understood by all stakeholders (Dove, et al. 2023). Stakeholder concern is typically the effects of loss caused by adversity, not the adversity itself, and their priorities driven by consequences (e.g., impact to mission). Their needs and requirements can thus be expressed in terms of loss, loss scenarios, loss tolerance, and acceptable loss.
  
5. Security architecture and design contains the concepts, processes, principles, and standards used to define, design, and implement secure applications, operating systems, networks, and equipment. The security architecture must integrate various levels of confidentiality, integrity, and availability to ensure effective operations and adherence to governance.
+
Addressing loss must consider loss results from combinations of adverse events or conditions that cause or lead to unacceptable ramifications, consequences, or impacts. Due to uncertainty (including uncertainty about adversity), guaranteeing a loss will not occur is not possible. The focus must be on controlling loss effects, including cascading or ripple events; e.g., the effect causes additional losses to occur. (Ross, Winstead, & McEvilley 2022)
  
6. Business continuity and disaster recovery planning are the preparations and practices which ensure business survival given events, natural or man-made, which cause a major disruption in normal business operations. Processes and specific action plans must be selected to prudently protect business processes and to ensure timely restoration.  
+
Loss control objectives frame addressing loss. Loss can be addressed by use of historically-informed practices (known good things to do) and assessing specific loss scenarios for opportunities to address conditions and the potential loss itself.
  
7. Telecommunications and network security are the transmission methods and security measures used to provide integrity, availability, and confidentiality of data during transfer over private and public communication networks.
+
{|align ="center"
 +
|+'''Table 2.''' Los Control Objectives (Ross, Winstead, and McEvilley, 2022 - Public Domain)'''
 +
!Loss Control Objective
 +
!Discussion
 +
|-
 +
|'''Prevent the Loss from Occurring'''
 +
|*Loss is avoided. Despite the presence of adversity:
 +
**The system provides only the intended behavior and produces only the intended outcomes.
 +
**Desired properties of the system and assets are retained.
 +
*Achieved by combinations of:
 +
**Preventing or removing the event or events that cause the loss
 +
**Preventing or removing the condition or conditions that allow the loss to occur
 +
**Not suffering an adverse effect despite the events or conditions (e.g., fault tolerance)
 +
|-
 +
|'''Limit the Extent of Loss'''
 +
|*Loss can or has occurred. The loss effect extent is to be limited.
 +
*Achieved by combinations of:
 +
**Limiting dispersion (e.g., propagation, ripple, or cascading effects)
 +
**Limiting duration (e.g., milliseconds, minutes, days)
 +
**Limiting capacity (e.g., diminished service or capability)
 +
**Limiting volume (e.g., bits or bytes of data/information)
 +
*Decisions to limit loss extent may require prioritizing what constitutes acceptable loss across a set of losses (i.e., limiting the loss of one asset requires accepting loss of some other asset).
 +
*Loss recovery and loss delay are two means to limit loss:
 +
**Loss Recovery: Action is taken by the system or enabled by the system to recover (or allow the recovery of) some or all its ability to function and to recover assets used by the system. Asset restoration can limit the dispersion, duration, capacity, or volume of the loss.
 +
**Loss Delay: The loss event is avoided until the adverse effect is lessened or when a delay enables a more robust response or quicker recovery.  
 +
|}
  
8. Application development security involves the controls applied to application software in a centralized or distributed environment. Application software includes tools, operating systems, data warehouses, and knowledge systems.
+
===Loss Scenarios===
 +
Loss scenarios describe the events and conditions that lead to unacceptable outcomes. These scenarios do not necessarily lead back to “root causes” in each case, but must capture the internal system (e.g., system states, internal faults) and the external environmental conditions (e.g., loss of power, presence of malicious insider threat) that may lead to a loss, including unauthorized system use (e.g., loss of control). See Figure 2.
  
9. Operations security is focused on providing system availability for end users while protecting data processing resources both in centralized data processing centers and in distributed client/server environments.
+
[[File:HierarchyofUnacceptableConditions.png|thumb|center|800px|'''Figure 2.''' Hierarchy of Unacceptable Conditions (OUSD(R&E) 2022 - Public Domain)]]
  
10. Legal, regulations, investigations, and compliance issues include the investigative measures to determine if an incident has occurred and the processes for responding to such incidents.  
+
Loss scenarios may be analyzed to inform requirements definition and derivation and analyze design alternatives, as well as inform tailoring of historically-informed practice usage.
  
One response to the complexity and diversity of security needs and domains that contribute to system security is “defense in depth,” a commonly applied architecture and design approach. Defense in depth implements multiple layers of defense and countermeasures, making maximum use of certified equipment in each layer to facilitate system accreditation.
+
==Enterprise Relationships==
 +
Most systems are part of a larger system of systems or enterprises. Adversity often comes through or from these connected systems.
  
==Discipline Management==
+
Systems engineering must balance a system’s self-protection capability with opportunities for mutual collaborative protection (Dove, et al. 2021) with trustworthy systems within an enterprise. Enterprises may have dedicated systems for protection, often within network operations and security centers (NOSCs). (Knerler, Parker, & Zimmerman 2022) Systems may also collaborate by sharing situational awareness, with one system alerting others of suspicious activity that may indicate malicious actions others may experience.
  
 
==Discipline Relationships==
 
==Discipline Relationships==
 +
Systems Security has commonality and synergy with many other disciplines, such as safety; quality management; reliability, availability, and maintenance (RAM); survivability; operational risk management; and resilience. Overlapping concerns exist with assets, losses, and adversities considered; requirements; and various engineering processes and analyses similarities and opportunities; e.g., System Theoretic Process Analysis, or STPA (Young & Leveson 2013). Some considerations for pursuing these commonalities and synergies were explored in Brtis (2020).
  
===Interactions===
+
==Personnel Considerations==
 +
Systems security engineering is a sub-discipline of systems engineering, defined in Ross, Winstead, & McEvilley (2022) as a transdisciplinary and integrative approach to enable the successful secure realization, use, and retirement of engineered systems using systems, security, and other principles and concepts, as well as scientific, technological, and management methods.  Systems security engineers are part of the systems engineering teams.
  
===Dependencies===
+
But as security is demonstrated in a system’s behaviors and outcomes, systems engineering responsibility, the systems engineer is ultimately responsible for a system’s security. (Thomas 2013) However, reflecting the maxim “Security is everyone’s job”, all engineering disciplines have security responsibilities (Dove, et al. 2021), not just the systems engineer and various specialists in areas like supply chain assurance, hardware assurance, software assurance, cybersecurity, and physical security.
  
====Web-based Resource====
+
==Security in the Future of Systems Engineering==
A good online resource for system and software assurance is the US Department of Homeland Security’s [https://buildsecurityin.us-cert.gov/ Build Security In] web site (DHS 2010),  which provides resources for best practices, knowledge, and tools for engineering secure systems.
+
The INCOSE SE Vision 2035 sets an aim that security “will be as foundational a perspective in systems design as system performance and safety are today”. (INCOSE 2021) To that end and other aims for security within the Vision, the INCOSE Systems Security Engineering Working Group has set objectives and roadmap concepts (Dove 2022).  
  
==Discipline Standards==
+
{| '''Table 3.''' Synopsis of FuSE Security Concepts (Dove, Setting Current Context for Security in the Future of Systems Engineering, 2022)
 +
!Roadmap Concept
 +
!General Needs to Fill
 +
|-
 +
|Security Proficiency in the Systems Engineering Team
 +
|System security and its evolution effectively enabled by systems engineering activity.
 +
|-
 +
|Education and Competency Development
 +
|Education at all levels focused on security of cyber-physical systems.
 +
|-
 +
|Stakeholder Alignment
 +
|Common security vision and knowledge among all stakeholders.
 +
|-
 +
|Loss-Driven Engineering
 +
|Standard metrics and abstractions relevant to all system lifecycle phases.
 +
|-
 +
|Architectural Agility
 +
|Readily composable and re-composable security with feature variants.
 +
|-
 +
|Operational Agility
 +
|Ability for cyber-relevant response to attack and potential threat; resilience in security system.
 +
|-
 +
|Capability-Based Security Engineering
 +
|Top-down approach to security starting with desired results/value.
 +
|-
 +
|Security as a Functional Requirement
 +
|Systems engineering responsibility for the security of systems.
 +
|-
 +
|Modeled Trustworthiness
 +
|Reinvigorate formal modeling of system trust as a core aspect of system security engineering; address issues of scale with model-based tools and automation.
 +
|-
 +
|Security Orchestration
 +
|Tightly coupled coordinated system defense in cyber-relevant time.
 +
|-
 +
|Collaborative Mutual Protection
 +
|Augmented detection and mitigation of known and unknown attacks with components collaborating for mutual protection.
 +
|}
  
==Personnel Considerations==
+
==Challenges==
 +
The challenging problems for system security are numerous. Some of this is a result of neglecting the addressing of security early in the system life cycle as recognized by the Systems Engineering Vision call for security to be a foundational perspective – as Carl Landwehr wrote “This whole economic boom in [security] seems largely to be a consequence of poor engineering” (Landwehr 2015).
  
==Metrics==
+
For example, system of systems engineering security faces challenges in part to not knowing or trusting the component systems which may not have been engineered with security in mind, or at least did not consider documenting the evidence that informs trustworthiness. Another system of systems challenge comes as formerly isolated cyber-physical systems (CPSs) are increasingly connected to form larger system of systems, negating assumptions impacting security if security was considered at all. Additionally, legacy CPSs were not built thinking of the need to update the software; i.e. use sustainable security (Rosser 2023)), which creates a challenge in upgrading software to reflect revised assumptions and security models for the CPS in question.
  
==Models==
+
Another challenge comes with the advancing use of Artificial Intelligence (AI). Any new technology presents security challenges until its usage matures, but AI introduces complexity and decreases predictability (INCOSE 2021). Managing complexity and uncertainty is necessary for security (Ross, Winstead, & McEvilley 2022), and AI increasing of both compounds the problem space for systems engineering security.
 
 
==Tools==
 
  
 
==References==  
 
==References==  
 
===Works Cited===
 
===Works Cited===
Baldwin, K., J. Miller, P. Popick, and J. Goodnight.  2012.  ''The United States Department of Defense Revitalization of System Security Engineering Through Program Protection.''  Proceedings of the 2012 IEEE Systems Conference, 19-22 March 2012, Vancouver, BC, Canada. Accessed 28 August 2012 at [http://www.acq.osd.mil/se/docs/IEEE-SSE-Paper-02152012-Bkmarks.pdf http://www.acq.osd.mil/se/docs/IEEE-SSE-Paper-02152012-Bkmarks.pdf].  
+
Agile IT. n.d.. The Top 10 Biggest Cyberattacks of 2022. Accessed September 15, 2023. Available at https://www.agileit.com/news/biggest-cyberattacks-2022/.  
  
CNSS. 2010. ''National Information Assurance Glossary", Committee on National Security Systems Instruction (CNSSI) no. 4009". Fort Meade, MD, USA: The Committee on National Security Systems. ''
+
Anderson, J. 1972. Computer Security Technology Planning Study, Technical Report ESD-TR-73-51. October 1, 1972. Hanscom AFB: Air Force Electronic Systems Division. Accessed September 15, 2023. Available at https://apps.dtic.mil/sti/citations/AD0758206.  
  
DHS. 2010. ''Build Security In.'' Washington, DC, USA: US Department of Homeland Security (DHS). Accessed September 11, 2011. Available: https://buildsecurityin.us-cert.gov.
+
Brtis, J. (ed). 2020. Loss-Driven Systems Engineering. INCOSE Insight, 23(4):7-8. Wiley. Accessed September 15, 2023. Available at https://incose.onlinelibrary.wiley.com/doi/abs/10.1002/inst.12312.  
  
DODI5200.44, United States Department of Defense, ''Protection of Mission Critical Functions to Achieve Trusted Systems and Networks'', Department of Defense Instruction Number 5200.44, November 2012, Accessed 3 November 2014 at Defense Technical Information Center [http://www.dtic.mil/whs/directives/corres/pdf/520044p.pdf http://www.dtic.mil/whs/directives/corres/pdf/520044p.pdf].
+
Carnegie Endowment for International Peace. n.d.. Cyber Conflict in the Russia-Ukraine War. Accessed September 15, 2023. Available at https://carnegieendowment.org/programs/technology/cyberconflictintherussiaukrainewar/.
  
Goertzel, K., et al. 2008. ''Enhancing the Development Life Cycle to Produce Secure Software: A Reference Guidebook on Software Assurance.'' Washington, DC, USA: Data and Analysis Center for Software (DACS)/US Department of Homeland Security (DHS).  
+
Dove, R. 2022. “Setting Current Context for Security in the Future of Systems Engineering”. INCOSE Insight, 25(2):8-10.
  
NATO. 2010. ''Engineering for System Assurance in NATO programs.'' Washington, DC, USA: NATO Standardization Agency. DoD 5220.22M-NISPOM-NATO-AEP-67.
+
Dove, R., K. Willett, T. McDermott, H. Dunlap, H, D.P. MacNamara, and C. Ocker. 2021. “Security in the Future of Systems Engineering (FuSE), a Roadmap of Foundational Concepts”. Proceedings of the 31st INCOSE International Symposium, July 17-22. Virtual only,
  
Tipton, H.F. (ed.). 2006. ''Official (ISC)2 guide to the CISSP CBK,'' 1st ed. Boston, MA, USA: Auerbach Publications.
+
Dove, R., M. Winstead, H. Dunlap, M. Hause, A. Scalco, A. Scalco, A. Williams, and B. Wilson. 2023. “Democratizing Systems Security”. Proceedings of the 33rd INCOSE International Symposium. July 15-20. Honolulu, Hawaii: Wiley.
  
===Primary References===
+
Goodman, M. 2018. Systems Thinking: What, Why, When, Where, and How? Accessed September 15, 2023. Available at https://thesystemsthinker.com/systems-thinking-what-why-when-where-and-how/
Anderson, R.J. 2008. ''[[Security Engineering: A Guide to Building Dependable Distributed Systems]],'' 2nd Ed. New York, NY, USA: John Wiley & Sons. Accessed October 24, 2014 at http://www.cl.cam.ac.uk/~rja14/book.html
+
 
 +
Hild, D., M. McEvilley, and M. Winstead. 2021. Principles for Trustworthy Design of Cyber-Physical Systems. MITRE Technical Report, MTR210263.  
 +
 
 +
INCOSE. 2021. INCOSE Systems Engineering Vision 2035. Accessed September 15, 2023. Available at https://www.incose.org/about-systems-engineering/se-vision-2035
 +
 
 +
Knerler, K., I. Parker, and C. Zimmerman. 2022. 11 Strategies of a World-Class Cybersecurity Operations Center (2nd ed.). McLean, VA: MITRE. Accessed September 15, 2023. Available at https://www.mitre.org/sites/default/files/2022-04/11-strategies-of-a-world-class-cybersecurity-operations-center.pdf
 +
 
 +
Landwehr, C. 2015. “We Need a Building Code for Building Code”. Communcations of the ACM, 58(2):24-26. Accessed September 15, 2023. Available at doi:https://doi.org/10.1145/2700341
 +
 
 +
McEvilley, M., and M. Winstead. 2022. “Functionally Interpreting Security”. INCOSE Insight, 25(2):15-17. Wiley. Accessed September 15, 2023. Available at https://incose.onlinelibrary.wiley.com/doi/abs/10.1002/inst.12380
 +
 
 +
OUSD(R&E). 2022. Security and Resilience Interpretation 1.0. Prepared by MITRE. Accessed September 2023. Available at https://www.crws-bok.org/asset/83a0d3528e6e27272a52b7e2c3758facc16773fd
 +
 
 +
Ross, R., M. Winstead, M., M. McEvilley. 2022. Engineering Trustworthy Secure Systems. NIST SP 800-160 Volume 1 Revision 1. Gaithersburg, MD: NIST. Accessed September 15, 2023. Available at https://csrc.nist.gov/pubs/sp/800/160/v1/r1/final
 +
 
 +
Rosser, L.A. 2023. “Applying Agility for Sustainable Security”. INCOSE Insight, 26(2):45-52. Wiley. Accessed September 15, 2023. Available at https://incose.onlinelibrary.wiley.com/doi/abs/10.1002/inst.12445
 +
 
 +
Saydjari, O.S. (ed.). 2018. Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time. New York: McGraw Hill. Accessed September 15, 2023. Available at https://www.amazon.com/Engineering-Trustworthy-Systems-Cybersecurity-Design/dp/1260118177
 +
 
 +
Security. 2017. Hackers Attack Every 39 Seconds. Accessed September 15, 2023. Available at https://www.securitymagazine.com/articles/87787-hackers-attack-every-39-seconds
  
DAU. 2012. "[[Defense Acquisition Guidebook (DAG)]]: Chapter 13 -- Program Protection." Ft. Belvoir, VA, USA: Defense Acquisition University (DAU)/U.S. Department of Defense (DoD). November 8, 2012. Accessed October 24, 2014 at [https://dag.dau.mil/ https://dag.dau.mil/]
+
Thomas, J.A. 2013. “Critical System Behaviors of The Future”. INCOSE Insight, 16(2):3-5. Wiley.
  
ISO. 2008. "[[ISO/IEC 21827|Information technology -- Security techniques -- Systems Security Engineering -- Capability Maturity Model® (SSE-CMM®)]]," Second Edition. Geneva, Switzerland: International Organization for Standardization (ISO), [[ISO/IEC 21827]]:2008.
+
Young, W. and N. Leveson. 2013. “Systems Thinking for Safety and Security”. Proceedings of the 29th Annual Computer Security Applications Conference (ACSAC '13), pp. 31-35. Accessed September 15, 2023. Available at https://dspace.mit.edu/handle/1721.1/96965
  
ISO/IEC. 2013. "[[ISO/IEC 27001|Information technology — Security techniques — Information security management systems — Requirements]]," Second Edition. Geneva, Switzerland: International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC), [[ISO/IEC 27001]]:2013.
+
===Primary References===
 +
Anderson, R. 2020. Security Engineering: A Guide to Building Dependable Distributed Systems (3rd Edition). Wiley. Accessed September 15, 2023. Available at https://www.amazon.com/Security-Engineering-Building-Dependable-Distributed/dp/1119642787
  
Kissel, R., K. Stine, M. Scholl, H. Rossman, J. Fahlsing, J. Gulick. 2008. "[[NIST 800-64|Security Considerations in the System Development Life Cycle]]," Revision 2. Gaithersburg, MD. National Institute of Standard and Technology (NIST), [[NIST 800-64]] Revision 2:2008. Accessed October 24, 2014 at the Computer Security Resource Center [http://csrc.nist.gov/publications/nistpubs/800-64-Rev2/SP800-64-Revision2.pdf]
+
Neumann P. 2004. Principled Assuredly Trustworthy Composable Architectures, CDRL A001 Final Report, SRI International, Menlo Park, CA. Accessed September 15, 2023. Available at https://ieeexplore.ieee.org/document/1335465
  
Ross, R., J.C. Oren, M. McEvilley. 2014. "[[NIST SP 800-160|Systems Security Engineering: An Integrated Approach to Building Trustworthy Resilient Systems]]." Gaithersburg, MD. National Institute of Standard and Technology (NIST) Special Publication (SP), [[NIST SP 800-160]]:2014 (Initial Public Draft). Accessed October 24, 2014 at the Computer Security Resource Center [http://csrc.nist.gov/publications/drafts/800-160/sp800_160_draft.pdf http://csrc.nist.gov/publications/drafts/800-160/sp800_160_draft.pdf]
+
Ross, R., M. Winstead, M., and M. McEvilley. 2022. Engineering Trustworthy Secure Systems. NIST SP 800-160 Volume 1 Revision 1. Gaithersburg, MD: NIST. Accessed September 15, 2023. Available at https://csrc.nist.gov/pubs/sp/800/160/v1/r1/final
  
 
===Additional References===
 
===Additional References===
Allen, Julia; Barnum, Sean; Ellison, Robert;  McGraw, Gary; and Mead, Nancy. 2008. ''Software security engineering: a guide for project managers.''  New York, NY, USA: Addison Wesley Professional.
+
Avizienis A., J. Laprie, B. Randell. and C. Landwehr C. “Basic Concepts and Taxonomy of Dependable and Secure Computing”. IEEE Transactions on Dependable and Secure Computing 1(1):11-33. Accessed September 15, 2023. Available at http://www.csl.sri.com/users/neumann/chats4.pdf
  
ISO. 2005. ''Information technology -- Security techniques -- Code of practice for information security management.''  Geneva, Switzerland: International Organization for Standardization (ISO). ISO/IEC 27002:2005.
+
DoD. 1983. DoD Standard 5200.28-STD Trusted Computer System Evaluation Criteria. Accessed September 15, 2023. Available at http://www.csl.sri.com/users/neumann/chats4.pdf
  
Jurjens, J. 2005"Sound Methods and effective tools for model-based security engineering with UML."  ''Proceedings of the 2005 International Conference on Software Engineering.'' Munich, GE: ICSE, 15-21 May.
+
National Security Agency. 2002. Information Assurance Technical Framework (IATF), Release 3.1. Accessed September 15, 2023. Available at https://ntrl.ntis.gov/NTRL/dashboard/searchResults/titleDetail/ADA606355.xhtml
  
MITRE. 2012. "Systems Engineering for Mission Assurance."  In ''Systems Engineering Guide.''  Accessed 19 June 2012 at MITRE [http://www.mitre.org/work/systems_engineering/guide/enterprise_engineering/se_for_mission_assurance/ http://www.mitre.org/work/systems_engineering/guide/enterprise_engineering/se_for_mission_assurance/].
+
Schroeder M.D., D.D. Clark, and J.H. Saltzer. 1977. “The Multics Kernel Design Project”. Proceedings of Sixth ACM Symposium on Operating Systems Principles. Accessed September 15, 2023. Available at https://dl.acm.org/doi/pdf/10.1145/800214.806546
  
NIST SP 800-160. ''Systems Security Engineering - An Integrated Approach to Building Trustworthy Resilient Systems''. National Institute of Standards and Technology, U.S. Department of Commerce, Special Publication 800-160. Accessed October 24, 2014 at the Computer Security Resource Center [http://csrc.nist.gov/publications/drafts/800-160/sp800_160_draft.pdf http://csrc.nist.gov/publications/drafts/800-160/sp800_160_draft.pdf].
+
===Videos===
 +
Keith Willett previously authored a series of videos that explain aspects of systems security. They are not referenced in the above article, but remain quite relevant and interesting viewing.
  
 +
<gallery>
 +
File:Arch the Future of Security.mp4|thumb|Architecture for the Future of Society. By Keith Willett. Used with Permission. Uploaded 17 May 2021.
 +
File:SEBoK F Security Concepts Part 1.mp4|thumb|SEBoK F Security Concepts Part 1. By Keith Willett. Used with Permission. Uploaded 17 May 2021.
 +
File:SEBoK F Security Concepts Part 2.mp4|thumb|SEBoK F Security Concepts Part 2. By Keith Willett. Used with Permission. Uploaded 17 May 2021.
 +
File:SEBoK L Sys Think Security.mp4|thumb|SEBoK L Systems Thinking and Security. By Keith Willett. Used with Permission. Uploaded 17 May 2021.
 +
File:SEBoK N Sec Ops Workflow.mp4|thumb|SEBoK N Security Operations Workflow. By Keith Willett. Used with Permission. Uploaded 17 May 2021.
 +
File:SEBoK O1 LDSE Intro.mp4|thumb|SEBoK O1 LDSE Intro. By Keith Willett. Used with Permission. Uploaded 17 May 2021.
 +
File:SEBoK O2 ODSE Intro.mp4|thumb|SEBoK O2 OSDE Intro. By Keith Willett. Used with Permission. Uploaded 17 May 2021.
 +
 +
</gallery>
 
----
 
----
<center>[[Safety Engineering|< Previous Article]] | [[Systems Engineering and Specialty Engineering|Parent Article]] | [[Electromagnetic Interference/Electromagnetic Compatibility|Next Article >]]</center>
+
----
 
+
<center>[[System Safety|< Previous Article]] | [[Systems Engineering and Quality Attributes|Parent Article]] | [[Systems Engineering Implementation Examples|Next Article (Part 7) >]]</center>
  
 +
<center>'''SEBoK v. 2.10, released 06 May 2024'''</center>
  
[[Category: Part 6]][[Category:Topic]]
+
[[Category: Part 6]]
[[Category:Systems Engineering and Specialty Engineering]]
+
[[Category:Topic]]
{{DISQUS}}
+
[[Category:Systems Engineering and Quality Attributes]]

Latest revision as of 22:47, 2 May 2024


Lead Author: Mark Winstead, Contributing Authors: Terri Chan and Keith Willett


Security is freedom from those conditions that may lead to loss of assets (anything of value) with undesired consequences (Ross, Winstead, & McEvilley 2022). Systems Security specifically is concerned with systems delivering capability (an asset) with intended and only intended behaviors and outcomes (no unacceptable consequences such as loss of asset integrity) in contested operational environments (cyberspace or physical).

Restated, Systems Security is about engineering for intended and authorized system behavior and outcomes despite anticipated and unanticipated adversity, conditions that may cause loss (e.g., threats, attacks, hazards, disruptions, exposures). (McEvilley & Winstead 2022)

Note: This is a completely new article inserted in SEBoK 2.9, replacing the previous version by Richard Fairley, Alice Squires, and Keith Willett which originally appeared in SEBoK 1.0 and was periodically updated through SEBoK 2.8.

Overview

Secure systems ideally have three essential characteristics (Ross, Winstead, & McEvilley 2022):

  • Enable required system capability delivery despite intentional and unintentional forms of adversity.
  • Enforce constraints to ensure only the desired behaviors and outcomes associated with the required capability are realized while realizing the first characteristic.
  • Enforce constraints based on a set of rules defining the only authorized interactions and operations allowed to occur while satisfying the second characteristic.

Desired behaviors and outcomes are those reflecting the delivery of the desired system capabilities and features without experiencing loss with undesired consequences, such as loss of information privacy.

While these characteristics are to be achieved to the extent practicable, gaps will occur between the ideal and what can be dependably achieved. A system should be as secure as reasonably practical (ASARP) while meeting minimum stakeholder expectations for security and optimized among other performance objectives and constraints, informed by the principle of commensurate trustworthiness – trustworthy to a level commensurate with the most significant adverse effect resulting from loss or failure. (Hild, McEvilley, & Winstead 2021)

Secure systems ideally have three essential characteristics (Ross, Winstead, & McEvilley 2022):

  • Enable required system capability delivery despite intentional and unintentional forms of adversity.
  • Enforce constraints to ensure only the desired behaviors and outcomes associated with the required capability are realized while realizing the first characteristic.
  • Enforce constraints based on a set of rules defining the only authorized interactions and operations allowed to occur while satisfying the second characteristic.

Desired behaviors and outcomes are those reflecting delivery of the desired system capabilities and features without experiencing loss with undesired consequences, such as loss of information privacy.

While these characteristics are to be achieved to the extent practicable, gaps will occur between the ideal and what can be dependably achieved. A system should be as secure as reasonably practical (ASARP) while meeting minimum stakeholder expectations for security and optimized among other performance objectives and constraints, informed by the principle of commensurate trustworthiness – trustworthy to a level commensurate with the most significant adverse effect resulting from loss or failure. (Hild, McEvilley, & Winstead 2021)

Figure 1. System Security and Cost/Schedule/other Performance (Ross, Winstead, & McEvilley, 2022 - Public Domain)

To understand the optimization of security among other performance objectives, stakeholders need to be aligned and respectful of each other’s needs. As loss and loss effects or consequences are easily understood, a collaborative understanding of loss tolerances provides a means to alignment as well as forms a basis for metrics across the system lifecycle. (Dove, et al. 2023)

Why Security?

Security, one of 10 acknowledged areas of growing stakeholder expectations in INCOSE’s Systems Engineering Vision 2035, is recognized as needed to become a foundational perspective for system design. (INCOSE 2021) This growing expectation is motivated by increasing cyberspace-based attacks as evidenced by reports such as Security Magazine’s report for computers with Internet access, an attack occurred every 39 seconds in 2017 and has increased since (Security 2017) by observations from the conflict in Ukraine (Carnegie Endowment for International Peace 2023) including attacks on civilian targets such as critical infrastructure, and many well-publicized exploited vulnerable systems. (Agile IT 2023)

Scope

Adversity, conditions that can cause a loss of assets (e.g., threats, attacks, vulnerabilities, hazards, disruptions, and exposures), occurs throughout the system lifecycle. Such conditions are internal and external to the system, with internal conditions often the result of faults and defects (e.g., missing requirements, implementation errors). Consequently, system security’s scope matches systems engineering’s scope, and every systems engineering process and activity has security considerations. (Ross, Winstead, & McEvilley 2022)

“Unless security is [engineered] into a system from its inception, there is little chance that it can be made secure by retrofit”. (Anderson 1972) Consequently, security must be a foundational perspective from concept exploration.

Assets

An asset is an item of value to a stakeholder. Ross, Winstead, & McEvilley (2022) identified broad asset classes, summarized in Table 1.

Table 1. Common Asset Classes (Ross, Winstead, and McEvilley, 2022 - Public Domain)
Class Description Loss Protection Criteria
Material Resources and Infrastructure Includes
  • physical property (e.g., buildings, facilities, equipment)
  • physical resources (e.g., water, fuel).
  • basic physical and organizational structures and facilities (i.e., infrastructure) needed for an activity or the operation of an enterprise or society.

An infrastructure commonly comprised of assets, such as a nation’s national airspace infrastructure, which includes the nation’s airports

Material resources are protected from loss if they are not stolen, damaged, or destroyed or are able to function or be used as intended, as needed, and when needed.

Infrastructure is protected from loss if it meets performance expectations while delivering only the authorized and intended capability and producing only the authorized and intended outcomes

System Capability The set of capabilities and services provided by a system System capability is protected from loss when it meets its performance expectations while delivering only the authorized and intended capability and producing only the authorized and intended outcomes.
Human Resources Personnel who are part of the system and personnel affected by the system Human resources are protected from loss if they are not injured, suffer illness, or killed.
Intellectual Property Trade secrets, recipes, technology, and other items that constitute an advantage over competitors Intellectual property is protected from loss if it is not stolen, corrupted, destroyed, copied, substituted in an unauthorized manner, or reverse-engineered in an unauthorized manner.
Data and Information Includes all types of data and information and all encodings and representations of data and information Data and information are protected from loss due to unauthorized alteration, exfiltration, infiltration, and destruction.
Derivative Non-Tangible Includes image, reputation, and trust. Such assets are affected by the success or failure to protect other assets Non-tangible assets are protected from loss by ensuring the adequate protection of assets in the other classes.

Systems Thinking

Systems thinking is the practice of thinking holistically about systems: relating systems behaviors and concepts to principles based on patterns. It is flexible, conceptual, and strategic in nature: hence generally adapted in systems architecture work. Systems thinking focuses not on immediate cause and effect, but also examines the dynamics of a system to identify secondary effects on behaviors and choices. (Goodman 2018)

Security, especially cybersecurity, has suffered from being treated as a tactics problem, focusing on threat defense and incident response. Security has also become a forensics exercise, steeped in root cause analysis, to add to the known threat defense. Systems thinking realizes the greater objective is assuring a systems’ ability to produce the capability (functions and services) that users depend on the system for and contributing to business and mission needs and produce that capability in an acceptable manner (i.e., no harm to stakeholder assets). The need is to focus less on efforts to defend against adversarial action beyond the control of the systems engineer and more on assuring the system performs and protects stakeholder assets, controlling the system from effects of loss, to include avoiding vulnerability that leads to loss when practical. (Young & Leveson 2013) Systems thinking brings security back into the conceptual, design, and implementation of systems capabilities.

Consequently, the need is to focus on a system’s trustworthiness rather than singularly focus on risk. (Dove, et al. 2021) Quality evidence such as that generated by verification and validation activities feed assurance arguments that merit trustworthiness, providing a basis for trust by stakeholders. Without such assurance, security functionality is a form of veneer security (Saydjari 2018), providing an unmerited sense of trustworthiness.

Loss

Loss, the experience of having an asset taken away or destroyed or the failure to keep or to continue to have an asset in a desired state or form (Ross, Winstead, & McEvilley 2022), provides language understood by all stakeholders (Dove, et al. 2023). Stakeholder concern is typically the effects of loss caused by adversity, not the adversity itself, and their priorities driven by consequences (e.g., impact to mission). Their needs and requirements can thus be expressed in terms of loss, loss scenarios, loss tolerance, and acceptable loss.

Addressing loss must consider loss results from combinations of adverse events or conditions that cause or lead to unacceptable ramifications, consequences, or impacts. Due to uncertainty (including uncertainty about adversity), guaranteeing a loss will not occur is not possible. The focus must be on controlling loss effects, including cascading or ripple events; e.g., the effect causes additional losses to occur. (Ross, Winstead, & McEvilley 2022)

Loss control objectives frame addressing loss. Loss can be addressed by use of historically-informed practices (known good things to do) and assessing specific loss scenarios for opportunities to address conditions and the potential loss itself.

Table 2. Los Control Objectives (Ross, Winstead, and McEvilley, 2022 - Public Domain)
Loss Control Objective Discussion
Prevent the Loss from Occurring *Loss is avoided. Despite the presence of adversity:
    • The system provides only the intended behavior and produces only the intended outcomes.
    • Desired properties of the system and assets are retained.
  • Achieved by combinations of:
    • Preventing or removing the event or events that cause the loss
    • Preventing or removing the condition or conditions that allow the loss to occur
    • Not suffering an adverse effect despite the events or conditions (e.g., fault tolerance)
Limit the Extent of Loss *Loss can or has occurred. The loss effect extent is to be limited.
  • Achieved by combinations of:
    • Limiting dispersion (e.g., propagation, ripple, or cascading effects)
    • Limiting duration (e.g., milliseconds, minutes, days)
    • Limiting capacity (e.g., diminished service or capability)
    • Limiting volume (e.g., bits or bytes of data/information)
  • Decisions to limit loss extent may require prioritizing what constitutes acceptable loss across a set of losses (i.e., limiting the loss of one asset requires accepting loss of some other asset).
  • Loss recovery and loss delay are two means to limit loss:
    • Loss Recovery: Action is taken by the system or enabled by the system to recover (or allow the recovery of) some or all its ability to function and to recover assets used by the system. Asset restoration can limit the dispersion, duration, capacity, or volume of the loss.
    • Loss Delay: The loss event is avoided until the adverse effect is lessened or when a delay enables a more robust response or quicker recovery.

Loss Scenarios

Loss scenarios describe the events and conditions that lead to unacceptable outcomes. These scenarios do not necessarily lead back to “root causes” in each case, but must capture the internal system (e.g., system states, internal faults) and the external environmental conditions (e.g., loss of power, presence of malicious insider threat) that may lead to a loss, including unauthorized system use (e.g., loss of control). See Figure 2.

Figure 2. Hierarchy of Unacceptable Conditions (OUSD(R&E) 2022 - Public Domain)

Loss scenarios may be analyzed to inform requirements definition and derivation and analyze design alternatives, as well as inform tailoring of historically-informed practice usage.

Enterprise Relationships

Most systems are part of a larger system of systems or enterprises. Adversity often comes through or from these connected systems.

Systems engineering must balance a system’s self-protection capability with opportunities for mutual collaborative protection (Dove, et al. 2021) with trustworthy systems within an enterprise. Enterprises may have dedicated systems for protection, often within network operations and security centers (NOSCs). (Knerler, Parker, & Zimmerman 2022) Systems may also collaborate by sharing situational awareness, with one system alerting others of suspicious activity that may indicate malicious actions others may experience.

Discipline Relationships

Systems Security has commonality and synergy with many other disciplines, such as safety; quality management; reliability, availability, and maintenance (RAM); survivability; operational risk management; and resilience. Overlapping concerns exist with assets, losses, and adversities considered; requirements; and various engineering processes and analyses similarities and opportunities; e.g., System Theoretic Process Analysis, or STPA (Young & Leveson 2013). Some considerations for pursuing these commonalities and synergies were explored in Brtis (2020).

Personnel Considerations

Systems security engineering is a sub-discipline of systems engineering, defined in Ross, Winstead, & McEvilley (2022) as a transdisciplinary and integrative approach to enable the successful secure realization, use, and retirement of engineered systems using systems, security, and other principles and concepts, as well as scientific, technological, and management methods. Systems security engineers are part of the systems engineering teams.

But as security is demonstrated in a system’s behaviors and outcomes, systems engineering responsibility, the systems engineer is ultimately responsible for a system’s security. (Thomas 2013) However, reflecting the maxim “Security is everyone’s job”, all engineering disciplines have security responsibilities (Dove, et al. 2021), not just the systems engineer and various specialists in areas like supply chain assurance, hardware assurance, software assurance, cybersecurity, and physical security.

Security in the Future of Systems Engineering

The INCOSE SE Vision 2035 sets an aim that security “will be as foundational a perspective in systems design as system performance and safety are today”. (INCOSE 2021) To that end and other aims for security within the Vision, the INCOSE Systems Security Engineering Working Group has set objectives and roadmap concepts (Dove 2022).

Roadmap Concept General Needs to Fill
Security Proficiency in the Systems Engineering Team System security and its evolution effectively enabled by systems engineering activity.
Education and Competency Development Education at all levels focused on security of cyber-physical systems.
Stakeholder Alignment Common security vision and knowledge among all stakeholders.
Loss-Driven Engineering Standard metrics and abstractions relevant to all system lifecycle phases.
Architectural Agility Readily composable and re-composable security with feature variants.
Operational Agility Ability for cyber-relevant response to attack and potential threat; resilience in security system.
Capability-Based Security Engineering Top-down approach to security starting with desired results/value.
Security as a Functional Requirement Systems engineering responsibility for the security of systems.
Modeled Trustworthiness Reinvigorate formal modeling of system trust as a core aspect of system security engineering; address issues of scale with model-based tools and automation.
Security Orchestration Tightly coupled coordinated system defense in cyber-relevant time.
Collaborative Mutual Protection Augmented detection and mitigation of known and unknown attacks with components collaborating for mutual protection.

Challenges

The challenging problems for system security are numerous. Some of this is a result of neglecting the addressing of security early in the system life cycle as recognized by the Systems Engineering Vision call for security to be a foundational perspective – as Carl Landwehr wrote “This whole economic boom in [security] seems largely to be a consequence of poor engineering” (Landwehr 2015).

For example, system of systems engineering security faces challenges in part to not knowing or trusting the component systems which may not have been engineered with security in mind, or at least did not consider documenting the evidence that informs trustworthiness. Another system of systems challenge comes as formerly isolated cyber-physical systems (CPSs) are increasingly connected to form larger system of systems, negating assumptions impacting security if security was considered at all. Additionally, legacy CPSs were not built thinking of the need to update the software; i.e. use sustainable security (Rosser 2023)), which creates a challenge in upgrading software to reflect revised assumptions and security models for the CPS in question.

Another challenge comes with the advancing use of Artificial Intelligence (AI). Any new technology presents security challenges until its usage matures, but AI introduces complexity and decreases predictability (INCOSE 2021). Managing complexity and uncertainty is necessary for security (Ross, Winstead, & McEvilley 2022), and AI increasing of both compounds the problem space for systems engineering security.

References

Works Cited

Agile IT. n.d.. The Top 10 Biggest Cyberattacks of 2022. Accessed September 15, 2023. Available at https://www.agileit.com/news/biggest-cyberattacks-2022/.

Anderson, J. 1972. Computer Security Technology Planning Study, Technical Report ESD-TR-73-51. October 1, 1972. Hanscom AFB: Air Force Electronic Systems Division. Accessed September 15, 2023. Available at https://apps.dtic.mil/sti/citations/AD0758206.

Brtis, J. (ed). 2020. Loss-Driven Systems Engineering. INCOSE Insight, 23(4):7-8. Wiley. Accessed September 15, 2023. Available at https://incose.onlinelibrary.wiley.com/doi/abs/10.1002/inst.12312.

Carnegie Endowment for International Peace. n.d.. Cyber Conflict in the Russia-Ukraine War. Accessed September 15, 2023. Available at https://carnegieendowment.org/programs/technology/cyberconflictintherussiaukrainewar/.

Dove, R. 2022. “Setting Current Context for Security in the Future of Systems Engineering”. INCOSE Insight, 25(2):8-10.

Dove, R., K. Willett, T. McDermott, H. Dunlap, H, D.P. MacNamara, and C. Ocker. 2021. “Security in the Future of Systems Engineering (FuSE), a Roadmap of Foundational Concepts”. Proceedings of the 31st INCOSE International Symposium, July 17-22. Virtual only,

Dove, R., M. Winstead, H. Dunlap, M. Hause, A. Scalco, A. Scalco, A. Williams, and B. Wilson. 2023. “Democratizing Systems Security”. Proceedings of the 33rd INCOSE International Symposium. July 15-20. Honolulu, Hawaii: Wiley.

Goodman, M. 2018. Systems Thinking: What, Why, When, Where, and How? Accessed September 15, 2023. Available at https://thesystemsthinker.com/systems-thinking-what-why-when-where-and-how/

Hild, D., M. McEvilley, and M. Winstead. 2021. Principles for Trustworthy Design of Cyber-Physical Systems. MITRE Technical Report, MTR210263.

INCOSE. 2021. INCOSE Systems Engineering Vision 2035. Accessed September 15, 2023. Available at https://www.incose.org/about-systems-engineering/se-vision-2035

Knerler, K., I. Parker, and C. Zimmerman. 2022. 11 Strategies of a World-Class Cybersecurity Operations Center (2nd ed.). McLean, VA: MITRE. Accessed September 15, 2023. Available at https://www.mitre.org/sites/default/files/2022-04/11-strategies-of-a-world-class-cybersecurity-operations-center.pdf

Landwehr, C. 2015. “We Need a Building Code for Building Code”. Communcations of the ACM, 58(2):24-26. Accessed September 15, 2023. Available at doi:https://doi.org/10.1145/2700341

McEvilley, M., and M. Winstead. 2022. “Functionally Interpreting Security”. INCOSE Insight, 25(2):15-17. Wiley. Accessed September 15, 2023. Available at https://incose.onlinelibrary.wiley.com/doi/abs/10.1002/inst.12380

OUSD(R&E). 2022. Security and Resilience Interpretation 1.0. Prepared by MITRE. Accessed September 2023. Available at https://www.crws-bok.org/asset/83a0d3528e6e27272a52b7e2c3758facc16773fd

Ross, R., M. Winstead, M., M. McEvilley. 2022. Engineering Trustworthy Secure Systems. NIST SP 800-160 Volume 1 Revision 1. Gaithersburg, MD: NIST. Accessed September 15, 2023. Available at https://csrc.nist.gov/pubs/sp/800/160/v1/r1/final

Rosser, L.A. 2023. “Applying Agility for Sustainable Security”. INCOSE Insight, 26(2):45-52. Wiley. Accessed September 15, 2023. Available at https://incose.onlinelibrary.wiley.com/doi/abs/10.1002/inst.12445

Saydjari, O.S. (ed.). 2018. Engineering Trustworthy Systems: Get Cybersecurity Design Right the First Time. New York: McGraw Hill. Accessed September 15, 2023. Available at https://www.amazon.com/Engineering-Trustworthy-Systems-Cybersecurity-Design/dp/1260118177

Security. 2017. Hackers Attack Every 39 Seconds. Accessed September 15, 2023. Available at https://www.securitymagazine.com/articles/87787-hackers-attack-every-39-seconds

Thomas, J.A. 2013. “Critical System Behaviors of The Future”. INCOSE Insight, 16(2):3-5. Wiley.

Young, W. and N. Leveson. 2013. “Systems Thinking for Safety and Security”. Proceedings of the 29th Annual Computer Security Applications Conference (ACSAC '13), pp. 31-35. Accessed September 15, 2023. Available at https://dspace.mit.edu/handle/1721.1/96965

Primary References

Anderson, R. 2020. Security Engineering: A Guide to Building Dependable Distributed Systems (3rd Edition). Wiley. Accessed September 15, 2023. Available at https://www.amazon.com/Security-Engineering-Building-Dependable-Distributed/dp/1119642787

Neumann P. 2004. Principled Assuredly Trustworthy Composable Architectures, CDRL A001 Final Report, SRI International, Menlo Park, CA. Accessed September 15, 2023. Available at https://ieeexplore.ieee.org/document/1335465

Ross, R., M. Winstead, M., and M. McEvilley. 2022. Engineering Trustworthy Secure Systems. NIST SP 800-160 Volume 1 Revision 1. Gaithersburg, MD: NIST. Accessed September 15, 2023. Available at https://csrc.nist.gov/pubs/sp/800/160/v1/r1/final

Additional References

Avizienis A., J. Laprie, B. Randell. and C. Landwehr C. “Basic Concepts and Taxonomy of Dependable and Secure Computing”. IEEE Transactions on Dependable and Secure Computing 1(1):11-33. Accessed September 15, 2023. Available at http://www.csl.sri.com/users/neumann/chats4.pdf

DoD. 1983. DoD Standard 5200.28-STD Trusted Computer System Evaluation Criteria. Accessed September 15, 2023. Available at http://www.csl.sri.com/users/neumann/chats4.pdf

National Security Agency. 2002. Information Assurance Technical Framework (IATF), Release 3.1. Accessed September 15, 2023. Available at https://ntrl.ntis.gov/NTRL/dashboard/searchResults/titleDetail/ADA606355.xhtml

Schroeder M.D., D.D. Clark, and J.H. Saltzer. 1977. “The Multics Kernel Design Project”. Proceedings of Sixth ACM Symposium on Operating Systems Principles. Accessed September 15, 2023. Available at https://dl.acm.org/doi/pdf/10.1145/800214.806546

Videos

Keith Willett previously authored a series of videos that explain aspects of systems security. They are not referenced in the above article, but remain quite relevant and interesting viewing.



< Previous Article | Parent Article | Next Article (Part 7) >
SEBoK v. 2.10, released 06 May 2024