Difference between revisions of "Culture"

From SEBoK
Jump to navigation Jump to search
(Byline)
(38 intermediate revisions by 7 users not shown)
Line 1: Line 1:
Establishing and managing cultures, values, and behaviors is a critical aspect of systems engineering, especially in the context of deploying SE within an organization (Fasser and Brettner 2002). The Columbia Accident Investigation Report (NASA 2003, 101), defines culture as ''“the basic values, norms, beliefs, and practices that characterize the functioning of a particular institution.”''  
+
----
 +
'''''Lead Authors:''''' ''Scott Jackson, Hillary Sillitto, John Snoderly'', '''''Contributing Authors:''''' ''Richard Turner, Art Pyster, Richard Beasley''
 +
----
 +
Establishing and managing cultures, values, and behaviors is a critical aspect of systems engineering, especially in the context of deploying SE within an organization (Fasser and Brettner 2002). The Columbia Accident Investigation Report (NASA 2003, 101), defines ''{{Term|Culture (glossary)|culture (glossary)}}'' as ''“the basic values, norms, beliefs, and practices that characterize the functioning of a particular institution.”''  
  
Stable safety and process cultures are key to effective SE, and can be damaged by an overly-rapid pace of change, a high degree of churn (see the Nimrod Crash Report, Haddon-Cave 2009), or by change that engineers perceive as arbitrarily imposed by management (e.g. Challenger, discussed below). On the other hand, a highly competitive, adversarial or "blame" culture can impede the free flow of information and disrupt synergies in the workplace.  
+
Stable safety and process cultures are key to effective SE, and can be damaged by an overly-rapid pace of change, a high degree of churn (see the Nimrod Crash Report, Haddon-Cave 2009), or by change that engineers perceive as arbitrarily imposed by management (see Challenger, discussed below). On the other hand, a highly competitive, adversarial or “blame” culture can impede the free flow of information and disrupt synergies in the workplace.  
  
 
In the multi-national, multi-business, multi-discipline collaborative projects becoming increasingly prevalent in SE, these factors take on greater importance.
 
In the multi-national, multi-business, multi-discipline collaborative projects becoming increasingly prevalent in SE, these factors take on greater importance.
Line 7: Line 10:
 
Effective handling of cultural issues is a major factor in the success or failure of SE endeavors.
 
Effective handling of cultural issues is a major factor in the success or failure of SE endeavors.
  
==Cultural Perspective==
+
==Systems Thinking and the Culture of the Learning Organization==
The focus of this topic is ''[[Culture (glossary)|culture (glossary)]]'' within SE.
+
Improving SE efficiency and effectiveness can be the goal of cultural change. This kind of culture change encourages people to learn to think and act in terms of systems, organizations and their enterprises; and, to take a systems approach as described in [[Overview of Systems Approaches]] in Part 2, and by Lawson (2010). See the knowledge area [[Systems Thinking]].
 +
 
 +
Attaining a ''learning organization'' culture can be another goal of cultural change. And once the learning organization exists, cultural change in general becomes easier to accomplish.
 +
 
 +
A learning organization aims to absorb, diffuse, generate, and exploit knowledge (Sprenger and Have 1996). Organizations need to manage formal information and facilitate the growth and exploitation of tacit knowledge. They should learn from experience and create a form of ''corporate memory'' – including process, problem domain and solution space knowledge, and information about existing products and services. Fassner and Brettner (2002, 122-124) suggest that ''shared mental models'' are a key aspect of corporate knowledge and culture.
 +
 
 +
A learning organization culture is enabled by disciplines such as
 +
*'''personal mastery''' where a person continually clarifies and deepens personal vision, focuses energy upon it, develops patience in seeking it so as to view reality in an increasingly objective way;
 +
*'''mental models''' where people appreciate that mental models do indeed occupy their minds and shape their actions;
 +
*'''shared vision''' where operating values and sense of purpose are shared to establish a basic level of mutuality; and
 +
*'''team learning''' where people’s thoughts align, creating a feeling that the team as a whole achieves something greater than the sum of what is achieved by its individual members.
 +
 
 +
{{Term|Systems Thinking (glossary)|Systems thinking}} supports these four disciplines, and in so doing becomes the '''fifth discipline''' and plays a critical role in promoting the learning organization (Senge et al. 1994).
  
Cultural change to improve SE efficiency and effectiveness is possible through a systems approach (as described in Part 2: [[Overview of Systems Approaches]] and by Lawson (2010)) and by learning to think and act in terms of systems, organizations and their enterprises.
+
==Cultural Shortfalls and How to Change them==
 +
Cultural shortfalls that are injurious to a system are described as negative {{Term|Paradigm (glossary)|paradigms (glossary)}} by Jackson (2010) and others. For example, a cultural reluctance to identify true {{Term|Risk (glossary)|risks (glossary)}} is the hallmark of the '''Risk Denial''' paradigm as seen in the Challenger and Columbia cases. When individuals believe a system is safe that in fact is not, that is the '''Titanic Effect''' paradigm, which is of course named for the ocean liner catastrophe of 1912.
 +
===Approaches to Change===
 +
Jackson and Erlick (Jackson 2010, 91-119) have found that there is a lack of evidence that a culture can be changed from a success point of view. However, they do suggest the Community of Practice (Jackson 2010, 110-112), an approach founded on the principles of organizational psychology, and discuss the pros and cons of other approaches to culture change, including training, coaching, Socratic teaching, use of teams, independent reviews, standard processes, rewards and incentives, use of cost and schedule margins, reliance on a charismatic executive, and management selection. Shields (2006) provides a similarly comprehensive review.
  
A general culture-related perspective is characterized by (Senge et al. 1994), who identify [[Systems Thinking (glossary)|systems thinking]] as being the '''fifth discipline''' that promotes a learning organization culture. The four disciplines that are supported by systems thinking are
+
The Columbia Accident (NASA 2003) and the Triangle fire (NYFIC 1912) official reports, among many others, call for cultural issues to be addressed through improved leadership, usually augmented by the more objective approach of auditing. One form of auditing is the Independent Technical Authority, which
#'''Personal mastery''' such that a person continually clarifies and deepens personal vision, focuses energy upon it, develops patience in seeking it and in this way apparently increasingly views reality in an objective manner.
+
*is separate from the program organization;
#'''Mental models''' aims to train people to appreciate that mental models do indeed occupy their minds and shape their action.
+
*addresses only technical issues, not managerial ones; and  
#'''Shared vision''' refers to shared operating values, a common sense of purpose, and a basic level of mutuality.
+
*has the right to take action to avoid failure, including by vetoing launch decisions.  
#'''Team learning''' to achieve alignment of people’s thoughts so that a common direction creates a feeling that the whole team achieves more than the sum of its team members.
+
An Independent Technical Authority cannot report to the program manager of the program in question, and it may be formulated within an entirely separate business or enterprise which can view that program objectively. The point of these stipulations is to insure that the Independent Technical Authority is indeed independent.
  
A ''learning organization'' aims to absorb, diffuse, generate, and exploit knowledge (Sprenger and Have 1996). Organizations need to manage formal information and facilitate the growth and exploitation of tacit knowledge. They should learn from experience and create a form of ''corporate memory'' – including process, problem domain and solution space knowledge, and information about existing products and services. Fassner and Brettner (2002, 122-124) suggest that ''shared mental models'' are a key aspect of corporate knowledge and culture. (See also the knowledge area  [[Systems Thinking]])
+
Management and leadership experts have identified ways to lead cultural change in organizations, apart from specifically safety-related cultural change. For example, Gordon (1961) in his work on the use of analogical reasoning called {{Term|Synectics (glossary)|synectics}} is one of several who emphasize creative thinking. Kotter (1995) advocates a series of steps to transform an organization.
  
==Paradigms==
+
== How Culture Manifests in Individuals and Groups ==
Many authorities, including Jackson (2010), have found that cultural shortfalls can be summarized in a set of negative [[Paradigm (glossary)|paradigms (glossary)]] that are injurious to a system. Although there are many paradigms, the following two are typical:
 
#'''Risk Denial''' holds that many people are reluctant to identify true [[Risk (glossary)|risks (glossary)]]. This paradigm is apparent in the Challenger and Columbia described above.
 
#'''Titanic Effect''' holds that the individual believes the system is safe even when it is not. The name of this paradigm comes, of course, from the famous ocean liner catastrophe of 1912.
 
===Approaches===
 
Jackson and Erlick (Jackson 2010, 91-119) have found that there is a lack of evidence that a culture can be changed from a success point of view. However, they do suggest an approach founded on the principles of organizational psychology, namely, the Community of Practice (Jackson 2010, 110-112). The pros and cons of various other approaches are also discussed. These include training, the charismatic executive, Socratic teaching, teams, coaching, independent reviews, cost and schedule margins, standard processes, rewards and incentives, and management selection. Shields (2006) provides a comprehensive list of these and similar approaches.
 
  
Many official reports, such as for the Columbia Accident (NASA 2003) and the Triangle fire (NYFIC 1912), call for an improvement in leadership to address the cultural issues. However, this approach is usually accompanied by a more objective approach of auditing, such as the Independent Technical Authority. This authority has the following features:
+
As a community’s physical, social, and religious environment changes over the generations, cultural beliefs, values, and customs evolve in response, albeit at a slower pace.
*'''Independent:''' The authority is separate from the program organization. It may be from another business/enterprise with an objective view of the program in question. In short, the authority cannot report to the program manager of the program in question.
 
*'''Technical:''' The authority will address only technical as opposed to managerial issues.
 
*'''Authority:''' The board has the right to take action to avoid failure including preventing launch decisions.  
 
  
In addition to the specific safety related cultural issues, there are many management and leadership experts that have identified various means for leading cultural organizational change. For example, the usage of creative thinking promoted by, amongst others, Gordon (1961) in his work on the productive use of analogical reasoning called [[Synectics (glossary)|synectics]]. Another example, Kotter (1995) identifies needed steps in transforming an organization.
+
Helmreich and Merritt describe the effects of cultural factors in the context of aviation safety, and suggest implications for safety cultures in other domains such as medicine. See (Helmreich and Merritt, 2000) and other writings by the same authors.
  
== Other Cultural Factors ==
+
We can describe the cultural orientation of an individual in terms of
 +
*national and/or ethnic culture;
 +
*professional culture; and
 +
*organizational culture.
  
Cultures evolve over generations in response to the community's environment (physical, social, religious). However, as the environment changes, cultural beliefs, values and customs change more slowly.  There are many definitions of culture, but one cited by the Columbia Accident Investigation Board is representative. (NASA 2003)
+
Some particulars of these aspects of culture are sketched below.
  
It is now generally considered that there are three main sources of cultural influence:
+
=== National and/or Ethnic Culture ===
#National (or ethnic) culture;
+
A product of factors such as heritage, history, religion, language, climate, population density, availability of resources, and politics, national culture is acquired in one's formative years and is difficult to change. National culture affects attitudes, behavior, and interactions with others.
#Professional culture; and
+
 
#Organizational culture.
+
National culture may help determine how a person handles or reacts to
These sources of culture, their effects on aviation safety, and suggested implications on safety cultures in other domains such as medicine, are described in Helmreich and Merritt (2000) and other writings by these authors.
+
*rules and regulations;
=== National (or ethnic) Culture ===
+
*uncertainty; and
National culture is a product of factors such as heritage, history, religion, language, climate, population density, availability of resources, and politics. National culture is picked up at a formative age, and once acquired, is difficult to change. National culture affects attitudes and behavior, and has a significant effect on interactions with others, for example
+
*display of emotion, including one’s own.
*Communication styles (direct and specific vs. indirect and non-specific);
+
 
*Leadership styles (hierarchical vs. consultative);
+
National culture may also play a role in whether a person
*Superior – inferior relationships (accepting vs. questioning decisions);
+
*communicates in a direct and specific style, or the opposite;
*Attitudes to rules and regulations;
+
*provides leadership in a hierarchical manner, or a consultative one; and
*Attitudes to uncertainty; and
+
*accepts decisions handed down in superior–inferior relationships, or question them.
*Displaying emotional reactions.
 
  
 
=== Professional Culture ===
 
=== Professional Culture ===
Medical doctors, airline pilots, the military, teachers and many others possess particular professional cultures that overlay their ethnic or national cultures. Professional culture is usually manifested in its members by a sense of community and by the bonds of a common identity (Helmreich and Merritt 2000). Features associated with professional culture typically include some or all of the following:
+
Professional culture acts as an overlay to ethnic or national culture, and usually manifests in a sense of community and in bonding based on a common identity (Helmreich and Merritt 2000). Well-known examples of professional cultures include those of medical doctors, airline pilots, teachers, and the military.
*Selectivity, competition and training in order to gain entry to the profession
+
 
*Member-specific expertise
+
Elements of professional culture may include 
*A shared professional jargon
+
*a shared professional jargon
*Prestige and status with badges or defining uniform
+
*binding norms for behavior
*Binding norms for behavior and common ethical values
+
*common ethical values
*Professional and gender stereotyping
+
*self-regulation
*Self-regulation
+
*barriers to entry like selectivity, competition and training
*Institutional and individual resistance to change
+
*institutional and/or individual resistance to change
Professional culture overlays a person’s national culture. If there are conflicts between the two cultures, during threat situations in particular, the professional culture may dominate, or the earlier-acquired national culture may rise to the fore. Elements of professional culture that are of particular importance (e.g. to safety or survivability) need to be inculcated by extensive training programs, and reinforced at appropriate intervals.
+
*prestige and status, sometimes expressed in badges or uniforms
 +
*stereotyped notions about members of the profession, in general and/or based on gender
 +
 
 +
Particularly important elements of professional culture (for example, those that affect safety or survivability) need to be inculcated by extensive training and reinforced at appropriate intervals.
  
 
=== Organizational Culture ===
 
=== Organizational Culture ===
Organizational culture arises out of the history of an organization, including its leadership, products and services. Although there will be a common layer across the organization, significant differences will emerge in organizations with a high level of multinational integration due to differing national cultures. These will appear as differing leadership styles, manager-subordinate relationships, etc.  Organizations have a formal hierarchy of responsibility and authority; therefore organizational culture is more amenable to carefully-planned change than are either professional or national cultures. Organizational culture channels the effects of the other two cultures into standard working practices; therefore changes to it that are sympathetic to national culture (rather than a culture in the distant group head office) can bring significant performance benefits.  
+
An organization's culture builds up cumulatively, determined by factors like its leadership, products and services, relationships with competitors, and role in society.  
  
Organizational culture is also unique; what works in one organization is unlikely to work in another. Some of the factors thought to influence or engender organizational culture include
+
Compared with one another, organizational cultures are not standardized because what works in one organization seldom works in another. Even so, strength in the following elements normally engenders a strong organizational culture:
*Strong corporate identity;
+
*corporate identity;
*Effective leadership;
+
*leadership;
*High morale and trust;
+
*morale and trust;
*Cohesive team working and cooperation;
+
*teamwork and cooperation;
*Job security;
+
*job security;
*Development & training;
+
*professional development and training;
*Confidence, e.g. in quality and safety practices, management communication and feedback; and
+
*empowerment of individuals; and
*High degree of empowerment.
+
*confidence, for example in quality and safety practices, or in management communication and feedback.
 +
 
 +
When the culture of the people in an organization is considered as a whole, organizational culture acts as a common layer shared by all. In spite of this, differing national cultures can produce differences in leadership styles, manager-subordinate relationships, and so on, especially in organizations with a high degree of multinational integration. 
 +
 
 +
Because organizations have formal hierarchies of responsibility and authority, organizational culture is more amenable to carefully-planned change than are either professional or national cultures. If changes are made in a manner that is sympathetic to local national culture (as opposed to that of a distant group head office, for example), they can bring significant performance benefits. This is because organizational culture channels the effects of national and professional cultures into standard working practices.
 +
 
 +
There are many definitions of culture in the literature. The Columbia Accident Investigation Board (NASA 2003) provides a useful one for understanding culture and engineering.
  
 
==Culture and Safety==
 
==Culture and Safety==
Reason (1997, 191-220) identifies four components of a culture with a focus on safety:
+
Reason (1997, 191-220) describes a culture which focuses on safety as having four components:
#A reporting culture encourages individuals to report errors and near misses including their own errors and near misses.  
+
#A reporting culture which encourages individuals to report errors and near misses, including their own.  
#A just culture is ''an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety related information.''
+
#A just culture which provides ''an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information.''
#A flexible culture abandons the traditional hierarchical reporting structure in favor of more direct means of team-to-team communications.  
+
#A flexible culture which abandons the traditional hierarchical reporting structure in favor of more direct team-to-team communications.  
#A learning culture is the willingness to draw the right conclusions from information and to implement reforms when necessary.
+
#A learning culture which is willing to draw the right conclusions from safety-related information and to implement reforms when necessary.
  
The Nuclear Regulatory Agency (2011) has issued its final report on safety culture. This report focuses mainly on leadership and individual authority.
+
Weick and Sutcliffe (2001, 3) introduce the term {{Term|High Reliability Organizations (HROs) (glossary)}}. HROs have ''fewer than their fair share of accidents'' despite operating ''under trying conditions'' in domains subject to catastrophic events.    Examples include ''power grid dispatching centers, air traffic control systems, nuclear aircraft carriers, nuclear power generation plants, hospital emergency departments, and hostage negotiation teams.''
 +
There are five hallmarks of HROs (Weick and Sutcliffe 2001, 10):
 +
#'''Preoccupation with Failure'''—HROs eschew complacency, learn from near misses, and do not ignore errors, large or small.
 +
#'''Reluctance to Simplify Interpretations'''—HROs simplify less and see more.  They “encourage skepticism towards received wisdom.”
 +
#'''Sensitivity to Operations'''—HROs strive to detect “latent failures,” defined by James Reason (1997) as systemic deficiencies that amount to accidents waiting to happen. They have well-developed situational awareness and make continuous adjustments to keep errors from accumulating and enlarging.
 +
#'''Commitment to Resilience'''—HROs keep errors small and improvise “workarounds that keep the system functioning.” They have a deep understanding of technology and constantly consider worst case scenarios in order to make corrections.
 +
#'''Deference to Expertise'''—HROs “push decision making down.” Decisions are made “on the front line.” They avoid rigid hierarchies and go directly to the person with the expertise.
  
Weick and Sutcliffe (2001, 3) introduced the term [[High Reliability Organizations (HROs) (glossary)]].  [[Acronyms|HROs]] are in domains subject to catastrophic events, and are ''organizations that operate under trying conditions and have fewer than their fair share of accidents.''  Example HROs include ''power grid dispatching centers, air traffic control systems, nuclear aircraft carriers, nuclear power generation plants, hospital emergency departments, and hostage negotiation teams.''
+
The US Nuclear Regulatory Agency (2011) focuses mainly on leadership and individual authority in its policy statement on safety culture.
Weick and Sutcliffe (2001, 10) identify five hallmarks of HROs:
 
#Preoccupation with Failure: HROs do not ignore errors, large or small, learn from near misses and avoid complacency.
 
#Reluctance to Simplify Interpretations: HROs simplify less and see more.  They “encourage skepticism towards received wisdom.”
 
#Sensitivity to Operations: HROs pay attention to possible latent conditions, defined by James Reason (1997) to be deficiencies in the system that have not yet resulted in an accident but are waiting to happen. They have well developed situational awareness and make continuous adjustments to keep errors from accumulating and enlarging.
 
#Commitment to Resilience: HROs keep errors small and improvise “workarounds that keep the system functioning.” They have a deep understanding of the technology and constantly create worst case situations to make corrections.
 
#Deference to Expertise: HROs “push decision making down.” Decisions are made “on the front line.” They avoid rigid hierarchies and go directly to the person with the expertise.
 
  
===Historical Safety Related Cases===
+
===Historical Catastrophes and Safety Culture===
Culture has been cited as a key factor in the success or failure of many systems. In all of the following cases, culture was cited in official reports or by authoritative experts as a factor in the success or failure of the systems involved.  
+
The cases described in the table below are some of the many in which official reports or authoritative experts cited culture as a factor in the catastrophic failure of the systems involved.  
  
 
{|
 
{|
Line 103: Line 124:
 
|-
 
|-
 
|Apollo
 
|Apollo
|According to Feynman (1988), Apollo was a successful program because it was a culture of ''common interest.'' Then over the next 20 years there was ''loss of common interest.'' This loss is the ''cause of the deterioration in cooperation, which . . . produced a calamity.''  
+
|According to Feynman (1988), Apollo was a successful program because of its culture of ''“common interest.'' The ''“loss of common interest”'' over the next 20 years then caused ''“the deterioration in cooperation, which . . . produced a calamity.''  
 
|-
 
|-
 
|Challenger
 
|Challenger
|Vaughn (1997) captured what she called ''normalization of deviance.'' She states that rather than taking risks seriously, NASA simply ignored them by calling them normal. She summarizes this idea by saying that ''flying with acceptable risks was normative in NASA culture.''
+
|Vaughn (1997) states that rather than taking risks seriously, NASA simply ignored them by calling them normal—what she terms ''“normalization of deviance,”'' whose result was that ''“flying with acceptable risks was normative in NASA culture.''
 
|-
 
|-
 
|Columbia
 
|Columbia
|The Columbia Accident Investigation Report (NASA 2003, 102) echoed Feynman’s view and declared that NASA had a ''broken safety culture.'' The board concluded that NASA had become a culture in which bureaucratic procedures took precedence over technical excellence.  
+
|The Columbia Accident Investigation Report (NASA 2003, 102) echoed Feynman’s view and declared that NASA had a ''“broken safety culture.'' The board concluded that NASA had become a culture in which bureaucratic procedures took precedence over technical excellence.  
 
|-
 
|-
 
|Texas City - 2005
 
|Texas City - 2005
|On August 3, 2005, a process accident occurred at the BP refinery in a Texas City refinery in the USA resulting in 19 deaths and more than 170 injuries. The Independent Safety Review Panel (2007) found that a corporate safety culture existed that ''has not provided effective process safety leadership and has not adequately established process safety as a core value across all its five U.S. refineries.'' The report recommended ''an independent auditing function.''  
+
|On August 3, 2005, a process accident occurred at the BP refinery in a Texas City refinery in the USA resulting in 19 deaths and more than 170 injuries. The Independent Safety Review Panel (2007) found that a corporate safety culture existed that ''“has not provided effective process safety leadership and has not adequately established process safety as a core value across all its five U.S. refineries.'' The report recommended ''“an independent auditing function.''  
 
|-
 
|-
 
|The Triangle Fire
 
|The Triangle Fire
|On August 11, 1911, a fire broke out in the borough of Manhattan in New York City in which 145 people died, mostly women (NYFIC 1912). The New York State Commission castigated the property owners for their lack of understanding of the ''human factors'' in the case. The report called for the establishment of standards to address this deficiency.
+
|On August 11, 1911, a fire at the Triangle shirtwaist factory in New York City killed 145 people, mostly women (NYFIC 1912). The New York Factory Investigating Commission castigated the property owners for their lack of understanding of the ''“human factors”'' in the case and called for the establishment of standards to address this deficiency.
 
|-
 
|-
 
|Nimrod
 
|Nimrod
|On September 2, 2006, a Nimrod British military aircraft caught fire and crashed killing its entire crew of 14. The Haddon-Cave report (Haddon-Cave 2009) focused on the cultural aspect. This report specifically references the Columbia Accident Investigation Report and the conclusions in it. A system of detailed audits is recommended.
+
|On September 2, 2006, a Nimrod British military aircraft caught fire and crashed, killing its entire crew of 14. The Haddon-Cave report (Haddon-Cave 2009) found that Royal Air Force culture had come to value staying within budget over airworthiness. Referencing the conclusions of the Columbia Accident Investigation Report, the Haddon-Cave report recommends creation of a system of detailed audits.
 
|}
 
|}
 +
 +
==Relationship to Ethics==
 +
A business's culture has the potential to reinforce or undermine ethical behavior.  For example, a culture that encourages open and transparent decision making and behavior makes it harder for unethical behavior to go undetected.  The many differences in culture around the world are reflected in different perspectives on what is ethical behavior. This is often reflected in difficulties that international companies face when doing business globally, sometimes leading to scandals because behavior that is considered ethical in one country may be considered unethical in another.  See [[Ethical Behavior]] for more information about this.
  
 
== Implications for Systems Engineering ==
 
== Implications for Systems Engineering ==
As SE increasingly seeks to work across national, ethnic, and organizational boundaries, systems engineers need to be aware of cultural issues and how these affect expectations and behavior in collaborative working environments. Different cultures and personal styles make best use of information presented in different ways and in different orders (levels of abstraction, big picture first or detail, principles first or practical examples). Sensitivity to cultural issues will make a difference to the success of SE endeavors; e.g., (Siemieniuch and Sinclair 2006).
+
As SE increasingly seeks to work across national, ethnic, and organizational boundaries, systems engineers need to be aware of cultural issues and how they affect expectations and behavior in collaborative working environments. SEs need to present information in an order and a manner suited to the culture and personal style of the audience. This entails choices like whether to start with principles or practical examples, levels of abstraction or use cases, the big picture or the detailed view.  
 +
 
 +
Sensitivity to cultural issues is a success factor in SE endeavors (Siemieniuch and Sinclair 2006).
  
 
==References==  
 
==References==  
Line 147: Line 173:
 
NASA. 2003. ''[[Columbia Accident Investigation Report]].'' Washington, DC, USA: National Aeronautics and Space Administration (NASA). August 2003.
 
NASA. 2003. ''[[Columbia Accident Investigation Report]].'' Washington, DC, USA: National Aeronautics and Space Administration (NASA). August 2003.
  
Nuclear Regulatory Agency. 2011. "NRC Issues Final Safety Culture Policy Statement." ''NRC News'' (14 June 2011). Available at: pbadupws.nrc.gov/docs/ML1116/ML11166A058.pdf.
+
Nuclear Regulatory Agency. 2011. "NRC Issues Final Safety Culture Policy Statement." ''NRC News'' (14 June 2011). Available at: http://pbadupws.nrc.gov/docs/ML1116/ML11166A058.pdf.
  
 
NYFIC. 1912. ''Preliminary Report of the New York Factory Investigating Commission''. R. F. Wagner (ed). New York, NY, USA: New York Factory Investigating Commission (NYFIC).
 
NYFIC. 1912. ''Preliminary Report of the New York Factory Investigating Commission''. R. F. Wagner (ed). New York, NY, USA: New York Factory Investigating Commission (NYFIC).
Line 170: Line 196:
 
Helmreich, R.L., and A.C. Merritt. 2000. "[[Safety and Error Management]]: The Role of Crew Resource Management." In ''Aviation Resource Management,'' edited by B.J. Hayward and A.R. Lowe. Aldershot, UK: Ashgate. (UTHFRP Pub250). p. 107-119.  
 
Helmreich, R.L., and A.C. Merritt. 2000. "[[Safety and Error Management]]: The Role of Crew Resource Management." In ''Aviation Resource Management,'' edited by B.J. Hayward and A.R. Lowe. Aldershot, UK: Ashgate. (UTHFRP Pub250). p. 107-119.  
  
Hofstede, G. 1984. ''[[Culture’s Consequences]]: International Differences in Work-Related Values''. London, UK: Sage.
+
Hofstede, G. 1984. ''[[Culture’s Consequences]]: International Differences in Work-Related Values''. London, UK: Sage Publications.
  
 
Jackson, S. 2010. ''[[Architecting Resilient Systems]]: Accident Avoidance and Survival and Recovery from Disruptions.''  Hoboken, NJ, USA: John Wiley & Sons.
 
Jackson, S. 2010. ''[[Architecting Resilient Systems]]: Accident Avoidance and Survival and Recovery from Disruptions.''  Hoboken, NJ, USA: John Wiley & Sons.
Line 181: Line 207:
  
 
===Additional References===
 
===Additional References===
Feynman, R. 1988. "An Outsider's Inside View of the Challenger Inquiry." ''Physics Today.'' 41(2) (February 1988): 26-27.
 
 
Gordon, W.J.J. 1961. ''Synectics: The Development of Creative Capacity.'' New York, NY, USA: Harper and Row.
 
 
Haddon-Cave, C. 2009. ''An Independent Review into the Broader Issues Surrounding the Loss of the RAF Nimrod MR2 Aircraft XV230 in Afganistan in 2006.'' London, UK: The House of Commons.
 
 
 
Hofstede, G. 2001. ''Culture's Consequences: Comparing Values, Behaviors, Institutions and Organizations Across Nations,''  Second Edition.  Thousand Oaks, CA, USA: Sage Publications.
 
Hofstede, G. 2001. ''Culture's Consequences: Comparing Values, Behaviors, Institutions and Organizations Across Nations,''  Second Edition.  Thousand Oaks, CA, USA: Sage Publications.
  
 
Hofstede, G. 2010. ''Cultures and Organizations: Software for the Mind,'' Third Edition. New York, NY, USA: McGraw Hill.
 
Hofstede, G. 2010. ''Cultures and Organizations: Software for the Mind,'' Third Edition. New York, NY, USA: McGraw Hill.
  
Independent Safety Review Panel. 2007. ''The Report of the BP U.S. Refineries Independent Safety Panel.'' Edited by J.A. Baker. Texas City, TX, USA.
 
 
Kotter, J.P. 1995. "Leading Change: Why Transformation Efforts Fail." ''Harvard Business Review.'' (March-April): 59-67.
 
 
Lawson, H. 2010. ''A Journey Through the Systems Landscape.'' London, UK: College  Publications, Kings College.
 
 
NYFIC. 1912. ''Preliminary Report of the New York Factory Investigating Commission''. R. F. Wagner (ed). New York, NY, USA: New York Factory Investigating Commission (NYFIC).
 
 
Nuclear Regulatory Agency. 2011. "NRC Issues Final Safety Culture Policy Statement." ''NRC News'' (14 June 2011). Available at: pbadupws.nrc.gov/docs/ML1116/ML11166A058.pdf.
 
 
Shields, J.L. 2006. "Organization and Culture Change."  In ''Enterprise Transformation,'' W.B. Rouse (ed.). Hoboken, NJ, USA: John Wiley & Son.
 
 
Siemieniuch, C.E. and M.A. Sinclair. 2006. "Impact of Cultural Attributes on Decision Structures and Interfaces." Paper presented at the 11th ICCRTS Coalition Command and Control in the Networked Era. Cambridge, MA, USA. p. 1-20.
 
 
Vaughn, D. 1997. ''The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA.'' Chicago, IL, USA: University of Chicago Press.
 
 
Weick, K.E. and K.M. Sutcliffe. 2001. ''Managing the Unexpected: Assuring High Performance in an Age of Complexity''. San Francisco, CA, USA: Jossey-Bass (Jossey-Bass acquired by Hoboken, NJ, USA: Wiley Periodicals, Inc.).
 
  
 
----
 
----
 
<center>[[Developing Systems Engineering Capabilities within Businesses and Enterprises|< Previous Article]] | [[Enabling Businesses and Enterprises|Parent Article]] | [[Enabling Teams|Next Article >]]</center>
 
<center>[[Developing Systems Engineering Capabilities within Businesses and Enterprises|< Previous Article]] | [[Enabling Businesses and Enterprises|Parent Article]] | [[Enabling Teams|Next Article >]]</center>
  
{{DISQUS}}
+
<center>'''SEBoK v. 2.1, released 31 October 2019'''</center>
  
 
[[Category: Part 5]][[Category:Topic]]
 
[[Category: Part 5]][[Category:Topic]]
 
[[Category:Enabling Businesses and Enterprises]]
 
[[Category:Enabling Businesses and Enterprises]]

Revision as of 13:42, 26 October 2019


Lead Authors: Scott Jackson, Hillary Sillitto, John Snoderly, Contributing Authors: Richard Turner, Art Pyster, Richard Beasley


Establishing and managing cultures, values, and behaviors is a critical aspect of systems engineering, especially in the context of deploying SE within an organization (Fasser and Brettner 2002). The Columbia Accident Investigation Report (NASA 2003, 101), defines culture (glossary)culture (glossary) as “the basic values, norms, beliefs, and practices that characterize the functioning of a particular institution.”

Stable safety and process cultures are key to effective SE, and can be damaged by an overly-rapid pace of change, a high degree of churn (see the Nimrod Crash Report, Haddon-Cave 2009), or by change that engineers perceive as arbitrarily imposed by management (see Challenger, discussed below). On the other hand, a highly competitive, adversarial or “blame” culture can impede the free flow of information and disrupt synergies in the workplace.

In the multi-national, multi-business, multi-discipline collaborative projects becoming increasingly prevalent in SE, these factors take on greater importance.

Effective handling of cultural issues is a major factor in the success or failure of SE endeavors.

Systems Thinking and the Culture of the Learning Organization

Improving SE efficiency and effectiveness can be the goal of cultural change. This kind of culture change encourages people to learn to think and act in terms of systems, organizations and their enterprises; and, to take a systems approach as described in Overview of Systems Approaches in Part 2, and by Lawson (2010). See the knowledge area Systems Thinking.

Attaining a learning organization culture can be another goal of cultural change. And once the learning organization exists, cultural change in general becomes easier to accomplish.

A learning organization aims to absorb, diffuse, generate, and exploit knowledge (Sprenger and Have 1996). Organizations need to manage formal information and facilitate the growth and exploitation of tacit knowledge. They should learn from experience and create a form of corporate memory – including process, problem domain and solution space knowledge, and information about existing products and services. Fassner and Brettner (2002, 122-124) suggest that shared mental models are a key aspect of corporate knowledge and culture.

A learning organization culture is enabled by disciplines such as

  • personal mastery where a person continually clarifies and deepens personal vision, focuses energy upon it, develops patience in seeking it so as to view reality in an increasingly objective way;
  • mental models where people appreciate that mental models do indeed occupy their minds and shape their actions;
  • shared vision where operating values and sense of purpose are shared to establish a basic level of mutuality; and
  • team learning where people’s thoughts align, creating a feeling that the team as a whole achieves something greater than the sum of what is achieved by its individual members.

Systems thinkingSystems thinking supports these four disciplines, and in so doing becomes the fifth discipline and plays a critical role in promoting the learning organization (Senge et al. 1994).

Cultural Shortfalls and How to Change them

Cultural shortfalls that are injurious to a system are described as negative paradigms (glossary)paradigms (glossary) by Jackson (2010) and others. For example, a cultural reluctance to identify true risks (glossary)risks (glossary) is the hallmark of the Risk Denial paradigm as seen in the Challenger and Columbia cases. When individuals believe a system is safe that in fact is not, that is the Titanic Effect paradigm, which is of course named for the ocean liner catastrophe of 1912.

Approaches to Change

Jackson and Erlick (Jackson 2010, 91-119) have found that there is a lack of evidence that a culture can be changed from a success point of view. However, they do suggest the Community of Practice (Jackson 2010, 110-112), an approach founded on the principles of organizational psychology, and discuss the pros and cons of other approaches to culture change, including training, coaching, Socratic teaching, use of teams, independent reviews, standard processes, rewards and incentives, use of cost and schedule margins, reliance on a charismatic executive, and management selection. Shields (2006) provides a similarly comprehensive review.

The Columbia Accident (NASA 2003) and the Triangle fire (NYFIC 1912) official reports, among many others, call for cultural issues to be addressed through improved leadership, usually augmented by the more objective approach of auditing. One form of auditing is the Independent Technical Authority, which

  • is separate from the program organization;
  • addresses only technical issues, not managerial ones; and
  • has the right to take action to avoid failure, including by vetoing launch decisions.

An Independent Technical Authority cannot report to the program manager of the program in question, and it may be formulated within an entirely separate business or enterprise which can view that program objectively. The point of these stipulations is to insure that the Independent Technical Authority is indeed independent.

Management and leadership experts have identified ways to lead cultural change in organizations, apart from specifically safety-related cultural change. For example, Gordon (1961) in his work on the use of analogical reasoning called synecticssynectics is one of several who emphasize creative thinking. Kotter (1995) advocates a series of steps to transform an organization.

How Culture Manifests in Individuals and Groups

As a community’s physical, social, and religious environment changes over the generations, cultural beliefs, values, and customs evolve in response, albeit at a slower pace.

Helmreich and Merritt describe the effects of cultural factors in the context of aviation safety, and suggest implications for safety cultures in other domains such as medicine. See (Helmreich and Merritt, 2000) and other writings by the same authors.

We can describe the cultural orientation of an individual in terms of

  • national and/or ethnic culture;
  • professional culture; and
  • organizational culture.

Some particulars of these aspects of culture are sketched below.

National and/or Ethnic Culture

A product of factors such as heritage, history, religion, language, climate, population density, availability of resources, and politics, national culture is acquired in one's formative years and is difficult to change. National culture affects attitudes, behavior, and interactions with others.

National culture may help determine how a person handles or reacts to

  • rules and regulations;
  • uncertainty; and
  • display of emotion, including one’s own.

National culture may also play a role in whether a person

  • communicates in a direct and specific style, or the opposite;
  • provides leadership in a hierarchical manner, or a consultative one; and
  • accepts decisions handed down in superior–inferior relationships, or question them.

Professional Culture

Professional culture acts as an overlay to ethnic or national culture, and usually manifests in a sense of community and in bonding based on a common identity (Helmreich and Merritt 2000). Well-known examples of professional cultures include those of medical doctors, airline pilots, teachers, and the military.

Elements of professional culture may include

  • a shared professional jargon
  • binding norms for behavior
  • common ethical values
  • self-regulation
  • barriers to entry like selectivity, competition and training
  • institutional and/or individual resistance to change
  • prestige and status, sometimes expressed in badges or uniforms
  • stereotyped notions about members of the profession, in general and/or based on gender

Particularly important elements of professional culture (for example, those that affect safety or survivability) need to be inculcated by extensive training and reinforced at appropriate intervals.

Organizational Culture

An organization's culture builds up cumulatively, determined by factors like its leadership, products and services, relationships with competitors, and role in society.

Compared with one another, organizational cultures are not standardized because what works in one organization seldom works in another. Even so, strength in the following elements normally engenders a strong organizational culture:

  • corporate identity;
  • leadership;
  • morale and trust;
  • teamwork and cooperation;
  • job security;
  • professional development and training;
  • empowerment of individuals; and
  • confidence, for example in quality and safety practices, or in management communication and feedback.

When the culture of the people in an organization is considered as a whole, organizational culture acts as a common layer shared by all. In spite of this, differing national cultures can produce differences in leadership styles, manager-subordinate relationships, and so on, especially in organizations with a high degree of multinational integration.

Because organizations have formal hierarchies of responsibility and authority, organizational culture is more amenable to carefully-planned change than are either professional or national cultures. If changes are made in a manner that is sympathetic to local national culture (as opposed to that of a distant group head office, for example), they can bring significant performance benefits. This is because organizational culture channels the effects of national and professional cultures into standard working practices.

There are many definitions of culture in the literature. The Columbia Accident Investigation Board (NASA 2003) provides a useful one for understanding culture and engineering.

Culture and Safety

Reason (1997, 191-220) describes a culture which focuses on safety as having four components:

  1. A reporting culture which encourages individuals to report errors and near misses, including their own.
  2. A just culture which provides an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information.
  3. A flexible culture which abandons the traditional hierarchical reporting structure in favor of more direct team-to-team communications.
  4. A learning culture which is willing to draw the right conclusions from safety-related information and to implement reforms when necessary.

Weick and Sutcliffe (2001, 3) introduce the term high reliability organizations (hros)high reliability organizations (hros). HROs have fewer than their fair share of accidents despite operating under trying conditions in domains subject to catastrophic events. Examples include power grid dispatching centers, air traffic control systems, nuclear aircraft carriers, nuclear power generation plants, hospital emergency departments, and hostage negotiation teams. There are five hallmarks of HROs (Weick and Sutcliffe 2001, 10):

  1. Preoccupation with Failure—HROs eschew complacency, learn from near misses, and do not ignore errors, large or small.
  2. Reluctance to Simplify Interpretations—HROs simplify less and see more. They “encourage skepticism towards received wisdom.”
  3. Sensitivity to Operations—HROs strive to detect “latent failures,” defined by James Reason (1997) as systemic deficiencies that amount to accidents waiting to happen. They have well-developed situational awareness and make continuous adjustments to keep errors from accumulating and enlarging.
  4. Commitment to Resilience—HROs keep errors small and improvise “workarounds that keep the system functioning.” They have a deep understanding of technology and constantly consider worst case scenarios in order to make corrections.
  5. Deference to Expertise—HROs “push decision making down.” Decisions are made “on the front line.” They avoid rigid hierarchies and go directly to the person with the expertise.

The US Nuclear Regulatory Agency (2011) focuses mainly on leadership and individual authority in its policy statement on safety culture.

Historical Catastrophes and Safety Culture

The cases described in the table below are some of the many in which official reports or authoritative experts cited culture as a factor in the catastrophic failure of the systems involved.

Table 1. Examples of Culture Discussion in Safety Critical Incidents. (SEBoK Original)
Example Cultural Discussion
Apollo According to Feynman (1988), Apollo was a successful program because of its culture of “common interest.” The “loss of common interest” over the next 20 years then caused “the deterioration in cooperation, which . . . produced a calamity.”
Challenger Vaughn (1997) states that rather than taking risks seriously, NASA simply ignored them by calling them normal—what she terms “normalization of deviance,” whose result was that “flying with acceptable risks was normative in NASA culture.”
Columbia The Columbia Accident Investigation Report (NASA 2003, 102) echoed Feynman’s view and declared that NASA had a “broken safety culture.” The board concluded that NASA had become a culture in which bureaucratic procedures took precedence over technical excellence.
Texas City - 2005 On August 3, 2005, a process accident occurred at the BP refinery in a Texas City refinery in the USA resulting in 19 deaths and more than 170 injuries. The Independent Safety Review Panel (2007) found that a corporate safety culture existed that “has not provided effective process safety leadership and has not adequately established process safety as a core value across all its five U.S. refineries.” The report recommended “an independent auditing function.”
The Triangle Fire On August 11, 1911, a fire at the Triangle shirtwaist factory in New York City killed 145 people, mostly women (NYFIC 1912). The New York Factory Investigating Commission castigated the property owners for their lack of understanding of the “human factors” in the case and called for the establishment of standards to address this deficiency.
Nimrod On September 2, 2006, a Nimrod British military aircraft caught fire and crashed, killing its entire crew of 14. The Haddon-Cave report (Haddon-Cave 2009) found that Royal Air Force culture had come to value staying within budget over airworthiness. Referencing the conclusions of the Columbia Accident Investigation Report, the Haddon-Cave report recommends creation of a system of detailed audits.

Relationship to Ethics

A business's culture has the potential to reinforce or undermine ethical behavior. For example, a culture that encourages open and transparent decision making and behavior makes it harder for unethical behavior to go undetected. The many differences in culture around the world are reflected in different perspectives on what is ethical behavior. This is often reflected in difficulties that international companies face when doing business globally, sometimes leading to scandals because behavior that is considered ethical in one country may be considered unethical in another. See Ethical Behavior for more information about this.

Implications for Systems Engineering

As SE increasingly seeks to work across national, ethnic, and organizational boundaries, systems engineers need to be aware of cultural issues and how they affect expectations and behavior in collaborative working environments. SEs need to present information in an order and a manner suited to the culture and personal style of the audience. This entails choices like whether to start with principles or practical examples, levels of abstraction or use cases, the big picture or the detailed view.

Sensitivity to cultural issues is a success factor in SE endeavors (Siemieniuch and Sinclair 2006).

References

Works Cited

Fasser, Y. and D. Brettner. 2002. Management for Quality in High-Technology Enterprises. New York, NY, USA: Wiley.

Feynman, R. 1988. "An Outsider's Inside View of the Challenger Inquiry." Physics Today. 41(2) (February 1988): 26-27.

Gordon, W.J.J. 1961. Synectics: The Development of Creative Capacity. New York, NY, USA: Harper and Row.

Haddon-Cave, C. 2009. An Independent Review into the Broader Issues Surrounding the Loss of the RAF Nimrod MR2 Aircraft XV230 in Afganistan in 2006. London, UK: The House of Commons.

Helmreich, R.L., and A.C. Merritt. 2000. "Safety and Error Management: The Role of Crew Resource Management." In Aviation Resource Management, edited by B.J. Hayward and A.R. Lowe. Aldershot, UK: Ashgate. (UTHFRP Pub250). p. 107-119.

Independent Safety Review Panel. 2007. The Report of the BP U.S. Refineries Independent Safety Panel. Edited by J.A. Baker. Texas City, TX, USA.

Jackson, S. 2010. Architecting Resilient Systems: Accident Avoidance and Survival and Recovery from Disruptions. Hoboken, NJ, USA: John Wiley & Sons.

Kotter, J.P. 1995. "Leading Change: Why Transformation Efforts Fail." Harvard Business Review. (March-April): 59-67.

Lawson, H. 2010. A Journey Through the Systems Landscape. London, UK: College Publications, Kings College.

NASA. 2003. Columbia Accident Investigation Report. Washington, DC, USA: National Aeronautics and Space Administration (NASA). August 2003.

Nuclear Regulatory Agency. 2011. "NRC Issues Final Safety Culture Policy Statement." NRC News (14 June 2011). Available at: http://pbadupws.nrc.gov/docs/ML1116/ML11166A058.pdf.

NYFIC. 1912. Preliminary Report of the New York Factory Investigating Commission. R. F. Wagner (ed). New York, NY, USA: New York Factory Investigating Commission (NYFIC).

Reason, J. 1997. Managing the Risks of Organisational Accidents. Aldershot, UK: Ashgate Publishing Limited.

Senge, P.M., A. Klieiner, C. Roberts, R.B. Ross, and B.J. Smith. 1994. The Fifth Discipline Fieldbook: Strategies and Tools for Building a Learning Organization. New York, NY, USA: Currency Doubleday.

Shields, J.L. 2006. "Organization and Culture Change." In Enterprise Transformation, W.B. Rouse (ed.). Hoboken, NJ, USA: John Wiley & Son.

Siemieniuch, C.E. and M.A. Sinclair. 2006. "Impact of Cultural Attributes on Decision Structures and Interfaces." Paper presented at the 11th ICCRTS Coalition Command and Control in the Networked Era. Cambridge, MA, USA. p. 1-20.

Sprenger, C. and S.T. Have. 1996. "4 Competencies of a Learning Organization." (Original title: "Kennismanagement als moter van delerende organisatie"). Holland Management Review Sept–Oct, p. 73–89.

Vaughn, D. 1997. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago, IL, USA: University of Chicago Press.

Weick, K.E. and K.M. Sutcliffe. 2001. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Francisco, CA, USA: Jossey-Bass (Jossey-Bass acquired by Hoboken, NJ, USA: Wiley Periodicals, Inc.).

Primary References

Fasser, Y. and D. Brettner. 2002. Management for Quality in High-Technology Enterprises. New York, NY, USA: Wiley.

Helmreich, R.L., and A.C. Merritt. 2000. "Safety and Error Management: The Role of Crew Resource Management." In Aviation Resource Management, edited by B.J. Hayward and A.R. Lowe. Aldershot, UK: Ashgate. (UTHFRP Pub250). p. 107-119.

Hofstede, G. 1984. Culture’s Consequences: International Differences in Work-Related Values. London, UK: Sage Publications.

Jackson, S. 2010. Architecting Resilient Systems: Accident Avoidance and Survival and Recovery from Disruptions. Hoboken, NJ, USA: John Wiley & Sons.

NASA. 2003. Columbia Accident Investigation Report. Washington, DC, USA: National Aeronautics and Space Administration (NASA). August 2003.

Reason, J. 1997. Managing the Risks of Organisational Accidents. Aldershot, UK: Ashgate Publishing Limited.

Senge, P.M., A. Klieiner, C. Roberts, R.B. Ross, and B.J. Smith. 1994. The Fifth Discipline Fieldbook: Strategies and Tools for Building a Learning Organization. New York, NY, USA: Currency Doubleday.

Additional References

Hofstede, G. 2001. Culture's Consequences: Comparing Values, Behaviors, Institutions and Organizations Across Nations, Second Edition. Thousand Oaks, CA, USA: Sage Publications.

Hofstede, G. 2010. Cultures and Organizations: Software for the Mind, Third Edition. New York, NY, USA: McGraw Hill.



< Previous Article | Parent Article | Next Article >
SEBoK v. 2.1, released 31 October 2019