Difference between revisions of "Culture"

From SEBoK
Jump to navigation Jump to search
Line 17: Line 17:
 
*''The Titanic Effect'' - This paradigm holds that the individual believes the system is safe even when it is not. The name of this paradigm comes, of course, from the famous ocean liner catastrophe of 1912.
 
*''The Titanic Effect'' - This paradigm holds that the individual believes the system is safe even when it is not. The name of this paradigm comes, of course, from the famous ocean liner catastrophe of 1912.
 
===Approaches===
 
===Approaches===
Jackson and Erlick (e.g. Jackson 2010, 91-119) have found that there is a lack of evidence that a culture can be changed from a success point of view. However, they do suggest an approach founded on the principles of organizational psychology, namely, the Community of Practice (Jackson 2010, 110-112). The pros and cons of various other approaches are discussed, namely, training, the charismatic executive, Socratic teaching, teams, coaching, independent reviews, cost and schedule margins, standard processes, rewards and incentives, and management selection. Shields (2006) provides a comprehensive list of these and similar approaches.  
+
Jackson and Erlick (e.g. Jackson 2010, 91-119) have found that there is a lack of evidence that a culture can be changed from a success point of view. However, they do suggest an approach founded on the principles of organizational psychology, namely, the Community of Practice (Jackson 2010, 110-112). The pros and cons of various other approaches are also discussed. These include training, the charismatic executive, Socratic teaching, teams, coaching, independent reviews, cost and schedule margins, standard processes, rewards and incentives, and management selection. (Shields 2006) provides a comprehensive list of these and similar approaches.  
  
 
Many official reports, such as for the Columbia Accident (NASA 2003) and the Triangle fire (NYFIC 1912), call for an improvement in leadership to address the cultural issues. However, this approach is usually accompanied by a more objective approach of auditing, such as the Independent Technical Authority. This authority has the following features:
 
Many official reports, such as for the Columbia Accident (NASA 2003) and the Triangle fire (NYFIC 1912), call for an improvement in leadership to address the cultural issues. However, this approach is usually accompanied by a more objective approach of auditing, such as the Independent Technical Authority. This authority has the following features:

Revision as of 11:16, 2 September 2011

A critical issue in effective organizational deployment of systems engineering is establishing and managing cultures, values and behaviors. (Fasser & Brettner 2002) A high degree of churn or imposed change can disrupt established cultures that are key to effective systems engineering. A safety or process culture can be damaged by too high a pace of change (see the Nimrod Crash Report, Haddon-Cave 2009) or by perceived management imperatives (e.g. Challenger, discussed below). A highly competitive, adversarial or "blame" culture can impede free flow of information and synergistic working. These factors particularly affect the multi-national, multi-business, multi-discipline collaborative projects that are increasingly prevalent in systems engineering. Effective handling of cultural issues is a major factor in the success, or otherwise, of systems engineering endeavours.

Cultural Perspective

The focus of this topic is culture within systems engineering. As defined in the Columbia Accident Investigation Report (NASA 2003, p.101), culture is “the basic values, norms, beliefs, and practices that characterize the functioning of a particular institution.”

Cultural change to improve systems engineering efficiency and effectiveness is possible through a systems approach as described in Part 2 and by (Lawson 2010) and by learning to think and act in terms of systems, organizations and their enterprises.

A general culture-related perspective is characterized by (Senge et al. 1994), who identify systems thinking as being the "fifth discipline" that promotes a learning organization culture. The four disciplines that are supported by systems thinking are as follows:

  • Personal mastery such that a person continually clarifies and deepens personal vision, focuses energy upon it, develops patience in seeking it and in this way apparently increasingly views reality in an objective manner.
  • Mental models aims to train people to appreciate that mental models do indeed occupy their minds and shape their action.
  • Shared vision refers to shared operating values, a common sense of purpose, and a basic level of mutuality.
  • Team learning to achieve alignment of people’s thoughts so that a common direction creates a feeling that the whole team achieves more than the sum of its team members.

Paradigms

Many authorities, for example, (Jackson 2010), have found that cultural shortfalls can be summarized in a set of negative paradigms that are injurious to a system. Although there are many paradigms, the following two are typical:

  • The Risk Denial Paradigm - This paradigm holds that many people are reluctant to identify true risks . This paradigm is apparent in the Challenger and Columbia described above.
  • The Titanic Effect - This paradigm holds that the individual believes the system is safe even when it is not. The name of this paradigm comes, of course, from the famous ocean liner catastrophe of 1912.

Approaches

Jackson and Erlick (e.g. Jackson 2010, 91-119) have found that there is a lack of evidence that a culture can be changed from a success point of view. However, they do suggest an approach founded on the principles of organizational psychology, namely, the Community of Practice (Jackson 2010, 110-112). The pros and cons of various other approaches are also discussed. These include training, the charismatic executive, Socratic teaching, teams, coaching, independent reviews, cost and schedule margins, standard processes, rewards and incentives, and management selection. (Shields 2006) provides a comprehensive list of these and similar approaches.

Many official reports, such as for the Columbia Accident (NASA 2003) and the Triangle fire (NYFIC 1912), call for an improvement in leadership to address the cultural issues. However, this approach is usually accompanied by a more objective approach of auditing, such as the Independent Technical Authority. This authority has the following features:

  • Independent means that the authority is completely divorced from the program organization. It may be from another government organization with an objective view of the program in question. In short, the authority cannot report to the program manager of the program in question.
  • Technical means that the authority will address only technical as opposed to managerial issues.
  • Authority means that that the board has the authority to take any actions to avoid failure including preventing launch decisions.

In addition to the specific safety related cultural issues, there are many management and leadership experts that have identified various means for leading cultural organizational change. For example, the usage of creative thinking promoted by, amongst others, (Gordon 1961) in his work on the productive use of analogical reasoning called synectics . Another example, (Kotter 1995, identifies needed steps in transforming an organization.

Other cultural factors

Cultures evolve over generations in response to the community's environment (physical, social, religious). However, as the environment changes, cultural beliefs, values and customs change more slowly. There are many definitions of culture, but one cited by the Columbia Accident Investigation Board, above, is representative. (NASA 2003)

It is now generally considered that there are three main sources of cultural influence:

  • National (or ethnic) culture,
  • Professional culture, and
  • Organizational culture.

These sources of culture, their effects on aviation safety, and suggested implications on safety cultures in other domains such as medicine, are described in (Helmreich and Merritt 2000) and other writings by these authors.

National (or ethnic) culture

National culture is a product of factors such as heritage, history, religion, language, climate, population density, availability of resources and politics. National culture is picked up at a formative age, and, once acquired, is difficult to change. National culture affects attitudes and behavior, and has a significant effect on interactions with others, for example:

  • Communication styles (direct and specific vs. indirect and non-specific),
  • Leadership styles (hierarchical vs. consultative),
  • Superior – inferior relationships (accepting vs. questioning decisions),
  • Attitudes to rules and regulations,
  • Attitudes to uncertainty, and
  • Displaying emotional reactions

Professional culture

Medical doctors, airline pilots, the military, teachers and many others possess particular professional cultures that overlay their ethnic or national cultures. Professional culture is usually manifested in its members by a sense of community and by the bonds of a common identity (Helmreich and Merritt 2001). Features associated with professional culture typically include some or all of the following:

  • Selectivity, competition and training in order to gain entry to the profession
  • Member-specific expertise
  • A shared professional jargon
  • Prestige and status with badges or defining uniform
  • Binding norms for behaviour and common ethical values
  • Professional and gender stereotyping
  • Self-regulation
  • Institutional and individual resistance to change

Professional culture overlays a person’s national culture. If there are conflicts between the two cultures, in particular in threat situations, the professional culture may dominate, or the earlier-acquired national culture may rise to the fore. Elements of professional culture that are of particular importance (e.g. to safety or survivability) need to be inculcated by extensive training programs, and reinforced at appropriate intervals.

Organizational culture

Organizational culture arises out of the history of an organization, including its leadership, products and services. Although there will be a common layer across the organization, significant differences will emerge in organizations with a high level of multinational integration due to differing national cultures. These will appear as differing leadership styles, manager-subordinate relationships, etc. Organizations have a formal hierarchy of responsibility and authority; therefore organizational culture is more amenable to carefully-planned change than are either professional or national cultures. Organizational culture channels the effects of the other two cultures into standard working practices; therefore changes to it that are sympathetic to national culture (rather than a culture in the distant group head office) can bring significant performance benefits.

Organizational culture is also unique; what works in one organization is unlikely to work in another. Some of the factors thought to influence or engender organizational culture include:

  • Strong corporate identity,
  • Effective leadership,
  • High morale and trust,
  • Cohesive team working and cooperation,
  • Job security,
  • Development & training,
  • Confidence, e.g. in quality and safety practices, management communication and feedback, and
  • High degree of empowerment.

Culture and Safety

Reason (1997, 191-220) identifies four components of a culture with a focus on safety:

  • A reporting culture encourages individuals to report errors and near misses including their own errors and near misses.
  • A just culture is “an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety related information.” (Reason 1997)
  • A flexible culture abandons the traditional hierarchical reporting structure in favor of more direct means of team-to-team communications.
  • A learning culture is the willingness to draw the right conclusions from information and to implement reforms when necessary.

In addition, the Nuclear Regulatory Agency (2011) has issued its final report on safety culture. This report focuses mainly on leadership and individual authority.

Historical Safety Related Cases

Culture has been cited as a key factor in the success or failure of many systems. In all of the following cases, culture was cited in official reports or by authoritative experts as a factor in the success or failure of the systems involved.

Table 1. Examples of Culture Discussion in Safety Critical Incidents. (Figure Developed for BKCASE)
Example Cultural Discussion
Apollo According to Feynman (1988), Apollo was a successful program because it was a culture of “common interest.” Then over the next 20 years there was “loss of common interest.” This loss is the “cause of the deterioration in cooperation, which . . . produced a calamity.”
Challenger Vaughn (1997) captured what she called it “normalization of deviance.” She states that rather than taking risks seriously, NASA simply ignored them by calling them normal. She summarizes this idea by saying that “flying with acceptable risks was normative in NASA culture.”
Columbia The Columbia Accident Investigation Report (NASA 2003, p. 102) echoed Feynman’s view and declared that NASA had a “broken safety culture.” The board concluded that NASA had become a culture in which bureaucratic procedures took precedence over technical excellence.
Texas City - 2005 On August 3, 2005 a process accident occurred at the BP refinery in Texas City refinery resulting in 19 deaths and more than 170 injuries. The Independent Safety Review Panel (2007) found that a corporate safety culture existed that “has not provided effective process safety leadership and has not adequately established process safety as a core value across all its five U.S. refineries.” The report recommended “an independent auditing function.”
The Triangle Fire On August 11, 1911 a fire broke out in the borough of Manhattan in which 145 people died mostly women. (NYFIC 1912) The New York State Commission castigated the property owners for their lack of understanding of the “human factors” in the case. The report called for the establishment of standards to address this deficiency.
Nimrod On 2 September 2006 a Nimrod British military aircraft caught fire and crashed killing its entire crew of 14. The Haddon-Cave report (Haddon-Cave 2009) focused on the cultural aspect. This report specifically references the Columbia Accident Investigation Report and the conclusions in it. A system of detailed audits is recommended.

Implications for Systems Engineering

As systems engineering increasingly seeks to work across national, ethnic, and organizational boundaries, systems engineers need to be aware of cultural issues and how these affect expectations and behavior in collaborative working environments. Different cultures and personal styles make best use of information presented in different ways and in different orders (levels of abstraction, big picture first or detail, principles first or practical examples). Sensitivity to cultural issues will make a difference to the success of systems engineering endeavours (see for example Siemeniuch and Sinclair 2006).

References

Citations

Fasser, Y. and D. Brettner. 2002. “Management for Quality in High-Technology Enterprises”. New York, NY, USA: Wiley.

Feynman, R. 1988. '"An Outsider's Inside View of the Challenger Inquiry". Physics Today, February, 26-37.

Gordon, W. J. J. 1961. Synectics: The Development of Creative Capacity. New York, NY, USA: Harper and Row.

Haddon-Cave, C. 2009. An Independent Review into the Broader Issues Surrounding the Loss of the RAF Nimrod MR2 Aircraft XV230 in Afganistan in 2006. London: The House of Commons.

Helmreich, R.L., & Merritt, A.C. 2000. "Safety and Error Management: The role of Crew Resource Management". In B.J. Hayward & A.R. Lowe (Eds.), Aviation Resource Management (pp. 107-119). Aldershot, UK: Ashgate. (UTHFRP Pub250) Download pdf

Independent Safety Review Panel. 2007. ‘‘The Report of the BP U.S. Refineries Independent Safety Panel’’. edited by J. A. Baker. Texas City, TX, USA.

Jackson, S. 2010. Architecting Resilient Systems: Accident Avoidance and Survival and Recovery from Disruptions. Edited by A. P. Sage, Wiley Series in Systems Engineering and Management. Hoboken, NJ, USA: John Wiley & Sons.

Kotter, J. P. 1995. “Leading Change: Why Transformation Efforts Fail”. Harvard Business Review (March-April): 59-67.

Lawson, H. 2010. ‘‘A Journey Through the Systems Landscape’’. London, UK: College Publications, Kings College.

NASA. 2003. "Columbia Accident Investigation Report," Washington, DC, USA: National Aeronautics and Space Administration (NASA), August 2003.

Nuclear Regulatory Agency. 2011. NRC Issues Final Safety Culture Policy Statement [PDF file]. NRC News, 14 June 2011 [cited 20 June 2011]. Available from http://www.nrc.gov/reading-rm/doc-collections/news/2011/11-104.pdf.

NYFIC. 1912. Preliminary Report of the New York Factory Investigating Commission. edited by R. F. Wagner. New York: New York Factory Investigating Commission.

Reason, J. 1997. Managing the Risks of Organisational Accidents. Aldershot, UK: Ashgate Publishing Limited.

Senge, P. M., A. Klieiner, C. Roberts, R. B. Ross, and B. J. Smith. 1994. The Fifth Discipline Fieldbook: Strategies and Tools for Building a Learning Organization. New York, NY, USA: Currency Doubleday.

Shields, J.L. 2006. Organization and Culture Change. In Enterprise Transformation, edited by W. B. Rouse. Hoboken, NJ, USA: John Wiley & Son.

Siemieniuch,C.E. and M.A. Sinclair. 2006. "Impact of Cultural Attributes on Decision Structures and Interfaces", Proceedings of the 11th ICCRTS Coalition Command and Control in the Networked Era. Cambridge, MA, USA. p. 1-20.

Vaughn, D. 1997. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press. Original edition, 1996.

Primary References

Fasser, Y. and D. Brettner. 2002. “Management for Quality in High-Technology Enterprises”. New York, NY, USA: Wiley.

Helmreich, R.L., & Merritt, A.C. 2000. "Safety and Error Management: The role of Crew Resource Management". In B.J. Hayward & A.R. Lowe (Eds.), Aviation Resource Management (pp. 107-119). Aldershot, UK: Ashgate. UTHFRP Pub250. Available at: Download pdf

Hofstede, G. 1984. “Culture’s Consequences: International differences in work-related values", London, UK: Sage.

Jackson, S. 2010. Architecting Resilient Systems: Accident Avoidance and Survival and Recovery from Disruptions. Edited by A. P. Sage, Wiley Series in Systems Engineering and Management. Hoboken, NJ, USA: John Wiley & Sons.

NASA. 2003. "Columbia Accident Investigation Report," Washington, DC, USA: National Aeronautics and Space Administration (NASA), August 2003.

Reason, J. 1997. Managing the Risks of Organisational Accidents. Aldershot, UK: Ashgate Publishing Limited.

Senge, P. M., A. Klieiner, C. Roberts, R. B. Ross, and B. J. Smith. 1994. The Fifth Discipline Fieldbook: Strategies and Tools for Building a Learning Organization. New York, NY, USA: Currency Doubleday.

Additional References

Feynman, R. 1988. '"An Outsider's Inside View of the Challenger Inquiry". Physics Today, February, 26-37.

Gordon, W. J. J. 1961. Synectics: The Development of Creative Capacity. New York, NY, USA: Harper and Row.

Haddon-Cave, C. 2009. An Independent Review into the Broader Issues Surrounding the Loss of the RAF Nimrod MR2 Aircraft XV230 in Afganistan in 2006. London: The House of Commons.

Hofstede, G. 2003. Culture's Consequences: Comparing Values, Behaviors, Institutions and Organizations Across Nations, Sage (Hardback 2001, paperback 2003)

Hofstede, G. 2010. Cultures and Organizations: Software for the Mind, Third Edition: Intercultural Cooperation and Its Importance for Survival, McGraw Hill, 2010

Hofstede, G. 2001. Culture's Consequences: Comparing Values, Behaviors, Institutions and Organizations Across Nations. Second Edition. Sage Publications.

Independent Safety Review Panel. 2007. ‘‘The Report of the BP U.S. Refineries Independent Safety Panel’’. edited by J. A. Baker. Texas City, TX, USA.

Kotter, J. P. 1995. Leading Change: Why Transformation Efforts Fail. Harvard Business Review (March-April):59-67.

Lawson, H. 2010. A Journey Through the Systems Landscape. London: College Publications, Kings College.

NYFIC. 1912. Preliminary Report of the New York Factory Investigating Commission. edited by R. F. Wagner. New York: New York Factory Investigating Commission.

Nuclear Regulatory Agency. 2011. NRC Issues Final Safety Culture Policy Statement [PDF file]. NRC News, 14 June 2011 [cited 20 June 2011]. Available from http://www.nrc.gov/reading-rm/doc-collections/news/2011/11-104.pdf.

Shields, J.L. 2006. Organization and Culture Change. In Enterprise Transformation, edited by W. B. Rouse. Hoboken, NJ: John Wiley & Son.

Siemieniuch,C.E. and M.A. Sinclair. 2006. "Impact of Cultural Attributes on Decision Structures and Interfaces", Proceedings of the 11th ICCRTS Coalition Command and Control in the Networked Era. Cambridge, MA, USA. p. 1-20.

Vaughn, D. 1997. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press. Original edition, 1996.


Article Discussion

[Go to discussion page]

<- Previous Article | Parent Article | Next Article ->

Signatures

--Bkcase 18:35, 30 August 2011 (UTC) Core team edit; please see discussion for additional information --Rturner 11:14, 2 September 2011 (UTC)