Culture

From SEBoK
Revision as of 20:43, 18 June 2011 by Bkcase (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Jump to navigation Jump to search

Introduction

Although the word culture originally implied “the training and refinement and training of mind, taste, and manners” (OED 1973, p. 471), it has come to have the broader meaning defined in the Columbia Accident Investigation Report (NASA 2003, p.101) of “the basic values, norms, beliefs, and practices that characterize the functioning of a particular institution.” It is the latter definition that is the focus of this topic. Since culture implies a collective set of beliefs and behaviors, it can pertain to organizations or enterprises of all sizes and to the individuals within them.

By taking a systems approach as described in Part 2 and by (Lawson 2010) and learning to think and act in terms of systems organizations and their enterprises can make a cultural change. This approach implies that the environment is open and communication, creative thinking, a mutual respect are present. In this article cultural aspects are examined in general and more specifically in relationship to the critical issue of system safety.

Culture Overview

(Reason 1997, pp. 191-220) identifies four components of a culture with a focus on safety:

  • A reporting culture encourages individual to report errors and near misses including their own errors and near misses.
  • A just culture is “an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety related information.”
  • A flexible culture abandons the traditional hierarchical reporting structure in favor of more direct means of team-to-team communications.
  • A learning culture is the willingness to draw the right conclusions from information and to implement reforms when necessary.

A general culture related perspective is characterized by (Senge et al. 1994) identify systems thinking as being the fifth discipline that promotes a learning organization culture. The four disciplines that are supported by systems thinking are as follows:

  • Personal mastery that is likened to a journey whereby a person continually clarifies and deepens personal vision, focuses energy upon it, develops patience is seeking it and in this way apparently increasingly views reality in an objective manner.
  • Mental models aims to train people to appreciate that mental models do indeed occupy their minds and shapes their action.
  • Shared vision refers to shared operating values, a common sense of purpose, and a basic level of mutuality.
  • Team learning to achieve alignment of people’s thoughts in which a common direction creates a feeling that the whole team achieves more than the sum of its team members.

Historical Safety Related Cases

Culture has been cited as a key factor in the success or failure of many systems. Unfortunately, the failures get more attention. Specific cases include, for example, the Texas City BP accident in 2005, the Apollo program, the Space Shuttle program, the 2006 Nimrod accident, and the New York Triangle fire of 1911. In all of the following cases culture was cited in official reports or by authoritative experts as a factor in the success or failure of the systems involved.

Apollo

Most people agree that the Apollo program was successful resulting in the first lunar landing in 1969. Although there have been many systems deployed and operated, the Apollo program is one in which culture was cited as a positive factor by the physicist Richard Feynman(Feynman 1988). Feynman, who was on the board of inquiry for the Challenger program, made the following conclusions comparing the two programs: Apollo was a successful program because “they all [management and engineers] were trying to do the same thing.” It was a culture of “common interest.” Then over the next 20 years there was “loss of common interest.” This loss is the “cause of the deterioration in cooperation, which . . . produced a calamity.” The Columbia Accident Investigation Report (NASA 2003, p. 102) echoed Feynman’s view of the Apollo culture when it said that the NASA culture during the Apollo era “held as its highest value attention to detail in order to lower the chances of failure.”

Challenger

Probably the most exhaustive account of the deterioration of culture on the Challenger program was (Vaughn 1997). What Vaughn captured was what eventually became to be known as “risk denial.” However, she called it “normalization of deviance.” She states that rather than taking risks seriously, NASA simply ignored them by calling them normal. She summarizes this idea by saying that “flying with acceptable risks was normative in NASA culture.”

Columbia

In the case of the Columbia accident the indictment of NASA culture came from a more authoritative source, the Columbia Accident Investigation Board (NASA 2003, p. 184). In short, the board declared that NASA had a “broken safety culture.” How the board recommended fixing the culture will be discussed below. The board concluded that NASA had become a culture in which bureaucratic procedures took precedence over technical excellence.

Texas City - 2005

On August 3, 2005 a process accident occurred at the BP refinery in Texas City refinery resulting in 19 deaths and more than 170 injuries. A panel was convened with former US statesman James Baker as chair. The report (Independent Safety Review Panel 2007) found that a corporate safety culture existed “that may have tolerated long and serious standing deviations.” The panel found that BP “has not provided effective process safety leadership and has not adequately established process safety as a core value across all its five U.S. refineries.” Among other recommendations, the report recommended “an independent auditing function” to determine whether BP was in compliance with safety standards.

The Triangle Fire

On August 11, 2011 a fire broke out in the borough of Manhattan in which 145 prople died mostly women (NYFIC 1912). The New York State Commission was commissioned on this case and pointed castigated the property owners for their lack of understanding of the “human factors” in the case. The panel concluded that “the human factor is practically neglected in our industrial system.” The report called for the establishment of standards to address this deficiency.

Nimrod

On 2 September 2006 a Nimrod British military aircraft caught fire and crashed killing its entire crew of 14. The Haddon-Cave report (Haddon-Cave 2009) focused on the cultural aspect. This port details the significant case against the British military establishment and its deficient safety culture. This report specifically references the Columbia Accident Investigation Report and the conclusions in it. A system of detailed audits is recommended.

Paradigms

Many authorities, for example, (Jackson 2010), have found that cultural shortfalls can be summarized in a set of negative paradigms that are injurious to a system. Although there are many paradigms, the following two are typical:

The Risk Denial Paradigm

This paradigm holds that many people are reluctant to identify true risks. This paradigm is apparent in the Challenger and Columbia described above.

The Titanic Effect

This paradigm holds that the individual believe the system is safe even when it is not. The name of this paradigm comes, of course, from the famous ocean liner catastrophe of 1912.

Approaches

The obvious question at this time is: how can a culture be changed. Jackson and Erlick (Jackson 2010, pp. 91-119) have studied many solutions and have found that there is a lack of evidence that a culture can be changed from a success point of view. However, Jackson and Erlick do suggest an approach founded on the principles of organizational psychology, namely, the Community of Practice (Jackson 2010, pp.110-112). The pros and cons of various other approaches are discussed, namely, training, the charismatic executive, Socratic teaching, teams, coaching, independent reviews, cost and schedule margins, standard processes, rewards and incentives, and management selection. (Shields 2006) provides a comprehensive list of these and similar approaches.

Many official reports call for an improvement in leadership to address the cultural issues. Both the Columbia Accident Investigation Report and the Triangle fire report use this approach. However, this approach is usually accompanied by a more objective approach of auditing, such as the Independent Technical Authority approach discussed below. The most generally agreed-to approach is an independent review as recommended by the Columbia Accident Investigation Board. The board recommends an Independent Technical Authority. This authority has the following features:

  • Independent means that the authority is completely divorced from the program organization. It may be from another government organization with an objective view of the program in question. In short, the authority cannot report to the program manager of the program in question.
  • Technical means that the authority will address only technical as opposed to managerial issues.
  • Authority means that that the board has the authority to take any actions to avoid failure including preventing launch decisions.

In addition to the specific safety related cultural issues there are many management and leadership gurus that have identified various means for leading cultural organizational change. For example, the usage of creative thinking promoted by, amongst others, (Gordon 1961) in his work on the productive use of analogical reasoning called Synectics. Another example is the work of (Kotter 1995) in identifying needed steps in transforming an organization.

Primary References

Jackson, Scott. 2010. Architecting Resilient Systems: Accident Avoidance and Survival and Recovery from Disruptions. Edited by A. P. Sage, Wiley Series in Systems Engineering and Management. Hoboken, NJ, USA: John Wiley & Sons.

NASA, "Columbia Accident Investigation Report," National Aeronautics and Space Administration (NASA), Washington, DCAugust 2003.

Reason, James. 1997. Managing the Risks of Organisational Accidents. Aldershot, UK: Ashgate Publishing Limited.

Senge, P. M., A. Klieiner, C. Roberts, R. B. Ross, and B. J. Smith. 1994. The Fifth Discipline Fieldbook: Strategies and Tools for Building a Learning Organization. New York: Currency Doubleday.

Other References

Feynman, Richard. 1988. An Outsider's Inside View of the Challenger Inquiry. Physics Today, February, 26-37.

Gordon, W. J. J. 1961. Synectics: The Development of Creative Capacity. New York: Harper and Row.

Haddon-Cave, Charles. 2009. An Independent Review into the Broader Issues Surrounding the Loss of the RAF Nimrod MR2 Aircraft XV230 in Afganistan in 2006. London: The House of Commons.

Independent Safety Review Panel. 2007. The Report of the BP U.S. Refineries Independent Safety Panel. edited by J. A. Baker. Texas City.

Kotter, J. P. 1995. Leading Change: Why Transformation Efforts Fail. Harvard Business Review (March-April):59-67.

Lawson, Harold. 2010. A Journey Through the Systems Landscape. London: College Publications, Kings College.

NYFIC. 1912. Preliminary Report of the New York Factory Investigating Commission. edited by R. F. Wagner. New York: New York Factory Investigating Commission.

OED. 1973. In The Shorter Oxford English Dictionary on Historical Principles, edited by C. T. Onions. Oxford: Oxford Univeristy Press. Original edition, 1933.

Shields, Joyce L. 2006. Organization and Culture Change. In Enterprise Transformation, edited by W. B. Rouse. Hoboken, NJ: John Wiley & Son.

Vaughn, Diane 1997. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. Chicago: University of Chicago Press. Original edition, 1996.

Glossary

Paradigm

Culture

Risk

Synectics



Article Discussion

[Go to discussion page]