Topic > A Critical Analysis of the Challenger and Columbia Accidents

IndexIntroductionChallengerColumbiaPreventive Impact of an Effective ISMSConclusionBibliographyIntroductionThe sky is not the limit for the safety of space organizations. As exemplified by the failures of Challenger and Columbia, in NASA's early days the operational requirements of this organization far outweighed safety priorities. Although they believed their safety framework was satisfactory, retrospective analysis demonstrated that this did not translate into an effective safety culture within the organization. This essay will critically analyze the systemic factors that led to these two disasters and will then demonstrate how a modern Integrated Safety Management System (ISMS), if effectively implemented, could have prevented these losses. The critical analysis will initially focus on the technical chain of the accident sequence, but will soon proceed further to the organizational environment that allowed this to happen. By comparing this organizational snapshot to that of an ideal generative safety culture, significant deficiencies will be identified that could have negated these tragedies. Say no to plagiarism. Get a tailor-made essay on "Why Violent Video Games Shouldn't Be Banned"? Get an original essayChallengerThe Challenger accident occurred on January 28, 1986. A failure of the aft field joint of the shuttle's solid rocket booster (SRB) ultimately resulted in an explosion that destroyed both the booster assembly and the shuttle, killing all seven crew members. It was determined that the launch vehicle was exposed to three abnormally cold nights, resulting in thermal expansion of the joint O-rings and consequently making them brittle. In flight this allowed hot gases to escape, puncture the primary fuel tank and subsequently cause the fatal explosion. While this technical failure was the immediate cause of the accident, it was simply the end result of a vast chain of organizational safety failures that allowed “a progressive descent into poor judgment, supported by a culture of high-risk technology." Vaughan argues that NASA's cultural environment was plagued by the normalization of deviance and evidence of managerial negligence. For example, SRB manufacturer Thiokol expressed concern the day before launch, specifically regarding the previously observed “significant blow-by” effect caused by low temperatures affecting the SRB O-rings. Although temperatures predicted for the morning launch were approximately twenty degrees lower than those at which problems had already been detected, Thiokol's position was challenged during two conference calls and ultimately withdrawn due to the fact that there were no commitment criteria for the SRB joint temperature launch. This example was cited in the presidential report, and investigations found that no safety representative or reliability and quality assurance engineer was present. This means that not only did it clearly ignore the indication of security risks, but it was also done without the perception of breaking any security requirements or organizational policy. Of the five communication or organizational failures declared by Mr. Aldrich, director of the Space Shuttle program, four were directly related to the safety program. These included lackluster reporting requirements, inadequate trend analysis, and a misunderstanding of critical security requirements. In the case of the O-ring failure it was even hypothesized, beforedisaster, that the fact that nothing had happened previously meant that the risk was already mitigated. Such glaring gaps in NASA's safety management point to the likelihood that the root cause was a systemic, not technical, failure, and that the Challenger accident was in fact "an organizational failure of tragic proportions." ColumbiaIn a high-performance, high-risk organization, identified errors of a similar nature can be expected not to be repeated. In the case of Columbia, this is hardly true in the technical realm and falls catastrophically flat in terms of management security considerations. Separated by only seventeen years, many lessons learned from Challenger seemed to have already been forgotten. Just as with the Challengers, the cause of this disaster was documented on numerous occasions before and during, but once again the risk was minimized to achieve operational results. Upon reentry, Columbia suffered a critical left wing integrity failure resulting in rapid loss of control, wing failure, and the rapid breakup of the orbiter. It was determined that a large piece of foam from the external attachment of the tanker shuttle had broken off and struck the leading edge of the left wing during takeoff. Despite being lightweight, the speeds involved meant that the force of impact was capable of disrupting the integrity of one of the reinforced carbon-carbon plates covering this edge. Therefore, superheated reentry air was able to enter the wing and ultimately destroy the orbiter. The relationship between these two disasters was so strikingly similar that board member and astronaut Sally Ride declared, "I think I hear an echo here." At the time of this accident, NASA's broken safety culture exhibited chronic overconfidence, groupthink, and tolerance for anomalous events. Along with the oppressive intimidation of concerned engineers, a “silent safety culture” was created and maintained. Despite promised cultural reforms following harsh criticism of the Challenger accident, the organization strongly resisted change and fell back into its traditional cultural norms. It is this degeneration of cultural improvement that has likely allowed NASA to repeat the same mistakes of the past. Increasing budget cuts and diminishing political gains in the run-up to the Columbia accident served to amplify performance pressure, with the organization even adopting a “Faster, Better, Cheaper” philosophy. However, in an organization still strengthened by the successes of the Apollo missions and for the most part, fortunately or not, the shuttle program, it is understandable to see how past success could lead to the development of this pseudo-security culture. Regardless, it is still confusing that the dangers from foam strikes could have been ignored so many times and not examined further until the catastrophe occurred. Preventive Impact of an Effective ISMS The introduction of an Integrated Safety Management System (ISMS) within NASA prior to one of the previously discussed failures would likely have minimized losses, if not prevented these accidents altogether. An effective ISMS seamlessly blends safety-focused work practices, beliefs, attitudes and procedures into all aspects of daily operations. A common criticism is that "NASA's culture is one of dedication to the mission." However, this is not an inherently negative bias as long as the organization understands that safety is a key enablermission critical capacity. NASA's safety framework, while excellent in theory, appeared to have been theoretically ignored and NASA as an organization had simply stopped following and enforcing its own rules. The four key components of an SMS are security policy and objectives, risk management, assurance and promotion. NASA as an organization has failed holistically in all these aspects despite having extensive mechanisms that meet many of the key component requirements mentioned above. NASA had in place a safety legislative framework, safety accountability and responsibility, accident investigations, enforcement policies, data collection and analysis, safety performance oversight and agreement. What NASA lacked at the most fundamental level was an effective safety culture, the advocates of which are flexible, knowledgeable, knowledgeable, and knowledgeable subcultures. These subcultures are intrinsically linked to creating a feedback loop that improves safety culture. As demonstrated in the Challenger and Columbia accidents, poor safety culture has been directly demonstrated to be a significant contributor in both situations. The ability to adapt decision-making processes in unusual situations is a key factor for high-velocity safety reporting. NASA failed at adaptability, demonstrating rigidity in launch timing and reentry plans through low-level dismissal of Thiokols O-ring concerns and rigorous data requirements that prevented information on the debris hazard of Columbia were transmitted along the chain. Indicators of a learning culture are open communication, management support, employee empowerment and collaboration. NASA has demonstrated its lack of learning culture through its failure to learn from Challenger's mistakes and its ineffective reforms. A just culture is one that discourages blaming individuals for honest mistakes, thus encouraging employees to speak up about them. In reference to the right culture, NASA has probably shown too much tolerance for mistakes and risky decisions. This, combined with an anti-reporting subculture that hindered engineers from reporting concerns, actually worsened the organization's safety culture. As exemplified by the presumptive decision-making process surrounding the foam attachment to Columbia's RCC panels, NASA had an information culture that was content to operate with uncertainties. This may have proved useful in the uncertain days of the Apollo missions, but it was a historical holdover that had no place in the more contemporary shuttle program. Thus, despite a tolerable safety framework, the lack of advocates for an effective safety culture has crippled NASA's safety organization. Please note: this is just an example. Get a custom paper from our expert writers now. Get a Custom Essay Conclusion Had an effective ISMS been introduced into NASA prior to these accidents, it is likely that the chain of events leading to the deaths of both Challenger and Columbia would have been broken long before either voyage reached their launch date. In an ideal ISMS, concerns regarding both O-rings and foam strikes would have been raised, acknowledged, considered, and resolved years before either incident. Chien consistently argues that, although difficult to accept, "once the foam hit Columbia's wing the crew was doomed" and that there simply wasn't enough information to. 80-102.