Advertisement

Advertisement

The Boeing 737 MAX: Lessons for Engineering Ethics

  • Original Research/Scholarship
  • Published: 10 July 2020
  • Volume 26 , pages 2957–2974, ( 2020 )

Cite this article

case study in engineering ethics

  • Joseph Herkert 1 ,
  • Jason Borenstein 2 &
  • Keith Miller 3  

116k Accesses

66 Citations

111 Altmetric

12 Mentions

Explore all metrics

The crash of two 737 MAX passenger aircraft in late 2018 and early 2019, and subsequent grounding of the entire fleet of 737 MAX jets, turned a global spotlight on Boeing’s practices and culture. Explanations for the crashes include: design flaws within the MAX’s new flight control software system designed to prevent stalls; internal pressure to keep pace with Boeing’s chief competitor, Airbus; Boeing’s lack of transparency about the new software; and the lack of adequate monitoring of Boeing by the FAA, especially during the certification of the MAX and following the first crash. While these and other factors have been the subject of numerous government reports and investigative journalism articles, little to date has been written on the ethical significance of the accidents, in particular the ethical responsibilities of the engineers at Boeing and the FAA involved in designing and certifying the MAX. Lessons learned from this case include the need to strengthen the voice of engineers within large organizations. There is also the need for greater involvement of professional engineering societies in ethics-related activities and for broader focus on moral courage in engineering ethics education.

Similar content being viewed by others

case study in engineering ethics

Repentance as Rebuke: Betrayal and Moral Injury in Safety Engineering

case study in engineering ethics

Airworthiness and Safety in Air Operations in Ecuadorian Public Institutions

case study in engineering ethics

Safety in Numbers? (Lessons Learned From Aviation Safety Assessment Techniques)

Explore related subjects.

  • Medical Ethics
  • Artificial Intelligence

Avoid common mistakes on your manuscript.

Introduction

In October 2018 and March 2019, Boeing 737 MAX passenger jets crashed minutes after takeoff; these two accidents claimed nearly 350 lives. After the second incident, all 737 MAX planes were grounded worldwide. The 737 MAX was an updated version of the 737 workhorse that first began flying in the 1960s. The crashes were precipitated by a failure of an Angle of Attack (AOA) sensor and the subsequent activation of new flight control software, the Maneuvering Characteristics Augmentation System (MCAS). The MCAS software was intended to compensate for changes in the size and placement of the engines on the MAX as compared to prior versions of the 737. The existence of the software, designed to prevent a stall due to the reconfiguration of the engines, was not disclosed to pilots until after the first crash. Even after that tragic incident, pilots were not required to undergo simulation training on the 737 MAX.

In this paper, we examine several aspects of the case, including technical and other factors that led up to the crashes, especially Boeing’s design choices and organizational tensions internal to the company, and between Boeing and the U.S. Federal Aviation Administration (FAA). While the case is ongoing and at this writing, the 737 MAX has yet to be recertified for flight, our analysis is based on numerous government reports and detailed news accounts currently available. We conclude with a discussion of specific lessons for engineers and engineering educators regarding engineering ethics.

Overview of 737 MAX History and Crashes

In December 2010, Boeing’s primary competitor Airbus announced the A320neo family of jetliners, an update of their successful A320 narrow-body aircraft. The A320neo featured larger, more fuel-efficient engines. Boeing had been planning to introduce a totally new aircraft to replace its successful, but dated, 737 line of jets; yet to remain competitive with Airbus, Boeing instead announced in August 2011 the 737 MAX family, an update of the 737NG with similar engine upgrades to the A320neo and other improvements (Gelles et al. 2019 ). The 737 MAX, which entered service in May 2017, became Boeing’s fastest-selling airliner of all time with 5000 orders from over 100 airlines worldwide (Boeing n.d. a) (See Fig.  1 for timeline of 737 MAX key events).

figure 1

737 MAX timeline showing key events from 2010 to 2019

The 737 MAX had been in operation for over a year when on October 29, 2018, Lion Air flight JT610 crashed into the Java Sea 13 minutes after takeoff from Jakarta, Indonesia; all 189 passengers and crew on board died. Monitoring from the flight data recorder recovered from the wreckage indicated that MCAS, the software specifically designed for the MAX, forced the nose of the aircraft down 26 times in 10 minutes (Gates 2018 ). In October 2019, the Final Report of Indonesia’s Lion Air Accident Investigation was issued. The Report placed some of the blame on the pilots and maintenance crews but concluded that Boeing and the FAA were primarily responsible for the crash (Republic of Indonesia 2019 ).

MCAS was not identified in the original documentation/training for 737 MAX pilots (Glanz et al. 2019 ). But after the Lion Air crash, Boeing ( 2018 ) issued a Flight Crew Operations Manual Bulletin on November 6, 2018 containing procedures for responding to flight control problems due to possible erroneous AOA inputs. The next day the FAA ( 2018a ) issued an Emergency Airworthiness Directive on the same subject; however, the FAA did not ground the 737 MAX at that time. According to published reports, these notices were the first time that airline pilots learned of the existence of MCAS (e.g., Bushey 2019 ).

On March 20, 2019, about four months after the Lion Air crash, Ethiopian Airlines Flight ET302 crashed 6 minutes after takeoff in a field 39 miles from Addis Ababa Airport. The accident caused the deaths of all 157 passengers and crew. The Preliminary Report of the Ethiopian Airlines Accident Investigation (Federal Democratic Republic of Ethiopia 2019 ), issued in April 2019, indicated that the pilots followed the checklist from the Boeing Flight Crew Operations Manual Bulletin posted after the Lion Air crash but could not control the plane (Ahmed et al. 2019 ). This was followed by an Interim Report (Federal Democratic Republic of Ethiopia 2020 ) issued in March 2020 that exonerated the pilots and airline, and placed blame for the accident on design flaws in the MAX (Marks and Dahir 2020 ). Following the second crash, the 737 MAX was grounded worldwide with the U.S., through the FAA, being the last country to act on March 13, 2019 (Kaplan et al. 2019 ).

Design Choices that Led to the Crashes

As noted above, with its belief that it must keep up with its main competitor, Airbus, Boeing elected to modify the latest generation of the 737 family, the 737NG, rather than design an entirely new aircraft. Yet this raised a significant engineering challenge for Boeing. Mounting larger, more fuel-efficient engines, similar to those employed on the A320neo, on the existing 737 airframe posed a serious design problem, because the 737 family was built closer to the ground than the Airbus A320. In order to provide appropriate ground clearance, the larger engines had to be mounted higher and farther forward on the wings than previous models of the 737 (see Fig.  2 ). This significantly changed the aerodynamics of the aircraft and created the possibility of a nose-up stall under certain flight conditions (Travis 2019 ; Glanz et al. 2019 ).

figure 2

(Image source: https://www.norebbo.com )

Boeing 737 MAX (left) compared to Boeing 737NG (right) showing larger 737 MAX engines mounted higher and more forward on the wing.

Boeing’s attempt to solve this problem involved incorporating MCAS as a software fix for the potential stall condition. The 737 was designed with two AOA sensors, one on each side of the aircraft. Yet Boeing decided that the 737 MAX would only use input from one of the plane’s two AOA sensors. If the single AOA sensor was triggered, MCAS would detect a dangerous nose-up condition and send a signal to the horizontal stabilizer located in the tail. Movement of the stabilizer would then force the plane’s tail up and the nose down (Travis 2019 ). In both the Lion Air and Ethiopian Air crashes, the AOA sensor malfunctioned, repeatedly activating MCAS (Gates 2018 ; Ahmed et al. 2019 ). Since the two crashes, Boeing has made adjustments to the MCAS, including that the system will rely on input from the two AOA sensors instead of just one. But still more problems with MCAS have been uncovered. For example, an indicator light that would alert pilots if the jet’s two AOA sensors disagreed, thought by Boeing to be standard on all MAX aircraft, would only operate as part of an optional equipment package that neither airline involved in the crashes purchased (Gelles and Kitroeff 2019a ).

Similar to its responses to previous accidents, Boeing has been reluctant to admit to a design flaw in its aircraft, instead blaming pilot error (Hall and Goelz 2019 ). In the 737 MAX case, the company pointed to the pilots’ alleged inability to control the planes under stall conditions (Economy 2019 ). Following the Ethiopian Airlines crash, Boeing acknowledged for the first time that MCAS played a primary role in the crashes, while continuing to highlight that other factors, such as pilot error, were also involved (Hall and Goelz 2019 ). For example, on April 29, 2019, more than a month after the second crash, then Boeing CEO Dennis Muilenburg defended MCAS by stating:

We've confirmed that [the MCAS system] was designed per our standards, certified per our standards, and we're confident in that process. So, it operated according to those design and certification standards. So, we haven't seen a technical slip or gap in terms of the fundamental design and certification of the approach. (Economy 2019 )

The view that MCAS was not primarily at fault was supported within an article written by noted journalist and pilot William Langewiesche ( 2019 ). While not denying Boeing made serious mistakes, he placed ultimate blame on the use of inexperienced pilots by the two airlines involved in the crashes. Langewiesche suggested that the accidents resulted from the cost-cutting practices of the airlines and the lax regulatory environments in which they operated. He argued that more experienced pilots, despite their lack of information on MCAS, should have been able to take corrective action to control the planes using customary stall prevention procedures. Langewiesche ( 2019 ) concludes in his article that:

What we had in the two downed airplanes was a textbook failure of airmanship. In broad daylight, these pilots couldn’t decipher a variant of a simple runaway trim, and they ended up flying too fast at low altitude, neglecting to throttle back and leading their passengers over an aerodynamic edge into oblivion. They were the deciding factor here — not the MCAS, not the Max.

Others have taken a more critical view of MCAS, Boeing, and the FAA. These critics prominently include Captain Chesley “Sully” Sullenberger, who famously crash-landed an A320 in the Hudson River after bird strikes had knocked out both of the plane’s engines. Sullenberger responded directly to Langewiesche in a letter to the Editor:

… Langewiesche draws the conclusion that the pilots are primarily to blame for the fatal crashes of Lion Air 610 and Ethiopian 302. In resurrecting this age-old aviation canard, Langewiesche minimizes the fatal design flaws and certification failures that precipitated those tragedies, and still pose a threat to the flying public. I have long stated, as he does note, that pilots must be capable of absolute mastery of the aircraft and the situation at all times, a concept pilots call airmanship. Inadequate pilot training and insufficient pilot experience are problems worldwide, but they do not excuse the fatally flawed design of the Maneuvering Characteristics Augmentation System (MCAS) that was a death trap.... (Sullenberger 2019 )

Noting that he is one of the few pilots to have encountered both accident sequences in a 737 MAX simulator, Sullenberger continued:

These emergencies did not present as a classic runaway stabilizer problem, but initially as ambiguous unreliable airspeed and altitude situations, masking MCAS. The MCAS design should never have been approved, not by Boeing, and not by the Federal Aviation Administration (FAA)…. (Sullenberger 2019 )

In June 2019, Sullenberger noted in Congressional Testimony that “These crashes are demonstrable evidence that our current system of aircraft design and certification has failed us. These accidents should never have happened” (Benning and DiFurio 2019 ).

Others have agreed with Sullenberger’s assessment. Software developer and pilot Gregory Travis ( 2019 ) argues that Boeing’s design for the 737 MAX violated industry norms and that the company unwisely used software to compensate for inadequacies in the hardware design. Travis also contends that the existence of MCAS was not disclosed to pilots in order to preserve the fiction that the 737 MAX was just an update of earlier 737 models, which served as a way to circumvent the more stringent FAA certification requirements for a new airplane. Reports from government agencies seem to support this assessment, emphasizing the chaotic cockpit conditions created by MCAS and poor certification practices. The U.S. National Transportation Safety Board (NTSB) ( 2019 ) Safety Recommendations to the FAA in September 2019 indicated that Boeing underestimated the effect MCAS malfunction would have on the cockpit environment (Kitroeff 2019 , a , b ). The FAA Joint Authorities Technical Review ( 2019 ), which included international participation, issued its Final Report in October 2019. The Report faulted Boeing and FAA in MCAS certification (Koenig 2019 ).

Despite Boeing’s attempts to downplay the role of MCAS, it began to work on a fix for the system shortly after the Lion Air crash (Gates 2019 ). MCAS operation will now be based on inputs from both AOA sensors, instead of just one sensor, with a cockpit indicator light when the sensors disagree. In addition, MCAS will only be activated once for an AOA warning rather than multiple times. What follows is that the system would only seek to prevent a stall once per AOA warning. Also, MCAS’s power will be limited in terms of how much it can move the stabilizer and manual override by the pilot will always be possible (Bellamy 2019 ; Boeing n.d. b; Gates 2019 ). For over a year after the Lion Air crash, Boeing held that pilot simulator training would not be required for the redesigned MCAS system. In January 2020, Boeing relented and recommended that pilot simulator training be required when the 737 MAX returns to service (Pasztor et al. 2020 ).

Boeing and the FAA

There is mounting evidence that Boeing, and the FAA as well, had warnings about the inadequacy of MCAS’s design, and about the lack of communication to pilots about its existence and functioning. In 2015, for example, an unnamed Boeing engineer raised in an email the issue of relying on a single AOA sensor (Bellamy 2019 ). In 2016, Mark Forkner, Boeing’s Chief Technical Pilot, in an email to a colleague flagged the erratic behavior of MCAS in a flight simulator noting: “It’s running rampant” (Gelles and Kitroeff 2019c ). Forkner subsequently came under federal investigation regarding whether he misled the FAA regarding MCAS (Kitroeff and Schmidt 2020 ).

In December 2018, following the Lion Air Crash, the FAA ( 2018b ) conducted a Risk Assessment that estimated that fifteen more 737 MAX crashes would occur in the expected fleet life of 45 years if the flight control issues were not addressed; this Risk Assessment was not publicly disclosed until Congressional hearings a year later in December 2019 (Arnold 2019 ). After the two crashes, a senior Boeing engineer, Curtis Ewbank, filed an internal ethics complaint in 2019 about management squelching of a system that might have uncovered errors in the AOA sensors. Ewbank has since publicly stated that “I was willing to stand up for safety and quality… Boeing management was more concerned with cost and schedule than safety or quality” (Kitroeff et al. 2019b ).

One factor in Boeing’s apparent reluctance to heed such warnings may be attributed to the seeming transformation of the company’s engineering and safety culture over time to a finance orientation beginning with Boeing’s merger with McDonnell–Douglas in 1997 (Tkacik 2019 ; Useem 2019 ). Critical changes after the merger included replacing many in Boeing’s top management, historically engineers, with business executives from McDonnell–Douglas and moving the corporate headquarters to Chicago, while leaving the engineering staff in Seattle (Useem 2019 ). According to Tkacik ( 2019 ), the new management even went so far as “maligning and marginalizing engineers as a class”.

Financial drivers thus began to place an inordinate amount of strain on Boeing employees, including engineers. During the development of the 737 MAX, significant production pressure to keep pace with the Airbus 320neo was ever-present. For example, Boeing management allegedly rejected any design changes that would prolong certification or require additional pilot training for the MAX (Gelles et al. 2019 ). As Adam Dickson, a former Boeing engineer, explained in a television documentary (BBC Panorama 2019 ): “There was a lot of interest and pressure on the certification and analysis engineers in particular, to look at any changes to the Max as minor changes”.

Production pressures were exacerbated by the “cozy relationship” between Boeing and the FAA (Kitroeff et al. 2019a ; see also Gelles and Kaplan 2019 ; Hall and Goelz 2019 ). Beginning in 2005, the FAA increased its reliance on manufacturers to certify their own planes. Self-certification became standard practice throughout the U.S. airline industry. By 2018, Boeing was certifying 96% of its own work (Kitroeff et al. 2019a ).

The serious drawbacks to self-certification became acutely apparent in this case. Of particular concern, the safety analysis for MCAS delegated to Boeing by the FAA was flawed in at least three respects: (1) the analysis underestimated the power of MCAS to move the plane’s horizontal tail and thus how difficult it would be for pilots to maintain control of the aircraft; (2) it did not account for the system deploying multiple times; and (3) it underestimated the risk level if MCAS failed, thus permitting a design feature—the single AOA sensor input to MCAS—that did not have built-in redundancy (Gates 2019 ). Related to these concerns, the ability of MCAS to move the horizontal tail was increased without properly updating the safety analysis or notifying the FAA about the change (Gates 2019 ). In addition, the FAA did not require pilot training for MCAS or simulator training for the 737 MAX (Gelles and Kaplan 2019 ). Since the MAX grounding, the FAA has been become more independent during its assessments and certifications—for example, they will not use Boeing personnel when certifying approvals of new 737 MAX planes (Josephs 2019 ).

The role of the FAA has also been subject to political scrutiny. The report of a study of the FAA certification process commissioned by Secretary of Transportation Elaine Chao (DOT 2020 ), released January 16, 2020, concluded that the FAA certification process was “appropriate and effective,” and that certification of the MAX as a new airplane would not have made a difference in the plane’s safety. At the same time, the report recommended a number of measures to strengthen the process and augment FAA’s staff (Pasztor and Cameron 2020 ). In contrast, a report of preliminary investigative findings by the Democratic staff of the House Committee on Transportation and Infrastructure (House TI 2020 ), issued in March 2020, characterized FAA’s certification of the MAX as “grossly insufficient” and criticized Boeing’s design flaws and lack of transparency with the FAA, airlines, and pilots (Duncan and Laris 2020 ).

Boeing has incurred significant economic losses from the crashes and subsequent grounding of the MAX. In December 2019, Boeing CEO Dennis Muilenburg was fired and the corporation announced that 737 MAX production would be suspended in January 2020 (Rich 2019 ) (see Fig.  1 ). Boeing is facing numerous lawsuits and possible criminal investigations. Boeing estimates that its economic losses for the 737 MAX will exceed $18 billion (Gelles 2020 ). In addition to the need to fix MCAS, other issues have arisen in recertification of the aircraft, including wiring for controls of the tail stabilizer, possible weaknesses in the engine rotors, and vulnerabilities in lightning protection for the engines (Kitroeff and Gelles 2020 ). The FAA had planned to flight test the 737 MAX early in 2020, and it was supposed to return to service in summer 2020 (Gelles and Kitroeff 2020 ). Given the global impact of the COVID-19 pandemic and other factors, it is difficult to predict when MAX flights might resume. In addition, uncertainty of passenger demand has resulted in some airlines delaying or cancelling orders for the MAX (Bogaisky 2020 ). Even after obtaining flight approval, public resistance to flying in the 737 MAX will probably be considerable (Gelles 2019 ).

Lessons for Engineering Ethics

The 737 MAX case is still unfolding and will continue to do so for some time. Yet important lessons can already be learned (or relearned) from the case. Some of those lessons are straightforward, and others are more subtle. A key and clear lesson is that engineers may need reminders about prioritizing the public good, and more specifically, the public’s safety. A more subtle lesson pertains to the ways in which the problem of many hands may or may not apply here. Other lessons involve the need for corporations, engineering societies, and engineering educators to rise to the challenge of nurturing and supporting ethical behavior on the part of engineers, especially in light of the difficulties revealed in this case.

All contemporary codes of ethics promulgated by major engineering societies state that an engineer’s paramount responsibility is to protect the “safety, health, and welfare” of the public. The American Institute of Aeronautics and Astronautics Code of Ethics indicates that engineers must “[H]old paramount the safety, health, and welfare of the public in the performance of their duties” (AIAA 2013 ). The Institute of Electrical and Electronics Engineers (IEEE) Code of Ethics goes further, pledging its members: “…to hold paramount the safety, health, and welfare of the public, to strive to comply with ethical design and sustainable development practices, and to disclose promptly factors that might endanger the public or the environment” (IEEE 2017 ). The IEEE Computer Society (CS) cooperated with the Association for Computing Machinery (ACM) in developing a Software Engineering Code of Ethics ( 1997 ) which holds that software engineers shall: “Approve software only if they have a well-founded belief that it is safe, meets specifications, passes appropriate tests, and does not diminish quality of life, diminish privacy or harm the environment….” According to Gotterbarn and Miller ( 2009 ), the latter code is a useful guide when examining cases involving software design and underscores the fact that during design, as in all engineering practice, the well-being of the public should be the overriding concern. While engineering codes of ethics are plentiful in number, they differ in their source of moral authority (i.e., organizational codes vs. professional codes), are often unenforceable through the law, and formally apply to different groups of engineers (e.g., based on discipline or organizational membership). However, the codes are generally recognized as a statement of the values inherent to engineering and its ethical commitments (Davis 2015 ).

An engineer’s ethical responsibility does not preclude consideration of factors such as cost and schedule (Pinkus et al. 1997 ). Engineers always have to grapple with constraints, including time and resource limitations. The engineers working at Boeing did have legitimate concerns about their company losing contracts to its competitor Airbus. But being an engineer means that public safety and welfare must be the highest priority (Davis 1991 ). The aforementioned software and other design errors in the development of the 737 MAX, which resulted in hundreds of deaths, would thus seem to be clear violations of engineering codes of ethics. In addition to pointing to engineering codes, Peterson ( 2019 ) argues that Boeing engineers and managers violated widely accepted ethical norms such as informed consent and the precautionary principle.

From an engineering perspective, the central ethical issue in the MAX case arguably circulates around the decision to use software (i.e., MCAS) to “mask” a questionable hardware design—the repositioning of the engines that disrupted the aerodynamics of the airframe (Travis 2019 ). As Johnston and Harris ( 2019 ) argue: “To meet the design goals and avoid an expensive hardware change, Boeing created the MCAS as a software Band-Aid.” Though a reliance on software fixes often happens in this manner, it places a high burden of safety on such fixes that they may not be able to handle, as is illustrated by the case of the Therac-25 radiation therapy machine. In the Therac-25 case, hardware safety interlocks employed in earlier models of the machine were replaced by software safety controls. In addition, information about how the software might malfunction was lacking from the user manual for the Therac machine. Thus, when certain types of errors appeared on its interface, the machine’s operators did not know how to respond. Software flaws, among other factors, contributed to six patients being given massive radiation overdoses, resulting in deaths and serious injuries (Leveson and Turner 1993 ). A more recent case involves problems with the embedded software guiding the electronic throttle in Toyota vehicles. In 2013, “…a jury found Toyota responsible for two unintended acceleration deaths, with expert witnesses citing bugs in the software and throttle fail safe defects” (Cummings and Britton 2020 ).

Boeing’s use of MCAS to mask the significant change in hardware configuration of the MAX was compounded by not providing redundancy for components prone to failure (i.e., the AOA sensors) (Campbell 2019 ), and by failing to notify pilots about the new software. In such cases, it is especially crucial that pilots receive clear documentation and relevant training so that they know how to manage the hand-off with an automated system properly (Johnston and Harris 2019 ). Part of the necessity for such training is related to trust calibration (Borenstein et al. 2020 ; Borenstein et al. 2018 ), a factor that has contributed to previous airplane accidents (e.g., Carr 2014 ). For example, if pilots do not place enough trust in an automated system, they may add risk by intervening in system operation. Conversely, if pilots trust an automated system too much, they may lack sufficient time to act once they identify a problem. This is further complicated in the MAX case because pilots were not fully aware, if at all, of MCAS’s existence and how the system functioned.

In addition to engineering decision-making that failed to prioritize public safety, questionable management decisions were also made at both Boeing and the FAA. As noted earlier, Boeing managerial leadership ignored numerous warning signs that the 737 MAX was not safe. Also, FAA’s shift to greater reliance on self-regulation by Boeing was ill-advised; that lesson appears to have been learned at the expense of hundreds of lives (Duncan and Aratani 2019 ).

The Problem of Many Hands Revisited

Actions, or inaction, by large, complex organizations, in this case corporate and government entities, suggest that the “problem of many hands” may be relevant to the 737 MAX case. At a high level of abstraction, the problem of many hands involves the idea that accountability is difficult to assign in the face of collective action, especially in a computerized society (Thompson 1980 ; Nissenbaum 1994 ). According to Nissenbaum ( 1996 , 29), “Where a mishap is the work of ‘many hands,’ it may not be obvious who is to blame because frequently its most salient and immediate causal antecedents do not converge with its locus of decision-making. The conditions for blame, therefore, are not satisfied in a way normally satisfied when a single individual is held blameworthy for a harm”.

However, there is an alternative understanding of the problem of many hands. In this version of the problem, the lack of accountability is not merely because multiple people and multiple decisions figure into a final outcome. Instead, in order to “qualify” as the problem of many hands, the component decisions should be benign, or at least far less harmful, if examined in isolation; only when the individual decisions are collectively combined do we see the most harmful result. In this understanding, the individual decision-makers should not have the same moral culpability as they would if they made all the decisions by themselves (Noorman 2020 ).

Both of these understandings of the problem of many hands could shed light on the 737 MAX case. Yet we focus on the first version of the problem. We admit the possibility that some of the isolated decisions about the 737 MAX may have been made in part because of ignorance of a broader picture. While we do not stake a claim on whether this is what actually happened in the MAX case, we acknowledge that it may be true in some circumstances. However, we think the more important point is that some of the 737 MAX decisions were so clearly misguided that a competent engineer should have seen the implications, even if the engineer was not aware of all of the broader context. The problem then is to identify responsibility for the questionable decisions in a way that discourages bad judgments in the future, a task made more challenging by the complexities of the decision-making. Legal proceedings about this case are likely to explore those complexities in detail and are outside the scope of this article. But such complexities must be examined carefully so as not to act as an insulator to accountability.

When many individuals are involved in the design of a computing device, for example, and a serious failure occurs, each person might try to absolve themselves of responsibility by indicating that “too many people” and “too many decisions” were involved for any individual person to know that the problem was going to happen. This is a common, and often dubious, excuse in the attempt to abdicate responsibility for a harm. While it can have different levels of magnitude and severity, the problem of many hands often arises in large scale ethical failures in engineering such as in the Deepwater Horizon oil spill (Thompson 2014 ).

Possible examples in the 737 MAX case of the difficulty of assigning moral responsibility due to the problem of many hands include:

The decision to reposition the engines;

The decision to mask the jet’s subsequent dynamic instability with MCAS;

The decision to rely on only one AOA sensor in designing MCAS; and

The decision to not inform nor properly train pilots about the MCAS system.

While overall responsibility for each of these decisions may be difficult to allocate precisely, at least points 1–3 above arguably reflect fundamental errors in engineering judgement (Travis 2019 ). Boeing engineers and FAA engineers either participated in or were aware of these decisions (Kitroeff and Gelles 2019 ) and may have had opportunities to reconsider or redirect such decisions. As Davis has noted ( 2012 ), responsible engineering professionals make it their business to address problems even when they did not cause the problem, or, we would argue, solely cause it. As noted earlier, reports indicate that at least one Boeing engineer expressed reservations about the design of MCAS (Bellamy 2019 ). Since the two crashes, one Boeing engineer, Curtis Ewbank, filed an internal ethics complaint (Kitroeff et al. 2019b ) and several current and former Boeing engineers and other employees have gone public with various concerns about the 737 MAX (Pasztor 2019 ). And yet, as is often the case, the flawed design went forward with tragic results.

Enabling Ethical Engineers

The MAX case is eerily reminiscent of other well-known engineering ethics case studies such as the Ford Pinto (Birsch and Fielder 1994 ), Space Shuttle Challenger (Werhane 1991 ), and GM ignition switch (Jennings and Trautman 2016 ). In the Pinto case, Ford engineers were aware of the unsafe placement of the fuel tank well before the car was released to the public and signed off on the design even though crash tests showed the tank was vulnerable to rupture during low-speed rear-end collisions (Baura 2006 ). In the case of the GM ignition switch, engineers knew for at least four years about the faulty design, a flaw that resulted in at least a dozen fatal accidents (Stephan 2016 ). In the case of the well-documented Challenger accident, engineer Roger Boisjoly warned his supervisors at Morton Thiokol of potentially catastrophic flaws in the shuttle’s solid rocket boosters a full six months before the accident. He, along with other engineers, unsuccessfully argued on the eve of launch for a delay due to the effect that freezing temperatures could have on the boosters’ O-ring seals. Boisjoly was also one of a handful of engineers to describe these warnings to the Presidential commission investigating the accident (Boisjoly et al. 1989 ).

Returning to the 737 MAX case, could Ewbank or others with concerns about the safety of the airplane have done more than filing ethics complaints or offering public testimony only after the Lion Air and Ethiopian Airlines crashes? One might argue that requiring professional registration by all engineers in the U.S. would result in more ethical conduct (for example, by giving state licensing boards greater oversight authority). Yet the well-entrenched “industry exemption” from registration for most engineers working in large corporations has undermined such calls (Kline 2001 ).

It could empower engineers with safety concerns if Boeing and other corporations would strengthen internal ethics processes, including sincere and meaningful responsiveness to anonymous complaint channels. Schwartz ( 2013 ) outlines three core components of an ethical corporate culture, including strong core ethical values, a formal ethics program (including an ethics hotline), and capable ethical leadership. Schwartz points to Siemens’ creation of an ethics and compliance department following a bribery scandal as an example of a good solution. Boeing has had a compliance department for quite some time (Schnebel and Bienert 2004 ) and has taken efforts in the past to evaluate its effectiveness (Boeing 2003 ). Yet it is clear that more robust measures are needed in response to ethics concerns and complaints. Since the MAX crashes, Boeing’s Board has implemented a number of changes including establishing a corporate safety group and revising internal reporting procedures so that lead engineers primarily report to the chief engineer rather than business managers (Gelles and Kitroeff 2019b , Boeing n.d. c). Whether these measures will be enough to restore Boeing’s former engineering-centered focus remains to be seen.

Professional engineering societies could play a stronger role in communicating and enforcing codes of ethics, in supporting ethical behavior of engineers, and by providing more educational opportunities for learning about ethics and about the ethical responsibilities of engineers. Some societies, including ACM and IEEE, have become increasingly engaged in ethics-related activities. Initially ethics engagement by the societies consisted primarily of a focus on macroethical issues such as sustainable development (Herkert 2004 ). Recently, however, the societies have also turned to a greater focus on microethical issues (the behavior of individuals). The 2017 revision to the IEEE Code of Ethics, for example, highlights the importance of “ethical design” (Adamson and Herkert 2020 ). This parallels IEEE activities in the area of design of autonomous and intelligent systems (e.g., IEEE 2018 ). A promising outcome of this emphasis is a move toward implementing “ethical design” frameworks (Peters et al. 2020 ).

In terms of engineering education, educators need to place a greater emphasis on fostering moral courage, that is the courage to act on one’s moral convictions including adherence to codes of ethics. This is of particular significance in large organizations such as Boeing and the FAA where the agency of engineers may be limited by factors such as organizational culture (Watts and Buckley 2017 ). In a study of twenty-six ethics interventions in engineering programs, Hess and Fore ( 2018 ) found that only twenty-seven percent had a learning goal of development of “ethical courage, confidence or commitment”. This goal could be operationalized in a number of ways, for example through a focus on virtue ethics (Harris 2008 ) or professional identity (Hashemian and Loui 2010 ). This need should not only be addressed within the engineering curriculum but during lifelong learning initiatives and other professional development opportunities as well (Miller 2019 ).

The circumstances surrounding the 737 MAX airplane could certainly serve as an informative case study for ethics or technical courses. The case can shed light on important lessons for engineers including the complex interactions, and sometimes tensions, between engineering and managerial considerations. The case also tangibly displays that what seems to be relatively small-scale, and likely well-intended, decisions by individual engineers can combine collectively to result in large-scale tragedy. No individual person wanted to do harm, but it happened nonetheless. Thus, the case can serve a reminder to current and future generations of engineers that public safety must be the first and foremost priority. A particularly useful pedagogical method for considering this case is to assign students to the roles of engineers, managers, and regulators, as well as the flying public, airline personnel, and representatives of engineering societies (Herkert 1997 ). In addition to illuminating the perspectives and responsibilities of each stakeholder group, role-playing can also shed light on the “macroethical” issues raised by the case (Martin et al. 2019 ) such as airline safety standards and the proper role for engineers and engineering societies in the regulation of the industry.

Conclusions and Recommendations

The case of the Boeing 737 MAX provides valuable lessons for engineers and engineering educators concerning the ethical responsibilities of the profession. Safety is not cheap, but careless engineering design in the name of minimizing costs and adhering to a delivery schedule is a symptom of ethical blight. Using almost any standard ethical analysis or framework, Boeing’s actions regarding the safety of the 737 MAX, particularly decisions regarding MCAS, fall short.

Boeing failed in its obligations to protect the public. At a minimum, the company had an obligation to inform airlines and pilots of significant design changes, especially the role of MCAS in compensating for repositioning of engines in the MAX from prior versions of the 737. Clearly, it was a “significant” change because it had a direct, and unfortunately tragic, impact on the public’s safety. The Boeing and FAA interaction underscores the fact that conflicts of interest are a serious concern in regulatory actions within the airline industry.

Internal and external organizational factors may have interfered with Boeing and FAA engineers’ fulfillment of their professional ethical responsibilities; this is an all too common problem that merits serious attention from industry leaders, regulators, professional societies, and educators. The lessons to be learned in this case are not new. After large scale tragedies involving engineering decision-making, calls for change often emerge. But such lessons apparently must be retaught and relearned by each generation of engineers.

ACM/IEEE-CS Joint Task Force. (1997). Software Engineering Code of Ethics and Professional Practice, https://ethics.acm.org/code-of-ethics/software-engineering-code/ .

Adamson, G., & Herkert, J. (2020). Addressing intelligent systems and ethical design in the IEEE Code of Ethics. In Codes of ethics and ethical guidelines: Emerging technologies, changing fields . New York: Springer ( in press ).

Ahmed, H., Glanz, J., & Beech, H. (2019). Ethiopian airlines pilots followed Boeing’s safety procedures before crash, Report Shows. The New York Times, April 4, https://www.nytimes.com/2019/04/04/world/asia/ethiopia-crash-boeing.html .

AIAA. (2013). Code of Ethics, https://www.aiaa.org/about/Governance/Code-of-Ethics .

Arnold, K. (2019). FAA report predicted there could be 15 more 737 MAX crashes. The Dallas Morning News, December 11, https://www.dallasnews.com/business/airlines/2019/12/11/faa-chief-says-boeings-737-max-wont-be-approved-in-2019/

Baura, G. (2006). Engineering ethics: an industrial perspective . Amsterdam: Elsevier.

Google Scholar  

BBC News. (2019). Work on production line of Boeing 737 MAX ‘Not Adequately Funded’. July 29, https://www.bbc.com/news/business-49142761 .

Bellamy, W. (2019). Boeing CEO outlines 737 MAX MCAS software fix in congressional hearings. Aviation Today, November 2, https://www.aviationtoday.com/2019/11/02/boeing-ceo-outlines-mcas-updates-congressional-hearings/ .

Benning, T., & DiFurio, D. (2019). American Airlines Pilots Union boss prods lawmakers to solve 'Crisis of Trust' over Boeing 737 MAX. The Dallas Morning News, June 19, https://www.dallasnews.com/business/airlines/2019/06/19/american-airlines-pilots-union-boss-prods-lawmakers-to-solve-crisis-of-trust-over-boeing-737-max/ .

Birsch, D., & Fielder, J. (Eds.). (1994). The ford pinto case: A study in applied ethics, business, and technology . New York: The State University of New York Press.

Boeing. (2003). Boeing Releases Independent Reviews of Company Ethics Program. December 18, https://boeing.mediaroom.com/2003-12-18-Boeing-Releases-Independent-Reviews-of-Company-Ethics-Program .

Boeing. (2018). Flight crew operations manual bulletin for the Boeing company. November 6, https://www.avioesemusicas.com/wp-content/uploads/2018/10/TBC-19-Uncommanded-Nose-Down-Stab-Trim-Due-to-AOA.pdf .

Boeing. (n.d. a). About the Boeing 737 MAX. https://www.boeing.com/commercial/737max/ .

Boeing. (n.d. b). 737 MAX Updates. https://www.boeing.com/737-max-updates/ .

Boeing. (n.d. c). Initial actions: sharpening our focus on safety. https://www.boeing.com/737-max-updates/resources/ .

Bogaisky, J. (2020). Boeing stock plunges as coronavirus imperils quick ramp up in 737 MAX deliveries. Forbes, March 11, https://www.forbes.com/sites/jeremybogaisky/2020/03/11/boeing-coronavirus-737-max/#1b9eb8955b5a .

Boisjoly, R. P., Curtis, E. F., & Mellican, E. (1989). Roger Boisjoly and the challenger disaster: The ethical dimensions. J Bus Ethics, 8 (4), 217–230.

Article   Google Scholar  

Borenstein, J., Mahajan, H. P., Wagner, A. R., & Howard, A. (2020). Trust and pediatric exoskeletons: A comparative study of clinician and parental perspectives. IEEE Transactions on Technology and Society , 1 (2), 83–88.

Borenstein, J., Wagner, A. R., & Howard, A. (2018). Overtrust of pediatric health-care robots: A preliminary survey of parent perspectives. IEEE Robot Autom Mag, 25 (1), 46–54.

Bushey, C. (2019). The Tough Crowd Boeing Needs to Convince. Crain’s Chicago Business, October 25, https://www.chicagobusiness.com/manufacturing/tough-crowd-boeing-needs-convince .

Campbell, D. (2019). The many human errors that brought down the Boeing 737 MAX. The Verge, May 2, https://www.theverge.com/2019/5/2/18518176/boeing-737-max-crash-problems-human-error-mcas-faa .

Carr, N. (2014). The glass cage: Automation and us . Norton.

Cummings, M. L., & Britton, D. (2020). Regulating safety-critical autonomous systems: past, present, and future perspectives. In Living with robots (pp. 119–140). Academic Press, New York.

Davis, M. (1991). Thinking like an engineer: The place of a code of ethics in the practice of a profession. Philos Publ Affairs, 20 (2), 150–167.

Davis, M. (2012). “Ain’t no one here but us social forces”: Constructing the professional responsibility of engineers. Sci Eng Ethics, 18 (1), 13–34.

Davis, M. (2015). Engineering as profession: Some methodological problems in its study. In Engineering identities, epistemologies and values (pp. 65–79). Springer, New York.

Department of Transportation (DOT). (2020). Official report of the special committee to review the Federal Aviation Administration’s Aircraft Certification Process, January 16. https://www.transportation.gov/sites/dot.gov/files/2020-01/scc-final-report.pdf .

Duncan, I., & Aratani, L. (2019). FAA flexes its authority in final stages of Boeing 737 MAX safety review. The Washington Post, November 27, https://www.washingtonpost.com/transportation/2019/11/27/faa-flexes-its-authority-final-stages-boeing-max-safety-review/ .

Duncan, I., & Laris, M. (2020). House report on 737 Max crashes faults Boeing’s ‘culture of concealment’ and labels FAA ‘grossly insufficient’. The Washington Post, March 6, https://www.washingtonpost.com/local/trafficandcommuting/house-report-on-737-max-crashes-faults-boeings-culture-of-concealment-and-labels-faa-grossly-insufficient/2020/03/06/9e336b9e-5fce-11ea-b014-4fafa866bb81_story.html .

Economy, P. (2019). Boeing CEO Puts Partial Blame on Pilots of Crashed 737 MAX Aircraft for Not 'Completely' Following Procedures. Inc., April 30, https://www.inc.com/peter-economy/boeing-ceo-puts-partial-blame-on-pilots-of-crashed-737-max-aircraft-for-not-completely-following-procedures.html .

Federal Aviation Administration (FAA). (2018a). Airworthiness directives; the Boeing company airplanes. FR Doc No: R1-2018-26365. https://rgl.faa.gov/Regulatory_and_Guidance_Library/rgad.nsf/0/fe8237743be9b8968625835b004fc051/$FILE/2018-23-51_Correction.pdf .

Federal Aviation Administration (FAA). (2018b). Quantitative Risk Assessment. https://www.documentcloud.org/documents/6573544-Risk-Assessment-for-Release-1.html#document/p1 .

Federal Aviation Administration (FAA). (2019). Joint authorities technical review: observations, findings, and recommendations. October 11, https://www.faa.gov/news/media/attachments/Final_JATR_Submittal_to_FAA_Oct_2019.pdf .

Federal Democratic Republic of Ethiopia. (2019). Aircraft accident investigation preliminary report. Report No. AI-01/19, April 4, https://leehamnews.com/wp-content/uploads/2019/04/Preliminary-Report-B737-800MAX-ET-AVJ.pdf .

Federal Democratic Republic of Ethiopia. (2020). Aircraft Accident Investigation Interim Report. Report No. AI-01/19, March 20, https://www.aib.gov.et/wp-content/uploads/2020/documents/accident/ET-302%2520%2520Interim%2520Investigation%2520%2520Report%2520March%25209%25202020.pdf .

Gates, D. (2018). Pilots struggled against Boeing's 737 MAX control system on doomed Lion Air flight. The Seattle Times, November 27, https://www.seattletimes.com/business/boeing-aerospace/black-box-data-reveals-lion-air-pilots-struggle-against-boeings-737-max-flight-control-system/ .

Gates, D. (2019). Flawed analysis, failed oversight: how Boeing, FAA Certified the Suspect 737 MAX Flight Control System. The Seattle Times, March 17, https://www.seattletimes.com/business/boeing-aerospace/failed-certification-faa-missed-safety-issues-in-the-737-max-system-implicated-in-the-lion-air-crash/ .

Gelles, D. (2019). Boeing can’t fly its 737 MAX, but it’s ready to sell its safety. The New York Times, December 24 (updated February 10, 2020), https://www.nytimes.com/2019/12/24/business/boeing-737-max-survey.html .

Gelles, D. (2020). Boeing expects 737 MAX costs will surpass $18 Billion. The New York Times, January 29, https://www.nytimes.com/2020/01/29/business/boeing-737-max-costs.html .

Gelles, D., & Kaplan, T. (2019). F.A.A. Approval of Boeing jet involved in two crashes comes under scrutiny. The New York Times, March 19, https://www.nytimes.com/2019/03/19/business/boeing-elaine-chao.html .

Gelles, D., & Kitroeff, N. (2019a). Boeing Believed a 737 MAX warning light was standard. It wasn’t. New York: The New York Times. https://www.nytimes.com/2019/05/05/business/boeing-737-max-warning-light.html .

Gelles, D., & Kitroeff, N. (2019b). Boeing board to call for safety changes after 737 MAX Crashes. The New York Times, September 15, (updated October 2), https://www.nytimes.com/2019/09/15/business/boeing-safety-737-max.html .

Gelles, D., & Kitroeff, N. (2019c). Boeing pilot complained of ‘Egregious’ issue with 737 MAX in 2016. The New York Times, October 18, https://www.nytimes.com/2019/10/18/business/boeing-flight-simulator-text-message.html .

Gelles, D., & Kitroeff, N. (2020). What needs to happen to get Boeing’s 737 MAX flying again?. The New York Times, February 10, https://www.nytimes.com/2020/02/10/business/boeing-737-max-fly-again.html .

Gelles, D., Kitroeff, N., Nicas, J., & Ruiz, R. R. (2019). Boeing was ‘Go, Go, Go’ to beat airbus with the 737 MAX. The New York Times, March 23, https://www.nytimes.com/2019/03/23/business/boeing-737-max-crash.html .

Glanz, J., Creswell, J., Kaplan, T., & Wichter, Z. (2019). After a Lion Air 737 MAX Crashed in October, Questions About the Plane Arose. The New York Times, February 3, https://www.nytimes.com/2019/02/03/world/asia/lion-air-plane-crash-pilots.html .

Gotterbarn, D., & Miller, K. W. (2009). The public is the priority: Making decisions using the software engineering code of ethics. Computer, 42 (6), 66–73.

Hall, J., & Goelz, P. (2019). The Boeing 737 MAX Crisis Is a Leadership Failure, The New York Times, July 17, https://www.nytimes.com/2019/07/17/opinion/boeing-737-max.html .

Harris, C. E. (2008). The good engineer: Giving virtue its due in engineering ethics. Science and Engineering Ethics, 14 (2), 153–164.

Hashemian, G., & Loui, M. C. (2010). Can instruction in engineering ethics change students’ feelings about professional responsibility? Science and Engineering Ethics, 16 (1), 201–215.

Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics, 3 (4), 447–462.

Herkert, J. R. (2004). Microethics, macroethics, and professional engineering societies. In Emerging technologies and ethical issues in engineering: papers from a workshop (pp. 107–114). National Academies Press, New York.

Hess, J. L., & Fore, G. (2018). A systematic literature review of US engineering ethics interventions. Science and Engineering Ethics, 24 (2), 551–583.

House Committee on Transportation and Infrastructure (House TI). (2020). The Boeing 737 MAX Aircraft: Costs, Consequences, and Lessons from its Design, Development, and Certification-Preliminary Investigative Findings, March. https://transportation.house.gov/imo/media/doc/TI%2520Preliminary%2520Investigative%2520Findings%2520Boeing%2520737%2520MAX%2520March%25202020.pdf .

IEEE. (2017). IEEE Code of Ethics. https://www.ieee.org/about/corporate/governance/p7-8.html .

IEEE. (2018). Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous and Intelligent Systems (version 2). https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v2.pdf .

Jennings, M., & Trautman, L. J. (2016). Ethical culture and legal liability: The GM switch crisis and lessons in governance. Boston University Journal of Science and Technology Law, 22 , 187.

Johnston, P., & Harris, R. (2019). The Boeing 737 MAX Saga: Lessons for software organizations. Software Quality Professional, 21 (3), 4–12.

Josephs, L. (2019). FAA tightens grip on Boeing with plan to individually review each new 737 MAX Jetliner. CNBC, November 27, https://www.cnbc.com/2019/11/27/faa-tightens-grip-on-boeing-with-plan-to-individually-inspect-max-jets.html .

Kaplan, T., Austen, I., & Gebrekidan, S. (2019). The New York Times, March 13. https://www.nytimes.com/2019/03/13/business/canada-737-max.html .

Kitroeff, N. (2019). Boeing underestimated cockpit chaos on 737 MAX, N.T.S.B. Says. The New York Times, September 26, https://www.nytimes.com/2019/09/26/business/boeing-737-max-ntsb-mcas.html .

Kitroeff, N., & Gelles, D. (2019). Legislators call on F.A.A. to say why it overruled its experts on 737 MAX. The New York Times, November 7 (updated December 11), https://www.nytimes.com/2019/11/07/business/boeing-737-max-faa.html .

Kitroeff, N., & Gelles, D. (2020). It’s not just software: New safety risks under scrutiny on Boeing’s 737 MAX. The New York Times, January 5, https://www.nytimes.com/2020/01/05/business/boeing-737-max.html .

Kitroeff, N., & Schmidt, M. S. (2020). Federal prosecutors investigating whether Boeing pilot lied to F.A.A. The New York Times, February 21, https://www.nytimes.com/2020/02/21/business/boeing-737-max-investigation.html .

Kitroeff, N., Gelles, D., & Nicas, J. (2019a). The roots of Boeing’s 737 MAX Crisis: A regulator relaxes its oversight. The New York Times, July 27, https://www.nytimes.com/2019/07/27/business/boeing-737-max-faa.html .

Kitroeff, N., Gelles, D., & Nicas, J. (2019b). Boeing 737 MAX safety system was vetoed, Engineer Says. The New York Times, October 2, https://www.nytimes.com/2019/10/02/business/boeing-737-max-crashes.html .

Kline, R. R. (2001). Using history and sociology to teach engineering ethics. IEEE Technology and Society Magazine, 20 (4), 13–20.

Koenig, D. (2019). Boeing, FAA both faulted in certification of the 737 MAX. AP, October 11, https://apnews.com/470abf326cdb4229bdc18c8ad8caa78a .

Langewiesche, W. (2019). What really brought down the Boeing 737 MAX? The New York Times, September 18, https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-crashes.html .

Leveson, N. G., & Turner, C. S. (1993). An investigation of the Therac-25 accidents. Computer, 26 (7), 18–41.

Marks, S., & Dahir, A. L. (2020). Ethiopian report on 737 Max Crash Blames Boeing, March 9, https://www.nytimes.com/2020/03/09/world/africa/ethiopia-crash-boeing.html .

Martin, D. A., Conlon, E., & Bowe, B. (2019). The role of role-play in student awareness of the social dimension of the engineering profession. European Journal of Engineering Education, 44 (6), 882–905.

Miller, G. (2019). Toward lifelong excellence: navigating the engineering-business space. In The Engineering-Business Nexus (pp. 81–101). Springer, Cham.

National Transportation Safety Board (NTSB). (2019). Safety Recommendations Report, September 19, https://www.ntsb.gov/investigations/AccidentReports/Reports/ASR1901.pdf .

Nissenbaum, H. (1994). Computing and accountability. Communications of the ACM , January, https://dl.acm.org/doi/10.1145/175222.175228 .

Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2 (1), 25–42.

Noorman, M. (2020). Computing and moral responsibility. In Zalta, E. N. (Ed.). The Stanford Encyclopedia of Philosophy (Spring), https://plato.stanford.edu/archives/spr2020/entries/computing-responsibility .

Pasztor, A. (2019). More Whistleblower complaints emerge in Boeing 737 MAX Safety Inquiries. The Wall Street Journal, April 27, https://www.wsj.com/articles/more-whistleblower-complaints-emerge-in-boeing-737-max-safety-inquiries-11556418721 .

Pasztor, A., & Cameron, D. (2020). U.S. News: Panel Backs How FAA gave safety approval for 737 MAX. The Wall Street Journal, January 17, https://www.wsj.com/articles/panel-clears-737-maxs-safety-approval-process-at-faa-11579188086 .

Pasztor, A., Cameron.D., & Sider, A. (2020). Boeing backs MAX simulator training in reversal of stance. The Wall Street Journal, January 7, https://www.wsj.com/articles/boeing-recommends-fresh-max-simulator-training-11578423221 .

Peters, D., Vold, K., Robinson, D., & Calvo, R. A. (2020). Responsible AI—two frameworks for ethical design practice. IEEE Transactions on Technology and Society, 1 (1), 34–47.

Peterson, M. (2019). The ethical failures behind the Boeing disasters. Blog of the APA, April 8, https://blog.apaonline.org/2019/04/08/the-ethical-failures-behind-the-boeing-disasters/ .

Pinkus, R. L., Pinkus, R. L. B., Shuman, L. J., Hummon, N. P., & Wolfe, H. (1997). Engineering ethics: Balancing cost, schedule, and risk-lessons learned from the space shuttle . Cambridge: Cambridge University Press.

Republic of Indonesia. (2019). Final Aircraft Accident Investigation Report. KNKT.18.10.35.04, https://knkt.dephub.go.id/knkt/ntsc_aviation/baru/2018%2520-%2520035%2520-%2520PK-LQP%2520Final%2520Report.pdf .

Rich, G. (2019). Boeing 737 MAX should return in 2020 but the crisis won't be over. Investor's Business Daily, December 31, https://www.investors.com/news/boeing-737-max-service-return-2020-crisis-not-over/ .

Schnebel, E., & Bienert, M. A. (2004). Implementing ethics in business organizations. Journal of Business Ethics, 53 (1–2), 203–211.

Schwartz, M. S. (2013). Developing and sustaining an ethical corporate culture: The core elements. Business Horizons, 56 (1), 39–50.

Stephan, K. (2016). GM Ignition Switch Recall: Too Little Too Late? [Ethical Dilemmas]. IEEE Technology and Society Magazine, 35 (2), 34–35.

Sullenberger, S. (2019). My letter to the editor of New York Times Magazine, https://www.sullysullenberger.com/my-letter-to-the-editor-of-new-york-times-magazine/ .

Thompson, D. F. (1980). Moral responsibility of public officials: The problem of many hands. American Political Science Review, 74 (4), 905–916.

Thompson, D. F. (2014). Responsibility for failures of government: The problem of many hands. The American Review of Public Administration, 44 (3), 259–273.

Tkacik, M. (2019). Crash course: how Boeing’s managerial revolution created the 737 MAX Disaster. The New Republic, September 18, https://newrepublic.com/article/154944/boeing-737-max-investigation-indonesia-lion-air-ethiopian-airlines-managerial-revolution .

Travis, G. (2019). How the Boeing 737 MAX disaster looks to a software developer. IEEE Spectrum , April 18, https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-software-developer .

Useem, J. (2019). The long-forgotten flight that sent Boeing off course. The Atlantic, November 20, https://www.theatlantic.com/ideas/archive/2019/11/how-boeing-lost-its-bearings/602188/ .

Watts, L. L., & Buckley, M. R. (2017). A dual-processing model of moral whistleblowing in organizations. Journal of Business Ethics, 146 (3), 669–683.

Werhane, P. H. (1991). Engineers and management: The challenge of the Challenger incident. Journal of Business Ethics, 10 (8), 605–616.

Download references

Acknowledgement

The authors would like to thank the anonymous reviewers for their helpful comments.

Author information

Authors and affiliations.

North Carolina State University, Raleigh, NC, USA

Joseph Herkert

Georgia Institute of Technology, Atlanta, GA, USA

Jason Borenstein

University of Missouri – St. Louis, St. Louis, MO, USA

Keith Miller

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Joseph Herkert .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Herkert, J., Borenstein, J. & Miller, K. The Boeing 737 MAX: Lessons for Engineering Ethics. Sci Eng Ethics 26 , 2957–2974 (2020). https://doi.org/10.1007/s11948-020-00252-y

Download citation

Received : 26 March 2020

Accepted : 25 June 2020

Published : 10 July 2020

Issue Date : December 2020

DOI : https://doi.org/10.1007/s11948-020-00252-y

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Engineering ethics
  • Airline safety
  • Engineering design
  • Corporate culture
  • Software engineering
  • Find a journal
  • Publish with us
  • Track your research

Ethics for Engineers

Case Studies

  • Engineering disasters
  • Covid vaccine distribution among wealthy and poor countries
  • A NYC skyscraper in danger of collapse
  • The Volkswagen emissions scandal
  • Addressing climate change
  • Racism and informed consent in clinical trials
  • Using CRISPR to manipulate animal and human genomes
  • Effects of social media on people’s lives and on society
  • The power, promise, and dangers of rapidly advancing AI capabilities

We work hard to find short readings that are interesting and thought-provoking.

  • Engineering Ethics Toolkit

Engineering Ethics Toolkit: Case studies

  • Engineering Ethics Toolkit – Contributor biographies
  • Engineering Ethics Toolkit – Contributor feedback questionnaire
  • Engineering Ethics Toolkit: Submit a case enhancement
  • Engineering Ethics Toolkit: Submit a guidance article
  • Engineering Ethics Toolkit: Reports and studies
  • Ethics Explorer
  • Engineering Ethics Toolkit: Blogs
  • Engineering Ethics Toolkit: Advice and guidance
  • Engineering Ethics Toolkit: Submit a case study
  • Engineering Ethics Toolkit: Case enhancements and other activities
  • Engineering Ethics Toolkit: Get involved

Welcome to the case studies pages of the EPC’s Engineering Ethics toolkit, produced in partnership with the Royal Academy of Engineering. Click here for the toolkit homepage.

Case studies are one tool that can be used to address the context and impact of engineering ethics, and have been proven to be an effective teaching and learning method.

These case studies cover a variety of engineering disciplines, professional situations, and personal dilemmas and focuses on several areas of moral pedagogy. They were developed for use in higher education, but may also be of use in other settings.

To accommodate many educational levels, the case studies are divided between Beginner, Intermediate, and Advanced cases. They are written in parts so that educators have the flexibility to use them as is best suited to their teaching content and environment, and all cases permit and encourage the integration of technical content. Along with learning and teaching notes, the cases contain suggested questions and activities as well as supplementary materials.

They are aligned to the Engineering Council and Royal Academy of Engineering’s Joint Statement of Ethical Principles and the expectations of the 4th Edition of the Engineering Council’s Accreditation of Higher Education Programmes and are therefore appropriate for UK teaching and learning contexts. They are, however, easily adapted for use in other countries.

Guidance articles are also available to help situate the case studies in an educational context and to signpost to additional research and resources on engineering ethics.

In developing the cases and articles in this resource, the authors and advisory group took into account recent scholarship on best practices in teaching engineering ethics through case studies. They also reviewed existing case study libraries in order to add to the growing body of material available on engineering ethics. 

Case studies

Case study Topic Disciplines Issues Level
Suitable technology for developing countries

Mechanical engineering
Electrical engineering
Energy

Sustainability
Honesty 
Integrity
Public good


Advanced
Dealing with contracts or subcontracts with potential slave or forced labour

Manufacturing
Engineering business

Social responsibility
Human rights 
Risk


Beginner
  Maintenance of an offshore wind farm

Mechanical
Energy

Sustainability
Risk


Beginner
Participatory approaches for engaging with a local community about the development of risky technologies

Nuclear engineering
Chemical engineering
Energy

Corporate social responsibility
Risk 
Accountability
Respect for the environment


Advanced
A country-wide energy transition plan

Energy
Electrical

Sustainability
Social responsibility
Risk


Beginner
Balancing personal values and professional conduct in the climate emergency

Civil engineering
Energy and environmental engineering
Energy

Respect for the environment
Justice 
Accountability
Social responsibility
Risk
Sustainability
Health
Public good
Respect for the law
Future generations
Societal impact


Intermediate
Home heating in the energy transition

Civil engineering
Chemical engineering
Mechanical engineering
Energy

Sustainability 
Social responsibility


Intermediate
Critical digital literacy

Computer science
Information systems
Biomedical engineering

Cultural context 
Social responsibility
Privacy


Intermediate
Sustainable materials  in construction

Civil engineering
Manufacturing
Construction

Sustainability 
Respect for the environment
Future generations
Societal impact
Corporate social responsibility


Intermediate
Responsibility for micro- and nano-plastics in the environment and human bodies

Environmental engineering
Chemical engineering
Mechanical engineering
Materials engineering

Respect for the environment 
Corporate social responsibility
Power
Safety


Intermediate
Data security of smart technologies

Electronics
Data
Biomedical engineering

Autonomy
Dignity 
Privacy
Confidentiality


Advanced
Data security of industrial robots

Robotics
Data
Internet of things

Safety
Health
Privacy
Transparency


Intermediate
Materials sourcing and circularity

Materials engineering
Manufacturing
Environmental engineering
Construction

Respect for the environment
Risk


Intermediate
Ethical entrepreneurship in engineering industries Mechanical engineering
Electrical and electronic engineering
Chemical engineering
Justice
Corporate social responsibility
Accountability
Beginner
Intermediate
Advanced
Development and use of a facial recognition system Data
Electronics
Computer science
AI
Diversity
Bias
Privacy
Transparency
Advanced
Soil carbon sequestration and solar geoengineering Chemical engineering
Energy
Environmental engineering
Respect of the environment
Social responsibility Risk.
Beginner
Safety of construction materials Mechanical engineering
Materials
Safety
Communication, Whistleblowing
Power
Beginner
  Low earth orbit satellites for internet provision Electronics Mechanical engineering Respect for environment
Public good
Future generations
 Intermediate
Monitoring and resolving industrial pollution Chemical engineering
Civil engineering
Manufacturing
Mechanical engineering
Environment
Health
Public good
Advanced
Alternative food production Energy
Chemical engineering
Sustainability
Social responsibility
Advanced
Developing customised algorithms for student support Computing
AI
Data.
Bias
Social responsibility
Risk
Privacy
Beginner
Data centres’ impact on sustainable water resources Civil engineering
Electronic engineering
Sustainability
Respect for environment
Future generations
Risk
Societal impact
Intermediate
Data security of smart technologies Electronics
Data
Mechatronics
Autonomy
Dignity
Privacy
Confidentiality
Intermediate
Smart meters for responsible everyday energy use Electrical engineering Integrity
Transparency
Social responsibility
Respect for the environment
Respect for the law
Beginner
Trade-offs in the energy transition Chemical engineering
Electrical engineering
Energy
Sustainability
Honesty
Respect for the environment
Public good
Intermediate

Most case studies are also available as PDF documents on the RAEng website.

To ensure that everyone can use and adapt these cases in a way that best fits their teaching or purpose, this work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License .  Under this licence, you are free to share and adapt this material, under terms that you must give appropriate credit and attribution to the original material and indicate if any changes are made.

Get involved:  These case studies were created as part of the EPC’s Engineering Ethics toolkit that is intended to evolve and grow over time. Further case studies are being developed and will be added in due course, along with additional teaching resources to support individual case studies. We are actively inviting experts to submit case studies for review and possible inclusion in this toolkit. For more information, see our Get involved  page.

National Academies Press: OpenBook

Emerging Technologies and Ethical Issues in Engineering: Papers from a Workshop (2004)

Chapter: state of the art in engineering ethics methodologies for case studies in engineering ethics, state of the art in engineering ethics.

This page intentionally left blank.

Methodologies for Case Studies in Engineering Ethics

CHARLES E. (ED) HARRIS

Texas A&M University

The methodology presented in this paper has two aspects: analytical and problem-resolution. The analytical aspect suggests concepts for identifying the types of issues in a case—factual issues, conceptual issues, application issues, and moral issues. The problem-resolution aspect involves “bottom-up” techniques and “top-down” techniques. Bottom-up techniques rely on moral intuitions rather than moral theories. These methods include weighing, casuistry, and finding a creative middle way. Top-down methods appeal to a general moral theory and are sometimes useful in applied ethics. Both methods are familiar in Western philosophy as utilitarianism and the ethics of respect for persons.

Most education in ethics and professional responsibility relies heavily on case studies. This is true of medical, legal, nursing, veterinary, dental, and business ethics. It is also true of engineering ethics. Students in my large classes in engineering ethics (approximately 600 each semester) often tell me that their favorite part of the course is the case studies, reflecting the practical orientation that characterizes all professionals. The ethical and professional concerns of people who defend clients in court, treat people who are sick, manage companies, fill teeth, operate on pets, and design bridges can best be addressed by way of cases that focus on activities relevant to their usual activities.

I find it useful to divide cases into three categories: micro-cases, macro-cases, and exemplary cases. Broadly defined, micro-cases are cases in which an individual professional makes decisions involving ethical or professional concerns. These decisions may have a limited impact or a wide-ranging impact. For example, John must decide whether he will accept a rather large gift from a

supplier. Alison must decide whether she is going to take part in a project that is environmentally destructive.

Macro-cases typically involve social policies, legislation, governmental administrative decisions, or the setting of policies for professional societies. In engineering, these policies usually have to do with the impact of technology on society. How should privacy be protected with respect to computers? How should computer crimes be treated? What kind of intellectual property rights should be granted to the creators of software? What policies should engineering societies adopt with respect to the environment? Should the cloning of human beings be pursued?

Exemplary cases involve situations in which professionals act in an admirable way in their professional capacities. Exemplary cases have two characteristics. First, decisions have already been made and a course of action already taken. In other words, no dilemma remains to be resolved. In exemplary cases, the dilemma has already been resolved in an exemplary way. Second, the behavior exhibited is praiseworthy, either because it is a paradigm of right action or because the action is taken in the face of adversity or because the action goes beyond what might be considered required under the circumstances. Exemplary cases can involve micro- or macro-issues.

Here is an example of a micro-case involving exemplary action. In the late 1930s, a group of General Electric engineers spent time outside their normal working hours to develop the sealed-beam headlight. Apparently, the prevailing consensus was that the headlight was not technically feasible. Nevertheless, the engineers accomplished their task. Sometimes, an engineer who simply performs what appears to be his or her professional duty can also exhibit exemplary action. Roger Boisjoly, an engineer who protested the launch of the Challenger at considerable risk to his career, exhibited exemplary action.

METHODS OF ANALYSIS

Methods of analysis can be used to identify the types of issues involved in a case: factual issues, conceptual issues, application issues, and moral issues.

Factual Issues

A factual issue has two characteristics: (1) it is a disagreement over a matter of fact, and (2) this matter of fact is crucial to resolving the problem. A fact, unlike a factual issue, is a matter that has already been settled and is uncontroversial. Factual issues arise, for example, in cases in which we do not know how much a certain modification in a design will cost or what the effects of a certain course of action will be or how accurate a given test is or what risks are involved in a certain technology.

In the real world, empirical research should be used to resolve a factual

issue. Some factual issues, however, cannot be resolved by investigation. Some technological questions cannot be answered, such as questions about consequences that can only be answered in the future. In these cases, the most realistic approach is to leave the factual question unanswered and make a decision in the context of factual uncertainty. Especially in the classroom, it is not appropriate to make assumptions that resolve an issue in a way that could not be done in a real-world context.

Here is a case involving a factual issue. A new law requires that the lead content of drinking water be less than 1.0 part per billion (ppb). Melissa is a safety engineer who has tested her company’s drinking water by two methods. Method A gives a reading of 0.85 ppb; Method B gives a reading of 1.23 ppb. She must fill out a government report describing the quality of her company’s water. If the lead content exceeds 1.0 ppb, her company will be fined. She must decide whether to report the results of Method A or Method B. In this case, her decision is based primarily on the factual issue of which method is the most accurate.

It is important to keep in mind that many controversies that appear to be about moral issues are traceable primarily to disagreements over facts. Two people may disagree about the proper course of action because they disagree about the consequences of a given course of action. Two engineers may disagree about which of two designs is ethically more acceptable because they disagree about which one is safer. They may agree on the moral parameters of the case, namely that the safest design should be chosen, but they may disagree over which design is safer. Although such a disagreement might be called a moral or ethical disagreement, it is really a disagreement over factual issues, unless they disagree over the definition of “safe.” Engineering students are often inclined to say that ethics are “soft” (in cases where a factual disagreement cannot be settled). It is important, therefore, to realize that sometimes, even though moral parameters may be agreed upon, there may be irresolvable disagreements over facts.

Conceptual Issues

A conceptual issue is (1) a disagreement over a definition of a concept that is (2) crucial to resolving a problem. Two engineers may differ over whether a design is safe because they have different definitions of (i.e., criteria for) “safe.” They may disagree about whether a given action is a conflict of interest because they may have different definitions of “conflict of interest.” They may disagree over whether something is a bribe because they have different conceptions of a bribe and how to distinguish one from extortion or “grease money.”

Here is an example of a case involving a conceptual issue. Sally is a mechanical engineer employed by General Motors to design automotive gas tanks. According to government safety standards, the automobile must be able to survive a “moderate impact” with no chance of the gas tank catching fire. In recent

tests, in cars that crashed at 35 miles per hour (mph) the gas tanks did not catch fire, whereas in 20 percent of cars that crashed at 45 mph they did. She knows she must first determine how the government defines “moderate impact.”

Probably the most effective way to come up with a definition is to derive one from paradigm, or standard, cases. A paradigm case of a bribe might be one in which an engineer accepts a large sum of money to specify a product that is not the most appropriate one for the design. From this standard case, we might derive a working definition of a bribe as an offer of something of value to induce a person to perform an action that is morally inappropriate to his or her office or role. If definitions differ, it may be possible to argue that one definition is more in accord with standard practice or paradigms or that one definition is more useful or easier to apply. If there are continuing differences over conceptual issues, the important thing is to be aware of the differences.

Another important consideration is whether a concept is “moralized” or “nonmoralized.” A moralized concept includes an implicit moral judgment that the action to which the concept refers is either morally acceptable or unacceptable. When we label something as a bribe, we make a presumptive judgment that it is wrong, because, as we have seen, we usually define bribery as giving something of value to induce a person to perform an action that is morally inappropriate to his or her office or role. Breaking confidentiality, for example, is prima facie morally wrong, because we define it as violating a commitment or breaking a rule that is morally justified.

Of course, the fact that an action is a bribe makes only a presumptive case that it is morally wrong. There might be a moral consideration that overrides the fact that we are giving a bribe. Bribing a Nazi guard to get your grandmother out of a concentration camp would be morally permissible, because the office of a concentration-camp guard is itself morally illegitimate. Breaking confidentiality is prima facie bad, but it may be justified when the safety of the public is at stake.

Some concepts, by contrast, appear to be morally neutral. We may call them nonmoralized concepts. In deciding whether computer software is a work of authorship (like a book) or an invention (like a machine), we must define “work of authorship” and “invention.” These definitions do not appear to involve moral judgments about the value of these two types of creative products.

Application Issues

An application issue is a question of whether or not a concept applies to a given situation. An application issue is (1) a disagreement over the application of a concept in a particular situation that is (2) crucial to resolving a problem. I just referred to the question of whether computer software should be classified as a work of authorship or an invention. This is an application issue, because the question is whether the concept of a work of authorship (once we have defined it) or the concept of invention (once we have defined it) best applies to software. Of

course, neither of these concepts applies particularly well, and this is characteristic of application issues. An application issue is one in which we have trouble deciding whether a concept applies in a given situation. We have no trouble deciding whether killing a person by stabbing him in the back to get his money is murder, but we do argue over whether euthanasia is murder. Similarly, engineers might argue over whether attending a conference in Hawaii sponsored by a vendor is a bribe, or whether giving one client general information about another client’s projects is a breach of confidentiality.

Here is an example of an application issue. Larry is an aerospace engineer who is a member of the Quaker religion, which is committed to nonviolence. Larry was hired by his firm to design passenger airplanes, but his boss has recently reassigned him to design military fighters. Larry must decide whether to accept the new assignment or quit and find a new job. He must decide whether his commitment to “nonviolence” requires not only that he refrain from operating military aircraft, but also that he refrain from designing them.

Application issues often arise in the law. The Constitution requires that citizens be given a “speedy” trial. If a citizen is kept in jail for two years without a trial, is this a denial of his constitutional right to a speedy trial? A city has a law against “vehicles” in the park, and a child rides a skateboard into the park. Is a skateboard a “vehicle”?

Moral Issues

A fourth type of issue is a genuine moral issue, usually a conflict between two or more values or obligations. Engineer Tom does not want to give the customs officer money, but he needs to get something through customs to complete a project that is important for the local economy as well as for his firm. Here Tom faces a conflict between his obligation not to pay bribes or grease money and his obligation to complete the project. Engineer Jane is not sure whether she should design a slightly safer product that will be considerably more expensive for consumers. Jane faces a conflict between her obligation to produce safe products and her obligation to produce inexpensive products.

Here is another example of a moral issue. Harry works for a large manufacturer in the town of Lake Pleasant. His company employs half of the people in the town, which is in an otherwise economically depressed part of the country. Harry discovers that his company is dumping chemicals into the local lake that may pose a health hazard. The lake is the town’s main source of drinking water. Harry is told that the company dumps these chemicals into the lake because disposing of them in any other way would be so expensive that the plant would have to close. Should Harry report his company’s practice to the local authorities? Harry faces a conflict between his obligation to the health of the citizens of Lake Pleasant and his obligation to the economic welfare of the citizens of Lake Pleasant.

BOTTOM-UP METHODS OF PROBLEM RESOLUTION

Sometimes moral conflicts remain even after all of the factual, conceptual, and application issues have been resolved. Therefore, we should consider some methods for resolving moral conflicts. Following a nomenclature often used in medical ethics, I find it useful to divide methods of resolving conflicts into bottom-up and top-down methods. Bottom-up methods start on a fairly concrete level, close to the details of the case, and work toward a solution. These methods adopt generally-accepted, intuitively plausible moral concepts that are a part of the moral thinking of most people, at least in our society. They work on what R.M. Hare, a prominent moral philosopher, would call the intuitive level of moral thinking (Hare, 1981).

Weighing or Balancing

The simplest bottom-up method might be called balancing or weighing. Reasons for alternative evaluations are considered, or “weighed,” and the alternative with the most convincing reasons is selected. We examine the reasons for and against universal engineering registration and, all things considered, find one set of reasons more convincing than the others. If we find the reasons on both sides equally convincing, either option is morally permissible.

Engineer Jane, who owns a civil engineering design firm, has a chance to bid on part of the design work for a fertilizer plant in Country X. The plant will increase food production in a country where many people do not have sufficient food. Unfortunately, the plant will have some bad environmental effects, and correcting the problems will make the fertilizer more expensive, too expensive for farmers in Country X. Should she bid on the design? She may decide to list considerations in favor of submitting a bid and considerations against it. On the one hand, she will be contributing to the saving of many lives, the economic development of Country X, and the economic advancement of her firm. On the other hand, she will be contributing to the environmental degradation of Country X, and her firm may receive some negative publicity. She must attempt to balance these two sets of considerations and determine which has the greater moral “weight.” Balancing does not provide specific directions for comparing alternative courses of action, but sometimes such direction is not necessary.

The Method of Casuistry or Line Drawing

The second method is casuistry, or what I call line drawing. Although the method I have developed for students is more formal than would ordinarily be used in real-world situations, I believe the underlying ideas are what we might call moral common sense. Casuistry has a long history in the moral tradition of the West, going back at least to Cicero. Recently, casuistry has been used to make

decisions in medical ethics. Congress established the National Commission for the Protection of Human Subjects of Biomedical Research in 1974. Deep religious and philosophical differences between members of the commission made progress difficult until the group decided to talk about specific examples of morally objectionable experiments (“paradigm cases”). The members found that they could agree on the characteristics (“features”) of these experiments that made the experiments wrong. Some members of the commission recognized that they were using the ancient technique of casuistry, and the method subsequently came to be accepted in medical ethics cases.

In casuistry, a decision about what to do or believe in a problematic situation is made by comparing the problematic situation with a clear situation. The comparison—reasoning by analogy—is made by comparing the features of the test case with the features of a “positive paradigm case” and a “negative paradigm case.” A feature is a characteristic that distinguishes a paradigm case from the test case, the subject of the analysis. A negative paradigm is a clear or uncontroversial example of an action that is wrong or morally impermissible; a positive paradigm is a clear and uncontroversial example of an action that is right or morally permissible.

Casuistry, or line drawing, can be used to resolve two distinct kinds of questions. First, it can be used to resolve an application issue, for example, to determine whether an action really constitutes a bribe. Second, it can be used to resolve a moral issue, for example, once we have determined that an offer really is (or is not) a bribe, whether or not we should accept it or offer it. Of course, in most circumstances, a bribe should not be accepted or offered, but offering or accepting a bribe might be justifiable in a few cases. To cite an earlier example, during World War II, if I could have bribed a Nazi guard to get my grandmother out of a concentration camp, I might decide that offering a bribe is justifiable.

The following example illustrates how casuistry can be used to settle an application issue and to settle a moral issue. Denise is an engineer at a large construction firm. Her job requires that she specify rivets for the construction of a large apartment building. She has the power to make the decision by herself. After some research and testing, she decides to use ACME rivets for the job, because, indeed, they are the best product. The day after she orders the rivets, an ACME representative visits her and gives her a voucher for an all-expense paid trip to the ACME Technical Forum in Jamaica. The voucher is worth $5,000, and the four-day trip will include 18 hours of classroom instruction, time in the evening for sightseeing, and a day-long tour of the coastline. The time will be roughly divided between education and pleasure. Does this trip constitute a bribe? A line-drawing analysis might look like Table 1 .

In a line-drawing analysis, one must decide not only where to place the “x’s” on the spectrum, but also how much “weight” or importance to give each “x.” Some features may be more important than others. For example, one might decide that because the offer was made after the decision to buy ACME rivets the

TABLE 1 Line-Drawing Analysis for Resolving an Application Issue

Features

Positive Paradigm

Test Case

Negative Paradigm

Gift Size

$1.00

_ _ _ _ _ _ _X_ _

$ 5,000

Timing

After decision

X_ _ _ _ _ _ _ _ _

Before decision

Reason

Education

_ _ __ _X _ _ _ _

Pleasure

Power to make decisions

With others

_ _ _ _ __ _X _ _

Alone

Quality of product

Best

_X_ _ _ _ _ _ _ _

Worst

gift cannot be considered a bribe. It may be a bribe, however, to other engineers, who may believe that buying ACME products results in offers of nice trips. However, to Denise it is certainly not a paradigm bribe.

Line-drawing analysis can also be used to determine whether Denise should take the trip. Even if she decides the trip is not a bribe, she might still decide not to accept the offer. The features important to this decision may be different from the ones in the first analysis, although there may be some overlap. In the second analysis, it will be important to consider the influence of the gift on future decisions by Denise and other engineers, the company policy on accepting gifts, and the appearance of bribery if the gift is accepted. Some features from the first analysis, such as the educational value of the technical forum, would be relevant here too. Table 2 is a line-drawing analysis to resolve the moral question of whether Denise should accept the offer.

According to the analysis in Table 2 , the issue is not clear. However, the problems associated with accepting the gift are serious enough that Denise probably should not accept it. In the next section, I shall suggest conditions under which accepting the gift would probably be morally permissible.

TABLE 2 Line-Drawing Analysis for Resolving a Moral Issue

Features

Positive Paradigm

Test Case

Negative Paradigm

Influence on future decisions

None

_ _ _ _ _X _ _

Great

Company policy

May accept

_ X _ _ _ _ _ _

May not accept

Appearance

No problem

_ _ _ _ _ X_ _

Appearance of a bribe

Educational value

Great

_ _ X _ _ _ _ _

Minimal

But first, here are some concluding thoughts about the method of casuistry. In general, the more features that are included in an analysis, the better. For the sake of simplicity, I used only four or five, but the more features you include, the more helpful and accurate the analysis becomes.

Casuistry is an inherently conservative method. In arriving at paradigm cases for comparison with test cases, we assume that our intuitive, common sense moral judgments are correct. This assumption is usually valid, but not always, particularly in areas where morality is changing or when the case involves a novel experience. It might be difficult to find uncontroversial paradigm cases for some issues in environmental ethics, for example.

For casuistry to work well in the context of a profession, the professional community must agree on paradigms of acceptable and unacceptable practice. Engineers must agree on paradigmatic examples of acceptable and unacceptable practice with respect to conflicts of interest, confidentiality, and other issues. In the area of medical ethics, for example, there is now widespread agreement about whether actions taken in certain publicized cases were moral or not. These agreed-upon bench marks can then be compared to more controversial cases. I believe there has been less discussion of bench mark cases in engineering.

Creative Middle Ways

A third method of resolving a problem is finding a creative middle way. Suppose there is a conflict between two or more legitimate moral obligations and that two of them appear to be at loggerheads. Sometimes by creative thinking, it is possible to find a course of action that satisfies both, although perhaps not in the way that was originally supposed. For example, a plant might be emitting some dangerous pollutants that are environmentally harmful, but completely eliminating them would be so expensive that the plant would have to close, throwing many local inhabitants out of work. Assuming there is an obligation both to preserve jobs and to protect the environment, a creative middle way might be to eliminate the worst pollutants and forego a complete cleanup until more economical means of doing so can be found. This alternative would be particularly attractive if the remaining pollutants would not cause irreversible damage to the environment or to human health.

This solution, and most creative middle-way solutions, involves compromise. Environmentalists might not be completely satisfied with this solution because not all of the pollutants will be removed. Plant managers might not be completely satisfied because the solution will still involve considerable expenditures for pollution control. Nevertheless, environmentalists will accomplish something, and the plant owners can remain in the town and even build up a considerable amount of public good will.

In the line-drawing analysis presented in the previous section, there might also be a creative middle way. Suppose we take the two competing values: (1) the

educational and recreational value of the trip; and (2) avoiding the appearance of bribery and undue influence on professional judgment. Denise’s manager might suggest: (1) that she take the trip but that the company pay her expenses; and (2) that engineers who were not involved in the decision also be allowed to take the trip. Furthermore, it must be understood that company engineers will be allowed to attend the forum, at the company’s expense, whether or not the company buys ACME products. This arrangement would only make sense, of course, if the forum is of very great technical value. This solution would allow Denise to honor competing obligations in a creative way.

Two limitations of this method come to mind. First, sometimes there is no creative middle way, even if it is desirable. In the example cited above, all of the pollutants may be so damaging to the environment that no half-way measures will work. Furthermore, there might not be a way to do the cleanup more economically. In that case, the plant might just have to close. In the line-drawing example, Denise’s company might not be able to pay her expenses. A second limitation is that sometimes the creative middle way is not morally appropriate. Sometimes one of the options is so morally repugnant that we must choose the other one. Still, a creative middle way is often a good solution to a complex, practical moral problem.

TOP-DOWN METHODS OF RESOLUTION

In some cases, the appeal to moral common sense may not be sufficient. In those cases, it may be useful to appeal to more fundamental moral ideas, such as those developed in philosophical theories. Although the role of moral theory in applied or practical ethics is controversial, I believe moral theorists have attempted to find fundamental moral ideals that can generate or explain all or most of our common-sense moral ideas. This goal has been only partially achieved, because there are at least two prevalent moral theories today, and neither one can explain the fundamental ideas of common morality in a completely satisfactory way. These two theories are utilitarianism, usually associated with Jeremy Bentham and John Stuart Mill, and the ethics of respect for persons, usually associated with Immanuel Kant. The main idea behind utilitarianism is to maximize overall human well-being; and the main idea behind the ethics of respect for persons is to respect the rights and moral agency of individuals.

Although the existence of two theories rather than one may be an embarrassment to theorists, practical ethicists can take a more positive attitude because the conflict between the ideas behind these two theories often arise in real-world moral controversies. Common morality, at least in the West, may not be a seamless web. In fact, it may be composed of two strands: (1) considerations having to do with utility, or the well-being of the greatest number of people; and (2) considerations having to do with justice and the rights of individuals.

An understanding of moral theory could serve several functions in practical

ethics. First, the two perspectives can often be helpful for identifying and sorting out different types of arguments and for recognizing that different types of arguments have deep moral roots. In arguments for and against strict protections for intellectual property, for example, knowing that some arguments are utilitarian can be helpful. From the utilitarian perspective, protecting intellectual property promotes the flourishing of technology and, thereby, the good of society. Utilitarian arguments can also be made that strong protections for intellectual property limit the sharing of new ideas in technology and are thereby detrimental to the general good. Arguments from the respect-for-persons perspective often focus on the individual’s right to control, and reap the profits from, the fruits of his or her own labor, regardless of the impact on the larger society.

Second, understanding these fundamental, yet divergent, moral perspectives often enables an ethicist to anticipate a moral argument. Just thinking about the two theories and the kinds of arguments they would support could have led one to expect that some arguments regarding intellectual property would take the utilitarian approach and others would take the rights-of-ownership approach.

Third, familiarity with these two perspectives can sometimes help in determining whether there has been closure on a moral issue. If arguments from both perspectives lead to the same conclusions, we can be pretty confident that we have arrived at the right answer. If the arguments lead to different conclusions, the discussion is likely to continue. When different conclusions are reached, there is no algorithm, unfortunately, for deciding which moral perspective should prevail. In general, however, the Western emphasis on individual rights and respect for persons takes priority, unless harm to individuals is slight and the utility to society is very great. With these considerations in mind, we can now look at the two moral theories.

The Ethics of Utilitarianism

A principle of utilitarianism is that the right action will have the best consequences, and the best consequences are those that lead to the greatest happiness or well-being of everyone affected by the action. Consider the following case. Kevin is the engineering manager for the county road commission. He must decide what to do about Forest Drive, a local, narrow, two-lane road. Every year for the past seven years, at least one person has crashed a car into trees close to the road and been killed. Many other accidents have also occurred, causing serious injuries, wrecked cars, and damaged trees. Kevin is considering widening the road, which would require that 30 trees be cut down. Kevin is already receiving protests from local citizens who want to protect the beauty and ecological integrity of the area. Should Kevin widen the road?

In this case, the conflicting values are public health and safety on the one hand and the beauty and ecological integrity of the area on the other. Let us suppose that widening the road will save one life and prevent two serious injuries

and five minor injuries a year. Not widening the road will preserve the beauty and ecological integrity of the area. Even though the preservation will increase the happiness of many people, the deaths and injuries are far more serious negative consequences for those who experience them. Therefore, the greatest total utility is probably served by widening the road.

Cost/benefit analysis is a form of utilitarianism. I sometimes refer to it as “utilitarianism with the numbers.” Instead of maximizing happiness, the focus is on balancing costs and benefits, both measured in money, and selecting the option that leads to the greatest net benefit, also measured in money. Consider an earlier case. ACME manufacturing has a plant in the small town of Springfield that employs about 10 percent of the community. As a consequence of some of its manufacturing procedures, the ACME plant releases bad-smelling fumes that annoy its neighbors, damage the local tourism trade, and have been linked to an increase in asthma in the area. The town of Springfield is considering issuing an ultimatum to ACME to clean up the plant or pay a million-dollar fine. ACME has responded that it will close the plant rather than pay the fine. What should Springfield do?

A cost/benefit analysis might show the costs of and benefits of not levying the fine and keeping the plant open ( Table 3 ) and or levying the fine and losing the plant ( Table 4 ).

According to these analyses, the economic consequences of fining ACME would be much greater than the consequences of not fining ACME. Thus, the fine should not be levied.

There are two major problems with utilitarianism. One is that an accurate analysis requires a lot of factual information. This is especially evident in the cost/benefit analyses above. One must know the amounts to assign to the various costs and benefits. Even in an analysis that is not done in the cost/benefit way, the consequences of various courses of action must be known before the course of action that will have the greatest overall utility can be known. A second problem

TABLE 3 Cost/Benefit Analysis of Not Levying the Fine

Costs:

Health expenses

$1,000,000

Nuisance odor

$50,000

Decline in housing values

$1,000,000

Decline in tourism

$50,000

Benefits:

Wages

$10,000,000

Taxes

$2,000,000

Total

+$9,900,000

TABLE 4 Cost/Benefit Analysis of Levying the Fine

Costs:

Loss of wages

$10,000,000

Loss of tax revenue

$2,000,000

Decline in housing values

$2,000,000

Benefits:

Fine

$1,000,000

Increase in tourism

$50,000

Health savings

$900,000

Total

−$12,050,000

is that a utilitarian analysis can sometimes justify unjust consequences. For example, a decision not to force the plant to stop polluting will result in some people getting sick, even though overall utility will be maximized. These problems suggest that a complete analysis should include the ethics-of-respect principle.

The Ethics of Respect for Persons

From the utilitarian point of view, harm to one person can be justified by a bigger benefit to someone else. In the ethics of respect for persons, there are some things you may not do to a person, even for the benefit of others. The fundamental idea in the ethics of respect for persons is that you must respect each person as a free and equal moral agent—that is, as a person who has goals and values and a right to pursue those values as long as he or she does not violate the similar rights of others.

As this formulation suggests, the ethics of respect for persons emphasizes the rights of individuals, which are formulated, among other places, in various United Nations documents. Individual rights include the right to life and to the security of one’s person, the right not to be held in slavery, the right to freedom of thought and expression, and so forth. The problem with this formulation is that it does not give any clear indication of which rights are most important. When rights conflict, it is important to know which ones are most important.

Alan Gewirth, a contemporary philosopher, has suggested that there are three levels of rights (Gewirth, 1978). Level I, the most important rights, includes the right to life, the right to bodily integrity, and the right to mental integrity. I would add to those the right to free and informed consent to actions that affect one. Level II includes the right not to be deceived, cheated, robbed, defamed, or lied to. It also includes the right to free speech. Level III includes the right to acquire property and the right to be free of discrimination. For Gewirth, Level I rights are the fundamental rights necessary for effective moral agency. Level II

rights are necessary to preserving one’s moral agency. Level III rights are necessary to increasing one’s level of effective moral agency. Whether or not one accepts this arrangement, most of us would probably recognize that some rights are more important than others.

Consider the following case. Karen, who has been working as a design engineer under Andy, has learned that he is about to be offered a job as head safety inspector for all of the oil rigs the company owns in the region. Karen worries that Andy’s drinking may affect his ability to perform his new job and thereby endanger workers on the oil rigs. She asks Andy to turn down the new assignment, but he refuses. Should Karen take her concerns to management? In this case, Andy’s right to advance his career (by trying to acquire property), which is a Level III right, conflicts with the workers’ rights to life and bodily integrity, which are Level I rights. In this conflict, the rights of the workers are more important, and Karen should take her concern to management.

In arbitrating conflicts between rights, two additional issues should be kept in mind. First, there is a distinction between violating and infringing a right. A right is violated if it is denied entirely. I violate your right to life if I kill you. A right is infringed if it is limited or diminished in some way. A plant infringes on my right to life if it emits a pollutant that increases my risk of dying of cancer. Second, rights can be forfeited by violating or perhaps infringing on the rights of others. I may forfeit my right to life if I kill someone else. I may forfeit some right (perhaps the right to free movement) if I steal from others and thus infringe on or violate their right not to be robbed.

Finally, the Golden Rule is also a principle associated with the ethics of respect for persons. Most cultures have a version of the Golden Rule. The Christian version requires that we treat others as we would have them treat us. In the Islamic version, no man is a true believer unless he desires for his brother that which he desires for himself. If we consider ourselves to be moral agents, the Golden Rule requires that we treat others as moral agents as well.

There are two primary problems with the ethics of respect for persons. First, the rights test and the Golden Rule are sometimes difficult to apply. We must determine when there is a conflict of rights, which rights are most important, and whether rights have been violated or merely infringed upon. With the Golden Rule, we must assume that others have the same values we do. If they do not, treating them as we would wish to be treated may be unfair. Second, it may be justifiable at times to allow considerations of utility to override considerations of the ethics of respect for persons, especially if the infringements of rights are relatively minor and the benefit to the general welfare is great.

I have presented a number of tools for analyzing and resolving ethical problems. The important thing to keep in mind, however, is that these tools cannot be

used in a mechanical way. They are not algorithms. One must decide if the issue to be resolved is really factual or conceptual, for example. One must also decide when the line-drawing method or finding a creative middle way is most appropriate and when an issue can best be approached as a conflict between general human welfare (utility) and the rights of individuals (the ethics of respect for persons). When there is such a conflict, there is no mechanical way to determine which perspective should be considered most important. In the West, we accord great importance to individual rights, but they do not always take precedence. The techniques and methods I have described are helpful for thinking about ethical issues, but they are no substitute for moral insight and moral wisdom.

Gewirth, A. 1978. Reason and Morality. Chicago: University of Chicago Press.

Hare, R.M. 1981. Moral Thinking: Its Levels, Method, and Point. Oxford, U.K.: Oxford University Press.

Engineers and ethicists participated in a workshop to discuss the responsible development of new technologies. Presenters examined four areas of engineering—sustainability, nanotechnology, neurotechnology, and energy—in terms of the ethical issues they present to engineers in particular and society as a whole. Approaches to ethical issues include: analyzing the factual, conceptual, application, and moral aspects of an issue; evaluating the risks and responsibilities of a particular course of action; and using theories of ethics or codes of ethics developed by engineering societies as a basis for decision making. Ethics can be built into the education of engineering students and professionals, either as an aspect of courses already being taught or as a component of engineering projects to be examined along with research findings. Engineering practice workshops can also be effective, particularly when they include discussions with experienced engineers. This volume includes papers on all of these topics by experts in many fields. The consensus among workshop participants is that material on ethics should be an ongoing part of engineering education and engineering practice.

READ FREE ONLINE

Welcome to OpenBook!

You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

Do you want to take a quick tour of the OpenBook's features?

Show this book's table of contents , where you can jump to any chapter by name.

...or use these buttons to go back to the previous chapter or skip to the next one.

Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

Switch between the Original Pages , where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

To search the entire text of this book, type in your search term here and press Enter .

Share a link to this book page on your preferred social network or via email.

View our suggested citation for this chapter.

Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

Get Email Updates

Do you enjoy reading reports from the Academies online for free ? Sign up for email notifications and we'll let you know about new publications in your areas of interest when they're released.

case study in engineering ethics

  • Member Benefits
  • Types of Membership
  • Renew Membership
  • Diversity, Equity, and Inclusion
  • Get Involved
  • NSPE Communities
  • Interest Groups
  • State Societies
  • What Is a PE
  • Why Get Licensed
  • How to Get Licensed
  • Maintaining a License
  • Why PEs Matter
  • NSPE Protects Your PE License
  • Licensing Boards
  • Licensing Resources
  • Professional Engineers Day
  • History of the Code of Ethics for Engineers
  • Engineers' Creed
  • Code of Ethics (French)
  • Code of Ethics (German)
  • Code of Ethics (Japanese)
  • Code of Ethics (Spanish)
  • Board of Ethical Review
  • Board of Ethical Review Cases
  • Education and Publications
  • Engineering Ethics Videos
  • Ethics Exam
  • Milton F. Lunch Ethics Contest
  • PE/FE Exam Preparation
  • Emerging Leaders Program
  • NSPE Education Foundation
  • EJCDC Contract Documents
  • Professional Liability
  • NSPE Advocacy Center
  • Sustainability and Resilience
  • Action on Issues
  • Latest News
  • Reports on State PE Laws and Rules
  • Advocacy Tools
  • State Watch
  • PE Legislators
  • Professional Policies and Position Statements
  • NSPE Legal Fund
  • Protect the PE Fund
  • NSPE Life Member Contribution
  • Digital PE Magazine
  • PE Magazine
  • Daily Designs Archives
  • NSPE Update
  • Advertising

News and Publications

New Ethics Case Studies Published

NSPE Today New Ethics Case Studies Published

NSPE’s Board of Ethical Review has published six new case studies that provide engineering ethics guidance using factbased scenarios. The cases cover the topics of plan stamping; gifts; the public health, safety, and welfare; conflicts of interest; responsible charge; and job qualifications. NSPE established the Board of Ethical Review in June 1954 due to many requests by engineers, state societies, and chapters for interpretations of the Code of Ethics in specific circumstances. Since the publishing of the first case in 1958, which involved questionable actions on a World Bank-financed hydroelectric project, the case catalog has grown to nearly 650.

Today, there are many real-world examples in which engineering ethics has a direct impact on the public, especially those related to technology advancement. For example, NSPE encourages policymakers to protect the public health, safety, and welfare when developing artificial intelligence and autonomous vehicles. In comments to the National Institute of Standards and Technology in August, NSPE called for the involvement of ethically accountable licensed professional engineers or duly certified individuals in the AI development process. The Society has also called on NIST to create AI technical standards that include an ethical framework that can be applied universally in the development of AI decision-making.

Each of the BER’s just-released cases dives into subjects that practicing professional engineers and engineer interns can face on the job. In Case 20-4 , a PE for a metropolitan water commission and a consulting engineer retained by the commission are faced with ethical dilemmas surrounding the commission’s consideration of a change in its water supply source—a change with public health, safety, and welfare implications. In another case ( 20-1 ), an engineer intern applies for a position at a consulting firm. The job requires the candidate to hold a PE license or to become licensed within 90 days. The firm offers the job to the engineer intern, but complications arise when the EI fails the PE exam and is found to have withheld information from the firm.

More NSPE Now Articles

OEC logo

Site Search

  • How to Search
  • Advisory Group
  • Editorial Board
  • OEC Fellows
  • History and Funding
  • Using OEC Materials
  • Collections
  • Research Ethics Resources
  • Ethics Projects
  • Communities of Practice
  • Get Involved
  • Submit Content
  • Open Access Membership
  • Become a Partner
  • Advanced Search
  • Webinar Series

The Space Shuttle Challenger Disaster

A case study looking at the explosion of the Challenger Space Shuttle.

On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded just over a minute into the flight. The failure of the solid rocket booster O-rings to seat properly allowed hot combustion gases to leak from the side of the booster and burn through the external fuel tank. The failure of the O-ring was attributed to several factors, including faulty design of the solid rocket boosters, insufficient low- temperature testing of the O-ring material and the joints that the O-ring sealed, and lack of proper communication between different levels of NASA management.

Instructor's Guide

Introduction to the case.

On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded just over a minute into the flight. The failure of the solid rocket booster O-rings to seat properly allowed hot combustion gases to leak from the side of the booster and burn through the external fuel tank. The failure of the O-ring was attributed to several factors, including faulty design of the solid rocket boosters, insufficient low-temperature testing of the O-ring material and the joints that the O-ring sealed, and lack of proper communication between different levels of NASA management.

Instructor Guidelines

Prior to class discussion, ask the students to read the student handout outside of class. In class the details of the case can be reviewed with the aide of the overheads. Reserve about half of the class period for an open discussion of the issues. The issues covered in the student handout include the importance of an engineer's responsibility to public welfare, the need for this responsibility to hold precedence over any other responsibilities the engineer might have and the responsibilities of a manager/engineer. A final point is the fact that no matter how far removed from the public an engineer may think she is, all of her actions have potential impact.

Essay #6, "Loyalty and Professional Rights" appended at the end of the case listings in this report will be found relevant for instructors preparing to lead class discussion on this case. In addition, essays #1 through #4 appended at the end of the cases in this report will have relevant background information for the instructor preparing to lead classroom discussion. Their titles are, respectively: "Ethics and Professionalism in Engineering: Why the Interest in Engineering Ethics?;" "Basic Concepts and Methods in Ethics," "Moral Concepts and Theories," and "Engineering Design: Literature on Social Responsibility Versus Legal Liability."

Questions for Class Discussion

1. What could NASA management have done differently?

2. What, if anything, could their subordinates have done differently?

3. What should Roger Boisjoly have done differently (if anything)? In answering this question, keep in mind that at his age, the prospect of finding a new job if he was fired was slim. He also had a family to support.

4. What do you (the students) see as your future engineering professional responsibilities in relation to both being loyal to management and protecting the public welfare?

HOW DOES THE IMPLIED SOCIAL CONTRACT OF PROFESSIONALS APPLY TO THIS CASE?

WHAT PROFESSIONAL RESPONSIBILITIES WERE NEGLECTED, IF ANY?

SHOULD NASA HAVE DONE ANYTHING DIFFERENTLY IN THEIR LAUNCH DECISION PROCEDURE?

Student Handout - Synopsis

On January 28, 1986, seven astronauts were killed when the space shuttle they were piloting, the Challenger, exploded just over a minute into flight. The failure of the solid rocket booster O-rings to seat properly allowed hot combustion gases to leak from the side of the booster and burn through the external fuel tank. The failure of the O-ring was attributed to several factors, including faulty design of the solid rocket boosters, insufficient low temperature testing of the O-ring material and the joints that the O-ring sealed, and lack of communication between different levels of NASA management.

Organization and People Involved

Marshall Space Flight Center - In charge of booster rocket development

Larry Mulloy - Challenged the engineers' decision not to launch

Morton Thiokol - Contracted by NASA to build the Solid Rocket Booster

Alan McDonald - Director of the Solid Rocket Motors Project

Bob Lund - Engineering Vice President

Robert Ebeling - Engineer who worked under

McDonald Roger Boisjoly - Engineer who worked under McDonald

Joe Kilminster - Engineer in a management position

Jerald Mason - Senior Executive who encouraged Lund to reassess his decision not to launch.

1974 - Morton-Thiokol awarded contract to build solid rocket boosters.

1976 - NASA accepts Morton-Thiokol's booster design.

1977 - Morton-Thiokol discovers joint rotation problem.

November 1981 - O-ring erosion discovered after second shuttle flight.

January 24, 1985 - shuttle flight that exhibited the worst O ring blow-by.

July 1985 - Thiokol orders new steel billets for new field joint design.

August 19, 1985 - NASA LevelI management briefed on booster problem.

January 27, 1986 - night teleconference to discuss effects of cold temperature on booster performance.

January 28, 1986 - Challenger explodes 72 seconds after liftoff.

NASA managers were anxious to launch the Challenger for several reasons, including economic considerations, political pressures, and scheduling backlogs. Unforeseen competition from the European Space Agency put NASA in a position where it would have to fly the shuttle dependably on a very ambitious schedule in order to prove the Space Transportation System's cost effectiveness and potential for commercialization. This prompted NASA to schedule a record number of missions in 1986 to make a case for its budget requests. The shuttle mission just prior to the Challenger had been delayed a record number of times due to inclement weather and mechanical factors.

NASA wanted to launch the Challenger without any delays so the launch pad could be refurbished in time for the next mission, which would be carrying a probe that would examine Halley's Comet. If launched on time, this probe would have collected data a few days before a similar Russian probe would be launched. There was probably also pressure to launch Challenger so it could be in space when President Reagan gave his State of the Union address. Reagan's main topic was to be education, and he was expected to mention the shuttle and the first teacher in space, Christa McAuliffe. The shuttle solid rocket boosters (or SRBs), are key elements in the operation of the shuttle. Without the boosters, the shuttle cannot produce enough thrust to overcome the earth's gravitational pull and achieve orbit.

There is an SRB attached to each side of the external fuel tank. Each booster is 149 feet long and 12 feet in diameter. Before ignition, each booster weighs 2 million pounds. Solid rockets in general produce much more thrust per pound than their liquid fuel counterparts. The drawback is that once the solid rocket fuel has been ignited, it cannot be turned off or even controlled. So it was extremely important that the shuttle SRBs were properly designed. Morton Thiokol was awarded the contract to design and build the SRBs in 1974. Thiokol's design is a scaled-up version of a Titan missile which had been used successfully for years. NASA accepted the design in 1976. The booster is comprised of seven hollow metal cylinders. The solid rocket fuel is cast into the cylinders at the Thiokol plant in Utah, and the cylinders are assembled into pairs for transport to Kennedy Space Center in Florida. At KSC, the four booster segments are assembled into a completed booster rocket. The joints where the segments are joined together at KSC are known as field joints (See Figure 1).

Cross section of Challenger field joints

These field joints consist of a tang and clevis joint. The tang and clevis are held together by 177 clevis pins. Each joint is sealed by two O rings, the bottom ring known as the primary O ring, and the top known as the secondary O-ring. (The Titan booster had only one O-ring. The second ring was added as a measure of redundancy since the boosters would be lifting humans into orbit. Except for the increased scale of the rocket's diameter, this was the only major difference between the shuttle booster and the Titan booster.) The purpose of the O-rings is to prevent hot combustion gasses from escaping from the inside of the motor. To provide a barrier between the rubber O-rings and the combustion gasses, a heat resistant putty is applied to the inner section of the joint prior to assembly. The gap between the tang and the clevis determines the amount of compression on the O-ring. To minimize the gap and increase the squeeze on the O-ring, shims are inserted between the tang and the outside leg of the clevis.

Launch Delays

The first delay of the Challenger mission was because of a weather front expected to move into the area, bringing rain and cold temperatures. Usually a mission wasn't postponed until inclement weather actually entered the area, but the Vice President was expected to be present for the launch and NASA officials wanted to avoid the necessity of the Vice President's having to make an unnecessary trip to Florida; so they postponed the launch early. The Vice President was a key spokesperson for the President on the space program, and NASA coveted his good will.

The weather front stalled, and the launch window had perfect weather conditions; but the launch had already been postponed to keep the Vice President from unnecessarily traveling to the launch site. The second launch delay was caused by a defective micro switch in the hatch locking mechanism and by problems in removing the hatch handle. By the time these problems had been sorted out, winds had become too high. The weather front had started moving again, and appeared to be bringing record-setting low temperatures to the Florida area.

NASA wanted to check with all of its contractors to determine if there would be any problems with launching in the cold temperatures. Alan McDonald, director of the Solid Rocket Motor Project at Morton Thiokol, was convinced that there were cold weather problems with the solid rocket motors and contacted two of the engineers working on the project, Robert Ebeling and Roger Boisjoly. Thiokol knew there was a problem with the boosters as early as 1977, and had initiated a redesign effort in 1985. NASA LevelI management had been briefed on the problem on August 19, 1985. Almost half of the shuttle flights had experienced O-ring erosion in the booster field joints. Ebeling and Boisjoly had complained to Thiokol that management was not supporting the redesign task force.

Engineering Design

The size of the gap is controlled by several factors, including the dimensional tolerances of the metal cylinders and their corresponding tang or clevis, the ambient temperature, the diameter of the O-ring, the thickness of the shims, the loads on the segment, and quality control during assembly. When the booster is ignited, the putty is displaced, compressing the air between the putty and the primary O-ring.

The air pressure forces the O-ring into the gap between the tang and clevis. Pressure loads are also applied to the walls of the cylinder, causing the cylinder to balloon slightly. This ballooning of the cylinder walls caused the gap between the tang and clevis gap to open. This effect has come to be known as joint rotation. Morton-Thiokol discovered this joint rotation as part of its testing program in 1977. Thiokol discussed the problem with NASA and started analyzing and testing to determine how to increase the O-ring compression, thereby decreasing the effect of joint rotation. Three design changes were implemented:

1. Dimensional tolerances of the metal joint were tightened.

2. The O-ring diameter was increased, and its dimensional tolerances were tightened.

3. The use of the shims mentioned above was introduced. Further testing by Thiokol revealed that the second seal, in some cases, might not seal at all. Additional changes in the shim thickness and O-ring diameter were made to correct the problem.

A new problem was discovered during November 1981, after the flight of the second shuttle mission. Examination of the booster field joints revealed that the O-rings were eroding during flight. The joints were still sealing effectively, but the O-ring material was being eaten away by hot gasses that escaped past the putty. Thiokol studied different types of putty and its application to study their effects on reducing O-ring erosion. The shuttle flight 51-C of January 24, 1985, was launched during some of the coldest weather in Florida history. Upon examination of the booster joints, engineers at Thiokol noticed black soot and grease on the outside of the booster casing, caused by actual gas blow-by. This prompted Thiokol to study the effects of O-ring resiliency at low temperatures. They conducted laboratory tests of O-ring compression and resiliency between 50lF and 100lF. In July 1985, Morton Thiokol ordered new steel billets which would be used for a redesigned case field joint. At the time of the accident, these new billets were not ready for Thiokol, because they take many months to manufacture.

The Night Before the Launch

Temperatures for the next launch date were predicted to be in the low 20°s. This prompted Alan McDonald to ask his engineers at Thiokol to prepare a presentation on the effects of cold temperature on booster performance. A teleconference was scheduled the evening before the re-scheduled launch in order to discuss the low temperature performance of the boosters. This teleconference was held between engineers and management from Kennedy Space Center, Marshall Space Flight Center in Alabama, and Morton-Thiokol in Utah. Boisjoly and another engineer, Arnie Thompson, knew this would be another opportunity to express their concerns about the boosters, but they had only a short time to prepare their data for the presentation.1

Thiokol's engineers gave an hour-long presentation, presenting a convincing argument that the cold weather would exaggerate the problems of joint rotation and delayed O-ring seating. The lowest temperature experienced by the O-rings in any previous mission was 53°F, the January 24, 1985 flight. With a predicted ambient temperature of 26°F at launch, the O-rings were estimated to be at 29°F. After the technical presentation, Thiokol's Engineering Vice President Bob Lund presented the conclusions and recommendations.

His main conclusion was that 53°F was the only low temperature data Thiokol had for the effects of cold on the operational boosters. The boosters had experienced O-ring erosion at this temperature. Since his engineers had no low temperature data below 53°F, they could not prove that it was unsafe to launch at lower temperatures. He read his recommendations and commented that the predicted temperatures for the morning's launch was outside the data base and NASA should delay the launch, so the ambient temperature could rise until the O-ring temperature was at least 53°F. This confused NASA managers because the booster design specifications called for booster operation as low as 31°F. (It later came out in the investigation that Thiokol understood that the 31°F limit temperature was for storage of the booster, and that the launch temperature limit was 40°F. Because of this, dynamic tests of the boosters had never been performed below 40°F.)

Marshall's Solid Rocket Booster Project Manager, Larry Mulloy, commented that the data was inconclusive and challenged the engineers' logic. A heated debate went on for several minutes before Mulloy bypassed Lund and asked Joe Kilminster for his opinion. Kilminster was in management, although he had an extensive engineering background. By bypassing the engineers, Mulloy was calling for a middle-management decision, but Kilminster stood by his engineers. Several other managers at Marshall expressed their doubts about the recommendations, and finally Kilminster asked for a meeting off of the net, so Thiokol could review its data.

Boisjoly and Thompson tried to convince their senior managers to stay with their original decision not to launch. A senior executive at Thiokol, Jerald Mason, commented that a management decision was required. The managers seemed to believe the O-rings could be eroded up to one third of their diameter and still seat properly, regardless of the temperature. The data presented to them showed no correlation between temperature and the blow-by gasses which eroded the O-rings in previous missions. According to testimony by Kilminster and Boisjoly, Mason finally turned to Bob Lund and said, "Take off your engineering hat and put on your management hat." Joe Kilminster wrote out the new recommendation and went back on line with the teleconference.

The new recommendation stated that the cold was still a safety concern, but their people had found that the original data was indeed inconclusive and their "engineering assessment" was that launch was recommended, even though the engineers had no part in writing the new recommendation and refused to sign it. Alan McDonald, who was present with NASA management in Florida, was surprised to see the recommendation to launch and appealed to NASA management not to launch. NASA managers decided to approve the boosters for launch despite the fact that the predicted launch temperature was outside of their operational specifications.

During the night, temperatures dropped to as low as 8°F, much lower than had been anticipated. In order to keep the water pipes in the launch platform from freezing, safety showers and fire hoses had been turned on. Some of this water had accumulated, and ice had formed all over the platform. There was some concern that the ice would fall off of the platform during launch and might damage the heat resistant tiles on the shuttle. The ice inspection team thought the situation was of great concern, but the launch director decided to go ahead with the countdown.

Note that safety limitations on low temperature launching had to be waived and authorized by key personnel several times during the final countdown. These key personnel were not aware of the teleconference about the solid rocket boosters that had taken place the night before. At launch, the impact of ignition broke loose a shower of ice from the launch platform. Some of the ice struck the left-hand booster, and some ice was actually sucked into the booster nozzle itself by an aspiration effect. Although there was no evidence of any ice damage to the Orbiter itself, NASA analysis of the ice problem was wrong. The booster ignition transient started six hundredths of a second after the igniter fired. The aft field joint on the right-hand booster was the coldest spot on the booster: about 28°F. The booster's segmented steel casing ballooned and the joint rotated, expanding inward as it had on all other shuttle flights.

The primary O-ring was too cold to seat properly, the cold-stiffened heat resistant putty that protected the rubber O-rings from the fuel collapsed, and gases at over 5000°F burned past both O-rings across seventy degrees of arc. Eight hundredths of a second after ignition, the shuttle lifted off. Engineering cameras focused on the right-hand booster showed about nine smoke puffs coming from the booster aft field joint. Before the shuttle cleared the tower, oxides from the burnt propellant temporarily sealed the field joint before flames could escape. Fifty-nine seconds into the flight, Challenger experienced the most violent wind shear ever encountered on a shuttle mission. The glassy oxides that sealed the field joint were shattered by the stresses of the wind shear, and within seconds flames from the field joint burned through the external fuel tank. Hundreds of tons of propellant ignited, tearing apart the shuttle. One hundred seconds into the flight, the last bit of telemetry data was transmitted from the Challenger.

Issues For Discussion

The Challenger disaster has several issues which are relevant to engineers. These issues raise many questions which may not have any definite answers, but can serve to heighten the awareness of engineers when faced with a similar situation. One of the most important issues deals with engineers who are placed in management positions. It is important that these managers not ignore their own engineering experience, or the expertise of their subordinate engineers. Often a manager, even if she has engineering experience, is not as up to date on current engineering practices as are the actual practicing engineers. She should keep this in mind when making any sort of decision that involves an understanding of technical matters. Another issue is the fact that managers encouraged launching due to the fact that there was insufficient low temperature data.

Since there was not enough data available to make an informed decision, this was not, in their opinion, grounds for stopping a launch. This was a reversal in the thinking that went on in the early years of the space program, which discouraged launching until all the facts were known about a particular problem. This same reasoning can be traced back to an earlier phase in the shuttle program, when upper-level NASA management was alerted to problems in the booster design, yet did not halt the program until the problem was solved. To better understand the responsibility of the engineer, some key elements of the professional responsibilities of an engineer should be examined. This will be done from two perspectives: the implicit social contract between engineers and society, and the guidance of the codes of ethics of professional societies.

As engineers test designs for ever-increasing speeds, loads, capacities and the like, they must always be aware of their obligation to society to protect the public welfare. After all, the public has provided engineers, through the tax base, with the means for obtaining an education and, through legislation, the means to license and regulate themselves. In return, engineers have a responsibility to protect the safety and well-being of the public in all of their professional efforts. This is part of the implicit social contract all engineers have agreed to when they accepted admission to an engineering college. The first canon in the ASME Code of Ethics urges engineers to "hold paramount the safety, health and welfare of the public in the performance of their professional duties." Every major engineering code of ethics reminds engineers of the importance of their responsibility to keep the safety and well being of the public at the top of their list of priorities. Although company loyalty is important, it must not be allowed to override the engineer's obligation to the public. Marcia Baron, in an excellent monograph on loyalty, states: "It is a sad fact about loyalty that it invites...single mindedness. Single-minded pursuit of a goal is sometimes delightfully romantic, even a real inspiration. But it is hardly something to advocate to engineers, whose impact on the safety of the public is so very significant. Irresponsibility, whether caused by selfishness or by magnificently unselfish loyalty, can have most unfortunate consequences."

Annotated Bibliography and Suggested References

Feynman, Richard Phillips, What Do You Care What Other People Think,: Further Adventures of a Curious Character, Bantam Doubleday Dell Pub, ISBN 0553347845, Dec 1992. Reference added by request of Sharath Bulusu, as being pertinent and excellent reading - 8-25-00.

Lewis, Richard S., Challenger: the final voyage , Columbia University Press, New York, 1988.

McConnell, Malcolm, Challenger: a major malfunction , Doubleday, Garden City, N.Y., 1987. Trento, Joseph J., Prescription for disaster, Crown, New York, c1987.

United States. Congress. House. Committee on Science and Technology, Investigation of the Challenger accident : hearings before the Committee on Science and Technology, U.S. House of Representatives, Ninety-ninth Congress, second session .... U.S. G.P.O.,Washington, 1986.

United States. Congress. House. Committee on Science and Technology, Investigation of the Challenger accident : report of the Committee on Science and Technology, House of Representative s, Ninety-ninth Congress, second session. U.S. G.P.O., Washington, 1986.

United States. Congress. House. Committee on Science, Space, and Technology, NASA's response to the committee's investigation of the "Challenger" accident : hearing before the Committee on Science, Space, and Technology, U.S. House of Representatives, One hundredth Congress, first session, February 26, 1987. U.S. G.P.O., Washington, 1987.

United States. Congress. Senate. Committee on Commerce, Science, and Transportation. Subcommittee on Science, Technology, and Space, Space shuttle accident : hearings before the Subcommittee on Science, Technology, and Space of the Committee on Commerce, Science, and Transportation , United States Senate, Ninety-ninth Congress, second session, on space shuttle accident and the Rogers Commission report, February 18, June 10, and 17, 1986. U.S. G.P.O., Washington, 1986.

1 "Challenger: A Major Malfunction." (See above) p. 194.

2  Baron, Marcia. The Moral Status of Loyalty . Illinois Institute of Technology: Center for the Study of Ethics in the Professions, 1984, p. 9. One of a series of monographs on applied ethics that deals specifically with the engineering profession. Provides arguments both for and against loyalty. 28 pages with notes and an annotated bibliography.

Department of Philosophy and Department of Mechanical Engineering, Texas A&M University. NSF Grant Number: DIR-9012252

Related Resources

Submit Content to the OEC   Donate

NSF logo

This material is based upon work supported by the National Science Foundation under Award No. 2055332. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

Banner

Engineering Ethics

  • Environmental Ethics
  • Social Justice Issues
  • Teaching Resources

Cases Available Online

  • Cases on Engineering Practice Cases from the Online Ethics Center for Science and Engineering
  • National Society of Professional Engineers Search the archive of case studies brought before the NSPE's Board of Ethical Review. Cases run from 1959 to 2006.
  • Engineering Ethics Cases from Texas A & M University Though the design of this website is a bit dated, this is an extremely good collection of famous case studies maintained by Texas A&M University's Murdoch Center for Engineering.
  • Engineering.com Ethics Cases A collection of eighteen famous case studies which include a time line of events, a detailed examination of the incident, and oftentimes a bibliography for further investigation.
  • Engineering Ethics: Concepts and Cases Cases from the first edition of the book, "Engineering Ethics: Concepts and Cases by Charles Harris. Listed by name, engineering specialty, and a taxonomy of cases by subject dealt with.
  • Engineering Disasters and Learning from Failure A collection of cases on famous engineering failures maintained by the Materials Science and Engineering Department at the University of New York, Stony Brook.
  • Ethics Education Library - Engineering Cases A collection of over 500 cases on various aspects engineering ethics. Searchable by topic and keyword.

Books Containing Cases

case study in engineering ethics

  • Engineering Ethics: Concepts, Viewpoints, Cases and Codes by Jimmy Smith, Patricia Harper, and Richard Burgess Call Number: CSEP.TA157.E54x2008 Along with a number of seminal readings on engineering ethics, this book includes a large collection of codes of ethics and case studies.

case study in engineering ethics

Engineering Ethics Films

  • Incident at Morales Call Number: DVD.CSEP.TA157.I53x2003 Incident at Morales involves a variety of ethical issues faced by a company that wants to quickly build a plant in order to develop a new chemical product to gain a competitive edge over the competition
  • Gilbane Gold by National Institute for Engineering Ethics Call Number: Available at the Ethics Center Library Publication Date: http://www.onlineethics.org/Resources/Cases/22240.aspx Available at the CSEP Library, this video is the fictional story of a chemical company The city of Gilbane has implemented a plan where the sewage of the city is processed and sold as fertilizer to local farmers, a product locally known as Gilbane Gold. The fertilizer project creates a sizable revenue for the city, who has also sought to make itself attractive to business through numerous tax incentives. Not long ago, the city has put in place stringent regulations on the amount of heavy metals manufacturing plants located in Gilbane discharge into the city's sewage. Z Corp, a company in Gilbane, has been releasing a higher level of heavy metals into the city's sewage than allowed. As a new employee, should the engineer David Jackson tell the city about this fact? Below URL is a link to a summary of the film, and discussion questions. The link goes to a teaching guide that describes the film in more detail.
  • << Previous: Home
  • Next: Environmental Ethics >>
  • Last Updated: Jun 17, 2024 11:19 AM
  • URL: https://guides.library.iit.edu/engineeringethics

case study in engineering ethics

  • Mechanical Engineering Cases
  • Markkula Center for Applied Ethics
  • Focus Areas
  • More Focus Areas
  • Engineering Ethics
  • Engineering Ethics Cases

A recently promoted manager at an industrial engineering company discovers that factory workers are asked to work more than eight hours a day without getting paid overtime.

A project engineer believes his company is providing the wrong form of technology to an in-need community in East Africa.

A systems engineering company employee quits after getting pressured to falsify product testing paperwork.

A manager at a nonprofit mechanical engineering firm questions how responsible her company should be for ongoing maintenance on past projects.

Should a production engineer prioritize a customer's desires over safety?

  • Academic Ethics
  • Bioengineering
  • Engineering Business
  • Civil Engineering
  • Computer/Software Engineering
  • Electrical Engineering
  • International
  • Mechanical Engineering
  • Science/Research Ethics

IMAGES

  1. Engineering Ethics

    case study in engineering ethics

  2. Engineering Ethics Case Study Example

    case study in engineering ethics

  3. ENGINEERING ETHICS CASE STUDY

    case study in engineering ethics

  4. Case Studies in Engineering Ethics

    case study in engineering ethics

  5. ENGINEERING ETHICS CASE STUDY

    case study in engineering ethics

  6. Case Study in Engineering Ethics 03

    case study in engineering ethics

VIDEO

  1. case study || Engineering Professional Practice

  2. VIDEOO CASE STUDY ENGINEERING IN SOCIETY ALIF (F1047) DAN AFHAM(F1028)

  3. Professional ethics in engineering

  4. Engineering Ethics: Ford Pinto

  5. Engineering Ethics: Making Decisions in the Field

  6. Professional ethics in engineering, unit-2 Moral Autonomy

COMMENTS

  1. Engineering Ethics Cases

    The Center's 2015 Hackworth Engineering Ethics Fellows collected case studies from more than 30 SCU alumni about ethical issues they encountered during their careers.

  2. The Boeing 737 MAX: Lessons for Engineering Ethics

    The case of the Boeing 737 MAX provides valuable lessons for engineers and engineering educators concerning the ethical responsibilities of the profession. Safety is not cheap, but careless engineering design in the name of minimizing costs and adhering to a delivery schedule is a symptom of ethical blight.

  3. Case Studies

    Case Studies In this weekly seminar we study ethics in theory and in practice. We read and consider portions of works by some of history's greatest and most influential ethical thinkers. Meanwhile, we examine ethical decision-making in real-life engineering situations.

  4. Using case studies in engineering ethics education: the case for

    The qualitative study aims to determine (RQ1) how cases are selected, (RQ2) the goals envisioned for engineering ethics case instruction, (RQ3) the characteristics of the scenarios employed and (RQ4) the preferred application by instructors. A first finding notes the diverse set of goals and application of ethics case studies.

  5. PDF Case Studies In Engineering Ethics

    TECHNICAL, ETHICS CASE STUDIES. Case 1 - False Claim of Production Source. A major company was unsuccessful in bidding on a complex gyroscopic control system for a military aircraft. Using strong political connections with the White House, they forced a Pentagon level review of the evaluation. The proposal claimed all portions of the system ...

  6. Engineering Ethics: Real World Case Studies

    In Engineering Ethics: Real World Case Studies, Starrett, Bertha, and Lara provide in-depth analysis with extended discussions and study questions of case studies that are based on real work situations. Important concepts, such as rights and obligations; conflict of interest; professionalism and mentoring; confidentiality; whistleblowing ...

  7. PDF Microsoft PowerPoint

    MTI lead on-site rep presented charts leading to first (engineering) recommendation: "O-Ring temp must be 53 degF (or greater) at launch." NASA on-site reps asked for and got MTI higher management telecom concurrence. After off-line conference, top management in Utah withdrew earlier objection.

  8. Engineering Ethics Toolkit: Case studies

    Case studies are one tool that can be used to address the context and impact of engineering ethics, and have been proven to be an effective teaching and learning method. These case studies cover a variety of engineering disciplines, professional situations, and personal dilemmas and focuses on several areas of moral pedagogy.

  9. State of the Art in Engineering Ethics Methodologies for Case Studies

    Most education in ethics and professional responsibility relies heavily on case studies. This is true of medical, legal, nursing, veterinary, dental, and business ethics. It is also true of engineering ethics. Students in my large classes in engineering ethics (approximately 600 each semester) often tell me that their favorite part of the course is the case studies, reflecting the practical ...

  10. Sixteen Case Studies of Ethical Issues in Engineering

    This chapter contains 16 case studies of ethical issues in engineering. They involve various fields of engineering, from mechanical and civil to electrical, chemical, and computer science and systems.

  11. New Ethics Case Studies Published

    NSPE's Board of Ethical Review has published six new case studies that provide engineering ethics guidance using factbased scenarios. The cases cover the topics of plan stamping; gifts; the public health, safety, and welfare; conflicts of interest; responsible charge; and job qualifications. NSPE established the Board of Ethical Review in June 1954 due to many requests by engineers, state ...

  12. Using case studies in engineering ethics education: the case for

    Our contribution is part of a broader study conducted in cooperation with the national accreditation body Engineers Ireland that examined the conceptualisation and education of ethics in engineering programmes in Ireland. The paper is a qualitative examination of the use of case studies in engineering ethics education and includes 23 engineering programmes from 6 higher education institutions ...

  13. Engineering Ethics

    What is Engineering Ethics? Focusing on ethical issues confronting professionals, the Center offers materials to help engineers identify and respond to dilemmas they face. Offering case studies, curriculum, and commentary for engineering students, educators, and practitioners.

  14. Case Studies for Engineering Ethics Across the Product Life Cycle

    Our program of case studies and educational materials is exemplary in its interdisciplinary foundation, created collectively by engineers, policy experts, business professionals, and ethicists to provide clear examples for rising engineers to appreciate ethical issues from multiple angles.

  15. Engineering Ethics: Real World Case Studies

    In Engineering Ethics: Real World Case Studies, Starrett, Bertha, and Lara provide in-depth analysis with extended discussions and study questions of case studies that are based on real work ...

  16. The Space Shuttle Challenger Disaster

    Upon examination of the booster joints, engineers at Thiokol noticed black soot and grease on the outside of the booster casing, caused by actual gas blow-by. This prompted Thiokol to study the effects of O-ring resiliency at low temperatures. They conducted laboratory tests of O-ring compression and resiliency between 50lF and 100lF.

  17. Engineering Ethics: Real World Case Studies

    In Engineering Ethics: Real-World Case Studies, Starrett, Lara, and Bertha provide in-depth analysis with extended discussions and study questions of case studies that are based on real work situations. Important concepts such as rights and obligations; conflict of interest; professionalism and mentoring; confidentiality; whistleblowing ...

  18. Computer Engineering Cases

    May the Truth be with You. A new hire at an electronics startup struggles to decide between telling the truth and maximizing the company's profit. Case studies for computer and software engineers.

  19. PDF The Case Study Approach To Engineering Ethics

    In order to better prepare our students to practice engineering with integrity and honesty a case study approach to engineering ethics has been implemented. It begins with a reflective look at the type of decisions engineers make, professional obligations, codes of ethical conduct, and contemporary issues.

  20. Engineering Ethics

    This anthology includes a number of seminal readings and case studies about a wide variety of issues in engineering, including conflict of interest, risk and product liability, engineering in a global context, and ethics and technology.

  21. Mechanical Engineering Cases

    A Sinking Situation A systems engineering company employee quits after getting pressured to falsify product testing paperwork. A Situation Unfiltered A manager at a nonprofit mechanical engineering firm questions how responsible her company should be for ongoing maintenance on past projects. Is the Customer Always Right?

  22. A Case Study of Engineering Ethics: Lesson Learned From Building

    A CASE STUDY OF ENGINEERING ETHICS: LESSON LEARNED FROM BUILDING COLLAPSE DISASTER TOWARD MALAYSIAN ENGINEERS May 2014 Authors: Mohamad Fani Sulaima Universiti Teknikal Malaysia Melaka (UTeM) H. S ...