Herkert2020 Article TheBoeing737MAXLessonsForEngin PDF
Herkert2020 Article TheBoeing737MAXLessonsForEngin PDF
Herkert2020 Article TheBoeing737MAXLessonsForEngin PDF
https://doi.org/10.1007/s11948-020-00252-y
ORIGINAL RESEARCH/SCHOLARSHIP
Abstract
The crash of two 737 MAX passenger aircraft in late 2018 and early 2019, and sub-
sequent grounding of the entire fleet of 737 MAX jets, turned a global spotlight on
Boeing’s practices and culture. Explanations for the crashes include: design flaws
within the MAX’s new flight control software system designed to prevent stalls;
internal pressure to keep pace with Boeing’s chief competitor, Airbus; Boeing’s
lack of transparency about the new software; and the lack of adequate monitoring
of Boeing by the FAA, especially during the certification of the MAX and follow-
ing the first crash. While these and other factors have been the subject of numerous
government reports and investigative journalism articles, little to date has been writ-
ten on the ethical significance of the accidents, in particular the ethical responsi-
bilities of the engineers at Boeing and the FAA involved in designing and certifying
the MAX. Lessons learned from this case include the need to strengthen the voice
of engineers within large organizations. There is also the need for greater involve-
ment of professional engineering societies in ethics-related activities and for broader
focus on moral courage in engineering ethics education.
Introduction
In October 2018 and March 2019, Boeing 737 MAX passenger jets crashed minutes
after takeoff; these two accidents claimed nearly 350 lives. After the second inci-
dent, all 737 MAX planes were grounded worldwide. The 737 MAX was an updated
version of the 737 workhorse that first began flying in the 1960s. The crashes were
* Joseph Herkert
jherkert@ncsu.edu
1
North Carolina State University, Raleigh, NC, USA
2
Georgia Institute of Technology, Atlanta, GA, USA
3
University of Missouri – St. Louis, St. Louis, MO, USA
13
Vol.:(0123456789)
J. Herkert et al.
13
The Boeing 737 MAX: Lessons for Engineering Ethics
According to published reports, these notices were the first time that airline pilots
learned of the existence of MCAS (e.g., Bushey 2019).
On March 20, 2019, about four months after the Lion Air crash, Ethiopian Air-
lines Flight ET302 crashed 6 minutes after takeoff in a field 39 miles from Addis
Ababa Airport. The accident caused the deaths of all 157 passengers and crew. The
Preliminary Report of the Ethiopian Airlines Accident Investigation (Federal Demo-
cratic Republic of Ethiopia 2019), issued in April 2019, indicated that the pilots fol-
lowed the checklist from the Boeing Flight Crew Operations Manual Bulletin posted
after the Lion Air crash but could not control the plane (Ahmed et al. 2019). This
was followed by an Interim Report (Federal Democratic Republic of Ethiopia 2020)
issued in March 2020 that exonerated the pilots and airline, and placed blame for the
accident on design flaws in the MAX (Marks and Dahir 2020). Following the sec-
ond crash, the 737 MAX was grounded worldwide with the U.S., through the FAA,
being the last country to act on March 13, 2019 (Kaplan et al. 2019).
As noted above, with its belief that it must keep up with its main competitor, Air-
bus, Boeing elected to modify the latest generation of the 737 family, the 737NG,
rather than design an entirely new aircraft. Yet this raised a significant engineering
challenge for Boeing. Mounting larger, more fuel-efficient engines, similar to those
employed on the A320neo, on the existing 737 airframe posed a serious design
problem, because the 737 family was built closer to the ground than the Airbus
13
J. Herkert et al.
A320. In order to provide appropriate ground clearance, the larger engines had to be
mounted higher and farther forward on the wings than previous models of the 737
(see Fig. 2). This significantly changed the aerodynamics of the aircraft and created
the possibility of a nose-up stall under certain flight conditions (Travis 2019; Glanz
et al. 2019).
Boeing’s attempt to solve this problem involved incorporating MCAS as a soft-
ware fix for the potential stall condition. The 737 was designed with two AOA sen-
sors, one on each side of the aircraft. Yet Boeing decided that the 737 MAX would
only use input from one of the plane’s two AOA sensors. If the single AOA sensor
was triggered, MCAS would detect a dangerous nose-up condition and send a signal
to the horizontal stabilizer located in the tail. Movement of the stabilizer would then
force the plane’s tail up and the nose down (Travis 2019). In both the Lion Air and
Ethiopian Air crashes, the AOA sensor malfunctioned, repeatedly activating MCAS
(Gates 2018; Ahmed et al. 2019). Since the two crashes, Boeing has made adjust-
ments to the MCAS, including that the system will rely on input from the two AOA
sensors instead of just one. But still more problems with MCAS have been uncov-
ered. For example, an indicator light that would alert pilots if the jet’s two AOA sen-
sors disagreed, thought by Boeing to be standard on all MAX aircraft, would only
operate as part of an optional equipment package that neither airline involved in the
crashes purchased (Gelles and Kitroeff 2019a).
Similar to its responses to previous accidents, Boeing has been reluctant to admit
to a design flaw in its aircraft, instead blaming pilot error (Hall and Goelz 2019). In
the 737 MAX case, the company pointed to the pilots’ alleged inability to control
the planes under stall conditions (Economy 2019). Following the Ethiopian Airlines
crash, Boeing acknowledged for the first time that MCAS played a primary role in
the crashes, while continuing to highlight that other factors, such as pilot error, were
also involved (Hall and Goelz 2019). For example, on April 29, 2019, more than
a month after the second crash, then Boeing CEO Dennis Muilenburg defended
MCAS by stating:
We’ve confirmed that [the MCAS system] was designed per our standards,
certified per our standards, and we’re confident in that process. So, it operated
according to those design and certification standards. So, we haven’t seen a
technical slip or gap in terms of the fundamental design and certification of the
approach. (Economy 2019)
The view that MCAS was not primarily at fault was supported within an article
written by noted journalist and pilot William Langewiesche (2019). While not
Fig. 2 Boeing 737 MAX (left) compared to Boeing 737NG (right) showing larger 737 MAX engines
mounted higher and more forward on the wing. (Image source: https://www.norebbo.com)
13
The Boeing 737 MAX: Lessons for Engineering Ethics
denying Boeing made serious mistakes, he placed ultimate blame on the use of
inexperienced pilots by the two airlines involved in the crashes. Langewiesche
suggested that the accidents resulted from the cost-cutting practices of the air-
lines and the lax regulatory environments in which they operated. He argued that
more experienced pilots, despite their lack of information on MCAS, should have
been able to take corrective action to control the planes using customary stall pre-
vention procedures. Langewiesche (2019) concludes in his article that:
What we had in the two downed airplanes was a textbook failure of airman-
ship. In broad daylight, these pilots couldn’t decipher a variant of a simple
runaway trim, and they ended up flying too fast at low altitude, neglecting
to throttle back and leading their passengers over an aerodynamic edge into
oblivion. They were the deciding factor here — not the MCAS, not the Max.
Others have taken a more critical view of MCAS, Boeing, and the FAA. These
critics prominently include Captain Chesley “Sully” Sullenberger, who famously
crash-landed an A320 in the Hudson River after bird strikes had knocked out both
of the plane’s engines. Sullenberger responded directly to Langewiesche in a let-
ter to the Editor:
… Langewiesche draws the conclusion that the pilots are primarily to blame
for the fatal crashes of Lion Air 610 and Ethiopian 302. In resurrecting this
age-old aviation canard, Langewiesche minimizes the fatal design flaws and
certification failures that precipitated those tragedies, and still pose a threat
to the flying public. I have long stated, as he does note, that pilots must be
capable of absolute mastery of the aircraft and the situation at all times,
a concept pilots call airmanship. Inadequate pilot training and insufficient
pilot experience are problems worldwide, but they do not excuse the fatally
flawed design of the Maneuvering Characteristics Augmentation System
(MCAS) that was a death trap.... (Sullenberger 2019)
Noting that he is one of the few pilots to have encountered both accident
sequences in a 737 MAX simulator, Sullenberger continued:
These emergencies did not present as a classic runaway stabilizer problem,
but initially as ambiguous unreliable airspeed and altitude situations, mask-
ing MCAS. The MCAS design should never have been approved, not by
Boeing, and not by the Federal Aviation Administration (FAA)…. (Sullen-
berger 2019)
In June 2019, Sullenberger noted in Congressional Testimony that “These crashes
are demonstrable evidence that our current system of aircraft design and certifi-
cation has failed us. These accidents should never have happened” (Benning and
DiFurio 2019).
Others have agreed with Sullenberger’s assessment. Software developer and
pilot Gregory Travis (2019) argues that Boeing’s design for the 737 MAX vio-
lated industry norms and that the company unwisely used software to compensate
for inadequacies in the hardware design. Travis also contends that the existence
13
J. Herkert et al.
of MCAS was not disclosed to pilots in order to preserve the fiction that the 737
MAX was just an update of earlier 737 models, which served as a way to cir-
cumvent the more stringent FAA certification requirements for a new airplane.
Reports from government agencies seem to support this assessment, emphasizing
the chaotic cockpit conditions created by MCAS and poor certification practices.
The U.S. National Transportation Safety Board (NTSB) (2019) Safety Recom-
mendations to the FAA in September 2019 indicated that Boeing underestimated
the effect MCAS malfunction would have on the cockpit environment (Kitroeff
2019, a, b). The FAA Joint Authorities Technical Review (2019), which included
international participation, issued its Final Report in October 2019. The Report
faulted Boeing and FAA in MCAS certification (Koenig 2019).
Despite Boeing’s attempts to downplay the role of MCAS, it began to work on
a fix for the system shortly after the Lion Air crash (Gates 2019). MCAS operation
will now be based on inputs from both AOA sensors, instead of just one sensor, with
a cockpit indicator light when the sensors disagree. In addition, MCAS will only be
activated once for an AOA warning rather than multiple times. What follows is that
the system would only seek to prevent a stall once per AOA warning. Also, MCAS’s
power will be limited in terms of how much it can move the stabilizer and manual
override by the pilot will always be possible (Bellamy 2019; Boeing n.d. b; Gates
2019). For over a year after the Lion Air crash, Boeing held that pilot simulator
training would not be required for the redesigned MCAS system. In January 2020,
Boeing relented and recommended that pilot simulator training be required when the
737 MAX returns to service (Pasztor et al. 2020).
Boeing and the FAA
There is mounting evidence that Boeing, and the FAA as well, had warnings about
the inadequacy of MCAS’s design, and about the lack of communication to pilots
about its existence and functioning. In 2015, for example, an unnamed Boeing engi-
neer raised in an email the issue of relying on a single AOA sensor (Bellamy 2019).
In 2016, Mark Forkner, Boeing’s Chief Technical Pilot, in an email to a colleague
flagged the erratic behavior of MCAS in a flight simulator noting: “It’s running
rampant” (Gelles and Kitroeff 2019c). Forkner subsequently came under federal
investigation regarding whether he misled the FAA regarding MCAS (Kitroeff and
Schmidt 2020).
In December 2018, following the Lion Air Crash, the FAA (2018b) conducted
a Risk Assessment that estimated that fifteen more 737 MAX crashes would occur
in the expected fleet life of 45 years if the flight control issues were not addressed;
this Risk Assessment was not publicly disclosed until Congressional hearings a
year later in December 2019 (Arnold 2019). After the two crashes, a senior Boeing
engineer, Curtis Ewbank, filed an internal ethics complaint in 2019 about manage-
ment squelching of a system that might have uncovered errors in the AOA sensors.
Ewbank has since publicly stated that “I was willing to stand up for safety and qual-
ity… Boeing management was more concerned with cost and schedule than safety
or quality” (Kitroeff et al. 2019b).
13
The Boeing 737 MAX: Lessons for Engineering Ethics
One factor in Boeing’s apparent reluctance to heed such warnings may be attrib-
uted to the seeming transformation of the company’s engineering and safety culture
over time to a finance orientation beginning with Boeing’s merger with McDon-
nell–Douglas in 1997 (Tkacik 2019; Useem 2019). Critical changes after the merger
included replacing many in Boeing’s top management, historically engineers, with
business executives from McDonnell–Douglas and moving the corporate head-
quarters to Chicago, while leaving the engineering staff in Seattle (Useem 2019).
According to Tkacik (2019), the new management even went so far as “maligning
and marginalizing engineers as a class”.
Financial drivers thus began to place an inordinate amount of strain on Boeing
employees, including engineers. During the development of the 737 MAX, signif-
icant production pressure to keep pace with the Airbus 320neo was ever-present.
For example, Boeing management allegedly rejected any design changes that would
prolong certification or require additional pilot training for the MAX (Gelles et al.
2019). As Adam Dickson, a former Boeing engineer, explained in a television docu-
mentary (BBC Panorama 2019): “There was a lot of interest and pressure on the
certification and analysis engineers in particular, to look at any changes to the Max
as minor changes”.
Production pressures were exacerbated by the “cozy relationship” between Boe-
ing and the FAA (Kitroeff et al. 2019a; see also Gelles and Kaplan 2019; Hall and
Goelz 2019). Beginning in 2005, the FAA increased its reliance on manufacturers to
certify their own planes. Self-certification became standard practice throughout the
U.S. airline industry. By 2018, Boeing was certifying 96% of its own work (Kitroeff
et al. 2019a).
The serious drawbacks to self-certification became acutely apparent in this case.
Of particular concern, the safety analysis for MCAS delegated to Boeing by the
FAA was flawed in at least three respects: (1) the analysis underestimated the power
of MCAS to move the plane’s horizontal tail and thus how difficult it would be for
pilots to maintain control of the aircraft; (2) it did not account for the system deploy-
ing multiple times; and (3) it underestimated the risk level if MCAS failed, thus per-
mitting a design feature—the single AOA sensor input to MCAS—that did not have
built-in redundancy (Gates 2019). Related to these concerns, the ability of MCAS to
move the horizontal tail was increased without properly updating the safety analy-
sis or notifying the FAA about the change (Gates 2019). In addition, the FAA did
not require pilot training for MCAS or simulator training for the 737 MAX (Gelles
and Kaplan 2019). Since the MAX grounding, the FAA has been become more
independent during its assessments and certifications—for example, they will not
use Boeing personnel when certifying approvals of new 737 MAX planes (Josephs
2019).
The role of the FAA has also been subject to political scrutiny. The report of a
study of the FAA certification process commissioned by Secretary of Transportation
Elaine Chao (DOT 2020), released January 16, 2020, concluded that the FAA certi-
fication process was “appropriate and effective,” and that certification of the MAX
as a new airplane would not have made a difference in the plane’s safety. At the
same time, the report recommended a number of measures to strengthen the pro-
cess and augment FAA’s staff (Pasztor and Cameron 2020). In contrast, a report of
13
J. Herkert et al.
The 737 MAX case is still unfolding and will continue to do so for some time. Yet
important lessons can already be learned (or relearned) from the case. Some of
those lessons are straightforward, and others are more subtle. A key and clear lesson
is that engineers may need reminders about prioritizing the public good, and more
specifically, the public’s safety. A more subtle lesson pertains to the ways in which
the problem of many hands may or may not apply here. Other lessons involve the
need for corporations, engineering societies, and engineering educators to rise to
the challenge of nurturing and supporting ethical behavior on the part of engineers,
especially in light of the difficulties revealed in this case.
All contemporary codes of ethics promulgated by major engineering societies
state that an engineer’s paramount responsibility is to protect the “safety, health,
and welfare” of the public. The American Institute of Aeronautics and Astronautics
Code of Ethics indicates that engineers must “[H]old paramount the safety, health,
and welfare of the public in the performance of their duties” (AIAA 2013). The
Institute of Electrical and Electronics Engineers (IEEE) Code of Ethics goes further,
pledging its members: “…to hold paramount the safety, health, and welfare of the
public, to strive to comply with ethical design and sustainable development prac-
tices, and to disclose promptly factors that might endanger the public or the environ-
ment” (IEEE 2017). The IEEE Computer Society (CS) cooperated with the Associa-
tion for Computing Machinery (ACM) in developing a Software Engineering Code
of Ethics (1997) which holds that software engineers shall: “Approve software only
13
The Boeing 737 MAX: Lessons for Engineering Ethics
if they have a well-founded belief that it is safe, meets specifications, passes appro-
priate tests, and does not diminish quality of life, diminish privacy or harm the envi-
ronment….” According to Gotterbarn and Miller (2009), the latter code is a useful
guide when examining cases involving software design and underscores the fact that
during design, as in all engineering practice, the well-being of the public should be
the overriding concern. While engineering codes of ethics are plentiful in number,
they differ in their source of moral authority (i.e., organizational codes vs. profes-
sional codes), are often unenforceable through the law, and formally apply to dif-
ferent groups of engineers (e.g., based on discipline or organizational membership).
However, the codes are generally recognized as a statement of the values inherent to
engineering and its ethical commitments (Davis 2015).
An engineer’s ethical responsibility does not preclude consideration of factors
such as cost and schedule (Pinkus et al. 1997). Engineers always have to grapple
with constraints, including time and resource limitations. The engineers working
at Boeing did have legitimate concerns about their company losing contracts to its
competitor Airbus. But being an engineer means that public safety and welfare must
be the highest priority (Davis 1991). The aforementioned software and other design
errors in the development of the 737 MAX, which resulted in hundreds of deaths,
would thus seem to be clear violations of engineering codes of ethics. In addition
to pointing to engineering codes, Peterson (2019) argues that Boeing engineers and
managers violated widely accepted ethical norms such as informed consent and the
precautionary principle.
From an engineering perspective, the central ethical issue in the MAX case argu-
ably circulates around the decision to use software (i.e., MCAS) to “mask” a ques-
tionable hardware design—the repositioning of the engines that disrupted the aero-
dynamics of the airframe (Travis 2019). As Johnston and Harris (2019) argue: “To
meet the design goals and avoid an expensive hardware change, Boeing created the
MCAS as a software Band-Aid.” Though a reliance on software fixes often hap-
pens in this manner, it places a high burden of safety on such fixes that they may
not be able to handle, as is illustrated by the case of the Therac-25 radiation therapy
machine. In the Therac-25 case, hardware safety interlocks employed in earlier mod-
els of the machine were replaced by software safety controls. In addition, informa-
tion about how the software might malfunction was lacking from the user manual
for the Therac machine. Thus, when certain types of errors appeared on its inter-
face, the machine’s operators did not know how to respond. Software flaws, among
other factors, contributed to six patients being given massive radiation overdoses,
resulting in deaths and serious injuries (Leveson and Turner 1993). A more recent
case involves problems with the embedded software guiding the electronic throttle
in Toyota vehicles. In 2013, “…a jury found Toyota responsible for two unintended
acceleration deaths, with expert witnesses citing bugs in the software and throttle
fail safe defects” (Cummings and Britton 2020).
Boeing’s use of MCAS to mask the significant change in hardware configuration
of the MAX was compounded by not providing redundancy for components prone to
failure (i.e., the AOA sensors) (Campbell 2019), and by failing to notify pilots about
the new software. In such cases, it is especially crucial that pilots receive clear docu-
mentation and relevant training so that they know how to manage the hand-off with
13
J. Herkert et al.
an automated system properly (Johnston and Harris 2019). Part of the necessity for
such training is related to trust calibration (Borenstein et al. 2020; Borenstein et al.
2018), a factor that has contributed to previous airplane accidents (e.g., Carr 2014).
For example, if pilots do not place enough trust in an automated system, they may
add risk by intervening in system operation. Conversely, if pilots trust an automated
system too much, they may lack sufficient time to act once they identify a problem.
This is further complicated in the MAX case because pilots were not fully aware, if
at all, of MCAS’s existence and how the system functioned.
In addition to engineering decision-making that failed to prioritize public safety,
questionable management decisions were also made at both Boeing and the FAA.
As noted earlier, Boeing managerial leadership ignored numerous warning signs that
the 737 MAX was not safe. Also, FAA’s shift to greater reliance on self-regulation
by Boeing was ill-advised; that lesson appears to have been learned at the expense of
hundreds of lives (Duncan and Aratani 2019).
Actions, or inaction, by large, complex organizations, in this case corporate and gov-
ernment entities, suggest that the “problem of many hands” may be relevant to the
737 MAX case. At a high level of abstraction, the problem of many hands involves
the idea that accountability is difficult to assign in the face of collective action, espe-
cially in a computerized society (Thompson 1980; Nissenbaum 1994). According to
Nissenbaum (1996, 29), “Where a mishap is the work of ‘many hands,’ it may not
be obvious who is to blame because frequently its most salient and immediate causal
antecedents do not converge with its locus of decision-making. The conditions for
blame, therefore, are not satisfied in a way normally satisfied when a single indi-
vidual is held blameworthy for a harm”.
However, there is an alternative understanding of the problem of many hands. In
this version of the problem, the lack of accountability is not merely because multiple
people and multiple decisions figure into a final outcome. Instead, in order to “qual-
ify” as the problem of many hands, the component decisions should be benign, or at
least far less harmful, if examined in isolation; only when the individual decisions
are collectively combined do we see the most harmful result. In this understanding,
the individual decision-makers should not have the same moral culpability as they
would if they made all the decisions by themselves (Noorman 2020).
Both of these understandings of the problem of many hands could shed light on
the 737 MAX case. Yet we focus on the first version of the problem. We admit the
possibility that some of the isolated decisions about the 737 MAX may have been
made in part because of ignorance of a broader picture. While we do not stake a
claim on whether this is what actually happened in the MAX case, we acknowl-
edge that it may be true in some circumstances. However, we think the more impor-
tant point is that some of the 737 MAX decisions were so clearly misguided that a
competent engineer should have seen the implications, even if the engineer was not
aware of all of the broader context. The problem then is to identify responsibility for
the questionable decisions in a way that discourages bad judgments in the future,
13
The Boeing 737 MAX: Lessons for Engineering Ethics
While overall responsibility for each of these decisions may be difficult to allocate
precisely, at least points 1–3 above arguably reflect fundamental errors in engi-
neering judgement (Travis 2019). Boeing engineers and FAA engineers either par-
ticipated in or were aware of these decisions (Kitroeff and Gelles 2019) and may
have had opportunities to reconsider or redirect such decisions. As Davis has noted
(2012), responsible engineering professionals make it their business to address prob-
lems even when they did not cause the problem, or, we would argue, solely cause
it. As noted earlier, reports indicate that at least one Boeing engineer expressed res-
ervations about the design of MCAS (Bellamy 2019). Since the two crashes, one
Boeing engineer, Curtis Ewbank, filed an internal ethics complaint (Kitroeff et al.
2019b) and several current and former Boeing engineers and other employees have
gone public with various concerns about the 737 MAX (Pasztor 2019). And yet, as
is often the case, the flawed design went forward with tragic results.
The MAX case is eerily reminiscent of other well-known engineering ethics case
studies such as the Ford Pinto (Birsch and Fielder 1994), Space Shuttle Challenger
(Werhane 1991), and GM ignition switch (Jennings and Trautman 2016). In the
Pinto case, Ford engineers were aware of the unsafe placement of the fuel tank well
before the car was released to the public and signed off on the design even though
crash tests showed the tank was vulnerable to rupture during low-speed rear-end col-
lisions (Baura 2006). In the case of the GM ignition switch, engineers knew for at
least four years about the faulty design, a flaw that resulted in at least a dozen fatal
accidents (Stephan 2016). In the case of the well-documented Challenger accident,
13
J. Herkert et al.
13
The Boeing 737 MAX: Lessons for Engineering Ethics
Conclusions and Recommendations
The case of the Boeing 737 MAX provides valuable lessons for engineers and engi-
neering educators concerning the ethical responsibilities of the profession. Safety
is not cheap, but careless engineering design in the name of minimizing costs and
adhering to a delivery schedule is a symptom of ethical blight. Using almost any
standard ethical analysis or framework, Boeing’s actions regarding the safety of the
737 MAX, particularly decisions regarding MCAS, fall short.
Boeing failed in its obligations to protect the public. At a minimum, the company
had an obligation to inform airlines and pilots of significant design changes, espe-
cially the role of MCAS in compensating for repositioning of engines in the MAX
from prior versions of the 737. Clearly, it was a “significant” change because it had a
direct, and unfortunately tragic, impact on the public’s safety. The Boeing and FAA
interaction underscores the fact that conflicts of interest are a serious concern in reg-
ulatory actions within the airline industry.
Internal and external organizational factors may have interfered with Boeing and
FAA engineers’ fulfillment of their professional ethical responsibilities; this is an
all too common problem that merits serious attention from industry leaders, regula-
tors, professional societies, and educators. The lessons to be learned in this case are
not new. After large scale tragedies involving engineering decision-making, calls for
13
J. Herkert et al.
change often emerge. But such lessons apparently must be retaught and relearned by
each generation of engineers.
Acknowledgement The authors would like to thank the anonymous reviewers for their helpful comments.
References
ACM/IEEE-CS Joint Task Force. (1997). Software Engineering Code of Ethics and Professional Practice,
https://ethics.acm.org/code-of-ethics/software-engineering-code/.
Adamson, G., & Herkert, J. (2020). Addressing intelligent systems and ethical design in the IEEE Code
of Ethics. In Codes of ethics and ethical guidelines: Emerging technologies, changing fields. New
York: Springer (in press).
Ahmed, H., Glanz, J., & Beech, H. (2019). Ethiopian airlines pilots followed Boeing’s safety procedures
before crash, Report Shows. The New York Times, April 4, https://www.nytimes.com/2019/04/04/
world/asia/ethiopia-crash-boeing.html.
AIAA. (2013). Code of Ethics, https://www.aiaa.org/about/Governance/Code-of-Ethics.
Arnold, K. (2019). FAA report predicted there could be 15 more 737 MAX crashes. The Dallas Morn-
ing News, December 11, https://www.dallasnews.com/business/airlines/2019/12/11/faa-chief-says-
boeings-737-max-wont-be-approved-in-2019/
Baura, G. (2006). Engineering ethics: an industrial perspective. Amsterdam: Elsevier.
BBC News. (2019). Work on production line of Boeing 737 MAX ‘Not Adequately Funded’. July 29,
https://www.bbc.com/news/business-49142761.
Bellamy, W. (2019). Boeing CEO outlines 737 MAX MCAS software fix in congressional hearings. Avi-
ation Today, November 2, https://www.aviationtoday.com/2019/11/02/boeing-ceo-outlines-mcas-
updates-congressional-hearings/.
Benning, T., & DiFurio, D. (2019). American Airlines Pilots Union boss prods lawmakers to solve ’Crisis
of Trust’ over Boeing 737 MAX. The Dallas Morning News, June 19, https://www.dallasnews.com/
business/airlines/2019/06/19/american-airlines-pilots-union-boss-prods-lawmakers-to-solve-crisi
s-of-trust-over-boeing-737-max/.
Birsch, D., & Fielder, J. (Eds.). (1994). The ford pinto case: A study in applied ethics, business, and tech-
nology. New York: The State University of New York Press.
Boeing. (2003). Boeing Releases Independent Reviews of Company Ethics Program. December 18, https
://boeing.mediaroom.com/2003-12-18-Boeing-Releases-Independent-Reviews-of-Company-Ethic
s-Program.
Boeing. (2018). Flight crew operations manual bulletin for the Boeing company. November 6, https://
www.avioesemusicas.com/wp-content/uploads/2018/10/TBC-19-Uncommanded-Nose-Down-Stab-
Trim-Due-to-AOA.pdf.
Boeing. (n.d. a). About the Boeing 737 MAX. https://www.boeing.com/commercial/737max/.
Boeing. (n.d. b). 737 MAX Updates. https://www.boeing.com/737-max-updates/.
Boeing. (n.d. c). Initial actions: sharpening our focus on safety. https://www.boeing.com/737-max-updat
es/resources/.
Bogaisky, J. (2020). Boeing stock plunges as coronavirus imperils quick ramp up in 737 MAX deliver-
ies. Forbes, March 11, https://www.forbes.com/sites/jeremybogaisky/2020/03/11/boeing-coronaviru
s-737-max/#1b9eb8955b5a.
Boisjoly, R. P., Curtis, E. F., & Mellican, E. (1989). Roger Boisjoly and the challenger disaster: The ethi-
cal dimensions. J Bus Ethics, 8(4), 217–230.
Borenstein, J., Mahajan, H. P., Wagner, A. R., & Howard, A. (2020). Trust and pediatric exoskeletons:
A comparative study of clinician and parental perspectives. IEEE Transactions on Technology and
Society, 1(2), 83–88.
Borenstein, J., Wagner, A. R., & Howard, A. (2018). Overtrust of pediatric health-care robots: A prelimi-
nary survey of parent perspectives. IEEE Robot Autom Mag, 25(1), 46–54.
Bushey, C. (2019). The Tough Crowd Boeing Needs to Convince. Crain’s Chicago Business, October 25,
https://www.chicagobusiness.com/manufacturing/tough-crowd-boeing-needs-convince.
13
The Boeing 737 MAX: Lessons for Engineering Ethics
Campbell, D. (2019). The many human errors that brought down the Boeing 737 MAX. The Verge, May
2, https://www.theverge.com/2019/5/2/18518176/boeing-737-max-crash-problems-human-error
-mcas-faa.
Carr, N. (2014). The glass cage: Automation and us. Norton.
Cummings, M. L., & Britton, D. (2020). Regulating safety-critical autonomous systems: past, present,
and future perspectives. In Living with robots (pp. 119–140). Academic Press, New York.
Davis, M. (1991). Thinking like an engineer: The place of a code of ethics in the practice of a profession.
Philos Publ Affairs, 20(2), 150–167.
Davis, M. (2012). “Ain’t no one here but us social forces”: Constructing the professional responsibility of
engineers. Sci Eng Ethics, 18(1), 13–34.
Davis, M. (2015). Engineering as profession: Some methodological problems in its study. In Engineering
identities, epistemologies and values (pp. 65–79). Springer, New York.
Department of Transportation (DOT). (2020). Official report of the special committee to review the Fed-
eral Aviation Administration’s Aircraft Certification Process, January 16. https://www.transporta
tion.gov/sites/dot.gov/files/2020-01/scc-final-report.pdf.
Duncan, I., & Aratani, L. (2019). FAA flexes its authority in final stages of Boeing 737 MAX safety
review. The Washington Post, November 27, https://www.washingtonpost.com/transporta
tion/2019/11/27/faa-flexes-its-authority-final-stages-boeing-max-safety-review/.
Duncan, I., & Laris, M. (2020). House report on 737 Max crashes faults Boeing’s ‘culture of conceal-
ment’ and labels FAA ‘grossly insufficient’. The Washington Post, March 6, https://www.washi
ngtonpost.com/local/traffi candcommuting/house-report-on-737-max-crashes-faults-boeings-cultu
re-of-concealment-and-labels-faa-grossly-insuffi cient/2020/03/06/9e336b9e-5fce-11ea-b014-4fafa
866bb81_story.html.
Economy, P. (2019). Boeing CEO Puts Partial Blame on Pilots of Crashed 737 MAX Aircraft for Not
’Completely’ Following Procedures. Inc., April 30, https://www.inc.com/peter-economy/boein
g-ceo-puts-partial-blame-on-pilots-of-crashed-737-max-aircraft-for-not-completely-following-proce
dures.html.
Federal Aviation Administration (FAA). (2018a). Airworthiness directives; the Boeing company air-
planes. FR Doc No: R1-2018-26365. https://rgl.faa.gov/Regulatory_and_Guidance_Library/rgad.
nsf/0/fe8237743be9b8968625835b004fc051/$FILE/2018-23-51_Correction.pdf.
Federal Aviation Administration (FAA). (2018b). Quantitative Risk Assessment. https://www.docum
entcloud.org/documents/6573544-Risk-Assessment-for-Release-1.html#document/p1.
Federal Aviation Administration (FAA). (2019). Joint authorities technical review: observations, findings,
and recommendations. October 11, https://www.faa.gov/news/media/attachments/Final_JATR_
Submittal_to_FAA_Oct_2019.pdf.
Federal Democratic Republic of Ethiopia. (2019). Aircraft accident investigation preliminary report.
Report No. AI-01/19, April 4, https://leehamnews.com/wp-content/uploads/2019/04/Preliminar
y-Report-B737-800MAX-ET-AVJ.pdf.
Federal Democratic Republic of Ethiopia. (2020). Aircraft Accident Investigation Interim Report. Report
No. AI-01/19, March 20, https://www.aib.gov.et/wp-content/uploads/2020/documents/accident/ET-
302%2520%2520Interim%2520Investigation%2520%2520Report%2520March%25209%25202020.
pdf.
Gates, D. (2018). Pilots struggled against Boeing’s 737 MAX control system on doomed Lion Air flight.
The Seattle Times, November 27, https://www.seattletimes.com/business/boeing-aerospace/black
-box-data-reveals-lion-air-pilots-struggle-against-boeings-737-max-flight-control-system/.
Gates, D. (2019). Flawed analysis, failed oversight: how Boeing, FAA Certified the Suspect 737 MAX
Flight Control System. The Seattle Times, March 17, https://www.seattletimes.com/business/boein
g-aerospace/failed-certification-faa-missed-safety-issues-in-the-737-max-system-implicated-in-
the-lion-air-crash/.
Gelles, D. (2019). Boeing can’t fly its 737 MAX, but it’s ready to sell its safety. The New York Times,
December 24 (updated February 10, 2020), https://www.nytimes.com/2019/12/24/business/boein
g-737-max-survey.html.
Gelles, D. (2020). Boeing expects 737 MAX costs will surpass $18 Billion. The New York Times, Janu-
ary 29, https://www.nytimes.com/2020/01/29/business/boeing-737-max-costs.html.
Gelles, D., & Kaplan, T. (2019). F.A.A. Approval of Boeing jet involved in two crashes comes under
scrutiny. The New York Times, March 19, https://www.nytimes.com/2019/03/19/business/boein
g-elaine-chao.html.
13
J. Herkert et al.
Gelles, D., & Kitroeff, N. (2019a). Boeing Believed a 737 MAX warning light was standard. It wasn’t.
New York: The New York Times. https://www.nytimes.com/2019/05/05/business/boeing-737-max-
warning-light.html.
Gelles, D., & Kitroeff, N. (2019b). Boeing board to call for safety changes after 737 MAX Crashes. The
New York Times, September 15, (updated October 2), https://www.nytimes.com/2019/09/15/busin
ess/boeing-safety-737-max.html.
Gelles, D., & Kitroeff, N. (2019c). Boeing pilot complained of ‘Egregious’ issue with 737 MAX in 2016.
The New York Times, October 18, https://www.nytimes.com/2019/10/18/business/boeing-fligh
t-simulator-text-message.html.
Gelles, D., & Kitroeff, N. (2020). What needs to happen to get Boeing’s 737 MAX flying again?. The
New York Times, February 10, https://www.nytimes.com/2020/02/10/business/boeing-737-max-fly-
again.html.
Gelles, D., Kitroeff, N., Nicas, J., & Ruiz, R. R. (2019). Boeing was ‘Go, Go, Go’ to beat airbus with the
737 MAX. The New York Times, March 23, https://www.nytimes.com/2019/03/23/business/boein
g-737-max-crash.html.
Glanz, J., Creswell, J., Kaplan, T., & Wichter, Z. (2019). After a Lion Air 737 MAX Crashed in Octo-
ber, Questions About the Plane Arose. The New York Times, February 3, https://www.nytim
es.com/2019/02/03/world/asia/lion-air-plane-crash-pilots.html.
Gotterbarn, D., & Miller, K. W. (2009). The public is the priority: Making decisions using the software
engineering code of ethics. Computer, 42(6), 66–73.
Hall, J., & Goelz, P. (2019). The Boeing 737 MAX Crisis Is a Leadership Failure, The New York Times,
July 17, https://www.nytimes.com/2019/07/17/opinion/boeing-737-max.html.
Harris, C. E. (2008). The good engineer: Giving virtue its due in engineering ethics. Science and Engi-
neering Ethics, 14(2), 153–164.
Hashemian, G., & Loui, M. C. (2010). Can instruction in engineering ethics change students’ feelings
about professional responsibility? Science and Engineering Ethics, 16(1), 201–215.
Herkert, J. R. (1997). Collaborative learning in engineering ethics. Science and Engineering Ethics, 3(4),
447–462.
Herkert, J. R. (2004). Microethics, macroethics, and professional engineering societies. In Emerging tech-
nologies and ethical issues in engineering: papers from a workshop (pp. 107–114). National Acad-
emies Press, New York.
Hess, J. L., & Fore, G. (2018). A systematic literature review of US engineering ethics interventions. Sci-
ence and Engineering Ethics, 24(2), 551–583.
House Committee on Transportation and Infrastructure (House TI). (2020). The Boeing 737 MAX Air-
craft: Costs, Consequences, and Lessons from its Design, Development, and Certification-Prelimi-
nary Investigative Findings, March. https://transportation.house.gov/imo/media/doc/TI%2520Prelim
inary%2520Investigative%2520Findings%2520Boeing%2520737%2520MAX%2520March%25202
020.pdf.
IEEE. (2017). IEEE Code of Ethics. https://www.ieee.org/about/corporate/governance/p7-8.html.
IEEE. (2018). Ethically Aligned Design: A Vision for Prioritizing Human Well-being with Autonomous
and Intelligent Systems (version 2). https://standards.ieee.org/content/dam/ieee-standards/standards/
web/documents/other/ead_v2.pdf.
Jennings, M., & Trautman, L. J. (2016). Ethical culture and legal liability: The GM switch crisis and les-
sons in governance. Boston University Journal of Science and Technology Law, 22, 187.
Johnston, P., & Harris, R. (2019). The Boeing 737 MAX Saga: Lessons for software organizations. Soft-
ware Quality Professional, 21(3), 4–12.
Josephs, L. (2019). FAA tightens grip on Boeing with plan to individually review each new 737 MAX
Jetliner. CNBC, November 27, https://www.cnbc.com/2019/11/27/faa-tightens-grip-on-boein
g-with-plan-to-individually-inspect-max-jets.html.
Kaplan, T., Austen, I., & Gebrekidan, S. (2019). The New York Times, March 13. https://www.nytim
es.com/2019/03/13/business/canada-737-max.html.
Kitroeff, N. (2019). Boeing underestimated cockpit chaos on 737 MAX, N.T.S.B. Says. The New York
Times, September 26, https://www.nytimes.com/2019/09/26/business/boeing-737-max-ntsb-mcas.
html.
Kitroeff, N., & Gelles, D. (2019). Legislators call on F.A.A. to say why it overruled its experts on
737 MAX. The New York Times, November 7 (updated December 11), https://www.nytim
es.com/2019/11/07/business/boeing-737-max-faa.html.
13
The Boeing 737 MAX: Lessons for Engineering Ethics
Kitroeff, N., & Gelles, D. (2020). It’s not just software: New safety risks under scrutiny on Boeing’s
737 MAX. The New York Times, January 5, https://www.nytimes.com/2020/01/05/business/boein
g-737-max.html.
Kitroeff, N., & Schmidt, M. S. (2020). Federal prosecutors investigating whether Boeing pilot lied to
F.A.A. The New York Times, February 21, https://www.nytimes.com/2020/02/21/business/boein
g-737-max-investigation.html.
Kitroeff, N., Gelles, D., & Nicas, J. (2019a). The roots of Boeing’s 737 MAX Crisis: A regulator relaxes
its oversight. The New York Times, July 27, https://www.nytimes.com/2019/07/27/business/boein
g-737-max-faa.html.
Kitroeff, N., Gelles, D., & Nicas, J. (2019b). Boeing 737 MAX safety system was vetoed, Engineer Says.
The New York Times, October 2, https://www.nytimes.com/2019/10/02/business/boeing-737-max-
crashes.html.
Kline, R. R. (2001). Using history and sociology to teach engineering ethics. IEEE Technology and Soci-
ety Magazine, 20(4), 13–20.
Koenig, D. (2019). Boeing, FAA both faulted in certification of the 737 MAX. AP, October 11, https://
apnews.com/470abf326cdb4229bdc18c8ad8caa78a.
Langewiesche, W. (2019). What really brought down the Boeing 737 MAX? The New York Times, Sep-
tember 18, https://www.nytimes.com/2019/09/18/magazine/boeing-737-max-crashes.html.
Leveson, N. G., & Turner, C. S. (1993). An investigation of the Therac-25 accidents. Computer, 26(7),
18–41.
Marks, S., & Dahir, A. L. (2020). Ethiopian report on 737 Max Crash Blames Boeing, March 9, https://
www.nytimes.com/2020/03/09/world/africa/ethiopia-crash-boeing.html.
Martin, D. A., Conlon, E., & Bowe, B. (2019). The role of role-play in student awareness of the social
dimension of the engineering profession. European Journal of Engineering Education, 44(6),
882–905.
Miller, G. (2019). Toward lifelong excellence: navigating the engineering-business space. In The Engi-
neering-Business Nexus (pp. 81–101). Springer, Cham.
National Transportation Safety Board (NTSB). (2019). Safety Recommendations Report, September 19,
https://www.ntsb.gov/investigations/AccidentReports/Reports/ASR1901.pdf.
Nissenbaum, H. (1994). Computing and accountability. Communications of the ACM, January, https://
dl.acm.org/doi/10.1145/175222.175228.
Nissenbaum, H. (1996). Accountability in a computerized society. Science and Engineering Ethics, 2(1),
25–42.
Noorman, M. (2020). Computing and moral responsibility. In Zalta, E. N. (Ed.). The Stanford Encyclo-
pedia of Philosophy (Spring), https://plato.stanford.edu/archives/spr2020/entries/computing-respo
nsibility.
Pasztor, A. (2019). More Whistleblower complaints emerge in Boeing 737 MAX Safety Inquiries. The
Wall Street Journal, April 27, https://www.wsj.com/articles/more-whistleblower-complaints-emerg
e-in-boeing-737-max-safety-inquiries-11556418721.
Pasztor, A., & Cameron, D. (2020). U.S. News: Panel Backs How FAA gave safety approval for 737
MAX. The Wall Street Journal, January 17, https://www.wsj.com/articles/panel-clears-737-maxs-
safety-approval-process-at-faa-11579188086.
Pasztor, A., Cameron.D., & Sider, A. (2020). Boeing backs MAX simulator training in reversal of stance.
The Wall Street Journal, January 7, https://www.wsj.com/articles/boeing-recommends-fresh-max-
simulator-training-11578423221.
Peters, D., Vold, K., Robinson, D., & Calvo, R. A. (2020). Responsible AI—two frameworks for ethical
design practice. IEEE Transactions on Technology and Society, 1(1), 34–47.
Peterson, M. (2019). The ethical failures behind the Boeing disasters. Blog of the APA, April 8, https://
blog.apaonline.org/2019/04/08/the-ethical-failures-behind-the-boeing-disasters/.
Pinkus, R. L., Pinkus, R. L. B., Shuman, L. J., Hummon, N. P., & Wolfe, H. (1997). Engineering ethics:
Balancing cost, schedule, and risk-lessons learned from the space shuttle. Cambridge: Cambridge
University Press.
Republic of Indonesia. (2019). Final Aircraft Accident Investigation Report. KNKT.18.10.35.04, https
://knkt.dephub.go.id/knkt/ntsc_aviation/baru/2018%2520-%2520035%2520-%2520PK-LQP%2520F
inal%2520Report.pdf.
Rich, G. (2019). Boeing 737 MAX should return in 2020 but the crisis won’t be over. Investor’s Business
Daily, December 31, https://www.investors.com/news/boeing-737-max-service-return-2020-crisi
s-not-over/.
13
J. Herkert et al.
Schnebel, E., & Bienert, M. A. (2004). Implementing ethics in business organizations. Journal of Busi-
ness Ethics, 53(1–2), 203–211.
Schwartz, M. S. (2013). Developing and sustaining an ethical corporate culture: The core elements. Busi-
ness Horizons, 56(1), 39–50.
Stephan, K. (2016). GM Ignition Switch Recall: Too Little Too Late? [Ethical Dilemmas]. IEEE Technol-
ogy and Society Magazine, 35(2), 34–35.
Sullenberger, S. (2019). My letter to the editor of New York Times Magazine, https://www.sullysulle
nberger.com/my-letter-to-the-editor-of-new-york-times-magazine/.
Thompson, D. F. (1980). Moral responsibility of public officials: The problem of many hands. American
Political Science Review, 74(4), 905–916.
Thompson, D. F. (2014). Responsibility for failures of government: The problem of many hands. The
American Review of Public Administration, 44(3), 259–273.
Tkacik, M. (2019). Crash course: how Boeing’s managerial revolution created the 737 MAX Disaster.
The New Republic, September 18, https://newrepublic.com/article/154944/boeing-737-max-inves
tigation-indonesia-lion-air-ethiopian-airlines-managerial-revolution.
Travis, G. (2019). How the Boeing 737 MAX disaster looks to a software developer. IEEE Spectrum,
April 18, https://spectrum.ieee.org/aerospace/aviation/how-the-boeing-737-max-disaster-looks-to-a-
software-developer.
Useem, J. (2019). The long-forgotten flight that sent Boeing off course. The Atlantic, November 20, https
://www.theatlantic.com/ideas/archive/2019/11/how-boeing-lost-its-bearings/602188/.
Watts, L. L., & Buckley, M. R. (2017). A dual-processing model of moral whistleblowing in organiza-
tions. Journal of Business Ethics, 146(3), 669–683.
Werhane, P. H. (1991). Engineers and management: The challenge of the Challenger incident. Journal of
Business Ethics, 10(8), 605–616.
Publisher’s Note Springer Nature remains neutral with regard to jurisdictional claims in published
maps and institutional affiliations.
13