‘I don’t know about you, but I don’t want Albert Einstein to be my pilot. I want great flying professionals that are allowed to easily and quickly take control of a plane!’ Donald J. Trump said.
Safety, convenience, greater efficiency as well as better flying experience. Those are the reasons why autopilots are applied in nowadays aircraft industry widely. However, two tragedies happened on Boeing 737 Max 8 have raised the controversy of autopilot.
Pilot, not autopilot, to blame
Autopilot is playing an indispensable role in recent aerospace industry. Based on thousands of previous air accidents, autopilot system has been improved to perfection. On most commercial aircrafts, there are multiple standby system ensuring the safety of aircrafts and monitoring each other. Thus, the risk of autopilot system being crashed has been reduced to minimum, even almost impossible.
As the old saying, no one is perfect. Especially in case of emergency, pilots are expected to consider all the aspects in few seconds, which is unimaginable. 71 people were killed in Überlingen Disaster, due to the conflict between ATC and TCAS. 228 people on Air France Flight 447 lost their lives, due to pilots’ improper operations. 3 students were killed on Asiana Airlines Flight 214, because of the terrible landing operated by the experienced pilot. Compared with system, human seems to be the weak side in emergencies. For instance, physical factors, mental factors and personal technical level would interfere with pilots’ decisions. According to the Überlingen Disaster, the suggestion provided by the autopilot was totally correct. However, the limited lime forced pilots to make the wrong choice: not to trust the system. If autopilot system has the highest priority act like a code of conduct, those victims may survive.
What if human itself caused the accident? According to research published in Journal Environmental Health, nearly 4.1% of pilot admitted that they had suicide thought before. Furthermore, using2017 Statistics, nearly 580 fatal accidents were caused by human pilot error annually. After reading this far, do you still have enough faith to trust human pilot in an emergency? Without any doubts, autopilot should be trusted because higher precision, quicker emergency response, as well as stronger information processing capability in emergency. Autopilot might be a disrespect to those well-trained pilots, but with the thought of Utilitarianism, passengers’ lives are definitely more valued.
What we are talking about now is not a degree of automation to the point where human beings aren’t doing anything at all. Without any doubt, autopilot will revolutionize the aerospace industry. However, this revolution does not mean the retirement of human pilot but provides insurances for safety in an emergency. According to Kantian Theory, the only virtue that can be unqualifiedly good is a good will. In this respect, autopilot should be provided moral supports because the initial willing of autopilot is nothing but saving lives.
Based on the moral theory and experiences from the past, autopilot ought to be trusted.
Can we really trust Autopilot?
A hazard is mentioned if a technology, or its use, can cause undesirable damage or effects in some situations. The term safety which is related to air crash is defined as the absence of risk or hazard. According to the engineer’s responsibility for safety, consequentialism, duty ethics and virtue ethics all provide the arguments why engineers should guarantee against the occurrence of tragedy.
From 2018 to 2019, two Boeing 737 MAX airplanes crashed in Indonesia and Ethiopia separately for the same reason. When Boeing developed 737 MAX, they came up with a way to fit a larger and more fuel-saving efficient engine for it. However, Max’s nose was getting nudged skyward due to the changes. To compensate for some unique aircraft handling characteristics, Boeing quietly added an automatic pilot system called MCAS which can bring the nose down when the jet is confronted with stalling. In the accident, the angle of attack detector didn’t work and the MCAS force MAX’s nose down during normal flight, the air crash occurred.
Consequentialism states that engineers must design safe products. In the air crash case, MCAS was not designed well and had system bug at the particular situation where the detector was out of order. From the perspective of engineering design, defects of computational products cannot be avoided due to various complex reasons, so it makes the machine cannot be totally reliable. In this scenario, Boeing stated that the pilots have no necessary to do extra training for 737 MAX and just need to know the operating condition of new software. It means that the airline company let pilot realize the MCAS is reliable, so they relax vigilance. It will lead to a ridiculous sight: At the last minute, the crew spent their energy on ‘FIGHTING’ with the computer while the airplane was still flying according to the ‘STUBBORN IDEA’ of the machine.
In times of emergency, for instance, instrument failure due to the bad weather condition or awareness of the unnormal flight condition, pilots can have the freedom of action which defined as someone must not have acted under compulsion. Human actions are described as spontaneous which is conscious and can be self-controlled according to Kantian ethic. Today, most of the civil airliners adopt human-machine mixed operation. The study found that most of serious air crashes due to more and more automated components replacing pilots gradually, who may be confused when the aircraft is broken and unable to figure out when automatic flying end and when their duties begin. A good pilot always has foreseeability that they have able to know the consequences of their actions. If the pilots didn’t trust the new system blindly at that time, they may save all the passengers against their experience.
We all admit that autopilot is the future of the civil aviation industry. However, human and autopilot system are indispensable, which requires aircraft manufacturers and civil aviation companies to assume more moral responsibility, optimize the system and take pilot training seriously. In order to make the cooperation between human and autopilot system more perfect.