Mayday!!! Autopilot or Human?

Group 53

‘I don’t know about you, but I don’t want Albert Einstein to be my pilot. I want great flying professionals that are allowed to easily and quickly take control of a plane!’ Donald J. Trump said.

Safety, convenience, greater efficiency as well as better flying experience. Those are the reasons why autopilots are applied in nowadays aircraft industry widely. However, two tragedies happened on Boeing 737 Max 8 have raised the controversy of autopilot.

Pilot, not autopilot, to blame

Autopilot is playing an indispensable role in recent aerospace industry. Based on thousands of previous air accidents, autopilot system has been improved to perfection. On most commercial aircrafts, there are multiple standby system ensuring the safety of aircrafts and monitoring each other. Thus, the risk of autopilot system being crashed has been reduced to minimum, even almost impossible.

As the old saying, no one is perfect. Especially in case of emergency, pilots are expected to consider all the aspects in few seconds, which is unimaginable. 71 people were killed in Überlingen Disaster, due to the conflict between ATC and TCAS. 228 people on Air France Flight 447 lost their lives, due to pilots’ improper operations. 3 students were killed on Asiana Airlines Flight 214, because of the terrible landing operated by the experienced pilot. Compared with system, human seems to be the weak side in emergencies. For instance, physical factors, mental factors and personal technical level would interfere with pilots’ decisions. According to the Überlingen Disaster, the suggestion provided by the autopilot was totally correct. However, the limited lime forced pilots to make the wrong choice: not to trust the system. If autopilot system has the highest priority act like a code of conduct, those victims may survive.

What if human itself caused the accident? According to research published in Journal Environmental Health, nearly 4.1% of pilot admitted that they had suicide thought before. Furthermore, using2017 Statistics, nearly 580 fatal accidents were caused by human pilot error annually. After reading this far, do you still have enough faith to trust human pilot in an emergency? Without any doubts, autopilot should be trusted because higher precision, quicker emergency response, as well as stronger information processing capability in emergency. Autopilot might be a disrespect to those well-trained pilots, but with the thought of Utilitarianism, passengers’ lives are definitely more valued.

What we are talking about now is not a degree of automation to the point where human beings aren’t doing anything at all. Without any doubt, autopilot will revolutionize the aerospace industry. However, this revolution does not mean the retirement of human pilot but provides insurances for safety in an emergency. According to Kantian Theory, the only virtue that can be unqualifiedly good is a good will. In this respect, autopilot should be provided moral supports because the initial willing of autopilot is nothing but saving lives.

Based on the moral theory and experiences from the past, autopilot ought to be trusted.

Can we really trust Autopilot?

A hazard is mentioned if a technology, or its use, can cause undesirable damage or effects in some situations. The term safety which is related to air crash is defined as the absence of risk or hazard. According to the engineer’s responsibility for safety, consequentialism, duty ethics and virtue ethics all provide the arguments why engineers should guarantee against the occurrence of tragedy.

From 2018 to 2019, two Boeing 737 MAX airplanes crashed in Indonesia and Ethiopia separately for the same reason. When Boeing developed 737 MAX, they came up with a way to fit a larger and more fuel-saving efficient engine for it. However, Max’s nose was getting nudged skyward due to the changes. To compensate for some unique aircraft handling characteristics, Boeing quietly added an automatic pilot system called MCAS which can bring the nose down when the jet is confronted with stalling. In the accident, the angle of attack detector didn’t work and the MCAS force MAX’s nose down during normal flight, the air crash occurred.

Consequentialism states that engineers must design safe products. In the air crash case, MCAS was not designed well and had system bug at the particular situation where the detector was out of order. From the perspective of engineering design, defects of computational products cannot be avoided due to various complex reasons, so it makes the machine cannot be totally reliable. In this scenario, Boeing stated that the pilots have no necessary to do extra training for 737 MAX and just need to know the operating condition of new software. It means that the airline company let pilot realize the MCAS is reliable, so they relax vigilance. It will lead to a ridiculous sight: At the last minute, the crew spent their energy on ‘FIGHTING’ with the computer while the airplane was still flying according to the ‘STUBBORN IDEA’ of the machine.

In times of emergency, for instance, instrument failure due to the bad weather condition or awareness of the unnormal flight condition, pilots can have the freedom of action which defined as someone must not have acted under compulsion. Human actions are described as spontaneous which is conscious and can be self-controlled according to Kantian ethic. Today, most of the civil airliners adopt human-machine mixed operation. The study found that most of serious air crashes due to more and more automated components replacing pilots gradually, who may be confused when the aircraft is broken and unable to figure out when automatic flying end and when their duties begin. A good pilot always has foreseeability that they have able to know the consequences of their actions. If the pilots didn’t trust the new system blindly at that time, they may save all the passengers against their experience.

Initial Decision

We all admit that autopilot is the future of the civil aviation industry. However, human and autopilot system are indispensable, which requires aircraft manufacturers and civil aviation companies to assume more moral responsibility, optimize the system and take pilot training seriously. In order to make the cooperation between human and autopilot system more perfect.

8 thoughts on “Mayday!!! Autopilot or Human?

  1. Obviously, the development of MCAS autopilot system is in good faith by Boeing Company but two crashes happened in Indonesia and Ethiopia also made people misty-eyed.
    Lack of new system trials and neglecting extra training for MCAS seem to be the major causes of the above tragedies. Admittedly, the automatic system did has a great advantage on making up for defect of human being, but no program or system is perfect or out of risk because in the final analysis it’s designed by man.
    Therefore, we need avoid the occurrence by continuous improvements of the automatic system as well as excellent operation skills.

  2. Maybe we shouldn’t let computers fly airplanes and let humans take over when something goes wrong. Since computers are tireless, patient and don’t require practice.It might be better to do like this: let humans fly airplanes, let computers supervise them, and intervene when something goes wrong.

  3. The human life only has one time, all science and technology should serve the human race, hope in the future, unmanned driving and artificial intelligence can be safe and stable development.

  4. Autopilot has its own unique advantages, such as higher precision, quicker emergency response, stronger information processing capability in emergency, etc. Although the autopilot skill or system may have not perfect, we all admit that autopilot is the future of the civil aviation industry.  Therefore, with the advance of science and technology, the machine learning ability will become stronger and safer through our efforts.

  5. First of all, really express the deep condolences to the victims.
    But according to the present status of investigation, I think it is much too soon to say why the plane went down, or to rule out possibilities like pilot error, mechanical breakdown or maintenance problems.
    There is no doubt that the automatic system has its advantage which human could not achieve, but how to control well this automatic system by human should go deeply into a matter.
    At last but not least, I agreed with the author of this article written, take pilot training seriously and optimize the system,  both are necessity in order to avoid such occurrences.

  6. This problem is hard to define. From the technical point of view,manless driving replace the pilot is completely feasible. However,for passengers, we won’t be able to give his life entirely to intelligent devices.People maybe make mistakes,the machine also need human to ensure normal operation.Win-win situation is that we prefer to see.

  7. Autonomous aircraft flying dates back to the 1910s , and by the 1930s, engineers had matured by attaching the airplane’s elevator, aileron, and rudder to gyroscopes and altimeters, allowing the plane to fly in a set direction and at a set altitude.To this day, the structure has not changed substantially.An airplane’s autopilot is a device that allows the airplane to fly at a set trajectory and speed.To put it simply, aircraft autopilot replaces the operation of “the pilot looking at the instrument pointer and correcting it in a fixed motion when it deviates from the set value”.

  8. A very topical issue to choose; the investigation into the 737 MAX is ongoing and more information will come to light. For the moment, the issue you are exploring, I think, is should we give full control, or final say, to the autopilot system. Perhaps that question is best answered by asking what is the autopilot programmed to do. In emergencies an autopilot does not have an ability to panic, that said airline pilots train constantly for such situations equipping them with the necessary reflexes. I think I need more information before I can answer that question. Certainly, providing pilots with as much assistance as possible is a good thing.

Leave a Reply