Hands-Free Auto-Pilot

Who Should Take The Liability For The Crash Caused By Autopilot

Group 20

How the Tesla Crashed
Click to Enlarge

Autopilot technology tends to replace human driving and leads the revolution in the foreseeable future. However, many accidents occurred due to the failure of Autopilot and brought the property and life losses. In 2016, Tesla Model-S with Autopilot on hit the trailer which was crossing the highway. The driver and autopilot system did not respond, even an emergency brake. Therefore, who should take the liability for this accident? Engineers or driver?

Engineer’s fault, they should design a more advanced system to adapt to the environment and perform more road test.

Tesla Model-S with Autopilot on crashed into the chassis of the trailer which was crossing the road vertically to the Tesla driving. Brake system was not activated due to Autopilot system did not recognise the white trailer under the sunshine. After the accident, the Tesla company announced that “the ultrasonic radar judged it as an overhead road sign.” “The autopilot was turned off by default, when the driver turned it on, they have to accept the fact that this system is still under beta test.

The system was released when it was immature and lead to the misunderstanding of the terms. Default setting was more likely forcing users to accept that term. The company and engineer should act. Engineer company aimed to let drivers make the full use of the Autopilot, promote the technology and get benefit from it.

The company and engineers were the stakeholders. The reputation and growth of the company were relevant to the company after the accident. Engineers are relevant to their reputation and confirmation from the company, because of the failure of the system. The engineer and company should be responsible to the customers and make benefit at the same time.

Engineers could act:

  1. Fully consider the road condition, design a better system.
  2. Communicate with company to perform more road test to improve its reliability and environment adaptivity.

Company could act:

  1. Establish the system when it is mature.
  2. Communicate with engineers to make a full definition of the system to let customers know its function.

The accident was caused by the recognition defect which due to the system design flaw including data-processing and decision-making. However, Engineers could be aware of these problems during the design stage and eliminate them before the system releases. The engineers did make benefits for the company. This reflects that for the company, or to be specific, the engineer who developed the system did the right thing in their perspective in the short term. But, it brought bad consequences for some of the customers, such as loss of properties and life. In the long term, the accident would bring the distrust of public to the system. And the distrust would directly affect the engineer reputation and lead to considerable losses of the engineers. That brought the lose-lose situation for both the engineers and customers. The release of the immature system caused the accident indirectly. The action did not conform to the maximize utility maxim because it did not give the greatest benefits to the most people who affected by it. Therefore, it is the fault of engineers from their perspective because the behaviour itself is unethical.

Driver’s fault, driver relayed excessively on Autopilot rather than use it as a supplementary system.

The driver did not put his hands on the steering wheel when driving on the road. Instead, he was watching a movie. When approaching the trailer, two protection systems in the car were failed to react. As the last protection system, the driver did not break. Hence, the accident happened, and he lost his life.

Driver relied on Autopilot system excessively and loss his concentrate when driving. He did not take full responsibility when driving on the road. The driver and trailer driver must act. The trailer driver could avoid the accident.The driver aimed to make the full use of the Autopilot and get benefit from it.

The driver and trailer driver were the stakeholders. For the Tesla driver, he was watching video when driving. His action linked to his entertainment and safety or life. For the trailer driver, he wanted to cross the road successfully. So, he was in front of the Tesla model-S. In this situation, the driver should comply with the rules and be responsible for himself and society.

There were several options for the driver to save his life:

  1. Put his hands on the steering wheel and focus on driving.
  2. Watch the video after the car has been parked in a safe area.
  3. Watch the video after the driver has arrived the destination.

The trailer driver could act:

  1. Double check the traffic condition before crossing the road.
  2. Blow car whistle to inform the driver when the trailer driver realise something goes wrong.
  3. Reverse quickly to avoid the Tesla Model-S.

The accident happened due to the carelessness of the driver and ignore the change of the traffic condition. For the driver, the problem can be prevented easily if he paid enough attention to the road conditions. The behaviour itself was irresponsible and brought the negative consequences to both the individual and society. Therefore, it was an incorrect and illicit behaviour. From the other point of view, his behaviour was not according to the maxim which can be considered as the universal law for the other drivers. Meanwhile, what the driver did was aim to satisfy himself. Moreover, at the same time, the other people was the means to the end. Moreover, the behaviour of the driver was not the will of the universally legislating. In conclusion, the drivers lived in the world under the universal law. While due to the lack of autonomy, what the driver did was against the universal law in fact. Therefore, the fault of the driver is indisputable.

46 thoughts on “Who Should Take The Liability For The Crash Caused By Autopilot

  1. I think in this particular instance there is a clear case for it being the driver’s fault. The Tesla autopilot system was in an experimental phase and the driver had a responsibility both to accept that status and for the overall operation of the vehicle. Most aircraft operate using autopilot, but the pilots continue to actively supervise. This should have been the case here.

    Of course, what is interesting is where this argument goes. Ultimately, driverless vehicles should mean all of their occupants can relax with videos, etc. However, should one occupant actively supervise or have some degree of over-sight?

  2. Both the fault of the engineer and driver cannot be ignored. It cannot be considered that the driver is totally wrong but the engineer is totally right. Because the system is not that mature and the engineer allow the system to be released. However, the driver should always focus on the situation on the road. It is also mentioned in your article. Therefore, the dialectical thought should be used to analyze the problem. Of course, the driver’s fault is more serious than the engineer’s.

  3. Both sides have inescapable responsibility. I think the main question is why the car that is still in the test phase is on the road. It is to upgrade this system after the accident.

  4. Definitely, I think this is an engineer’s mistake. Although the direct cause was that the driver was watching the movie and caused the accident, the engineer did not tell the user in detail about the relevant instructions. As a result, the user did not pay enough attention to this problem. Or engineers have intentionally exaggerated their products. This is the cause of the accident.

  5. This must be the fault of the driver. Even if the automatic driving system can facilitate the driving, the driver can not play the mobile phone while driving and ignore the road conditions. The automatic driving system is only an auxiliary means to reduce driving fatigue, it can not completely replace the driver’s operation, if necessary, it still needs Driver intervention.

  6. The fault of the driver is not ignorable as well as the fault of the engineer. The engineer should alert the driver that the system is still in the testing stage and is not guaranteed for one hundred percent. The driver should be informed to pay attention to the road condition although they are using the autopilot. The engineer should develop a new system that can remind the driver if the driver looks away the road for a while. however, the driver should always pay attention to the road condition as the safety is the most important thing not only to himself also to other drivers and the system is not sensible and stable in some particular situation.

  7. The driver’s fault is more obvious than the engineer’s. Although the system is released, it is still under testing. In your article, it is said that the testing terms should be confirmed by the driver. That means the driver has already known during the driving, he should always keep focusing. But he didn’ do that, and he just let the autopilot take the full control of the car, even the system has warned him. I think the engineers shouldn’t be responsible for the case. Because their fault didn’t cause the accident directly.

  8. Both drivers and engineer should be liable for this accident. On one hand, as the report pointed out, the system of autopilot fails to recognize the trailer and cause the accident. So it means that the system is not suitable to go on to road at current stage. On the other hand, the driver should not lose his attention on the road even if the autopilot system is fully functioning.
    Overall, I think the engineers of the autopilot system should be the account for the liability more than the driver.

  9. First of all, i insist that both engineers and driver himself should take the liability. Meanwhile, the engineers should take more responsibility on this. Because the reason why the customers buy these cars is this technique. The engineers should have made it perfect to avoid this tragedy which due to the technical deficiency.

  10. Personally I think driver should take more responsibility in this case as he should take responsibility for his own safety and cannot focus on the vedio while driving, as the system is more like a supplement tool and has been told still under test. Besides, the engineer also needs to take responsibility of developing mature system to build better living quality without accidents, release it until it becomes a mature system.

  11. At this experimental stage, I would say, yes, it is the fault of the driver, since he should concentrate more on his driving and take the risk on his own during the experiment.

    However, if the autopilot has been fully developed in the future and opened to the public, the company should take the responsibility, as it is their aim to allow customers or passengers enjoy their time during driving.

  12. The autopilot system aims to replace the driver and therefore let the driver enjoy the whole journey. From that point, the driver’s behaviour is right because it makes the full use of the system. But, the system doesn’t reach that level which means the autopilot system cannot replace the driver’s control fully, as shown in your article. Due to that, the driver’s fault is dominant. Because he is informed that the system is not fully developed but he still let the system take full control of the car. Thus, the driver’s fault is dominant although the engineer’s fault exists.

  13. From my point of view, the driver should take full responsibility for the crash. The autopilot is just an auxiliary system. Hence, it should help people instead of letting it do the work by itself. Even when it is fully developed, people still should supervise the system.

  14. What am I thinking is if the autopilot can handle everything, why we need a driver? A driver needs to be responsible for himself and other vehicles when driving on the road. The engineer did not design a system that can achieve full autonomous. It was limited by the technology. But the driver cannot relax his vigilance, because of it on the road. The machine cannot do the thing that is beyond its ability.

  15. I don’t think it’s the responsibility of Tesla. In fact, our company has a Tesla and I have driven it myself. In my point of view, first of all, Tesla had a clear driving instructions, and then Tesla had a good driving experience, and I think that, apart from this man, no one would look at the movie when driving a car. Finally, once you get out of the steering wheel with your hands, your life will not be controlled by yourself.

  16. Its clearly driver’s fault. You cannot expect program that is not finished in 100 % to be doing 100 % of your work. This technology is still in progress and should not be used in our environment as there are too many variables. Engineer made a mistake but he warned you by saying that its still under beta test.

  17. There is no doubt that engineer has the responsibility, whereas, the main fault should be on the driver. As the machine is system which has no such sensible and flexible capability to adapt the sudden situation as human’s brain. Additionally, the system is following the program and can not be always correct and make the best decision. In other words, the best solution the systems gives may not be the most suitable solution to the particular situation. Therefore, the driver should not divert their attention from the road condition, I agree they can have a relax and hand the control to the autopilot , but need to aware the road anytime, so that if some emergency happens, they can react in time.

  18. In my view, both engineers and driver should take responsibility for the incidence. The engineers need to consider various situation that users might face in reality including such sudden incidence on road and test for avoiding safety issues, while the driver cannot completely trust the Autopilot system as well. the driver was supposed to focus on the road rather than the movie.

  19. I think both engineers and drivers are responsible, but most of the responsibility lies with the driver himself. First of all, drivers need to be responsible for their own safety. When told that the system is still in the testing phase, the driver should not rely entirely on this system and ignore his own observation of the road conditions and control of the car. On the other hand, engineers should release mature products and train consumers’ cognition of the system. Without determining the safety hazard, they should not analyse the application of the system by having the consumers’ using experience.

  20. In this case, I think both engineers and drivers are responsible for the accidents. For engineers, they should ensure the reliability of their designs before coming into use. Also, the autonomous vehicles are designed to give the drivers more freedom rather than supervise them. The drivers should aware the Autopilot is still in experiment phase and they should not fully rely on the autonomous system.

  21. A professional and attractive article. However, if the accident actually happens in the future, I don’t think the major responsibility should be taken by wither the engineers or the driver. It should be the company that take responsibility for that. As a engineer, i understand that all the engineers are working hard to develop the technologies and their application. The faults and mistakes during the development of new technology are unavoidable. Fortunately, this technology is just under testing stage. If the company push this technology into market without fully testing and repairing the system, then it should be responsible for any accident.

  22. In my point of view, the driver should take the most responsibility for the accident, but the company still need to take a second responsibility. Lets say, for example, the ‘driver’ means drive and control. Which means if anyone setting on the driver sit, it have the responsibility to control the vehicle. Furthermore, the driver need to accept the term and conditions if they want to use the autopilot. The acceptance means they know the auto pilot would not response for the damage, this in in the filed of law. On the other hand, the company need to take a second responsibility as the autopilot is still in beta. In emotional consideration, the company need to response as they did not provide a complete program, the beta program should not pacing the public.

  23. The engineers fault, due to the nature of driving on the roads being unpredictable, it leads me to believe the system is a constant hazard at this point in its development therefore should not have been released.

  24. In fact, it was the engineer’s fault that led to the accident. Nowadays, the technology of automatic driving has not been popularized, so the mutual recognition between vehicles basically depends on some physical instruments or human senses. In the future, if the self driving vehicle is popularized, it will become a huge network system. At that time, the main way to identify vehicles was not through physical instruments, but through the Internet of things. So, in my view, the accident was caused by the immaturity of the entire autopilot system, after all, how the precision instrument could also be damaged or mistaken.

  25. I think both of them were at fault. For Tesla, they certainly did not make detailed use instructions and danger reminders. Otherwise, drivers would not risk their lives to watch movies, actually, this accident is very ridiculous. For the driver, he starts to use the automatic driving function without understanding the driving instructions, which is tantamount to suicide.

  26. Obviously, both Tesla and the driver were at fault in this traffic accident. But from my personal point of view, I think the driver has to take most of the responsibility. First of all, there are detailed instructions for use in automatic driving. Second, when using drones, you must agree to the terms. The driver still violated the terms after accepting the terms. Finally an accident occurred, the driver must take most of the responsibility, even he dead.

  27. The autopilot system needs to develop, which means the error cannot be totally avoided. We need to use it as a supplementary system before it fully developed. However, the driver didn’t think so. It’s no doubt that the driver should take the responsibility. The company had told him the system was in testing. Thus, he should be familiar with the situation of the immature system.
    I think the engineer did nothing wrong because the road testing of the system is necessary. And all they can do is to inform the driver before they using it. I don’t understand why they need to take the liability for the accident.

  28. It is clearly driver’s fault. On the control of the vehicle is the most important thing for a driver. No matter how advanced system can replace the driver. That’s why the air force still use pilots to perform task. But the drones only perform inspections. The machine is still machine, especially it is still under beta test. The condition on the road is always changing which is more in need of driver to judge.

  29. As I see, any accident is not caused by a single error. There must be multiple faults in a system to cause the crash. The driver of Tesla, the engineer and even the driver of the truck. However, in this case, I believe the engineer of Tesla should take the main responsibility for this accident. The reports pointed out the system fails to recognize the trailer under sunshine, the engineer did not find and fix this deadly defect of the system or tell the possibility of failure of the system under this situation to the costumer. Even if the driver noticed there were something wrong and took actions, the accident cannot be avoided because there is not enough time to response and brake. In fact, Tesla just illustrates the risk with misty in the handbook and do not have any safety or redundancy equipment to prevent the driver from doing anything irrelevant to driving such as leaving hands from the steering wheel or watching movies.

  30. From my point of view, the driver should take the major responsibility of this. The system can be used only if the user agrees to the terms. If he or she didn’t, they cannot use the system. That was also the Disclaimer provided by the engineer and company. Meanwhile, the driver was the side that didn’t comply with the rules. Although the system failure was the direct reason for it, the driver should face the consequences. However, the engineer also should take minor responsibility for the accident. Because the system failure was caused by the system defect.
    But, recently, Uber’s self-driving car crash happened. For this accident, the engineer should take the whole responsibility for it. Because the car was designed as running without the driver. That means the Uber let the system take the whole control of the car all the time which means the system should be mature enough. However, the accident still happened. What I want to say is it depend on the terms provided by the company and the usage of the system.
    Back to this accident, this is just a simple accident that caused by the dangerous driving.

  31. The responsibility of drivers and engineers cannot be ignored. The driver did not drive the vehicle as required. When he was supposed to concentrate, he watched the video and ignored the traffic situation. He was not only responsible for his own responsibilities but also was not responsible for other people. The engineers did not improve the autopilot system and failed to make it sufficient to deal with all possible conditions. This is a secondary factor that caused the accident.

  32. Obviously, the driver needs to take most of the responsibility. In fact, I can’t understand why someone would give their lives to an imperfect system, which is no different from suicide. And this behavior not only caused his own death, the damage of Tesla’s reputation and the loss of others’ property and the threat of security.

  33. The engineer has no major responsibility in this accident. This system is not meant to be unmanned in the full sense. The specification clearly states that if an emergency occurs, the driver’s necessary measures are still needed. Therefore, the driver’s negligence was the main reason for the accident.

  34. I think engineers and Tesla need to bear most of the responsibility. First of all, objectively speaking, this autopilot system is incomplete and requires the driver to judge whether human operation is required. Second, the engineer did not force the driver to pay attention to road conditions. Finally, Tesla’s advertisements may have false propaganda, exaggerating product features and misleading consumers.

  35. I think both of them are faults. For engineers, it is a fact that this imperfect autopilot system directly caused the death of the driver. For the company, this driver is so superstitious about this imperfect system that the death of the company is closely related to the company’s advertising. For the driver, he did not carefully read the operating instructions and precautions, and ultimately led to death is his own responsibility. However, the accident made the company regulate its advertisements, allowed the engineers to complete the system, and made people no longer believe in the autopilot system. In summary, the accident still has a positive effect on society.

  36. From my first sight, the engineers and Tesla company should be blamed. Because they established an immature system. But after study this case further, I found that they should not be blamed. There is no perfect system in the word. For example, the most famous mobile system IOS, it is imperfect as well. Keep updating it is the key to prevent it make some mistake. But look at the driver’s action. He did not show his responsibility. His action strongly threatened the safety of the whole society. The company never announced the system could be fully autonomous. But how can behave like that? He is a driver, not a passenger.

  37. Don’t think the engineer need to be responsible for the crash. Sometimes, the engineer cannot consider all the situation that on the road. Also, the system need to learn as well. We can’t say that the system failure is caused by the engineer’s fault. The engineer can’t prevent this from happening. But they still need to take minor liability for this.
    For the driver, it’s different from the engineer, he can avoid the crash happening by just concentrate on the road. He should be taught in the driving school. He didn’t consider the other people, and put the other under the dangerous situation. That was a serious problem. He should take the main part liability for this.

  38. Neither engineers nor drivers can be exempted, but it is clear that the driver needs to take more responsibility. Although there is an autopilot system, it is absolutely forbidden to look out of the phone while hands are out of the steering wheel during driving. This behavior of the driver is not responsible for his own life and is also not responsible for other drivers on the road. The engineer knows that the system is still incomplete, but there is no limit to let people use it.

  39. In my opinion, the driver should take the most responsibility as the system is designed for driver to have better corporation with car to have a better driving experience and safer driving conditions. Being a driver it is his responsibility to take care of his own safety rather than watching movies and put all the responsibility to the system.

    However, the engineers should working hard to build a better system for the car. Personally i think currently the system should highlight its main function is a supporting system and warn the users to pay attention to the road and take care of themselves.

  40. This issue is quite controversial. But i think the engineers should take more responsibility of this accident, because the job of these engineers are making sure the car is well designed, well-manufactured and full-secured. However, they didn’t make it, and lead to this tragedy. On the other hand, the customers are quite innocent, they just trust what Tesla present and advocate for, and but the products. In this case, I think that the engineers should take more responsibility should be more blamed.

  41. I think the engineer should be to blamed. Based on the case, the system seems to be not ready for a road test because it cannot detect the incoming track. So the engineer should run the system in some simulation test before a real road test. So I think the engineer team should be account for this accident.

  42. To be honest, I think the engineer and the driver should take the minor liability for this, while the Tesla company should take the major liability. Engineers is just the developer and the driver is just the user. The autopilot system is the selling point of the car, the company would do anything to promote the sale. That means, the company only consider themselves. Although the engineers are the designer but the company made the call that use the immature system. I don’t think we should ignore the company’s fault in this accident.

  43. I think the driver should be to blamed. No matter what kinds of system that were testing. The driver should obey the regulation as a qualified driver, therefore, he should pay attention to the situation on the road. In this case, the driver should have time to adjust the car to prevent this accident or at lease reduce the damage. But the driver did nothing and let the autopilot system went into system error and fail to identify the track.

  44. I think the driver should be blamed. Because he didn’t look the traffic situation and focused on diving (should not play the smart phone when he drived). He also miscalculated the autopilot system. He didn’t have sense of the social responsibility.

  45. The engineer should consider most of the failure type, especially the recognition failure which is the most important part of the system. Due to that, the engineer should take the most liability for this accident. Actually, in this accident, the driver has paid for it due to the fault of the engineers. Therefore, the driver shouldn’t be blamed.

Leave a Reply