Self-Driving Cars
They are supposed to be safer than the average drunk, distracted or texting driver. But if they get into an accident, who is responsible? The owner? the car company? The programmers?
“Self-driving cars” is an umbrella term which covers a wide range of technology developed to advance the capabilities of dynamic driving task (DDT) software. DDT is defined as “all of the real-time operational and tactical functions required to operate a vehicle in on-road traffic” (ANSI 2016). Terms such as self-driving cars, and unmanned cars, are often misused and are too vague. Within the broad category of “self-driving cars”, the long-standing Society of Automobile Engineers (SAE) offers a range of definitions ranging from “no automation” to “full automation” on a scale numbered 0-5 which measures “the level of driving automationexhibited in any given instance…[as] determined by the feature(s) that are engaged” (SAE 2018).
In the event of an accident, any assessment of the liability of the human in a self-driving car vs the manufacturer must take account of the vehicle’s level of automation. In instances where there is “partial automation” under the SAE guidelines, the human driver is equally responsible for all outcomes of an accident during a DDT, just as the driver of any non-highly automated vehicle would be. Vehicles categorised under level 2 of the SAE scale (“partial automation”) are partially subject to liability based on the balance between technical failure and human error. Situations involving vehicles categorised as being on levels 3-4 of the SAE scale (“conditional driving automation” and “high driving automation”) are subject to liability for accidents even if a driver is present. As the SAE has stated about levels 3-4, “you are not driving when these features are engaged, even when you are in the drivers seat” (SAE 2018).
In an extreme case (level 5) of automation (“full automation”) the vehicle is completely responsible for accidents, because the driver has no opportunity to intervene. At this stage, “An automated driving system performs all aspects of the dynamic driving task in all driving modes even if the human driver will… not respond appropriately to a request to intervene.” (Barabás et al 2017). In this case, an example is the makers of the Boeing 747 planes being held responsible for the recent spike in crashes (Cnet 2011). According to CNET, a report into these crashes said, “The MCAS [manoeuvering characteristics augmentation system] function was not a fail-safe design and did not include redundancy.” Though these planes are not motor cars, they are examples of fully-automated piloting features, therefore when something goes wrong it is not the fault of the pilot, nor the victims, but rather the responsibility of the manufacturer. In these cases, it can be seen that there was an expectation that the system would perform in a predictable manner, which it failed to do. No intervention could have prevented these accidents, which is the same for any self-driving systems in “full automation” or “high driving automation” (SAE 2018).
Though conditional driving automation does enable the driver to intervene when necessary, the fallback systems of these DDT vehicles are responsible for acting accordingly. In the early rollout of Tesla’s autopilot feature, the driver was alerted to touch the steering wheel when the car felt no interaction for prolonged periods. However, accidents would still occur, and Tesla became subject to legal action. Tesla was then responsible for adding better protection within its autopilot feature, which forces vehicles to pull over or come to a halt if no driver engages as a result of this level 3 act (Barabás et al 2017).. DDT tasks where the vehicle takes full control are the responsibility of the car manufacturer no matter what the driver does. However, if the vehicle indicates that the driver must drive, the responsibility for anything that happens afterwards falls to the driver, not the manufacturer (SAE 2018).
When DDT is at the SAE level 2 automation of driving, the driver is responsible for accidents when the driver support features are enabled. Examples of these situations include the lane-keeping and adaptive cruise control features of vehicles where the driver is responsible for object and event detection and response (OEDR). At level 2, unlike levels 3-5 of the SAE chart, OEDR becomes the responsibility of the driver in an accident (SAE 2018).
When considering liability for accidents involving self-driving vehicles, it is important to remember that the parties involved are not just the manufacturers and the driver, crashes can also occur because of faults in the programming. Technology companies who create autonomous driving software may create tools alone or in partnership with large car manufacturers (Volvo 2020). However, car manufacturers are ultimately responsible for the testing of their vehicles therefore programmers, either individually or as a group, should not be held liable for any accidents. Technology companies may then rollout final solutions within a finished vehicle, but the manufacturer of the car often claims liability when accidents during “self driving tasks” occur, as they should (Iozzio 2016).
In conclusion, the term “self-driving” casts a wide net, covering a variety of different levels of automation. Accordingly, liability for an accident varies depending on the vehicle’s level of automation. In cases where a vehicle promises complete automation, then the car manufacturer is responsible for any accidents. Where there are lapses in the performance of a level 5 “self-driving” feature, the driver cannot be held responsible for accidents. At levels 3-4 the manufacturers of the vehicle are partially subject to liability even when a driver is present, but if the vehicle indicates that the driver must drive then responsibility transfers to the driver. At levels 2 and below, liability for an accident is, as it would be in any standard motor vehicle, dependent on the balance of technical and human error. As we can see, it is necessary to break down the term “self-driving cars” to fully understand the range of possibilities that this term covers, and the subsequent potential range of liabilities in the event of an accident.
References
American National Standards Institute (ANSI). 2016. “Defining Self-Driving Car and Automated Vehicle Systems in SAE J 3016.” Accessed October 2, 2020. https://blog.ansi.org/2016/11/defining-automated-vehicle-system-sae-j-3016/#gref
Barabás, I., A Todoruț, N Cordoș, and A Molea. 2017. “Current Challenges in Autonomous Driving.” IOP Conference Series: Materials and Engineering 252 012096 https://doi:10.1088/1757-899X/252/1/012096
German, Kent. 2020. “FAA, EU Finish Boeing 737 Max Recertification Flights.” CNET, September 16, 2020. Accessed October 3, 2020. https://www.cnet.com/news/boeing-737-max-8-all-about-the-aircraft-flight-ban-and-investigations/
Iozzio, Corinne. 2016. “Who’s Responsible When a Self—Driving Car Crashes?” Scientific American, May 1, 2016. Accessed October 4, 2020. https://www.scientificamerican.com/article/who-s-responsible-when-a-self-driving-car-crashes/
Society of Automobile Engineers (SAO). 2018. “Taxonomy for Terms Related to Driving Automation Systems for On-Road Motor Vehicles.” Accessed October 1, 2020. https://www.sae.org/standards/content/j3016_201806/
Volvo. 2020. “Volvo Cars and Veoneer Complete Divide of Zenuity.” July 2, 2020. Accessed October 4, 2020. https://www.media.volvocars.com/global/en-gb/media/pressreleases/269593/volvo-cars-and-veoneer-complete-divide-of-zenuity