NTSB Cites Driver Errors, Overreliance on Automation, Lack of Safeguards as Causes of Autonomous Vehicle Crash
September 12, 2017

Pete Goldin
ITSdigest

The National Transportation Safety Board (NTSB) determined that a truck driver's failure to yield the right of way and a car driver's inattention due to overreliance on vehicle automation are the probable cause of the fatal May 7, 2016, crash near Williston FL involving an autonomous vehicle.

The NTSB also determined the operational design of the Tesla's vehicle automation permitted the car driver's overreliance on the automation, noting its design allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.

NTSB pointed out that if automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains.

"While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles," said NTSB Chairman Robert L. Sumwalt III. "Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla's Autopilot' system, are designed to assist drivers with specific tasks in limited environments. These systems require the driver to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla's driver from using the car's automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened."

NTSB found the Tesla's automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla's path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert, and the automatic emergency braking did not activate

NTSB also found that the way the Tesla "Autopilot" system monitored and responded to the driver's interaction with the steering wheel was not an effective method of ensuring driver engagement. It is important to note that Tesla made design changes to its "Autopilot" system following the crash. The change reduced the period of time before the "Autopilot" system issues a warning/alert when the driver's hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.

The NTSB issued several recommendations including:

■ Event data should be captured and available in standard formats on new vehicles equipped with automated vehicle control systems

■ Manufacturers should incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards

■ Applications should be developed to more effectively sense a driver's level of engagement and alert when engagement is lacking

■ Manufacturers should report incidents, crashes, and exposure numbers involving vehicles equipped with automated vehicle control systems.

NTSB also reiterated two recommendations issued to the National Highway Traffic Safety Administration in 2013:

■ The need for minimum performance standards for connected vehicle technology for all highway vehicles

■ The need to require installation of the technology, once developed, on all newly manufactured highway vehicles

A statement from the National Safety Council reads: Labeling cars as “self-driving” or technologies as “auto-pilot” implies the driver can abdicate responsibility to the machine. The nomenclature is misleading. In reality, advanced driver assistance technologies can work with us and help mitigate driver error – but they cannot work without us. This crash serves as a stark reminder that cars cannot yet drive themselves. We must understand the systems in our vehicles and how to properly interface with them so they provide the intended safety benefits. Shutting them off because we do not understand them is just as counterproductive as depending on them to make decisions for us. We are decades away from an autonomous fleet. It is critical to remember that the driver still is the car’s best safety feature, and humans are the “self” in “self-driving.”

Share