Tesla Autopilot, distracted driving to blame in deadly 2018 crash
By
Feb 25, 2020
The National Transportation Safety Board said Tuesday that Tesla’s Autopilot driver assistance system was one of the probable causes of a fatal 2018 crash into a concrete barrier. In addition, the safety board said the driver was playing a mobile game while using Autopilot before the crash, and investigators also determined he was overly confident in Autopilot’s capabilities.
The safety board arrived at those probable causes after a nearly two-year investigation into the crash. NTSB investigators also named a number of contributing factors, including that the crash attenuator in front of the barrier was damaged and had not been repaired by California’s transportation department, Caltrans, in a timely manner. Had the crash attenuator been replaced, NTSB investigators said Tuesday that the driver, Walter Huang, likely would have survived.
The NTSB shared its findings at the end of a three-hour-long hearing on Tuesday. During the hearing, board members took issue with Tesla’s approach to mitigating the misuse of Autopilot, the National Highway Traffic Safety Administration’s lax approach to regulating partial automation technology, and Apple — Huang’s employer — for not having a distracted driving policy. (Huang was playing the mobile game on a company-issued iPhone.)
“In this crash we saw an over-reliance on technology, we saw distraction, we saw a lack of policy prohibiting cell phone use while driving, and we saw infrastructure failures, which, when combined, led to this tragic loss,” NTSB chairman Robert Sumwalt said at the end of the hearing on Tuesday. “We urge Tesla to continue to work on improving their Autopilot technology and for NHTSA to fulfill its oversight responsibility to ensure that corrective action is taken where necessary. It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.”
The investigators’ findings
On March 23rd, 2018, Huang was traveling south on US-101 using Autopilot on his way to work in Mountain View, California. He eventually approached a section of the highway where State Route 85 begins by splitting off to the left of US-101. He was in the left-most HOV lane thanks to the clean air sticker that electric vehicle owners are eligible for.
As the left exit lane for State Route 85 started to split off to the left, the Autopilot system in Huang’s Model X briefly lost sight of the lines marking his lane. (Investigators showed photos on Tuesday of the worn-down lane markers.) Autopilot started following the right-most lane marker of the exit lane, and Huang’s Model X steered into the “gore area” that separates the exit lane from the rest of the highway. A second or so later, his Model X smashed into the damaged crash attenuator and the concrete barrier. He died a few hours later at a local hospital.
Investigators recapped the crash during Tuesday’s hearing, presenting evidence that the NTSB made public last week. At the end of the presentation, and after a few hours of questions from the five members of the safety board, the team of investigators presented 23 findings, and made nine new safety recommendations, in addition to naming the probable causes.
One of the team’s findings was that the crash was made possible, in part, because of the limits of Autopilot’s vision-based processing system. Tesla CEO Elon Musk has long argued that autonomous cars don’t need LIDAR (a laser sensor that can build a real-time 3D model of the world), and so Autopilot is designed around a system of cameras, as well as ultrasonic sensors and a forward-facing radar. That reliance on cameras has limits, investigators said Tuesday, and the way Huang’s car drifted out of the HOV lane is an example of those limits. In fact, as the investigators found, Huang’s car had done this same dangerous maneuver multiple times in the days and weeks before his crash.
In addition, the investigators said that Tesla’s method of making sure drivers are paying attention while using Autopilot — using a torque sensor to measure force on the steering wheel — “did not provide an effective means of monitoring the driver’s level of engagement with the driving task.”
NTSB investigators also found that the Model X’s forward collision warning system didn’t alert Huang to the coming impact, nor did it slow the vehicle down at all; in fact, the Model X sped up before impact because the system thought it was free to resume the 75 mile per hour cruise control speed Huang had set. The NTSB said Tesla had not designed these emergency systems to handle a situation like this. It also placed some blame on the National Highway Traffic Safety Administration for not requiring companies like Tesla to make those systems work in a crash like this.
If Tesla doesn’t add new safeguards that limit the use of the Autopilot outside of its advertised applications, investigators wrote, then the “risk for future crashes will remain.”
The investigators gave similar weight to distracted driving’s role in Huang’s death. They found that he was playing a mobile game prior to the crash, and said that was “likely” the reason why he didn’t try to turn away from the barrier. The team said that countermeasures, like limiting distracting features or locking out smartphones entirely, would help lower the rate of crashes tied to distracted driving.
Apple does have a feature that turns off many features while driving, but one NTSB board member, Thomas Chapman, said he was “frankly unaware I had such an option on my phone.”
Source:Tesla Autopilot, distracted driving to blame in deadly 2018 crash