DETROIT (AP) — A design flaw in Tesla’s Autopilot semi-autonomous driving system and driver inattention combined to cause a Model S electric car to slam into a firetruck parked along a California freeway, a government investigation has found.
The National Transportation Safety Board determined that the driver was overly reliant on the system and that Autopilot’s design let him disengage from driving.
The agency released a brief report Wednesday that outlined the probable cause of the January 2018 crash in the high occupancy vehicle lane of Interstate 405 in Culver City near Los Angeles.
The findings raise questions about the effectiveness of Autopilot, which was engaged but failed to brake in the Culver City crash and three others in which drivers were killed since 2016.
No one was hurt in the I-405 crash involving a 2014 Tesla Model S that was traveling 31 mph at the time of impact, according to the report.
The crash occurred after a larger vehicle ahead of the Tesla, which the driver described as an SUV or pickup truck,moved out of its lane and the Tesla hit the truck that had been parked with its emergency lights flashing while firefighters handled a different crash.
The probable cause of the rear-end crash was the driver’s lack of response to the firetruck “due to inattention and overreliance on the vehicle’s advanced driver assistance system; the Tesla Autopilot design, which permitted the driver to disengage from the driving task, and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer,” the NTSB wrote in the report.
Tesla has said repeatedly that semi-autonomous system is designed to assist drivers, who must pay attention and be ready to intervene at all times. The company says Teslas with Autopilot are safer than vehicles without it, and that the system does not prevent all crashes.
CEO Elon Musk has promised a fully autonomous system next year using the same sensors as current Teslas, but with a more powerful computer and software. Current Teslas have more sensors than the 2014 model in the crash.
The report says the Tesla’s automatic emergency braking did not activate, and there was no braking from the driver, a 47-year-old man commuting to Los Angeles from his home in Woodland Hills. Also the driver’s hands were not detected on the wheel in the moments leading to the crash, the report said.
Cellphone data showed the driver was not using his phone to talk or text in the minutes leading up to the crash, but the NTSB could not determine if any apps were being used.
A statement from a driver in a nearby vehicle provided by Tesla said the driver appeared to be looking down at a cellphone or other device before the crash.
The NTSB’s finding is another black mark against the Autopilot system, which was activated in three fatal crashes in the U.S., including two in Florida and one in Silicon Valley.
In the Florida crashes, one in 2016 and another in March of this year, the system failed to brake for a semi turning in front of the Teslas, and the vehicles went under the turning trailers. In the other fatality, in Mountain View, California, in March of 2018, Autopilot accelerated just before the Model X SUV crashed into a freeway barrier, killing its driver, the NTSB found.
The NTSB investigates highway crashes and makes safety recommendations largely to another federal agency, the National Highway Traffic Safety Administration, which has the power to seek recalls and make regulations.
David Friedman, a former acting NHTSA administrator who now is vice president of advocacy at Consumer Reports, said Tesla has known for years that its system allows drivers to not pay attention, yet it hasn’t taken the problem seriously.
Autopilot can steer a car in its lane, change lanes with driver permission, keep a safe distance from vehicles ahead of it and automatically brake to avoid a crash.
Some drivers will always rely too much on driver assist systems, and the system must be programmed to handle that, Friedman said. Autopilot, he said, gives drivers a warning if it doesn’t detect torque on the steering wheel at varying intervals. But unlike a similar system from General Motors, it does not watch the driver’s eyes to make sure he or she is paying attention, Friedman said.
“It’s unrealistic to try to train people for automation,” Friedman said. “You’ve got to train automation for people.”
Tesla’s sensors were unable to see the side of an 18-wheeler in previous crashes, he said. “Is it that shocking that it can’t see a firetruck? We’ve known about this for at least three years,” said Friedman, who is calling on NHTSA to declare Autopilot defective and force Tesla to recall it so it keeps drivers engaged.
The Center for Auto Safety, another advocacy group, also called for a recall.
“Put simply, a vehicle that enables a driver to not pay attention, or fall asleep, while accelerating into a parked fire truck is defective and dangerous,” the group said in a statement. “Any company that encourages such behavior should be held responsible, and any agency that fails to act bears equal responsibility for the next fatal incident.”
NHTSA said it will review the NTSB report “and will not hesitate to act if NHTSA identifies a safety-related defect.”
Tesla said in a statement Wednesday that Autopilot repeatedly reminds drivers to remain attentive and prohibits use of the system when warnings are ignored.
“Since this incident occurred, we have made updates to our system including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated,” the statement said. Tesla said the frequency of the warnings varies based on speed, acceleration, surrounding traffic and other factors.
In the Culver City crash, the larger vehicle ahead of the Tesla changed lanes three to four seconds before the crash, revealing the parked fire truck, the NTSB said.
“The system was unable to immediately detect the hazard and accelerated the Tesla toward the stationary truck,” the report said. The system did spot the firetruck and issued a collision warning to the driver just under a half-second before impact — too late for a driver to act, the agency wrote.
The NTSB found that a stationary vehicle in the Tesla’s field of view is a challenge for the system to assess a threat and brake. It says that detection of stationary objects is challenging for all manufacturers of driver-assist systems.