DETROIT (AP) — A U.S. investigation into Teslas operating on partially automated driving systems that have crashed into parked emergency vehicles has moved a step closer to a recall.
The National Highway Traffic Safety Administration said Thursday that it is upgrading the probe to an engineering analysis, another sign of increased scrutiny of the electric vehicle maker and automated systems that perform at least some driving tasks.
An engineering analysis is the final stage of an investigation, and in most cases NHTSA decides within a year if there should be a recall or the probe should be closed.
Documents posted Thursday by the agency raise some serious issues about Tesla‘s Autopilot system. The agency found that it’s being used in areas where its capabilities are limited, and that many drivers aren’t taking action to avoid crashes despite warnings from the vehicle.
The agency said it has reports of 16 crashes into emergency vehicles and trucks with warning signs, causing 15 injuries and one death.
The probe now covers 830,000 vehicles, almost everything that Austin, Texas, carmaker has sold in the U.S. since the start of the 2014 model year.
Investigators will evaluate additional data, vehicle performance and “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks undermining the effectiveness of the driver’s supervision,” the agency said.
A message was left Thursday seeking comment from Tesla.
In the majority of the 16 crashes, the Teslas issued forward collision alerts to the drivers just before impact. Automatic emergency braking intervened to at least slow the cars in about half the cases. On average, Autopilot gave up control of the Teslas less than a second before the crash, NHTSA documents said.
In documents detailing the engineering analysis, NHTSA wrote that it also is looking into crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.
The agency found that in many cases, drivers had their hands on the steering wheel yet failed to take action to avoid a crash. “This suggests that drivers may be compliant with the driver engagement strategy as designed,” the agency wrote.
Investigators also wrote that a driver’s use or misuse of the driver monitoring system “or operation of a vehicle in an unintended manner does not necessarily preclude a system defect.”
The agency will have to decide if there is a safety defect before pursuing a recall.
In total, the agency looked at 191 crashes but removed 85 of them because other drivers were involved or there wasn’t enough information to do a definite assessment. Of the remaining 106, the main cause of the crash appears to be running Autopilot in areas where it has limitations, or conditions can interfere with its operations. “For example, operation on roadways other than limited access highways, or operation in low traction or visibility environments such as rain, snow or ice.”
NHTSA began its inquiry in August of last year after a string of crashes since 2018 in which Teslas using the company’s Autopilot or Traffic Aware Cruise Control systems hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board, or cones warning of hazards.
Copyright © 2022 The Washington Times, LLC.