Tesla, in Autopilot against stationary vehicles: need a recall?

Tesla, in Autopilot against stationary vehicles: need a recall?

Tesla, in Autopilot against stationary vehicles

The love story between Tesla and NHTSA, the American national association that deals with safety on motorways, continues at full speed: in recent years the relations between these two realities have been very close, with particular interest in driving systems autonomous proposed by Tesla, Autopilot and Full Self-Driving.

Tesla Autopilot is the system affected by the latest investigation launched by the NHTSA last August; following 16 different accidents involving Tesla and vehicles stopped on the roadside, the association has launched a 'Preliminary Evaluation', a sort of preliminary assessment, to understand if a recall or an engineering analysis is necessary of the system.




"Improper use of the systems available on the vehicle, or improper use of the vehicle itself, does not preclude the existence of a system defect. These level 2 ADAS systems must take into account and foresee a whole series of behaviors, even unexpected or involuntary, in all the moments in which the system is active. " NHTSA writes.

According to the American association, car manufacturers and those who design autonomous driving systems have the important task of finding the method to keep the attention of those in the driver's seat high, even while the car is practically driving on its own.

NHTSA's engineering analysis usually takes a year during which as much data and evidence as possible is collected, and once this analysis is concluded cases can be two: either it has been concluded that there is no problem to be solved, or the House is obliged to start a recall campaign.






NHTSA deepens its probe into Tesla collisions with stationary emergency vehicles

The National Highway Traffic Safety Administration (NHTSA) has deepened (PDF) its investigation into a series of Tesla crashes involving first responders to an engineering analysis. As The Washington Post explains, that's the last stage of an investigation, and the agency typically decides within a year if a vehicle should be recalled or if the probe should be closed. In addition to upgrading the probe's status, the investigation now covers 830,000 units, or almost all the Tesla Model Y, Model X, Model S and Model 3 vehicles the company has sold since 2014.


This development expands upon the investigation the NHTSA initiated back in 2021 following 11 collisions of Tesla vehicles with parked emergency responders and trucks. Since then, the agency has identified and added six more incidents that occurred over the past couple of years. In most of those crashes, Autopilot gave up vehicle control less than one second before impact, though Automatic Emergency Braking intervened in at least half of them. 


The NHTSA also found that the first responders on the road would've been visible to the drivers at an average of eight seconds before impact. Plus, forensic data showed no driver took evasive action between 2 to 5 seconds prior to impact even though they all had their hands on the wheel. Apparently, nine of the 11 vehicles originally involved in the investigation exhibited no driver engagement visual or chime alerts until the last minute before the collision. Four of them didn't exhibit any engagement visual or chime alert at all. 


The NHTSA also looked into 191 crashes not limited to incidents involving first responders. In 53 of those collisions, the agency found that the driver was 'insufficiently responsive' as evidenced by them not intervening when needed. All these suggest that while drivers are complying with Tesla's instructions to make sure they have their hands on the wheel at all times, they're not necessarily paying attention to their environment. 


That said, the NHTSA noted in its report that 'a driver's use or misuse of vehicle components, or operation of a vehicle in an unintended manner does not necessarily preclude a system defect.' As University of South Carolina law professor Bryant Walker Smith told The Post, monitoring the position of a driver's hands isn't effective enough, because it doesn't ensure a driver's capability to respond to what they encounter on the road. 


In addition, the NHTSA noted that the ways a driver may interact with the system is an important design consideration for Level 2 autonomous driving technologies. These systems still aren't full autonomous and still mostly depend on the human driver, after all. 'As such, ensuring the system facilitates the driver's effective performance of this supervisory driving task presents an important safety consideration,' the agency wrote.


All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.