The American manufacturer is recalling some two million vehicles on the road due to a failure of its “autopilot” system which could cause an increased risk of collision.
New blow for Tesla: the American electric car manufacturer has initiated a recall in the United States of some two million vehicles for an increased risk of collision linked to “Autopilot”, their controversial driving assistance system.
At the end of a two-year investigation, the American Highway Safety Agency (NHTSA) announced its conclusions in a letter addressed to the manufacturer on Tuesday. It indicates that in certain circumstances, the assisted driving function of Tesla vehicles may lend itself to misuse, causing incur an increased risk of collision.
Specifically, the investigation found that the design of the system is likely to cause “inadequate driver engagement and usage controls,” “which may lead to improper use of the system,” a spokesperson said. of the NHTSA in an email to AFP on Wednesday.
If a driver uses driver assistance incorrectly, in poor conditions, or fails to recognize whether the function is activated, the risk of an accident could be higher, explains the NHTSA.
For its part, Tesla acknowledged in its information report that the controls put in place on its autopilot system “may not be sufficient to prevent misuse by the driver”again according to the authority’s email.
Which models are affected?
Affected vehicles are certain Model S produced between 2012 and 2023 and equipped with the system, all Model
They will receive an over-the-air update, which was expected to start rolling out from December 12, 2023.
This is not the first time that “Autopilot”, Tesla’s assisted driving system, has been questioned. Tesla has been offering assisted driving on all its new cars for several years. The key is the possibility for the system to adapt the speed to the traffic and to maintain the course on a lane. In all cases, the driver must remain vigilantwith hands on the steering wheel, specifies Tesla on its website.
The manufacturer also offers and tests more advanced options such as lane changing, parking assistance or taking traffic lights into account, integrated depending on the country in the “Improved Autopilot” or “Fully autonomous driving capability” packages. .
Numerous accidents
But the software has been accused by many industry players and experts of giving drivers the false impression that the car is driving itself, with the risk of causing potentially serious accidents.
At the beginning of November, Tesla won a first round on the role of its “autopilot” in a fatal accident near Los Angeles, in 2019. In this case, the jury considered that the driving assistance system did not present any manufacturing defect. Another case concerning the role of this assistance system, in another fatal accident, is expected to go to trial next year.
The NHTSA, for its part, began an evaluation process in 2021 to investigate eleven incidents involving stationary first responder vehicles and Tesla vehicles with the assisted driving system activated.
Therefore, and “without agreeing with the analysis” of the NHTSA, Tesla decided on December 5 to hire “a reminder for a software update”explains the road authority.
This will notably add additional alerts to encourage drivers to maintain control of their vehicle, “which involves keeping their hands on the wheel,” notes the authority.
The group has already carried out several recalls in the United States last year to remotely modify potentially problematic software. At the beginning of 2022, Tesla had to deactivate an option that allowed cars not to come to a complete stop at “Stop” under certain conditions.
The producer, which raked in $81.5 billion in sales last year, confirmed in October that it plans to produce 1.8 million vehicles in 2023.