New blow for Tesla: the American electric car manufacturer has initiated a recall in the United States of some two million vehicles for an increased risk of collision linked to “Autopilot”, their controversial driving assistance system.
At the end of a two-year investigation, the American Highway Safety Agency (NHTSA) announced its conclusions in a letter addressed to the manufacturer on Tuesday.
It indicates that in certain circumstances, the assisted driving function of Tesla vehicles may lend itself to misuse, leading to an increased risk of collision.
Specifically, the investigation found that the design of the system is likely to cause “inadequate driver engagement and usage controls,” “which may lead to improper use of the system,” a spokesperson said. of the NHTSA in an email to AFP on Wednesday.
If a driver uses driver assistance incorrectly, in poor conditions, or fails to recognize whether the function is activated, the risk of an accident could be higher, explains the NHTSA.
For its part, Tesla acknowledged in its information report that the controls put in place on its autopilot system “may not be sufficient to prevent misuse by the driver,” according to the authority’s email.
The recall covers models Y, S, 3 and X produced between October 5, 2012 and December 7. More specifically, it concerns certain S models produced between 2012 and 2023 and equipped with the system, all X models produced between 2016 and 2023, all 3 models produced between 2017 and 2023, and all Y models produced since 2020.
They will receive an over-the-air update, which was expected to start rolling out from December 12, 2023.
This is not the first time that Autopilot, Tesla’s assisted driving system, has been questioned. Tesla has been offering assisted driving on all its new cars for several years.
The key is the ability for the system to adapt speed to traffic and maintain course on a lane. In all cases, the driver must remain vigilant, with their hands on the steering wheel, Tesla specifies on its website.
The manufacturer also offers and tests more advanced options such as lane change, parking assistance or taking traffic lights into account, integrated depending on the country in the “Improved Autopilot” or “Fully autonomous driving capability” packs. .
Several accidents
But the software has been accused by many industry players and experts of giving drivers the false impression that the car is driving itself, with the risk of causing potentially serious accidents.
At the beginning of November, Tesla won a first round on the role of its “autopilot” in a fatal accident near Los Angeles in 2019. In this case, the jury considered that the driver assistance system did not present any problems. manufacturing defect. Another case involving the system’s role in another fatal crash is expected to go to trial next year.
The NHTSA, for its part, began an evaluation process in 2021 to investigate eleven incidents involving stationary first responder vehicles and Tesla vehicles with the assisted driving system activated.
Consequently, and “without agreeing with the analysis” of the NHTSA, Tesla decided on December 5 to initiate “a recall for a software update,” explains the highway authority.
This will notably add additional alerts to encourage drivers to maintain control of their vehicle, “which involves keeping their hands on the wheel,” notes the authority.
The group has already carried out several recalls in the United States last year to remotely modify potentially problematic software. At the start of 2022, Tesla had to deactivate an option that allowed cars to not come to a complete stop at a “Stop” under certain conditions.
The producer, which raked in US$81.5 billion in sales last year, confirmed in October that it plans to produce 1.8 million vehicles in 2023.