Controversial Autopilot: Tesla forced to recall two million vehicles


New blow for Tesla: the American electric car manufacturer has initiated a recall in the United States and Canada of some two million vehicles for an increased risk of collision linked to “Autopilot”, their vehicle assistance system. controversial conduct. After a two-year investigation, the US Highway Safety Agency (NHTSA) indicates that in certain circumstances, the assisted driving function of Tesla vehicles may lend itself to misuse, increasing the risk of collision.

A remote update

The investigation found that the design of the system is likely to cause “inadequate driver engagement and usage controls”, “which can lead to misuse of the system”, a spokesperson for the company said on Wednesday. NHTSA to AFP. If a driver uses driver assistance incorrectly, in poor conditions, or fails to recognize whether the function is activated, the risk of an accident could be higher, explains the NHTSA, whose conclusions were sent to the manufacturer by mail on Tuesday.

For its part, Tesla acknowledged in its information report that the controls put in place on its autopilot system “may not be sufficient to prevent misuse by the driver”, according to the authority’s email. In Canada, authorities said about 193,000 vehicles were also recalled for the same reasons. The vehicles affected are certain Model S produced between 2012 and 2023 and equipped with the system, all Model

They will receive an over-the-air update, which was to begin rolling out from December 12, 2023. This is not the first time that “Autopilot”, Tesla’s assisted driving system, has been questioned. Tesla has been offering assisted driving on all its new cars for several years. The key is the ability for the system to adapt speed to traffic and maintain course on a lane. In all cases, the driver must remain vigilant, with their hands on the steering wheel, Tesla specifies on its site.

The manufacturer offers and tests more advanced options such as lane change, parking assistance or traffic light consideration, integrated depending on the country in the “Improved Autopilot” or “Fully autonomous driving capability” packages.

False impression that the car drives itself

But the software has been accused by many industry players and experts of giving drivers the false impression that the car is driving itself, with the risk of causing potentially serious accidents. At the beginning of November, Tesla won a first round on the role of its “autopilot” in a fatal accident near Los Angeles, in 2019. In this case, the jury considered that the driving assistance system did not present any manufacturing defect. Another case concerning the role of this assistance system, in another fatal accident, is expected to go to trial next year.

NHTSA began an assessment process in 2021 to investigate 11 incidents involving stationary first responder vehicles and Tesla vehicles with activated driving assistance systems. Consequently, and “without agreeing with the analysis” of the NHTSA, Tesla decided on December 5 to initiate “a recall for a software update”, explains the highway authority. This will notably add alerts to encourage drivers to maintain control of their vehicle, “which involves keeping their hands on the wheel,” notes the authority.

This recall is “something of a setback for Tesla because it confirms that the technology that is the strength of its vehicles has problems. Even if Tesla refutes some of the NHTSA’s claims, it will raise questions in the minds of consumers.” , estimated Neil Saunders, analyst at Globaldata, in a note. However, since “the problems can be resolved via a software update, this is not a financial disaster for Tesla and the problems should be quickly resolved,” he adds.



Source link -75