a new fatal accident raises questions, road safety gets involved


National Highway Traffic Safety (NHTSA) is opening a new investigation into Tesla’s Autopilot semi-autonomous driving system after a Tesla vehicle crashed into a fire truck, killing the driver. This is not the first time that the manufacturer has to deal with the authorities on this subject.

Is Autopilot really as reliable as Tesla wants us to believe? We are logically entitled to ask the question. Because for several years now, cases involving this semi-autonomous driving system, which arrived in 2015 on the Model S, have been increasing. And the recent accident which took place on February 18 last in Contra Costa County, California doesn’t help matters.

A new survey

A few days ago, firefighters were responding to a road accident, with their truck parked in the middle of the road to block access and secure the area. When suddenly a Tesla crashed into the rescue vehicle, instantly killing its driver while the passenger was extricated and taken to hospital. Four firefighters were also evacuated.

If for the time being, nothing indicates that the Autopilot was engaged at the time of the accident, the NHTSA (National Highway Traffic Safety Administration), the equivalent of Road Safety in the United States still wants explanations from the from Tesla. And for good reason, the organization has already been investigating a dozen accidents for two years having involved emergency vehicles with the brand’s electric cars.

Slow down and move over when approaching emergency vehicles. Truck 1 was struck by a Tesla while blocking I-680 lanes from a previous accident. Driver pronounced dead on-scene; passenger was extricated & transported to hospital. Four firefighters also transported for evaluation. pic.twitter.com/YCGn8We1bK

— Con Fire PIO (@ContraCostaFire) February 18, 2023

Indeed, the level 2 semi-autonomous driving system would tend to deactivate when approaching road accidents or simply not seeing fire trucks parked on the side of the highway.

. If Tesla had operated a remote update supposed to correct the problem in 2021, as recalled Automotive News, so it looks like this one didn’t really work. Be careful however, the police do not yet know if the Autopilot was active at the time of the accident.

Remember that the manufacturer has decided to upgrade its Autopilot, removing the sensors in order to make it rely solely on the cameras. A system called Tesla Vision deemed more efficient by the brand.

If the manufacturer claimed that its semi-autonomous driving system would make it possible to avoid approximately 40 accidents per day, it is therefore not entirely the opinion of the NHTSA which has launched several investigations. The latter has just ordered the recall of more than 362,000 cars of the brand, due to a failure of the Autopilot, which would then not respect the Highway Code and who would behave dangerously.

Many bugs

Among them, phantom braking, well known to Tesla owners, whose car suddenly brakes hard for no apparent reason. A phenomenon that is not specific to the brand’s vehicles, but would be common on them. So much so that some customers alerted the authorities, while Elon Musk claimed to have corrected the problem. A new blow to the brand’s reputation, recently damaged by the revelations of an engineer who claimed that the video promoting Autopilot in 2016 was faked.

In April 2021, we also demonstrated that the figures published by Tesla concerning the efficiency of its autonomous driving system were in fact slightly biased. A few months ago, American justice also accused Elon Musk of misleading his customers about the real capabilities of his Autopilot, when he had said in 2016 that “ the person in the driver’s seat is only there for legal reasons. She does nothing at all. The car drives itself“.

Either way, however, the latest data proves that Teslas are safer than other cars, with less risk of having an accident.

Autopilot navigation on Tesla Model 3 // Source: Bob Jouy for Frandroid

In June 2022, the threat of a ban on the system hung over the brand, as NHTSA opened an investigation following an accident involving a car of the brand whoseAutopilot would have deactivated one second before impact. Is this a way for the firm to clear itself, by claiming that its system was not active during the impact and that it is therefore the responsibility of the driver who is engaged?

Be that as it may, and if this device is far from perfect, it is advisable to wait for the results of the various surveys before deciding. A few weeks ago, however, during the Super Bowl final, the association The Dawn Project broadcast an ad against Tesla, accusing the Autopilot of causing accidents. It was she who had also claimed that it hit the children, which had been denied by several experts.

The manufacturer is currently preparing a new version of its software, Hardware 4, which will be inaugurated on the Cybertruck and which could make autonomous driving and in particular FSD (full self-driving) more efficient and safe.


Do you use Google News (News in France)? You can follow your favorite media. Follow Frandroid on Google News (and Numerama).





Source link -102