Tesla Receives First Private Lawsuit Over Autopilot Promises

Since the American electric car pioneer has been offering the “Autopilot”, there have been serious incidents. The performance of the system is often overestimated. Who is guilty?

In recent years, there have been repeated Tesla accidents involving fire service vehicles. So also in January 2018 in California.

Culver City Fire Department/Reuters

Once again, the electric car manufacturer Tesla is under pressure. It’s the autopilot system again, which should allow the driver of a Tesla not to always have to look at the road while driving. Many buyers of the Full Self Driving (FSD) autopilot option believe that the system takes over all of the driving.

Not only is the distance to the car in front perfectly maintained, but also steering, changing lanes with indicators and other functions. So you can safely take your hands off the wheel. Numerous videos of risky maneuvers in which drivers rely fully on the programs are circulating on the Internet.

However, Tesla Autopilot can’t do much more than other cars with driver assistance systems like lane keeping and distance from the vehicle in front. In June, the US Transportation Administration NHTSA expanded an “autopilot” investigation into a series of rear-end collisions. There have already been at least six accidents with Tesla vehicles that can be traced back to the use of the autopilot system. Eleven fatalities were the result. Tesla collisions with fire brigade vehicles were particularly common.

US authorities are targeting autopilot

The California transportation authority DMV filed a lawsuit against Tesla in August for false advertising promises related to automated driving. Now the first Tesla owner has filed a private lawsuit in which he accuses the company of tech billionaire Elon Musk of misleading advertising promises. According to the lawsuit, the manufacturer has suggested since 2016 that its self-driving car technology is or is about to be fully operational.

The Tesla owner complains that the software is actually still immature and insecure. The promises have “turned out to be wrong again and again”. Customers who received updates to the programs act as “unskilled test engineers”.

Tesla always emphasizes that customers are always advised that the autopilot and the additional option FSD are assistance systems. The driver must keep his hands on the wheel at all times and be ready to take control of the vehicle. This is also in the description when the customer orders the option.

Mercedes reacted differently

Nevertheless, the systems are controversial – also because terms such as “autopilot” and “full self-driving” sound like fully autonomous driving. In a comparable case, in 2016, consumer organizations in the USA pointed out misleading statements in advertising to the German carmaker Mercedes.

In an advertisement for the then new E-Class, consumers could have assumed that the vehicle could drive itself completely. Mercedes rejected the allegations of misleading, but at the same time immediately withdrew the advertisement.

Tesla has not followed this example and is sticking to the Autopilot designation and the FSD option. The manufacturer has not yet commented on the complaints.

source site-111