Crash course
The family of a man who died after his Tesla crashed while driving on Autopilot is suing the EV maker Independent reports, accusing CEO Elon Musk of making misleading claims about the driver assistance software.
In February 2023, 31-year-old Genesis Giovanni Mendoza-Martinez died after his Model S crashed into a fire truck on the side of a freeway in San Francisco.
According to the complaint, Mendoza-Martinez “did not use the accelerator or brake pedal” during the 12 minutes prior to the crash while the vehicle was on autopilot.
Mendoza-Martinez’s family claims the driver bought the vehicle under the mistaken belief that it could drive itself, echoing the sentiment that Tesla has overestimated the ability of its vehicles to drive themselves.
The complaint focuses on pages of online posts written by Musk, in which he alleges that he knowingly misled the public, despite knowing that the software did not and still does not allow Teslas to safely drive themselves isn’t.
The automaker has since fired back, arguing that it was the driver’s “own negligent acts and/or omissions” that led to his death, as cited by the automaker. Independent.
However, attorney Brett Schreiber, who represents the family, told the newspaper that Tesla improperly used its customers to beta test flawed driver assistance software on public roads, with fatal consequences.
“This is yet another example of Tesla using our public roads to conduct research and development in autonomous driving technology,” he told the newspaper.
Deep impact
Tesla is already the subject of several government investigations into the safety of its so-called “self-driving” software.
Mendoza-Martinez’s crash has already been part of an active National Highway Traffic Safety Administration investigation dating back to 2021.
The regulator also found earlier this year that those who used FSD were given a false sense of security and were ‘not sufficiently involved in the driving task’.
According to NBCThere are at least 15 other similar and active cases involving Autopilot or the EV maker’s misleadingly named ‘Full Self-Driving’ (FSD) software – an optional add-on – that have resulted in an accident involving death or resulting in injuries.
The California Department of Motor Vehicles has also filed a lawsuit against the automaker, accusing it of false advertising about FSD.
More about autopilot: Workers training Tesla’s autopilot say they had to ignore road signs to prevent cars from driving like a “robot”