offbeat-

Family Of Dead Driver Sues Tesla Over “Misleading” And “Flawed” Autopilot And FSD


  • Tesla is in court again over claims that Autopilot is defective.
  • The family of a deceased driver says Tesla intentionally concealed problems with Autopilot.
  • Tesla responded that the driver failed to use the system properly.

Driving is an inherently dangerous task, but several modern tech features are supposed to increase safety. According to one family in California, one such tool, Tesla’s Autopilot, is actually to blame for the death of their loved one. He crashed into a parked fire truck around 4 a.m. on February 18, 2023, and didn’t survive.

Carscoops covered the crash when it first happened. At the time, it was unclear whether or not the driver were using any semi-autonomous driving features. Now, we know that the driver, Genesis Giovanni Mendoza Martinez, 31, had engaged Autopilot when he crashed into the fire truck.

More: Rogue Tesla Smashes Into Multiple Cars Before Plunging Off Parking Deck

According to the Independent, the complaint against Tesla on behalf of the Mendoza family says that Giovanni was using Autopilot for 12 minutes prior to the crash. Data reportedly shows that he “generally maintained contact with the steering wheel until the time of the crash.” Why he didn’t see the flashing emergency lights of the truck and slow down or move over is unclear.

In any case, the complaint says that Autopilot itself is flawed and that Tesla “undertook a widespread campaign to conceal thousands of consumer reports about problems with [its] ‘Autopilot’ feature, including crashes, unintended braking, and unintended acceleration.” For Tesla’s part, it’s responded the way it has with so many cases like this. It argues that the crash and its results “were caused by misuse or improper maintenance of the subject product in a manner not reasonably foreseeable to Tesla.”

There’s no question that this entire situation is sad. We all make mistakes and sometimes that includes misunderstanding what a product is or isn’t capable of. Tesla, no doubt, could make it more clear that Autopilot and FSD don’t actually provide genuine Level 5 autonomy – perhaps starting with their names, which may mislead consumers into thinking the cars can actually drive themselves without any human interference.

This seems to be the case here, according to the lawsuit: “Not only was he aware that the technology itself was called ‘Autopilot,’ he saw, heard, and/or read many of Tesla or Musk’s deceptive claims on Twitter, Tesla’s official blog, or in the news media,” the complaint states. “Giovanni believed those claims were true, and thus believed the ‘Autopilot’ feature with the ‘full self driving’ upgrade was safer than a human driver, and could be trusted to safely navigate public highways autonomously.”

At the same time, it seems that previous legal victories for the automaker make one thing clear: ultimately, the driver is responsible for always maintaining control of the vehicle.

As we pointed out in our original coverage, countless such accidents happen every year involving all kinds of vehicles, not just Teslas. It’s why the NHTSA has created campaigns like “Slow Down, Move Over” to remind drivers what they should do when they encounter an emergency vehicle.

Image Credit: Contra Costa FD

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *