Tesla is once again facing legal scrutiny after a lawsuit was filed, accusing the company of exaggerating the capabilities of its Autopilot and Full Self-Driving (FSD) systems, which allegedly contributed to a fatal crash. This case, reminiscent of previous legal challenges against the automaker, will force the courts to reassess the validity of Tesla’s claims regarding the safety and reliability of its driver-assistance technologies. However, the question remains: Can Tesla overcome this legal hurdle as it did with similar lawsuits in the past?
Mendoza vs Tesla
The latest lawsuit against Tesla centers around a tragic incident in which driver Genesis Giovanni Mendoza Martinez died, and his brother Caleb sustained severe injuries. The accident occurred when Mendoza’s Tesla Model S collided with a fire truck that was parked diagonally across two lanes of a California interstate highway for traffic control due to an unrelated incident.
Filed in California’s Contra Costa Superior Court in October, the lawsuit claims that Mendoza’s Model S was operating under Tesla’s Autopilot system at the time of the crash. The plaintiffs argue that Mendoza had “generally maintained contact with the steering wheel until the time of the crash.” The case was later moved to the US District Court for the Northern District of California after Tesla intervened.
Similar to other lawsuits stemming from fatal or serious accidents involving Tesla vehicles with Autopilot/FSD engaged, Mendoza’s family contends that, while he used the technology cautiously, he was misled by Tesla’s long-running advertising campaign. This campaign, they argue, suggested that Tesla’s vehicles were capable of self-driving capabilities far beyond what the technology could actually achieve.
The lawsuit highlights that Mendoza, aware of the technology’s name “Autopilot,” had seen or heard various claims by Tesla and CEO Elon Musk, including on Twitter, the official Tesla blog, and in the media. These claims led him to believe that Autopilot, with the “full self-driving” upgrade, was safer than a human driver and capable of autonomously navigating highways.
Lawyer blasts Tesla’s response to fatal crash
Tesla has faced numerous lawsuits and regulatory investigations concerning its Autopilot and FSD features, and the argument that the company overstates the capabilities of these systems is far from new. The automaker has been embroiled in so many legal battles over these issues that it’s difficult to keep track.
One ongoing case involves the 2019 death of Jeremy Banner, who crashed his Model 3 into a tractor-trailer crossing into his lane. This incident mirrors the 2016 death of Joshua Brown, whose Model S collided with a similar vehicle. Although Tesla claimed to have addressed the issue after Brown’s fatal crash, the similarities to Banner’s death have raised concerns among regulators about whether the company is doing enough to prevent such accidents.
Mendoza’s fatal crash shares similarities with both Banner’s and Brown’s incidents, as well as with a previous investigation by the U.S. National Highway Traffic Safety Administration (NHTSA). That inquiry found that Tesla’s Autopilot often failed to detect emergency vehicles stopped on the roadside. This led to a voluntary recall and an over-the-air software update, which, according to Mendoza’s family lawyer, was insufficient in preventing accidents of this nature.
“Genesis Mendoza’s death caused by the failure of Tesla’s vision system is yet another example of Tesla overstating and overhyping what its technology can do; knowing full well that it was incapable of identifying and responding to an emergency vehicle flashing lights,” Lawyer Brett Schreiber told The Register via email.
Schreiber further criticized Tesla’s previous decision to address the issue with just an over-the-air software update rather than a full recall. He contends that this limited approach left numerous vehicles vulnerable to similar accidents.
“This limited bug fix left tens of thousands of vehicles on the road continuing to suffer from the same defect, putting both Mr. Mendoza, members of the public and emergency first responders needlessly at risk,” Schreiber continued in the email. “The time for Tesla to be held accountable is coming.”
Are the odds in Tesla’s favor?
Will Tesla/s legal luck run out? That remains uncertain. The company has successfully dodged liability in two previous cases that raised similar claims to those in the Mendoza lawsuit. In both cases, plaintiffs argued that Tesla exaggerated the capabilities of its Autopilot and FSD systems, leading to drivers becoming overly reliant on these technologies based on Tesla’s safety assurances.
In 2019, Justine Hsu filed a case after her Tesla Model S allegedly swerved onto a curb while in Autopilot mode. A jury sided with Tesla, concluding that the vehicle had functioned as expected and that the company had adequately disclosed the system’s limitations. Months later, Tesla also won a case involving a fatal accident where Micah Lee died and two passengers were injured while using Autopilot.
In response to the Mendoza lawsuit, Tesla’s defense has been consistent: the company claims that Autopilot and FSD operated as designed and that the accident was caused by “the negligence, acts or omissions of another party, not Tesla.”
Whether the Mendoza case will break Tesla’s winning streak in these types of lawsuits is yet to be seen. However, there is a precedent: While Tesla won the aforementioned cases, it opted to settle another case earlier this year involving the death of Walter Huang, with the settlement amount kept confidential. Tesla warned that if the payout became public, it could be seen as evidence of potential liability, which could prompt other legal actions against the company.
Both Tesla and Mendoza’s legal teams have requested a trial by jury for this case.