Elon Musk and Tesla are under the spotlight once again after the tragic death of 31-year-old Genesis Giovanni Mendoza Martinez, whose Tesla vehicle, operating in “autopilot” mode, collided with a firetruck. Mendoza’s family, heartbroken and seeking justice, has filed a lawsuit against Tesla, claiming the company’s self-driving technology was at fault. This high-profile case raises serious questions about the safety and reliability of Tesla’s autonomous features.
Elon Musk has been named in the lawsuit. Credit: Jared Siskin / Getty
The Tragic Accident That Claimed a Life
On February 18, 2023, Genesis Mendoza was driving his Tesla when the car, allegedly in autopilot mode, struck a firetruck at high speed. The crash was catastrophic, resulting in Mendoza’s untimely death and injuring his brother Caleb, who was a passenger. Four firefighters also suffered minor injuries in the accident.
The Mendoza family believes this tragedy could have been avoided and is now holding Tesla and CEO Elon Musk accountable. Their lawsuit argues that Tesla’s self-driving technology is dangerously flawed and improperly marketed as safer than it actually is.
The Lawsuit Against Tesla
The family’s attorney, Brett Schreiber, has labeled the incident “entirely preventable” and accuses Tesla of using public roads as testing grounds for its autonomous technology. In a statement, Schreiber said, “Tesla knows that many of its earlier model vehicles continue to drive our roadways today with this same defect, putting first responders and the public at risk.”
The lawsuit claims that Mendoza purchased the Tesla based on Tesla’s marketing of its “autopilot” and “full self-driving” features, which are portrayed as cutting-edge advancements in vehicle safety. According to the family, these representations gave Mendoza a false sense of security, leading him to trust the technology more than he should have.
“Giovanni believed Tesla’s claims and trusted the autopilot feature to safely navigate highways autonomously,” the lawsuit states. The family is now seeking justice and accountability for what they consider to be deceptive practices by Tesla.
Tesla’s Defense and Position
Tesla has responded to the lawsuit by denying liability. The company asserts that its vehicles are designed with “reasonably safe” systems under applicable state laws and suggested that the crash may have been partially caused by Mendoza’s own negligence.
In court filings, Tesla stated, “No additional warnings would have, or could have, prevented the alleged incident.” The company has consistently maintained that its “autopilot” system requires a fully attentive driver, ready to take control at any moment, a point outlined in its user manuals and on its website.
Details of the Crash
According to the lawsuit, the Tesla was in autopilot mode for 12 minutes leading up to the crash and was traveling at an average speed of 71 mph. The collision not only resulted in Mendoza’s death but also highlighted the potential risks associated with Tesla’s autopilot technology.
This case is not an isolated incident. Between 2015 and 2022, Tesla owners reported over 1,000 crashes and more than 1,500 complaints of sudden, unintended braking while using autopilot, according to reports cited by The Los Angeles Times. These numbers fuel the growing scrutiny surrounding Tesla’s self-driving systems.
Slow down and move over when approaching emergency vehicles. Truck 1 was struck by a Tesla while blocking I-680 lanes from a previous accident. Driver pronounced dead on-scene; passenger was extricated & transported to hospital. Four firefighters also transported for evaluation. pic.twitter.com/YCGn8We1bK
— Con Fire PIO (@ContraCostaFire) February 18, 2023
What Is Tesla’s Autopilot System?
Tesla’s official website describes its autopilot feature as an “advanced driver assistance system that enhances safety and convenience.” The company states that the system, when used properly, reduces the driver’s workload. However, Tesla emphasizes that the feature does not make the vehicle fully autonomous. Drivers are expected to remain engaged, with hands on the wheel and eyes on the road at all times.
This messaging has drawn criticism for being contradictory. While Tesla markets the feature as a step toward autonomous driving, the fine print clarifies that it still relies heavily on human intervention. This dual messaging has created confusion among users, some of whom may overestimate the system’s capabilities.
Federal Criticism of Tesla’s Self-Driving Claims
The fatal crash involving Mendoza has reignited criticism from federal officials, including Transportation Secretary Pete Buttigieg. Buttigieg has been vocal about the risks of Tesla’s autopilot technology, particularly its misleading branding.
“I don’t think something should be called ‘autopilot’ when the fine print says you need to have your hands on the wheel and eyes on the road at all times,” Buttigieg told The Associated Press.
In February 2023, just weeks before the crash, Buttigieg tweeted, “Reminder—ALL advanced driver assistance systems available today require the human driver to be in control and fully engaged in the driving task at all times.”
Credit: X
The Family’s Call for Accountability
For the Mendoza family, the loss of Genesis is immeasurable. They are not only seeking justice for their son but also demanding greater accountability from Tesla to prevent similar tragedies in the future. Attorney Brett Schreiber argued that Tesla’s reliance on public roads for testing its autopilot systems places everyone at risk.
“This loss was entirely preventable,” Schreiber said. “Tesla needs to answer for its recklessness.”
The lawsuit underscores the need for more stringent regulations and oversight of autonomous driving technologies, which are still in their developmental stages. The family’s fight is not just about seeking justice for their own loss but about raising awareness of the potential dangers of self-driving systems.
The Broader Implications for Autonomous Driving
This case highlights a critical challenge in the race toward fully autonomous vehicles: the balance between innovation and safety. Tesla’s autopilot system is often praised for its potential, but incidents like Mendoza’s crash demonstrate the technology’s limitations.
The debate surrounding Tesla’s self-driving capabilities raises broader questions about the ethics of deploying such systems without fully addressing their risks. Should companies like Tesla bear greater responsibility for the marketing and functionality of their autonomous features? Or is it ultimately up to drivers to understand the technology’s limitations?
Conclusion: A Fight for Justice and Change
The tragic death of Genesis Giovanni Mendoza Martinez has brought into sharp focus the risks associated with Tesla’s autopilot system. While the company continues to defend its technology, the Mendoza family’s lawsuit raises serious concerns about how these systems are marketed and implemented.
For Eduardo, Maria, and Caleb Mendoza, this legal battle is not just about holding Tesla accountable—it’s about ensuring no other family has to endure the heartbreak they’ve faced. As autonomous technology continues to evolve, cases like this serve as a reminder of the importance of prioritizing safety over innovation. The road ahead for self-driving cars may be paved with promise, but it must also be safeguarded by responsibility and transparency.