A Florida judge has ruled that a lawsuit against Tesla involving a fatal crash in 2019 while Autopilot was active can proceed to trial, finding there is “reasonable evidence” that CEO Elon Musk and the company knew about defects in the system but misled consumers about its capabilities. The ruling opens the door for a jury to consider whether Tesla’s marketing of its semi-autonomous driving system contributed to the death of a Model 3 owner.
Under Scrutiny Over Autopilot’s Safety Record
The decision focused scrutiny on mounting safety questions around Tesla’s driver assistance technology. Autopilot, first introduced in 2015, has been involved in multiple crashes that have resulted in deaths and injuries over the years. While Tesla maintains that drivers must keep hands on the wheel and pay attention while Autopilot is engaged, critics allege the system lures consumers into misusing it and placing too much trust in its abilities.
The most recent ruling concerns a 2019 accident in Florida that killed 50-year-old Stephen Banner when his Model 3 plowed into an 18-wheeler that was crossing the road. Investigators say the Tesla made no attempt to brake or avoid the collision. Banner activated Autopilot seconds before the crash and took his hands off the wheel. In allowing the lawsuit brought by Banner’s wife to proceed, Judge Reid Scott wrote that her attorneys provided evidence showing Tesla “painted the products as autonomous” through its marketing even as it knew about problems with Autopilot failing to detect perpendicular traffic.
Overselling Autopilot’s Capabilities
The judge pointed to public comments made by Musk about Autopilot’s capabilities as well as a 2016 promotional video demonstrating a Tesla driving itself with no human intervention as examples that may have misled consumers. Musk has made ambitious claims about the technology in recent years, saying full autonomy is close at hand even as engineers caution that human oversight remains essential for the foreseeable future.
Scott agreed that Banner’s attorneys had provided enough evidence for the case to proceed. He wrote that after reviewing the evidence, he could not “imagine how some ordinary consumers would not have some belief that the Tesla vehicles were capable of driving themselves hands free.”
Echoes a Similar Deadly Crash in 2016
The Florida crash has similarities to another lethal accident that happened in 2016 involving Tesla’s semi-autonomous driving system. In that incident, Joshua Brown was killed when his Model S collided with a tractor-trailer crossing a highway that Autopilot failed to recognize and brake for. Brown, an enthusiastic proponent of Autopilot, had posted videos showing it could drive extended periods without human intervention. Following these and other crashes, experts have called for greater safeguards to ensure drivers pay attention and take control when needed while using driver assistance systems.
What This Means for Tesla’s Legal Exposure
The Florida judge’s decision exposes Tesla and Musk to the possibility of punitive damages, which could total millions of dollars if the jury finds in favor of the plaintiff. According to legal experts, the summary of evidence suggests alarming inconsistencies between Tesla’s internal knowledge and public messaging about Autopilot’s limitations. It also indicates Tesla may have put marketing concerns above safety in how it rolled out semi-autonomous features.
The upcoming trial promises uncomfortable scrutiny of the automaker’s claims and safety record around Autopilot and Full Self Driving, an option package with higher autonomous capabilities. Tesla had previously prevailed in two other lawsuits in California state courts involving alleged defects in its semi-autonomous technology. But the Florida ruling shows plaintiffs can still have a path to victory.
Ongoing Questions Around Regulation of Driver Assistance Tech
Lurking behind Tesla’s legal troubles are larger debates surrounding the appropriate regulation of rapidly evolving advanced driver assistance systems (ADAS) like Autopilot. While the technology has potential to improve traffic safety if used properly, consumer confusion about system capabilities likely contributes to preventable accidents. Groups like the National Transportation Safety Board have recommended better safeguards and standards for semi-autonomous vehicles to ensure drivers understand the need to monitor systems and take control when required.
However, regulation has been slow to catch up with the pace of technological change. With dozens of automakers racing to pack vehicles with self-driving features, lawmakers face pressure to balance encouraging innovation against calls for more stringent safety rules. How they choose to regulate claims around driver assistance technology’s abilities could have major consequences for an industry gearing up for a hands-free future.
What Comes Next in Lawsuit Over Fatal Autopilot Crash
Barring an eleventh-hour settlement, Tesla now faces a public trial that will likely air considerable evidence aimed at showing negligence and reckless conduct, including internal communications casting doubt on the company’s official stance regarding Autopilot. The judge indicated readiness to allow juror consideration of whether Tesla’s written disclaimers and warnings about system limitations are adequate.
With Musk promising to achieve full autonomy soon, the legal glare on Autopilot comes at an inopportune time for Tesla as it works furiously on improving self-driving software. How regulators choose to balance safety concerns against hands-free driving promises in the coming months and years will have major implications for the emerging driver assistance industry. Tesla’s legal fate may serve as a warning to automakers regarding how they promote semi-autonomous driving features to the public.
Comments 1