
On February 21, 2026, Judge Beth Bloom of the United States Court of Appeals for the Eleventh Circuit in Miami, Florida, formally denied Tesla’s motion to vacate the judgment or for a new trial, upholding the massive US$243 million (approximately RMB 1.68 billion) compensatory and punitive damages award.
For all practical purposes, this lawsuit has reached its conclusion. It is highly unlikely that the U.S. Supreme Court will agree to hear the appeal, as the acceptance rate for cases in the U.S. federal courts is generally below 1%.
On August 1, 2025, an 8‑member jury in the U.S. District Court for the Southern District of Florida unanimously ordered Tesla to pay nearly RMB 1.7 billion in compensation to the victim of a 2019 traffic accident involving a Tesla vehicle (for details, see Tesla’s Record‑Setting Damages and the Liability Boundaries of Intelligent Driving Systems). This marks the first court ruling in years holding Tesla liable for a traffic accident due to its FSD (Full Self‑Driving) system, making the verdict exceptionally significant.
Elon Musk publicly announced on the social media platform X (which he controls) the same day that Tesla would definitely appeal. The outcome of the appeal had been uncertain, given Tesla’s deep financial resources and its ability to retain top legal counsel.
The appellate court’s February 21 ruling carries profound significance. Upholding the record‑setting penalty amounts to a heavy blow against the blind optimism prevalent in the so‑called intelligent driving industry. It will also help regulators and judicial authorities worldwide accelerate efforts to clarify how liability for assisted driving systems is allocated between manufacturers and consumers, thereby preventing further tragedies.
According to the published judicial opinion, the primary grounds for the judge’s rejection of Tesla’s appeal can be summarized as follows:
The judge held that the trial evidence was fully sufficient to support the original jury’s finding that Tesla was responsible for the 2019 fatal crash involving the enhanced Autopilot driver assistance feature. Tesla failed to present any new, persuasive legal basis to overturn the original judgment or any procedural flaws sufficient to alter the verdict.
The ruling confirmed that Tesla’s Autopilot system contained design defects. Although the driver was distracted (bending down to pick up a mobile phone), the judge ruled that Tesla’s driving system failed to effectively prevent such an accident and should bear 33% of the liability (the driver 67%).
The judge upheld the large punitive damages award, intended to deter Tesla for its disregard of safety risks related to the system’s inability to handle stationary obstacles and its misleading marketing practices.
The legal foundation central to the judgment — the existence of defects in Tesla’s intelligent driving system — was the key focus of the trial. In her February 21, 2026 order denying Tesla’s motion, Judge Beth Bloom explicitly analyzed and confirmed that evidence of technical defects was adequate to support the verdict. Regarding the legal weight of Defective Design, the judge stated that evidence introduced at trial was sufficient for a reasonable jury to conclude that Tesla’s Autopilot system was defectively designed.
Her analysis of technical flaws centered on five core points:
Inability to detect cross‑traffic or obstaclesEvidence showed the system failed to properly identify, or failed to timely activate braking or warning functions for, vehicles stopped on the road shoulder, as in the present case.
Inadequate driver monitoring mechanismsThe judge endorsed findings that the system failed to ensure driver attentiveness, with defective monitoring practices that tended to create a misleading sense of safety.
Failure to WarnThe judge noted that Tesla failed to provide adequate warnings regarding the limitations of Autopilot. Risk disclosures in the user manual were buried deep within touchscreen menus, making them effectively inaccessible to ordinary users.
Awareness of defects without remedial actionCiting internal Tesla documents and engineer testimony, the judge found reasonable evidence that Tesla and Elon Musk had actual knowledge of the system’s fatal flaws in detecting cross‑traffic, yet continued marketing it as “autonomous driving.”
Disconnect between technical capabilities and actual marketingThe judge found a severe mismatch between Tesla’s marketing — including videos claiming the “car was driving itself” — and the system’s actual technical limitations. Such misleading advertising amplified risks arising from design defects
In sum, Judge Bloom concluded that Tesla not only suffered from design failures that prevented collision avoidance, but also undermined driver vigilance through improper marketing while aware of those defects. She therefore declined to set aside the jury’s findings of 33% liability and substantial damages, holding that no judicial intervention by the appellate court was warranted to overturn those conclusions.
During the summer 2025 trial and subsequent legal rulings, testimony from the plaintiff’s expert witnesses played a decisive role in shaping the judge’s views. The leading expert was Professor Mary “Missy” Cummings of George Mason University, a former senior safety advisor at the U.S. Department of Transportation’s National Highway Traffic Safety Administration (NHTSA). Her core opinions can be grouped into four areas:
The expert identified severe design flaws in Tesla’s Autopilot system, particularly in handling cross‑traffic and roadside obstacles — most notably stationary objects. The system failed to detect the victim’s SUV parked on the shoulder, trigger Automatic Emergency Braking (AEB), or issue forward collision warnings. Over‑reliance on cameras, without redundant safety safeguards to compensate for visual recognition limits in complex scenarios, was highlighted as a critical limitation.
The expert sharply criticized Tesla’s driver monitoring system as insufficient to ensure driver concentration, containing obvious monitoring loopholes. The system primarily detected torque on the steering wheel (whether hands were on the wheel) rather than monitoring the driver’s gaze.This meant the system continued operating even if the driver looked down to pick up a phone — as occurred in this case — so long as hands remained on the wheel. This approach lagged behind industry peers: General Motors and Ford used infrared cameras for eye tracking, while Tesla’s design increased misuse risk.
The expert concluded that Tesla’s marketing strategies induced driver “misuse.” Exaggerated promotions created a false sense of security; videos claiming the “car was driving itself” and statements by Elon Musk led drivers to believe the system was safer than humans, resulting in over‑reliance and distraction.Concealed warnings also contributed significantly. Although safety warnings appeared in the user manual, they were hidden deep within complex touchscreen menus, making them hard for ordinary drivers to access and understand.The trial judge further questioned Tesla’s warning method, suggesting its disclaimers functioned more as reminders than genuine warnings, which must clearly communicate the harm resulting from improper use.
Expert testimony revealed that Tesla released the system to market and allowed its activation on inappropriate roads — including those with intersections and road shoulders, as in this case — despite knowing of the above defects.For instance, no geofencing was implemented. The expert argued Tesla could have used technical measures such as geofencing to restrict Autopilot activation in unsafe environments, but chose not to do so for profit and data collection.
These arguments successfully persuaded the jury and judge that design defects in Autopilot were a “substantial factor” in causing the crash, leading to the final massive punitive award.
Notably, in her February 21, 2026 ruling, Judge Beth Bloom systematically rejected Tesla’s core defensive arguments, describing them as “reheated” claims attempting to evade liability by repeating previously rejected theories. Her reasoning lacked novel legal support and conflicted with trial evidence.Crucially, these arguments reflect views widely held among intelligent driving manufacturers, and even some judicial experts and practitioners. The judge’s key rebuttals included:
Tesla claimed driver McGee’s distraction (looking down for a phone) was the sole cause of the crash, and that the vehicle only offered assistance, not autonomous driving, requiring constant driver supervision.Judge Bloom responded that while the driver admitted improper conduct, this did not automatically absolve Tesla. Evidence showed the driver held a reasonable expectation of reliance on Autopilot’s collision‑avoidance functions, and the system’s design defects — including failure to detect stationary obstacles — were a “substantial factor” in causing harm.
Tesla contended Autopilot was not defectively designed because it functioned as intended and clear warnings existed in the user manual.The judge ruled trial evidence “amply” demonstrated design defects. She emphasized that Tesla buried risk notices deep in electronic menus, rendering them inaccessible to ordinary users; such “warnings” were deemed ineffective or insufficient.
In its 71‑page motion, Tesla claimed the plaintiff’s references to Elon Musk’s exaggerated Autopilot claims “misled” the jury and that the verdict defied “common sense.”The judge forcefully replied these arguments were nearly identical to those raised during and before trial, which the trial court had already considered and rejected. She stated plainly that Tesla provided no new legal evidence to alter the earlier judgment.
Tesla claimed the punitive damages were excessive and illegal under Florida law.The judge upheld the jury’s US$200 million punitive damages award, finding evidence sufficient to support a finding of “recklessness” or “disregard” — namely, that Tesla engaged in misleading marketing while aware of technical defects.
This ruling is regarded by legal and automotive industries as a watershed case in the legal history of autonomous driving. It will undoubtedly shape perspectives in the intelligent driving and judicial sectors both in the U.S. and globally. Its precedential impact may be seen in six key dimensions:
For years, Tesla prevailed in numerous lawsuits by emphasizing the driver handbook’s requirement to maintain control.This ruling establishes a precedent: even if a driver is distracted, the automaker may still bear significant legal liability if design defects (e.g., failure to detect obstacles) and misleading marketing (creating false security) combine to cause an accident. It breaks the industry’s reliance on disclaimers as a universal shield.
The judge upheld claims that Elon Musk and Tesla videos misled users. Going forward, automakers must maintain strict consistency between marketing phrasing (such as “FSD / Full Self‑Driving”) and actual technical limitations. Regulators and courts will have stronger grounds to review whether manufacturers induce dangerous behavior through exaggerated claims.
Criticism of Tesla’s steering‑wheel torque‑only monitoring was endorsed by the judge, sending a warning to the entire industry.To reduce legal risk, the sector will accelerate the shift from basic “hands‑off detection” to more robust infrared camera‑based eye and fatigue tracking systems. Systems that fail to effectively prevent driver distraction may legally be deemed “defective in design” — a standard already under development by the United Nations and China.
The ruling confirmed Tesla’s internal awareness of the system’s failure to detect cross‑traffic without corrective action. This may encourage more insiders to come forward and expose failures of corporate responsibility.If plaintiff attorneys can prove internal engineers warned of risks ignored by management, manufacturers face severe punitive damages — as seen in the US$200 million award here. Internal accountability becomes more powerful than compensatory damages alone.
Dozens of lawsuits involving fatal crashes linked to Tesla’s Autopilot and FSD are pending across the U.S.This case may serve as a benchmark for pending litigation. Its judicial opinion — especially on technical defects and the rejection of Tesla’s defenses — will become a “roadmap” for plaintiff attorneys in other cases. It greatly strengthens other victims’ positions, likely leading to increased settlement pressure or similar large judgments against Tesla.
Historically, court rulings often push administrative regulators and legislatures to strengthen rules.This verdict is expected to prompt NHTSA and other global regulators — including United Nations bodies — to impose mandatory recalls on Tesla and potentially legislate stricter entry standards for autonomous driving systems, such as mandatory redundant sensors or geofencing restrictions.
In short, this decision marks the autonomous driving industry’s entry into an era of heightened accountability. Technology is no longer a black box for avoiding liability. Corporate messaging and design must assume a far greater share of responsibility for human life and safety.
合作咨询
QQ 5257885在线咨询
合作咨询热线:
13779938068
