A court rejected Tesla's attempt to overturn a $243 million verdict in an Autopilot crash case. This is one of the largest verdicts against Tesla's autonomous driving claims, and it's now locked in.
Tesla has been fighting liability for Autopilot crashes for years, arguing the technology is safe and drivers misuse it. The courts are increasingly disagreeing. This verdict stands as a precedent for every other autonomous vehicle maker - you can't just disclaim responsibility when your "self-driving" car crashes.
The case centered on a crash where Tesla's Autopilot system was engaged. The details matter: Autopilot is not full self-driving, despite Tesla's marketing suggesting otherwise. It's a driver assistance system that requires constant human supervision. But Tesla's naming and marketing have consistently implied capabilities beyond what the system delivers.
The jury found that Tesla's marketing and naming misled consumers about Autopilot's capabilities, contributing to the crash. The $243 million verdict reflects not just compensatory damages but punitive damages - the jury's way of saying Tesla's behavior was particularly egregious.
Tesla appealed, arguing the verdict was excessive and that the driver was responsible for supervising the system. The court rejected the appeal, letting the verdict stand. That's significant. It means the legal system is starting to hold autonomous vehicle makers accountable for the gap between marketing promises and technical reality.
Elon Musk has been promising full self-driving for years. "Next year" has been the answer since 2016. Meanwhile, Tesla has been selling "Full Self-Driving" packages to customers, charging thousands of dollars for software that demonstrably does not drive the car fully by itself. The naming is the problem. When you call something "Autopilot" or "Full Self-Driving," people believe you.
This verdict isn't about whether the technology works - it's about whether Tesla's marketing created dangerous misconceptions. The evidence suggests it did. Videos of Tesla drivers sleeping, reading, or otherwise not supervising Autopilot circulate regularly. That's not just driver negligence - it's a predictable response to Tesla's messaging about what the system can do.

