It sounds like a futuristic dream: a fully self-driving car. No more need for human navigation, which could mean an end to crashes caused by driver error, inattention or impairment. Wanting to take a nap or read a book instead of driving? Turn on the autopilot.
Unfortunately, today’s semi-autonomous vehicles are not fully self-driving. Many people do not realize that. Some drivers are relying on these so-called self-driving capabilities and getting into crashes as a result.
Do not rely on these systems. Numerous investigations and reviews have shown that an attentive human driver is always needed in these vehicles. The engineering reality is that these vehicles are at Level 2 of driving automation, which refers to a semi-automated system requiring complete supervision by a human driver. Fully autonomous would be Level 5.
The State of California has put Tesla “under review” for making public statements that could violate state regulations. The statements allegedly could be taken to imply that the vehicles are fully or nearly fully autonomous when they are not.
For example, Tesla’s CEO Elon Musk stated in a January earnings call he was “highly confident the car will be able to drive itself with reliability in excess of human this year.” California later confirmed that the technology is not on track to reach Level 5 this year.
Statements like these could have significant repercussions for Tesla, especially if they lead people to believe that their vehicles are fully autonomous. Under product liability law, companies may be held liable for injuries caused by a foreseeable misuse of their products.
Further, the names of these products – the Autopilot system or the Full Self-Driving Capability (FSD) option – could be misleading. The Autopilot is not autonomous and the FSD is not fully self-driving.
Mistakes about the cars’ capabilities can have tragic consequences
As of this month, the National Highway Traffic Safety Administration (NHTSA) is investigating 28 crashes involving Teslas with either Autopilot or FSD. At least three Tesla vehicles with Autopilot engaged have been involved in fatal crashes since 2016.
Recently, a Tesla Model 3 collided with an overturned truck in Fontana, California. The Tesla driver was killed, and two others were injured. It has not been determined whether the Autopilot was engaged at the time of the crash. However, the now-deceased driver made numerous Tiktok videos praising the Autopilot feature. In at least one, he was filming while driving the Tesla with no hands on the wheel.
Tesla states these systems do not replace the driver
Tesla’s website says that the Autopilot system does not make the vehicle fully autonomous. An alert, sober driver needs to be in control of the vehicle at all times.
An attorney for Tesla told California’s Department of Motor Vehicles in March that vehicles equipped with FSD still need a human driver to steer, brake or accelerate as needed.
Consumer Reports tested the FSD product suite and found it “requires significant driver attention to ensure that these developing-technology features don’t introduce new safety risks to the driver, or other vehicles out on the road.”
Moreover, the nonprofit found that the features in the FSD suite worked inconsistently, which could further put people at risk. The navigation system sometimes skipped steps on the preprogrammed route, drove in the carpool lane or disengaged for no apparent reason. Sometimes, it drove through stop signs, slammed on the brakes for yield signs or otherwise acted against driving rules.
If you have been in a wreck, get help
If you have a Tesla with Autopilot or FSD, you can’t afford to rely on those features. You must remain alert and in control of the vehicle.
If you have been in a crash involving one of these features—either as a driver or passenger in a Tesla, or as a driver or passenger in another vehicle that was involved in a crash with a Tesla—you could have a product liability claim in addition to any other rights you might have as a car crash victim. Talk to a lawyer for specific information about your case.