The National Highway Traffic Safety Administration (NHTSA) has opened an investigation this week into Tesla’s “Full Self-Driving” (FSD) system after the electric car maker reported four crashes, including one that killed a pedestrian.
According to an Associated Press news report, investigators are looking into the ability of Tesla’s FSD technology to “detect and respond appropriately to reduced roadway visibility conditions, and if so, the contributing circumstances for these crashes.”
The probe covers about 2.4 million Tesla vehicles between the 2016 and 2024 model years. The crashes reportedly occurred after Tesla vehicles entered areas of low visibility, including sun glare, fog, and airborne dust.
Tesla’s Plans for Unsupervised Autonomous Driving
Tesla has been going full steam ahead with its plans to put more autonomous vehicles on the road. In October 2024, the company held an event in Hollywood to unveil a fully autonomous robotaxi without a steering wheel or pedals.
Tesla CEO Elon Musk has said that the company plans to have fully autonomous vehicles running without human drivers as early as next year, with robotaxis being available in 2026.
Musk said during that Oct. 11 event that this would move vehicles “from supervised Full Self-Driving” to unsupervised. According to Musk, that means you could even fall asleep in the vehicle and wake up at your destination.
The Tesla boss called his robotaxi vision the “glorious future.” Tesla also expects to make the FSD technology available on its popular Model 3 and Model Y vehicles in Texas and California next year.
However, the impact of NHTSA’s probe on Tesla’s ambitious plans is still unclear. NHTSA is already investigating the reported fatal crashes that have been linked to FSD limitations.
The watchdog has also said it will investigate whether any other similar crashes involving FSD have happened in low-visibility conditions.
Officials have said they will seek information from Tesla about whether any updates impacted the system’s performance in those conditions. Specifically, documents posted by NHTSA said the review will evaluate the timing, purpose, and capabilities of the updates and the company assessment of their safety impact.
NHTSA would need to approve any robotaxi that operates without pedals or a steering wheel. Those plans are unlikely to move forward with an ongoing investigation.
Tesla’s FSD Criticized for Possible Safety Gaps
Tesla’s FSD has been criticized for solely using cameras to spot hazards and for not having proper sensors to support fully autonomous driving. Other companies working on self-driving vehicles use radar and laser sensors and cameras for better visibility in the dark and under inclement weather conditions.
The FSD recalls arrived following an investigation into the company’s Autopilot system after crashes involving emergency vehicles. Following that probe, NHTSA said it found 467 crashes involving the less-sophisticated Autopilot.
Those crashes resulted in 14 deaths and 54 injuries. While Autopilot was simply a better version of cruise control, Musk has touted FSD as having the ability to operate without human intervention.
This week’s new investigation is uncharted territory for NHTSA, which previously viewed Tesla’s systems as only assisting drivers rather than driving themselves.
This new probe is significant because it focuses on what FSD is capable of doing rather than on whether drivers were focused on the road at the time of the crash.
Safety advocates say the prior investigation of Autopilot did not really look into why the Tesla vehicles that crashed into emergency vehicles were not seeing or stopping for those vehicles. In those investigations, the focus was the driver, not the car.
However, our auto defect attorneys will be watching this investigation closely where the focus is on whether FSD is capable of properly detecting dangers in real-life scenarios.
Liability for Defective Autonomous Vehicles
As autonomous vehicles (or AVs) become more widespread, it is natural that questions surrounding liability for accidents involving these vehicles get louder. The legal framework governing this area intersects with both product liability and the evolving laws and legislation relating to these types of vehicles.
Traditionally, product liability holds manufacturers responsible for defects in the design, manufacture, or marketing of products that cause harm. In the context of autonomous vehicles, manufacturers may be held liable under strict liability, negligence, or breach of warranty claims.
Strict liability applies when a vehicle is defective and causes injury, regardless of whether the manufacturer was negligent. For a defectively manufactured AV, this means that if a flaw occurs during the production process.
When an accident results from faulty materials, poor assembly, or a failure in quality control, the manufacturer could be held accountable if the defect causes an accident.
In the case of autonomous vehicles, the complex interplay of software and hardware also introduces new dimensions to defect liability. Manufacturers not only produce physical vehicles but also rely on sophisticated software systems that control the driving functions.
If an accident results from a defect in the vehicle’s software or sensors, this may also be classified as a manufacturing defect if the software was improperly installed or integrated. It can add a whole new layer to car accident lawsuits.
Furthermore, questions about who is responsible for such defects arise in the context of supply chains because autonomous vehicles are made of parts from multiple companies.
Component suppliers may also share liability if their products are defectively manufactured. Determining fault can become complex and could involve multiple parties, including software developers, component manufacturers, and vehicle assemblers.
Contact an Auto Defect Lawyer
With autonomous vehicle technology still in its infancy, courts and legislatures will continue to adapt and refine the laws governing liability, particularly when it comes to the unique aspects of how these vehicles are manufactured.
If you or a loved one has been injured in a crash involving an autonomous or semi-autonomous vehicle, it is important that you contact an experienced product defect lawyer who understands the nuances of laws and liability issues involving such vehicles and the technology behind them.
Source: https://www.cbsnews.com/news/tesla-fsd-self-driving-autopilot-elon-musk/