ClickCease

(800) 561-4887

No Fee If We Don't Win

Another Fatal Tesla Crash in California Involving Driver Assist

The interior and steering wheel of a self-driving Tesla car

Tesla Inc. has told U.S. regulators about yet another fatal Tesla crash involving automated driver-assist systems. The latest incident brings Tesla’s total to 17 fatal accidents since June 2021, when the government required automakers to begin submitting data on these accidents.

According to Bloomberg.com, the most recent crash involved a Tesla Model S that collided with an emergency vehicle in February in the San Francisco Bay Area. At the time of that crash, the National Highway Traffic Safety Administration (NHTSA) asked Tesla for more information.

Ongoing Investigation into Fatal Tesla Crash

According to NBC 15, which covered the Feb. 18 crash on Interstate 680, the 2014 Tesla Model S crashed into a fire truck. The vehicle was operating with Autopilot engaged at the time. 

The driver was killed and a passenger was critically injured but survived. Four Contra Costa County firefighters also suffered injuries.

NHTSA is investigating how Tesla’s Autopilot system detects and responds to emergency vehicles parked on highways. At least 15 Tesla vehicles have crashed into emergency vehicles nationwide while using the system. 

Officials said the truck had its lights on and was parked diagonally on the northbound lanes of the freeway. The Model S was among nearly 363,000 vehicles recalled by Tesla in February for potential flaws in the Full Self-Driving system.

This crash was one of 66 reported accidents that were included in the latest public release of data collected by NHTSA about crashes involving Level-2 automated driving systems, gathered under a June 2021 order demanding that automakers and tech companies report the incidents.

The Associated Press also reported on a March 15 crash in Halifax County, North Carolina. In this case, the Tesla may have been operating on a partially automated system when it struck and seriously injured a 17-year-old student who had just exited a school bus. NHTSA officials are also looking into that incident.

State Highway Patrol officials said the driver of the 2022 Tesla Model Y failed to stop for the school bus, which was displaying all of its activated warning devices. 

The student, Tillman Mitchell, had just left the school bus and was walking across the street to his house when he was struck. He was transported to an area hospital with life-threatening injuries, but is expected to survive.

Advocates Seek to Regulate Self-Driving Systems

Auto safety advocates are asking regulators and lawmakers to set firmer rules for self-driving vehicles and other advanced technologies that are surging in popularity, including Tesla’s Autopilot driver-assist feature. 

NHTSA said in January that it had enquired about a Dec. 31 tweet from Tesla CEO Elon Musk in which he had said the company’s full self-driving (FSD) program might be upgraded as early as April with a way to disable an alert when drivers remove their hands from the steering wheel.

Tesla’s website still states that the Autopilot and its FSD feature are intended for use with a fully attentive driver who has their hands on the wheel and is prepared to take over at a moment’s notice. 

Since August 2021, NHTSA has been looking into how Autopilot handles crash scenes, particularly in collisions with emergency vehicles.

Musk Ordered to Give Deposition in Autopilot Lawsuit

According to Engadget, Musk may have to answer detailed questions regarding a fatal Tesla crash in Northern California in 2018, where Autopilot was engaged. 

A judge has ordered Musk to give a three-hour deposition in a lawsuit over the crash, which killed Apple engineer Walter Huang when his Model X crashed into a highway median south of San Francisco. 

Musk will be asked specifically about statements he made about Autopilot’s capabilities in the year before this fatal crash.

The plaintiffs point to a 2016 Code Conference interview where the Tesla CEO said Tesla cars with Autopilot could already drive with “greater safety than a person.”

Plaintiffs say they are also concerned about a 2016 self-driving demo video that engineers said was staged to show features that were not ready. 

The lawsuit alleges that Huang was misled into believing that he could trust his Model X to drive down the highway on its own without him needing to intervene.

If You Have Been Injured

Our auto defect attorneys have said time and again that Tesla has done nothing but tried to use consumers as test subjects for their so-called “ground-breaking” technology.

It is also disturbing to note that drivers may not even receive a warning when they take their hands off the wheel in future iterations of these driver-assist features.

Autopilot and FSD were not ready for prime time when they were first released, and they still appear shaky and riddled with glitches, based on a number of media reports.

It’s time that Tesla did the responsible thing. The company needs to diligently test its software and products before releasing it to consumers. Using consumers as guinea pigs to test this technology and making them pay for them, is absolutely unconscionable. 

If you or a loved one has been injured or if you have lost a loved one in a fatal Tesla crash, or as a result of any other auto defects, an attorney with experience winning auto defect claims can help you better understand your legal rights and options.

FREE Case Evalution

Our staff will evaluate your case submission and respond in a timely manner.

California Personal Injury Blog