Surveillance footage from a Thanksgiving Day crash on the San Francisco Bay Bridge shows a Tesla Model S vehicle changing lanes and abruptly braking in the far-left lane of the San Francisco Bay Bridge, resulting in an eight-vehicle crash.
According to a report in The Intercept, the crash injured nine people, including a 2-year-old child, and blocked traffic on the bridge for more than an hour. The driver said the car was in Full Self-Driving mode at the time.
Crash Involving Self-Driving Tesla with Full Auto Feature
The video and new images of the crash secured by The Intercept via a California Public Records Act request give the first direct look at what happened on Nov. 24 while confirming eyewitness accounts. The driver of the Tesla told police that he has been using Tesla’s new Full Self-Driving (FSD) feature before the vehicle’s left signal and its brakes activated, and it moved into the left lane, slowing to a stop directly in the second vehicle’s path of travel.
Only hours before the crash, The Intercept reports,” Tesla CEO Elon Musk had announced the Tesla’s FSD capability was available in North America, calling it a “major milestone” for the company. The feature had been rolled out to more than 285,000 people in North America by the end of 2022. Tesla Full Self-Driving Beta is now available to anyone in North America who requests it and has bought the option.
Ongoing Investigations
The Bay Bridge accident is one of several Tesla crashes in recent months. On November 18, a Tesla Model 3 crashed into a stopped Ohio State Highway Patrol SUV with its hazard lights flashing. Tesla was also believed to have been in self-driving mode, and NHTSA is also investigating that crash.
The agency is looking into a tweet by Musk where he said that FSD users would soon have the option to turn off reminder notifications for drivers to keep their hands on the steering wheel. A Twitter user tagged Musk on the post: “Users with more than 10,000 miles on FSD Beta should be given the option to turn off the steering nag.” Musk replied: “Agreed, update coming in Jan.”
The San Francisco crash is under investigation by the National Highway Traffic Safety Administration (NHTSA). Full Self-Driving mode has expanded features on top of Tesla’s Autopilot driver-assist system. According to NHTSA data, self-driving Tesla vehicles on Autopilot were involved in 273 known crashes from July 2021 to June 2022.
Injuries and Fatalities Caused by Self-Driving Tesla Vehicles
Tesla vehicles accounted for nearly 70% of 329 crashes in which these advanced driver assistance systems were engaged. Most serious injuries and fatalities were associated with Tesla vehicles on Autopilot. Since 2016, NHTSA has investigated 35 crashes in which Tesla’s FSD or Autopilot systems were likely in use.
These accidents killed 19 people. There have also been a number of reports recently where Tesla drivers have complained about sudden or “phantom braking,” when they say vehicles slammed on their brakes at high speeds without the driver initiating any such move. NHTSA got more than 100 of these complaints just in a three-month period.
Evolving Terminology Around Driver Assist Features
Tesla has been aggressive about developing FSD as an essential vehicle feature as the automaker feels increasing pressure to distinguish its products from other electric vehicles entering a competitive market. The term “Full Self-Driving” has received blowback from safety advocates because it is misleading and dangerous, similar to Autopilot, which is only a driver-assist feature but gives the impression because of its name that it is completely self-driving.
Tesla has consistently covered up for the Autopilot, saying it has advised drivers never to take their hands off the steering wheel and be prepared to take over at a moment’s notice – even though the feature is called Autopilot. Other tech companies realize the danger of such terminology.
Last year, the autonomous driving technology company Waymo, owned by Google’s parent company, announced that it would no longer use the term self-driving. The company admitted that the term is inaccurate and gives the public a false impression of the capabilities of driver-assist technology. Waymo said in a blog post that this type of false impression could lead someone to unknowingly take risks, like taking their hands off the steering wheel and putting themselves, their passengers, and others on the roadway in grave danger.
Transportation Secretary Pete Buttigieg has also urged consumers to be wary of what is being marketed as “self-driving,” warning that what is being sold in the market today is driver assistance technology, not driver replacement technology. There are also no federal restrictions yet on testing autonomous vehicles on public roads. And even after all these crashes and ongoing investigations, Tesla has shown no signs of changing its branding in the near future.
If you or a loved one has been injured in a self-driving Tesla crash involving Autopilot or Full Self-Diving, it is important that you understand your legal rights and options. When a driver is using a vehicle as directed, but a malfunctioning automated driving system causes a crash, this is an example of an auto defect. An experienced auto defect attorney can provide you with more information about pursuing your legal rights.