What's this? This is an unbiased just-the-facts news summary about Elon Musk, NHTSA, Tesla Autopilot and Tesla Inc.. To see the full newsline, click the red tabs below. Love news? Become a contributor now!

10 Jun, 2023

Report: Autopilot involved in 17 fatalities, 736 crashes

Safety report2 Comments

According to NHTSA reports, Tesla’s Autopilot self-driving software was involved in 17 fatalities and 736 crashes. NHTSA began collecting the data after a federal order in 2021 required automakers to disclose crashes involving driver-assistance technology. NHTSA says a report of a crash involving driver-assistance does not itself imply that the technology was the cause. It is also unclear if the NHTSA’s data covers every driver assisted crash. Of the 17 fatal crashes, four involved motorcycles. NHTSA has received more than a dozen reports of Teslas running into parked emergency vehicles while in Autopilot

The increase in crashes coincides with Tesla’s rollout of Full Self-Driving, which has expanded from around 12,000 users to nearly 400,000 in just over a year. Nearly two-thirds of all driver-assistance crashes that Tesla has reported to NHTSA occurred in the past year. Tesla clams that, when comparing miles driven per collision, Full Self-Driving crashes at a rate at least five times lower than vehicles in normal driving. However, this figure is uncorroborated by external agencies. NHTSA:

NHTSA has an active investigation into Tesla Autopilot, including Full-Self Driving. NHTSA reminds the public that all advanced driver assistance systems require the human driver to be in control and fully engaged in the driving task at all times. Accordingly, all state laws hold the human driver responsible for the operation of their vehicles.

0 0 votes
Article Rating
Subscribe
Notify of
guest
2 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Maria

Nao era eu que andava com um carro sem ser eu a conduzi lo eu entrava em pânico po lo automático para estacionar é uma coisa agora andar nas estradas com o carro em automático também teria um acidente e talvez morreria mas isto é sobre mim 🤔🤔🤔🤔

Steve

First, both FSB and Autopilot functions are required by definition, to have the driver responsible for what the car does. Autopilot is no different than any lane keeping software that many car manufacturers employ. FSB BETA is a trial software that is used to teach the autonomous software the nuances of driving. Just like your dad thought you. As it learns from the thousands of episodes it gets smarter and smarter about driving and how to avoid accidents. As long as it is in BETA release the driver is always responsible for the car. But the whole program is learning from every drive that occurred.