Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks
US Opens Probe Into Tesla's 'Full Self-Driving' System After Fatality
Image credit: Nathan Laine/Bloomberg via Getty Images

US Opens Probe Into Tesla's 'Full Self-Driving' System After Fatality

The US National Highway Traffic Safety Administration (NHTSA) launched a preliminary evaluation on Thursday into Tesla's Full Self-Driving (FSD) system's alleged failure to 'detect and respond appropriately to reduced roadway visibility conditions.'...

Improve the News Foundation profile image
by Improve the News Foundation

Facts

  • The US National Highway Traffic Safety Administration (NHTSA) launched a preliminary evaluation on Thursday into Tesla's Full Self-Driving (FSD) system's alleged failure to 'detect and respond appropriately to reduced roadway visibility conditions.'[1][2][3]
  • It covers an estimated more than 2.4M Tesla vehicles that offer the optional FSD feature, including the 2016-2024 Model S and Model X, 2017-2024 Model 3, 2020-2023 Model Y, and 2023-2024 Cybertruck.[2][4][1]
  • This comes as four crashes in low-visibility conditions were reported, including one that resulted in the death of a pedestrian in Arizona last November.[2][3][5]
  • The investigation will also look into whether other accidents in similar conditions are linked to the system, and if Tesla implemented any changes that may affect its performance in low-visibility conditions.[2][6][7]
  • The software, which requires active driver supervision and doesn't make vehicles fully autonomous, is no stranger to legal scrutiny, with another fatal accident involving the system occurring in April.[3][6][8]
  • An earlier NHTSA investigation into Tesla's Autopilot cruise control system found that 467 crashes involving Autopilot resulted in 14 deaths and pressured the company to recall more than 2M vehicles in the US to install new safeguards in the system.[3][8]

Sources: [1]NBC, [2]United States Department of Transportation, [3]Associated Press, [4]Verge, [5]New York Post, [6]FOX News, [7]The Register and [8]Reuters.

Narratives

  • Narrative A, as provided by The American Prospect. Recent crashes involving Tesla's FSD system sent a warning signal to auto regulators about serious safety concerns in the system. Now, they are finally taking action against a technology that should never have been allowed on the road in the first place, especially as data shows that this system is far more dangerous at driving than humans.
  • Narrative B, as provided by USA Today. It's hard to fault a driver assistant system clearly labeled as 'supervised' for accidents when those crashes are mostly — if not entirely — to blame either on human error or bad road conditions. In fact, issues with this technology apparently stem from the fact that it is safe because it can make drivers too complacent.

Predictions

Improve the News Foundation profile image
by Improve the News Foundation

Get our free daily newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More