trusted formSafety Concerns Rise After Tesla’s FSD Causes Fatal Crash | Several.com
Although we earn commissions from partners, we ensure unbiased evaluations. More on our 'How We Work' page
Teslas Selfdriving Tech Faces Investigation Due To Fatal Flaws
Get a Quote

Fatal Flaw?: Tesla’s Self-Driving Tech Faces Investigation

Fatal Flaw?: Tesla’s Self-Driving Tech Faces InvestigationFatal Flaw?: Tesla’s Self-Driving Tech Faces Investigation
Tesla is under probe over their FSD system

Published: October 20th, 2024.

Tesla is currently under investigation by the NHTSA after a fatal crash involving one of their vehicles equipped with Full Self-Driving (FSD) software. The NHTSA initiated the probe following reports of four accidents involving Tesla’s driver-assistance technology, one of which led to the death of a pedestrian. All these crashes occurred in situations of reduced visibility, such as fog, sun glare, or airborne dust, raising serious concerns about whether Tesla’s FSD system can handle such conditions.

The investigation will focus on whether Tesla’s FSD system can detect and respond appropriately in low-visibility situations. It will also explore whether similar crashes have occurred and whether Tesla’s software updates have influenced the system’s performance. With over 2.4 million Tesla vehicles equipped with FSD on U.S. roads, including popular models like the Model S, 3, X, Y, and Cybertruck, the investigation could significantly impact the future of Tesla’s self-driving technology.

Tesla has long marketed FSD as a cutting-edge system capable of partial automation, allowing vehicles to handle many driving tasks independently. However, drivers must remain attentive and ready to take over when needed. Tesla’s driver-assistance technologies, including Autopilot, have been involved in numerous accidents, raising safety concerns about the systems' reliability in real-world conditions.

Earlier in 2024, the NHTSA linked Tesla’s Autopilot system to over 200 crashes and 29 deaths, prompting over 50 special investigations into Tesla vehicles thought to be involved in Autopilot-related crashes. Since 2021, more than 1,200 crashes involving Tesla’s driver-assistance systems have been reported, adding to the ongoing scrutiny surrounding these features.

Tesla is also facing legal challenges related to their driver-assistance systems. The company has been sued by families of individuals who died in crashes while using Autopilot or FSD. Additionally, California’s Department of Motor Vehicles has accused Tesla of false advertising, claiming that the company exaggerated the capabilities of their self-driving technology. The Department of Justice is also investigating Tesla's marketing practices related to FSD and Autopilot.

CEO Elon Musk has strongly advocated for Tesla’s self-driving technology, often making bold claims about its capabilities. Musk recently suggested that Tesla would begin deploying “unsupervised” FSD in Texas and California by 2025, allowing cars to drive without human supervision. However, with mounting regulatory and legal challenges, Tesla’s ambitious plans for fully autonomous vehicles could be delayed.

Even with advanced systems like FSD, drivers must remain alert and prepared to take over control of the vehicle at any time. The technology is not fully autonomous, and drivers are legally required to stay engaged while using it. Tesla has marketed FSD to assist with driving, but it still demands human supervision, especially in challenging conditions like fog, heavy rain, or poor road visibility.

There is growing evidence that Tesla’s FSD and Autopilot features may not always perform optimally in these conditions, which could lead to dangerous outcomes. The NHTSA’s ongoing investigation aims to determine how well the system can detect and respond to such situations, but until the results are clear, drivers should be cautious.

Related Topics

Recent Posts