NHTSA Escalates Probe into Tesla’s Full Self-Driving — 3.2 Million Vehicles at Risk Over Failures in Fog, Glare, and Low Visibility
The U.S. auto safety watchdog just turned up the heat on Tesla’s Full Self-Driving tech. The National Highway Traffic Safety Administration escalated its investigation Thursday into an “engineering analysis” — the final step before a potential recall — after linking the system to multiple crashes in poor visibility conditions.
The probe covers roughly 3.2 million Tesla vehicles equipped with FSD (Supervised) or FSD Beta, spanning Model S and X from 2016-2026, Model 3 from 2017-2026, Model Y from 2020-2026, and Cybertrucks from 2023-2026.
NHTSA flagged a core issue: Tesla’s camera-only degradation detection system often fails to spot or warn drivers when visibility drops due to sun glare, fog, dust, or other airborne obscurants. In reviewed crashes, the system didn’t alert drivers until seconds before impact — and sometimes lost track of lead vehicles entirely.
Crashes That Raised Red Flags
The original preliminary evaluation started in October 2024 after four incidents, including one fatal pedestrian strike. It’s now expanded to nine confirmed crashes (one fatality, one injury) with six more under review. In each case, FSD was active, and the system allegedly didn’t detect degraded camera performance in time.
One example: a Tesla on FSD struck and killed a pedestrian in low-visibility conditions. Others involved collisions where glare or fog blinded cameras, yet no timely handover warning came.
What the Engineering Analysis Means
Here’s the kicker — upgrading to an engineering analysis signals serious scrutiny. NHTSA will dig deeper into Tesla’s software updates, how the degradation system performs post-fixes, and whether those changes truly address safety gaps. This phase often precedes recalls if defects are confirmed.
Tesla relies heavily on FSD for its future — it’s the $8,000 add-on (or subscription) that powers robotaxi ambitions and differentiates the brand. But repeated probes into Autopilot/FSD have already forced software tweaks and raised questions about the vision-only approach (no radar or lidar backups).
Tesla didn’t respond immediately to comment requests. The company has pushed updates to improve FSD handling in tough conditions, but critics argue camera reliance leaves blind spots in real-world weather.
Broader Safety Stakes
But that’s not all. This is the third active FSD-related investigation for Tesla. With millions of vehicles on roads, any mandated recall or major fix could ripple through owners, stock prices, and the autonomous driving race.
The editorial image captures the interior of a new Tesla Model 3 with Full Self-Driving engaged — the large touchscreen displaying visualization as the system runs. It underscores the tech at the center of the storm.
Final Thought
NHTSA’s escalation puts Tesla’s Full Self-Driving under the microscope like never before, zeroing in on whether it can safely handle everyday visibility challenges or if it’s a defect waiting to cause more harm. As the probe deepens, drivers with FSD face uncertainty — is the system ready for prime time, or does it need a hard reset?
What’s your view? Do you trust FSD in fog or bright sun, or would you pull over? Share in the comments below — especially from Delhi as the weekend kicks off — and pass this along if you’re following Tesla news or autonomous driving debates.