NHTSA Probes Tesla's FSD Software Amid Safety Concerns

The National Highway Traffic Safety Administration (NHTSA) has raised concerns over Tesla’s Full Self-Driving software being misrepresented on social media as a fully autonomous service. An investigation into 2.4 million Tesla vehicles is underway to address safety issues, including collisions attributed to FSD's limitations in adverse conditions.


Devdiscourse News Desk | Updated: 09-11-2024 01:16 IST | Created: 09-11-2024 01:16 IST
NHTSA Probes Tesla's FSD Software Amid Safety Concerns
This image is AI-generated and does not depict any real-life event or location. It is a fictional representation created for illustrative purposes only.

The National Highway Traffic Safety Administration (NHTSA) has expressed concern over Tesla's social media representations of its Full Self-Driving (FSD) software. The agency warns these portrayals might mislead users into considering the software as fully autonomous, contrary to its partial automation design requiring driver oversight.

In October, NHTSA launched an investigation into 2.4 million Tesla vehicles equipped with FSD software following four collisions, one fatal, under challenging conditions like sun glare and fog. A publicized email from May 14th reveals NHTSA advising Tesla against showing FSD as a hands-free solution.

With a probe scrutinizing the assistance system's performance under diminished visibility, NHTSA aims to assess whether the provided driver feedback is sufficient. Tesla maintains that its official materials urge drivers to remain attentive, asserting any public miscommunication will be addressed as part of an ongoing dialogue with NHTSA.

(With inputs from agencies.)

Give Feedback