Federal auto safety regulators have launched a significant investigation into nearly 2.9 million Tesla vehicles equipped with the company’s “Full Self-Driving” (FSD) technology. The probe follows dozens of incident reports where the advanced driver-assistance system allegedly caused vehicles to violate traffic laws, leading to crashes and injuries. The National Highway Traffic Safety Administration (NHTSA) is scrutinizing the software after receiving numerous complaints about dangerous driving maneuvers, including running red lights and steering into oncoming traffic.
The investigation by NHTSA’s Office of Defects Investigation represents a critical examination of Tesla’s automated driving capabilities and could ultimately lead to a mandatory recall if the agency determines the system poses an unreasonable risk to safety. This inquiry is the latest in a series of federal reviews of Tesla’s driver-assistance features, which have been linked to numerous crashes over the past several years, raising fundamental questions about the technology’s readiness for widespread public use. The probe focuses specifically on versions of the software marketed as “FSD (Supervised)” and “FSD (Beta),” both of which Tesla states require a fully attentive driver prepared to take control at any moment.
Scope and Scale of the Federal Inquiry
The new probe is one of the broadest ever undertaken by NHTSA, covering an estimated 2.88 million Tesla vehicles across all models equipped with the FSD software. This includes essentially all vehicles sold in the United States with the feature enabled. The agency’s action was prompted by a pattern of alarming field reports from vehicle owners, law enforcement, and data collected by the agency itself. According to official documents, NHTSA has logged at least 58 separate incidents where vehicles operating in FSD mode allegedly engaged in unsafe and illegal actions.
These incidents have resulted in significant consequences, including more than a dozen crashes, some of which involved fires. Across all reported events, a total of 23 injuries have been documented. The investigation serves as a preliminary evaluation, which is the first formal step in a process that could compel Tesla to issue a recall and implement a software or hardware fix to address any identified safety defects. The agency stated its goal is “to assess the scope, frequency, and potential safety consequences” of the reported traffic violations.
Specific Safety Violations Under Review
Regulators are focusing on specific, repeatable driving behaviors that violate traffic safety laws. A primary concern is the system’s performance at intersections. NHTSA has identified six crashes where a Tesla with FSD engaged drove through a red traffic signal and collided with other vehicles in the intersection. In four of these crashes, one or more injuries were reported. Beyond collisions, the agency has received at least 18 complaints of Teslas failing to stop for red lights or not remaining stopped.
Erratic and Dangerous Maneuvers
Another critical area of the investigation involves improper lane movements. Reports indicate that vehicles using FSD have initiated lane changes that steer the car into the path of opposing traffic. Many of the drivers involved in these various incidents told regulators that the car provided no warning before executing the unexpected and dangerous maneuver, leaving them with little to no time to intervene and prevent a potential collision. Investigators noted that some of these problems may be location-specific and repeatable, citing multiple incidents that occurred at the same intersection in Joppa, Maryland, which Tesla later attempted to address with a software update.
The Technology and Its Limitations
The systems under investigation are “FSD (Supervised)” and its pre-release version, “FSD (Beta).” Despite the name “Full Self-Driving,” this software is not autonomous. It is classified as a Level 2 driver-assistance system, meaning it can provide steering, braking, and acceleration support but requires constant human oversight. Tesla’s own materials explicitly state that the driver must remain fully attentive with their hands on the wheel, prepared to take over immediately.
This distinction is central to the safety debate. Critics and safety advocates argue the “Full Self-Driving” branding overstates the system’s capabilities, potentially leading drivers to become complacent and over-reliant on the technology. The investigation will likely examine whether the system’s design and marketing contribute to this foreseeable misuse. This probe is separate from, but related to, other ongoing NHTSA investigations into Tesla’s less advanced “Autopilot” system, as well as its “Summon” feature, which has been linked to minor collisions in parking lots.
Regulatory Context and Broader Implications
This is not the first time NHTSA has scrutinized Tesla’s automated driving technology. Regulators have been investigating the company’s systems for over three years due to a series of high-profile crashes. One fatal incident from 2024, in which a Tesla on FSD struck and killed a motorcyclist in the Seattle area, remains a significant point of concern for safety officials. Furthermore, the agency opened a separate inquiry last year into 2.4 million Teslas after several crashes occurred in low-visibility conditions like fog or sun glare.
The outcome of this comprehensive evaluation could have far-reaching consequences for Tesla and the broader autonomous vehicle industry. A recall would not only be a significant logistical challenge but could also impact public trust in automated driving systems. The probe’s findings will likely influence future regulations and standards for driver-assistance technology. As states like California prepare new laws to hold manufacturers accountable for traffic violations caused by their systems, the pressure on companies to ensure the safety and reliability of their software has never been greater.