US federal regulators have launched an investigation into Tesla’s self-driving system after various crashes were reportedly caused by the Full Self-Driving (FSD) software offered by the company. The National Highway Traffic Safety Administration (NHTSA) revealed that it is investigating approximately 2.9 million Tesla vehicles across various models with the FSD functionality.
The research will analyse whether the technology was safe to work under real-life conditions, especially regarding traffic offenses and collisions. The action is taken following complaints of cars breaching stoplights and red lights, as well as hitting stationary objects when using the autonomous driving system.
Regulators are examining the Full Self-Driving technology of Tesla
According to the federal investigators, the investigation includes several Tesla models, such as the Model S, Model 3, Model X, and Model Y. The cars in question were produced after 2016 and feature the company’s advanced driver-assistance systems.
The police reported that the investigation followed a series of accidents involving Tesla cars, where people claimed the autonomous driving software was activated at the moment or just before the accident. The review will establish whether the FSD system poses unreasonable risks to people’s safety and whether a recall is possible.
The NHTSA’s Office of Defects Investigation has reported dozens of comments from drivers and law enforcement agencies about the erratic driving behavior attributed to Tesla’s autonomous mode. Data logs and crash reports are also being collected by regulators to determine the rate and severity of such incidents.
Response of Tesla to continued inquiry
Tesla did not release a proper official statement about the Tesla self-driving investigation. Nevertheless, CEO Elon Musk has already endorsed the autonomous technology developed by the company, stating that it is safer than human driving when used correctly.
Watch related coverage:
The company website emphasizes that Full Self-Driving does not make a car completely self-driving but provides assistance capabilities, including lane centering, automatic lane changes, and traffic light management. The drivers are obligated to be alert and be prepared to be in charge.
Tesla has been collaborating with regulatory bodies to offer over-the-air updates to enhance driver monitoring and car behavior in past safety inspections. It is not known whether the company will publish more updates as the investigation is underway or wait until the findings of regulators.
Crash pattern and previous studies
The Tesla self-driving probe is not the company’s initial inquiry into autonomous technology. Tesla has already had its Autopilot and FSD systems reviewed by NHTSA due to numerous accidents involving emergency vehicles and roadside hazards.
In January, regulators shut down an investigation into Tesla’s driver-assistance software after they found more than 900 crashes occurring due to the limitations of the system. The agency then demanded that the company revise its driver monitoring alerts to minimise abuse.
The recent accidents under consideration reportedly occurred when drivers extensively used the system in complicated traffic settings. Investigators are examining whether the Tesla system provides sufficient warnings to the driver and if the feature can correctly interpret traffic signals and stop signs.
The increasing doubts about the safety of autonomous driving
The contribution of Tesla’s self-driving investigation is a part of the burgeoning criticisms of self-driving technologies in the auto industry. Regulators worldwide are taking safety standards more seriously as car manufacturers compete to introduce advanced driver-assistance features.
In the United States, government bodies have been concerned about how certain businesses promote their technology as self-driving, potentially misleading drivers who are unaware of the necessary degree of control. The recent research may affect how future systems are incorporated, experimented with, and labeled for public use.
Other competitors of Tesla, such as Waymo and Cruise, have also been investigated for their autonomous vehicles’ engagement in collisions or traffic offenses. The results of the NHTSA may thus influence policies on the broader implementation of automated vehicles.
Safety to the Consumer and Perception
According to safety advocates, the Tesla self-driving probe is the key to regaining public trust in new vehicle technologies. A significant number of consumers do not feel safe using autonomous driving systems due to a lack of transparency and accountability in the testing and certification methods of these tools.
According to the public, in some of the reported crashes, there were injuries, but nobody has died. The government is still encouraging motorists to be more watchful, even with the automation functionality on.
The consumer watchdogs have also demanded stiffer rules that compel companies to divulge safety performance data and software restrictions. These steps can ensure that innovation is heading in the right direction without compromising people’s safety.
Continuous Evaluation and Potential Results
The NHTSA has not given a timeline for when the Tesla self-driving probe would be completed. Crash reports, software data, and corrective steps taken by the company will be looked up by investigators. Based on the result, the regulators can either recall, require a safety change, or close the investigation without further action.
According to analysts, the result might have far-reaching implications not only for Tesla but also for the entire autonomous driving industry. Investors are keenly awaiting the news that may impact production, software changes, or future vehicle sales.
Also Read: Victoria Beckham Netflix Docuseries Sparks Spice Girls Reunion Buzz
Final thoughts
The Tesla self-driving probe is another significant test in the regulation of autopilot vehicles in the United States. The case is likely to affect the balance between innovation and road safety in governments as regulators continue to collect evidence and assess the system’s performance. In the meantime, drivers are encouraged to be cautious, attentive, and in control when engaging in self-driving capabilities.
FAQs
What prompted the Tesla self-driving probe?
The Tesla self-driving probe was launched after multiple crashes allegedly involving vehicles using the Full Self-Driving (FSD) feature. Federal regulators received complaints that some cars failed to stop at traffic lights or signs and performed unsafe maneuvers while the feature was active.
2. How many Tesla vehicles are under investigation?
Approximately 2.9 million Tesla vehicles are being reviewed by the National Highway Traffic Safety Administration (NHTSA). The models include the Model S, Model 3, Model X, and Model Y produced between 2016 and 2025.
3. What kinds of crashes are being examined in the probe?
The investigation focuses on incidents where Tesla vehicles, while in Full Self-Driving mode, allegedly ran stop signs, ignored red lights, or collided with stationary objects. Some crashes also involved emergency vehicles and roadside hazards.
4. Is Tesla disputing that its self-driving feature was involved?
Tesla has not publicly disputed the investigation but maintains that the FSD system improves safety when used correctly. The company states that drivers must remain attentive and ready to take control at all times while using the technology.
5. What will regulators do if they find faults?
If regulators determine that Tesla’s FSD software poses safety risks, they may require a recall or mandate software updates. The NHTSA could also impose stricter monitoring requirements or issue new safety guidelines for autonomous systems.
6. Has Tesla faced similar investigations before?
Yes, Tesla has faced previous federal investigations into its Autopilot and FSD systems. Earlier probes examined crashes involving emergency vehicles and driver misuse of automation features, leading to safety updates and stricter monitoring alerts.
7. When might the probe’s findings be released?
The NHTSA has not provided a specific timeline for releasing the findings. The review involves analysing crash data, system logs, and vehicle software updates, which could take several months to complete.