Federal Probe Heats Up as Tesla’s Self-Driving Tech Faces Scrutiny after Dozens of Crashes

Tesla crash scene with police and investigators at night
Table of Contents
    Add a header to begin generating the table of contents

    Federal safety regulators have launched another sweeping investigation into Tesla’s Full Self-Driving (FSD) technology, following a series of high-profile incidents where Teslas reportedly ignored traffic laws and were involved in crashes. The probe could have wide-ranging implications for both Tesla and the future of automated driving in the US.

    Key Takeaways

    • Federal regulators are investigating 58 incidents where Teslas allegedly broke traffic laws or crashed while using FSD.
    • The probe covers nearly 2.9 million Tesla vehicles equipped with the technology.
    • Tesla faces mounting pressure to address bugs and marketing around FSD following injuries and fatalities.
    • CEO Elon Musk continues to promise widespread rollout of self-driving features despite regulatory concerns.

    Investigation Targets Tesla’s Full Self-Driving Technology

    The National Highway Traffic Safety Administration (NHTSA) has broadened its focus to cover almost all Teslas featuring FSD capability. The agency noted dozens of cases where Teslas ran red lights, drove on the wrong side of the road, or engaged in other hazardous behaviors, sometimes resulting in collisions, fires, and injuries. Despite marketing, Tesla maintains that its system requires active driver supervision.

    Range of Incidents Raises Safety Questions

    Many drivers involved in FSD-related crashes reported that the vehicles did not provide any warning before engaging in erratic or unsafe maneuvers. In some cases, the incidents led to injuries and sparked concern about the readiness of the technology. The level of automation under scrutiny is classified as Level 2, which mandates the driver pay attention and be prepared to intervene at any moment.

    Wider Regulatory and Legal Pressure

    This latest probe adds to several ongoing investigations into Tesla’s semi-automated systems, including those for its Autopilot and vehicle "summon" feature. Regulators have also questioned Tesla’s crash reporting practices and criticized the company’s choice to call its system "Full Self-Driving," given its current limitations. In a separate case, a jury found Tesla partly responsible for a fatal 2019 crash involving Autopilot, ordering the company to pay hefty damages.

    Tesla’s Future and Industry Impact

    With the company’s traditional car sales facing headwinds—including competition from other EV manufacturers and consumer concerns related to CEO Elon Musk’s public stances—the success of FSD plays a larger role in Tesla’s narrative. Meanwhile, some investors and industry experts argue the company should rethink its hardware approach and more clearly communicate the limitations of its technology.

    The ongoing investigations highlight a broader debate over how—and how quickly—advanced driver assistance systems should be deployed on public roads. The outcomes may not only affect Tesla’s business but also set precedents for autonomous vehicle regulation across the industry.

    Sources