In 2019, in Key Largo, a Tesla, which was driving on Autopilot, plowed into a car parked on the side of the road, killing the owners, a young couple standing near the vehicle. The accident happened at a T intersection because the driver took his eyes off the road when he dropped his phone while “driving on cruise.” The incident highlights a recurring issue where Tesla’s Autopilot technology was used on roads not designated for its operation, leading to at least eight fatal or serious accidents.
An investigation by The Washington Post identified approximately 40 fatal or serious crashes since 2016 involving Tesla’s driver assistance software. While many occurred on controlled-access highways, at least eight took place on roads where Autopilot was not designed to be used. Lawsuits, NHTSA data, and legal records reveal a concerning pattern of Autopilot deployment beyond its operational design domain, leading to tragic consequences.
Autopilot Limitations Ignored Despite Safety Concerns
Tesla’s user manuals and legal documents explicitly state that Autosteer, the central feature of Autopilot, is intended for use on controlled-access highways with clear lane markings and no cross traffic. According to Tesla’s warnings, the technology may falter on roads with hills or sharp curves. Despite this, the company has taken minimal steps to restrict Autopilot’s use based on geography, and federal regulators have yet to impose significant limitations.
Regulatory Dilemma: NTSB Urges Action, NHTSA Hesitates
Following the 2016 crash that killed Tesla driver Joshua Brown, the National Transportation Safety Board (NTSB) recommended limitations on where driver-assistance technology, like Autopilot, could be activated. However, the NTSB lacks regulatory power over Tesla, leaving the decision to the National Highway Traffic Safety Administration (NHTSA). In an October interview, NTSB Chair Jennifer Homendy criticized the NHTSA’s inaction, stating that safety does not seem to be a priority for Tesla.
Tensions Rise Between NTSB and NHTSA
The lack of enforceable rules has strained relations between the NTSB and the NHTSA, with the former urging the latter to take decisive action. Homendy emphasized the need for regulatory oversight, stating, “How many more people have to die before you take action as an agency?” The NHTSA defended its position, claiming that verifying the conditions for which systems like Autopilot are designed would be complex and resource-intensive, potentially not solving the problem. Despite issuing recommendations and urging sensible safeguards, the NHTSA has not adopted rules restricting the technology’s use to its intended conditions. The lack of proactive measures contrasts sharply with how other industries and entities respond to safety concerns, raising questions about the effectiveness of the current regulatory framework for autonomous driving technologies.
Tesla’s Deflection of Responsibility
Tesla, led by CEO Elon Musk, has shown an unwillingness to implement safeguards or restrictions on Autopilot’s use despite numerous safety concerns. In a letter sent to Musk in August 2021, NTSB Chair Jennifer Homendy urged him to take action on safety recommendations issued four years prior. Musk, however, did not respond. In court cases and public statements, Tesla consistently deflects responsibility for Autopilot-related crashes, arguing that drivers are ultimately accountable for their vehicle’s route. Despite fatalities and serious accidents, Tesla has not imposed geographic restrictions on Autopilot use. Critics argue that this laissez-faire approach to a rapidly evolving technology on public roads puts both Tesla drivers and pedestrians at risk.
A Pattern of Inadequate Government Oversight
The string of Autopilot-related crashes underscores the consequences of inadequate government oversight on rapidly evolving automotive technologies. Unlike the swift action taken in the aviation and railroad industries, NHTSA’s approach to regulating driver-assistance systems appears very slow, allowing a potentially flawed technology to persist.
The series of Autopilot-related crashes has become the kindling to start a critical examination of the regulatory framework governing autonomous driving technologies. With lives at stake, the urgent need for enforceable rules and proactive oversight cannot be overstated as the industry grapples with the challenges of innovation and safety on the nation’s roadways.