According to a police report, Tillman Mitchell, a 17-year-old student, was getting off a school bus one afternoon in March. The bus had its stop sign and flashing red warning lights activated on North Carolina Highway 561. Tragically, a Tesla Model Y, allegedly operating on Autopilot mode, approached the scene without slowing down.
At a speed of 45 mph, the car collided with Mitchell, propelling him into the windshield and causing him to be thrown into the air. He landed face down on the road, as recounted by his great-aunt, Dorothy Lynch. Alerted by the crash, Mitchell’s father hurried from his porch to find his son lying in the middle of the road. Lynch expressed profound concern: “If it had been a younger child, that child would not have survived.”
In a rural area of North Carolina’s Halifax County, a catastrophic incident unfolded when advanced technology collided with devastating consequences. The Washington Post analyzed data from the National Highway Traffic Safety Administration, revealing a staggering 736 crashes involving Tesla vehicles in Autopilot mode since 2019, far surpassing previous reports. Over the past four years, this crash surge underscores the risks associated with the increasingly prevalent use of Tesla’s futuristic driver-assistance technology and the growing presence of these vehicles on American roads.
Challenges and Concerns with Autopilot Technology
The data also highlights a significant increase in fatalities and severe injuries related to Autopilot. Initially, authorities only confirmed three deaths connected to the technology when they released a partial account in June 2022. However, the latest data reveals a minimum of 17 fatal incidents, 11 occurring since last May, and five serious injuries.

Although Tillman Mitchell survived the March crash, he endured a fractured neck and a broken leg and required the assistance of a ventilator. He continues to face memory issues and struggles with walking. His great-aunt emphasizes the incident as a critical warning about the perils of relying too heavily on technology, stating, “I pray that this is a learning process. People are too trusting when it comes to a piece of machinery.”
Tesla CEO Elon Musk has championed the safety of cars in Autopilot mode compared to human drivers, citing crash rate comparisons. He envisions a future with virtually no accidents, advocating for advanced road navigation features, including obstacles like school buses and pedestrians. However, real-time testing of the technology on American highways has revealed notable flaws.
The Washington Post’s analysis of Tesla’s 17 fatal crashes uncovered patterns, including incidents involving motorcycles and emergency vehicles. Experts suggest that certain decisions made by Musk, such as expanding feature availability and removing radar sensors, may have contributed to the reported increase in incidents.
Tesla and Elon Musk declined to comment on the matter. At the same time, the National Highway Traffic Safety Administration (NHTSA) clarified that a crash involving driver-assistance technology does not automatically imply that the technology caused the crash. NHTSA emphasizes that the human driver must always be in control and actively engaged in driving tasks, with legal responsibility resting on them.
Response of Tesla and Ongoing Investigations
Despite the scrutiny, Musk stands by his decision to advance driver-assistance technologies, arguing that the benefits outweigh any potential harm.
Out of the numerous Tesla driver-assistance crashes, the National Highway Traffic Safety Administration (NHTSA) has selected approximately 40 incidents for further examination, aiming to better understand how the technology functions. One of these cases is the North Carolina crash involving Mitchell, the student getting off the school bus.
Following the accident, Mitchell woke up in the hospital with no memory of the event. His understanding of the severity of the crash remains limited, causing challenges as he tries to catch up in school. Local news outlet WRAL reported that the impact shattered the windshield of the Tesla involved.
The Tesla driver, Howard G. Yee, is facing multiple charges related to the crash, including reckless driving, passing a stopped school bus, and striking a person—a class I felony—according to Sgt. Marcus Bethea of the North Carolina State Highway Patrol.
Calls for Regulation and the Debate on Automated Driving
Authorities revealed that Yee had attached weights to the steering wheel to deceive Tesla’s Autopilot system into detecting the driver’s hand presence. Autopilot disables its functions if no steering pressure is applied for an extended period. Yee referred the matter to his attorney, who did not respond to The Washington Post’s request for comment.
The National Highway Traffic Safety Administration (NHTSA) is still investigating the crash, and Tesla requested that the agency keep the company’s incident summary confidential due to potentially sensitive business information.
Lynch, Mitchell’s great-aunt, expressed sympathy for Yee, considering his actions a mistake resulting from excessive trust in technology—a phenomenon known as “automation complacency.” However, Lynch had stronger words regarding Elon Musk, advocating for a ban on automated driving. She believes it should be prohibited altogether.