Yesterday’s chaos on Tesla’s autonomous driving technology’s usage in the Texas car crash was cleared finally. Elon Musk tweeted that the car involved in the accident did not purchase the FSD Beta, Tesla’s self-driving system.
As Tesla’s autonomous technology, FSD Beta is yet to be developed fully, every data log is sent to Tesa’s headquarters. The company always insisted that the drivers be always attentive.
Your research as a private individual is better than professionals @WSJ!
Data logs recovered so far show Autopilot was not enabled & this car did not purchase FSD.
Moreover, standard Autopilot would require lane lines to turn on, which this street did not have.
— Elon Musk (@elonmusk) April 19, 2021
On-going investigation
Two American federal agencies are working on the case currently. Both National Highway Traffic Safety Administration and the National Transportation Safety Board are making efforts to clarify the details.
Mark Herman, the constable from the scene, gave a statement to Reuters after Elon Musk’s tweet,
“If he is tweeting that out, if he has already pulled the data, he hasn’t told us that. We will eagerly wait for that data.”
Moreover, the autopilot system doesn’t identify the road without lane markers. Besides, FSD Beta’s recent update is supposed to be 10 times safer than the usual car with the possibility of accidents. The first time when FSD Beta was released, Elon assured that the car can handle high speeds. Meaning, it should be able to turn if there are lane markers the system can identify.
On the other hand, the police say they have a witness who said the owner wanted to show the autonomous driving system to his friend. As Herman said,
“We have witness statements from people that said they left to test drive the vehicle without a driver and to show the friend how it can drive itself.”
Safety is of utmost importance
Regardless of whether Tesla’s FSD Beta is not at fault or not, the importance of safety has become crucial with this incident. Richard Blumenthal, a US senator tweeted on the same issue. Any autonomous technology shouldn’t have any kind of death risk. The investigation by NHTSA is to shed some light on the safety matters of using semi-automated technology in the future.
Using Tesla’s driverless system—or any other—shouldn’t be a death risk. Advancements in driving technology must first & foremost be safe. A NHTSA investigation, along with comprehensive oversight, is paramount to prevent future semi-automated driving deaths. https://t.co/Kc0G5twfxV
— Richard Blumenthal (@SenBlumenthal) April 19, 2021
Interestingly, Tesla is banned from using “automated system” or “semi-automated system” from their advertisements.
Unlike other autonomous driving technology, Tesla’s website shows vehicle safety reports. The website has everything mentioned from dates, miles traveled to the number of accidents that happened in the testing system.
Knowing Tesla Drivers with FSD Beta
As shown by a Tesla owner who has FSD Beta, the car stops immediately as the seatbelt is taken off.
While Elon Musk supports the same statement as the Tesla driver who showed the demo. Contradicting to the assured safety, there are other videos (older videos, without recent FSD update) by teenage drivers who cunningly use the system without the driver’s attention.
Here is a video recorded last year, of a kid sleeping throughout the drive, with a seatbelt on. He just slipped himself to the back seat without taking the seatbelt. If all the logs of FSD are being sent to Tesla’s data log, then this Tesla owner surely must have lost their privileges.
https://www.youtube.com/watch?v=VS5zQKXHdpM