Tesla’s new version of Full self-driving beta 10.3 isn’t received by the customers as expected. After the users complained of sudden breaks, false collision warnings, turning and other issues, Tesla rolled back its latest version of Full Self-Driving (FSD) beta software in less than a day after its release.
The setbacks come as Tesla’s Full Self-Driving beta is still under scrutiny over the safety of its semi-autonomous driving technology. About these complaints, Tesla CEO Elon Musk responded on his Twitter as ” Seeing some issues with 10.3, so rolling back to 10.2 temporarily. Please note, this is to be expected with beta software. It is impossible to test all hardware configs in all conditions with internal QA (quality assurance), the public beta.”
According to this tweet, Tesla is rolling back to its previous version of FSD 10.2 temporally until they fix the issues of the new version 10.3. Tesla didn’t respond immediately to the request for comment for these complaints facing outside regular U.S. business hours.
In these videos posted by the beta users, it was shown the Tesla vehicles with the new 10.3 software continuously providing forward-collision warnings even there was no immediate danger. And also some vehicles applied breaks without a reason and some users said that they lost the FSD beta software completely after facing problems with the latest version.
The new FSD software 10.3 was previously delayed by a day with new improvements and fixes. On Saturday Elon Musk tweeted about 10.3 reasoning “Regression in some left turns at traffic lights found by internal QA in 10.3. Fix in work, probably releasing tomorrow.
No details about further updates
While FSD Beta 10.3 still faced issues after its release, there has been no information either from Musk or Tesla in any social media platform about the new update and its release.
Autopilot and Full Self-Driving are being questioned many times about the clarity and technical safety proof after dozen of collisions by Tesla vehicles. In each of the incidents, The National Highway Traffic Safety (NHTSA) identified that the drivers had engaged Autopilot before the crash. There was a fatality among these crashes. The details are not clear and it is being investigated now.
The question is whether the commitment of Elon Musk and his aim to develop futuristic equipment in less period and his unclear explanation for the questions irrespective what he believes and know about technology, would his dream of completely Automotive vehicles make the roads an easy driving and safer place?