Tesla’s driving data storage system has reported been hacked by a Dutch forensic lab, and data pertaining to car crashes that weren’t made public by the company, have been found through the same. The information that has been brought to light during the hack is believed to be helpful in crash investigations.
Storing Tons of User Responses
Researchers at the Netherlands Forensic Institute (NFI) have found that Tesla’s EVs store a far greater amount of user data than was originally thought. These include the driving behavior of its customers, such as speed, steering wheel angle, accelerator pedal position, and brake usage. Some of this data can be stored for as long as a year, in order to improve Autopilot, the company’s advanced driver assistance system, and also to evaluate crashes.
The Dutch team was looking into a crash involving a Tesla with Autopilot, that was apparently in use even after rear-ending another vehicle. Interestingly, the researchers decided to “reverse engineer” Tesla’s data logs, instead of seeking data from the company, claiming that the same allowed them to “objectively” study the information.
NFI digital investigator Francis Hoogendijk has said in a statement that the data that has been retrieved by them contains “a wealth of information” that can help both traffic accident analysts and forensic investigators, while also allowing for a criminal investigation after traffic accidents that either result in injury or are fatal altogether. The organization has further added that even though Elon Musk’s firm has complied with requests for data by the government, it has also deliberately left out quite a lot of data that could have been useful.
Selectively Providing Information
The NFI’s report reads that when requested for data, Tesla provides only those subsets that are specifically requested, and that too, only for a timeframe. This, they say, is in contrast to the logs, which contain a plethora of recorded signals.
This hack has the potential of having implications for a number of US investigators that are looking into all the crashes that have to do with Tesla’s vehicles. The data obtained by Tesla from its users is encrypted to prevent rivals from accessing it, but vehicle owners can request their own data, including camera footage, if a crash happens.
Then there’s also Autopilot’s shadow mode, which allows Tesla to gather statistical information on false negatives and false positives. The mode makes the car register when it would have taken action, despite not actually taking any action, and this allows the firm’s team to see if the autonomous mode is capable of avoiding crashes.
The NFI has also said that it would be “good” if the data could be made available more often to facilitate forensic investigations.