The Tesla AI event was a huge success. While the Tesla Bot stole the spotlight, its supercomputer Dojo specs released during the event are mind-blowing. Musk announced that Dojo will be functional by 2022. Then Tesla will be operating the world’s 5th most powerful supercomputer.
As a lot of details are revealed, it looks like the Wall Street and industry experts will take some time to absorb and get a detailed analysis. Once Elon Musk was on stage, he joked around after apologizing for the delay. Saying that Tesla needs AI for solving the technical issues. He says that the AI event is a recruitment event. However, the event turned out more than a recruitment event for many investors and Tesla fans all over the world.
Karpathy started giving information by talking about the visual component of the Tesla AI. Said that the AI will be using eight cameras located on the Tesla cars. Everything from the synthetic visual cortex is being built for the AI. Basically, it can be considered a biological being.
Karpathy mentions how Tesla’s visual processing strategies have evolved (indicating their preference for radar sensors and then to camera vision). Furthermore talks about their HydraNets, part of multi-tasking. Then the AI director talks about Tesla using a re-engineered neural network that is designed by them. It enables multi-tasking like queues, caching, camera calibrations, and optimizations.
Tesla takes data from its backbone and feeds useful information into different tasks (ie: Object Detection, Traffic Lights, and Lane Prediction) without impacting every other task, which would otherwise waste processing. pic.twitter.com/lm3IVRX4zD
— Teslascope (@teslascope) August 20, 2021
Later the director of Autopilot Ashok Elluswamy spoke about Tesla’s solutions to certain issues like non-convex and high dimensional spaces. By using a Hybrid Planning System Tesla autopilot handles lane change. Then added that their cars can not only predict it’s own moves but also where the nearby cars could possibly drive to. Undeniable, traffic behavior is complex in several parts of the world.
When Tesla’s director of Autopilot engineering, Milan Kovac takes the stage to talk about Dojo. He says Tesla’s autopilot needs Dojo. He talked about the Neural Networks created for their cars.
Elon Musk wanted a superfast computer so the Tesla autopilot could be trained accordingly. The origin of Project Dojo was that, but the possible applications are endless. Its distributed computer architecture is connected to a network fabric.
The most important aspect of the event, Dojo’s specifications. All its power seems to be used to make autonomous cars possible. It is a pure learning machine that uses 500,000 training nodes to finish. Also uses 36 terabytes per second of off-tile bandwidth and 9 petaflops of computer per tile. These announced specifications are just a fraction of what Dojo is capable of. Because it is not done yet, the evolution process is going on.