Meta CEO Mark Zuckerberg unveiled new plans for the metaverse in a recent live stream. The company is working on several new features, including the ability to generate worlds and items by simply describing them.
The outspoken billionaire demonstrated creating a basic virtual world. Using Builder Bot, an AI feature, Zuckerberg implanted trees, a beach, and an island into a custom virtual environment.
The Facebook founder explained that Builder Bot was part of Meta’s CAIRaoke project to improve AI assistants. In addition, the design supports AI interaction with the world based on relative experiences via headsets or glasses.
Zuckerberg added that in addition to the Bot feature, his team was also developing a universal speech translator. He capped this off by saying, “The ability to communicate with anyone in any language is a superpower that was dreamt of forever.”
Zuckerberg also showed a demo of the effort, in which a “Builder Bot” declared commands to create a beach, clouds, and trees, which resulted in their generation. This was impressive in and of itself, but it remains to be seen how far it can go and whether it can offer robust experiences.
Meta has gone all in on the metaverse. The company has suffered tremendously from recent incidents, with its market cap dropping by hundreds of billions. Facebook’s user growth slowed down for the first time, and a move by Apple led to the company losing a considerable amount in revenue.
The project, pronounced like the sing-along activity, is almost exactly like Google’s DeepMind machine-learning subsidiary. To advance its AI capabilities, Facebook is investing heavily in machine-learning technology. The company said it had started to implement self-supervised learning, a tool that allows AI models to recognize patterns on its own.
In a typical AI learning model — including those used by Facebook — algorithms learn from labels and tags inputted by humans, like Facebook content moderators. The AI proceeds to make decisions based on those labels. So far, Facebook’s tests of SSL are outperforming previous AI models for text and video, Zuckerberg said.
This same technology is being put to use for human and AI adaptive models such as the long-term “egocentric perception” data set, Ego4D, that Meta has been building to help AI understand the world from a first-person perspective. Using data2vec, an AI model can quickly learn about complicated subjects such as a game of soccer, baking bread or playing a musical instrument. The objective is to build more adaptive AI technology than exists today.