In order to provide users with more comprehensive answers and live search tools that can instantly recognize items and locations around them, Google today announced various updates to its search, translate, and map products that combine artificial intelligence (AI) and augmented reality (AR). The business showcased new features for users of electric vehicles (EV), individuals who walk, cycle, or take public transportation, as well as improvements to its Google Maps Immersive View and Live View technologies during an event in Paris. These capabilities depict 3D routes in real-time. A tiny map at the bottom of a smartphone screen and directions put in the actual world are both available to users of Live View, which was introduced in 2020.
The Google Immersive View is operational in five cities. Immersive View is a new navigation feature for Google Maps that combine billions of street view and aerial photographs with AI and AR to generate a digital representation of the actual world that can direct users on a path. London, Los Angeles, San Francisco, New York, and Tokyo all now offer that feature.
New features of Google Maps
By just lifting a phone while walking down the street, Search with Live View and Immersive View is intended to assist users in finding nearby amenities, including ATMs, restaurants, parks, and public transportation. Additionally, the features provide useful details like the hours of operation, current customer volume, and rating of a business.
Aerial views are also possible with Live View. Users might use it, for instance, to visually fly over the Rijksmuseum in Amsterdam and view entrance positions. Additionally, a “time slider” displays a region at various times of day, along with weather predictions. In order to have all the information you need to choose where and when to go, Google added that you could also see where it tends to be most crowded. If you’re hungry, descend to the street level to look into nearby eateries. You may even go inside to get a sense of the atmosphere of a place before making a reservation.
Google employs neural radiance fields
Google claimed that the superimposed 3D maps created using AR technology could be especially useful for navigating challenging environments, such as an unfamiliar airport. The company introduced indoor Live View in 2021 for a few sites in the US, Zurich, and Tokyo. It directs users to places, including restrooms, lounges, taxi stops, and auto rentals via AR-generated arrows.
Google intends to increase indoor Live View’s availability at more than 1,000 additional airports, train stations, and shopping centers in Barcelona, Berlin, Frankfurt, London, Madrid, Melbourne, Paris, Prague, So Paulo, Singapore, Sydney, and Taipei over the coming few months.
Google employs neural radiance fields (NeRF), a cutting-edge AI tool that turns pictures into 3D representations, to produce lifelike scenes in Live View.
Users can also use the camera on their mobile device to use Immersive View with Google’s Live View technology to experience a neighborhood, landmark, eatery, or other well-known location before entering. Thus, pointing it towards a store, for instance, enables a person to “enter” the business and wander about without actually walking inside. The feature also recognizes locations on recorded footage in addition to real-time.