Google’s Search Engine is the most primitive function of the company that has continuously evolved with time and the advancement of technology. Recently, Google’s Chief Executive Officer, Sundar Pichai announced at Google’s I/O developer’s event 2021 that “Google Search” has finally reached a new AI milestone for understanding information with its new technology development. It is called Multitask Unified Model or MUM.
Usually, Google Searches involve a single query for the most optimum results such as “How to file an FIR in a police station?” The search answer would include a step-by-step process on how to proceed with the query. Nevertheless, we all know that this level of specificity in a question is not possible at all times and by everyone around the world. Thus, several searches include questions that branch out as a complex task for the search engine to process.
For instance, you could Google Search “How to prepare for hiking Mount Fuji in October, I have hiked on a mountain before but what level of fitness do I require?
This question is multi-fold and the search engine will perceive this as a complex task to give you relevant search results. If you ask the same question to a mountain guide, he would be able to tell you precisely what all you need to carry out your task and that amounts as a conversational search result, an elaborated and detailed briefing of what you should do to prepare for your hike. Not to mention, the preparation that you do in October might differ from the one you do in February for the same mountain, again the guide will be able to help you better than your search engine results.
“What clothes should I wear for hiking Mount Fuji in October?” Or “What gear do I require to hike Mount Fuji in October?” or “A cheap and affordable stay when planning to hike Mount Fuji?”
All of these questions have pretty complex answers and by complex, I mean multi-fold answers and this is exactly what Google wants to deliver to its search users. Google wants you to get expertly curated answers for all your questions and queries and with the help of its cutting-edge natural language processing.
Google researchers made a breakthrough back in 2018 with the development of Bidirectional Encoder Representations from Transformers (BERT), a natural language model that changed the entire functionality of the Google Search Engine.
Google’s Bidirectional Encoder Representations from Transformers (BERT):
BERT is a natural language model developed by Google that changed the way Google’s search engine worked. Google researchers used to feed neural network text examples labelled with their meaning on the search engine and BERT, on the other hand, is fed with enormous quantities of unannotated digital text. The neural network applied by Google in BERT analysed the training texts that were masked and it was a challenge for this natural language model to understand the patterns of the text and understand the basic relationship between the words entered in the search engine. It was a successful model that changed how Google Search functioned behind the curtains.
Google’ Multitask Unified Model (MUM):
In recent announcements made by Google during its I/O developer’s event, CEO Sundar Pichai revealed that Google AI has successfully built a new model based on BERT that will revolutionise Google Search for users, a new way for Google to deal with complex or conversational searches- Multitask Unified Model (MUM).
Google claims that MUM is 1000x better than BERT and we truly had difficulty believing that until the functionality of MUM was explained by the company. MUM is better because it has over a thousand extra nodes than BERT. Imagine a human brain and how it has nerve junctions inside them, decision points if a layman would call it. Nodes can be referred to as those decision points which are a part of the neural network. MUM is multi-nodal and is designed to use data available on the open web with all low-quality data and content automatically removed.
Like BERT, MUM is also built on Transformer Architecture and it becomes a lot more powerful with the inclusion of extra nodes and other features.
MUM is capable to generate language, so the user who searched his complex query on Google might get a narrative or an answer explained similar to what a human expert or guide would say. As mentioned earlier, MUM enjoys a multi-nodal neural network which means that it can understand images and videos from all over the World Wide Web and deliver the most relevant content for our Google Searches along with the text.
Unlike BERT, MUM understands the meaning of the words, and analyses patterns of texts, images, videos and links that relate to the complex search on Google. MUM as a Natural Language Model understands the meaning and relations between the words, and as meaning exists across languages, Google’s MUM can map out the meanings and relations of words across 75 languages to deliver the user with expertly curated answers including relevant images and videos.
This is a ground-breaking development made by Google with its Multitask Unified Model.
Other than this, MUM is also capable of comprehending images by applying descriptor labels as meaning. So, this means that if a user uploads a picture of his shoes and asks Google if these types of shoes will work for a trek on Mount Fuji, MUM will be able to understand the image and provide a relevant answer to the user including a Yes/No and even recommendations for similar products.
Furthermore, Google is also working to reduce the carbon footprints of all the servers with its latest developments.
Moreover, MUM is still in its experimental stages and users will not be able to enjoy the full AI-equipped Google Search with MUM integration.
All things considered, MUM is a major revolution in the history of Google Search and Google’s plan to make AI search more powerful is affirmatively succeeding.