Updates On Google’s Custom Whitechapel Chipset For Google Pixel 6
Updates On Google’s Custom Whitechapel Chipset For Google Pixel 6

Google’s Whitechapel Chip, LaMDA and AI to design chips

Since 2019, we’ve heard rumours that Google is working on its own smartphone chipset. Google’s custom semiconductors include the Pixel Visual Core, Titan M Security Chip, and Neural Processing Unit, to name a few. These were simply co-processors, not an entire system-on-a-chip (SoC).

Things are just about to change, as Google announced a collaboration with Samsung to develop an in-house CPU to compete with the A14 Bionic and Snapdragon 888 on the market. The first-ever Google-designed chipset, Whitechapel (codenamed GS101), will appear in the Pixel 6 series this autumn.

Google manufactures some of the greatest Android phones money can buy, and the cameras are a big part of what sets them apart. Despite the fact that Google hasn’t updated the hardware sensor in almost three years, the company has been able to use its software skills to create genuinely excellent photographs from Pixel phones, including the $350 Pixel 4a.

While Google prefers to differentiate its phones through software, it appears that the firm is increasingly focusing on bespoke silicon to offer Pixels an advantage. The proprietary chipsets may power Google’s phones and possibly Chromebooks in the future, and the first of them, the GS101, is slated to debut in the Pixel 6 later this year.

Here’s everything you need to know about Google’s proprietary silicon, including a breakdown of the hardware, security features, camera enhancements, and why this is such a significant step forward for Google’s hardware aspirations.

Whitechapel GS101 Chip

Google is expected to release the Pixel 6 this year using its own proprietary chip, rather than the Qualcomm chips it has used since the first Pixel phone was released. According to 9to5Google, the Pixel 6 will be the first gadget to be powered by a “GS101” chip, also known as the “Whitechapel” chip. If confirmed, the processor will power not just Pixel phones, but also Chromebooks and the rumoured Pixel Fold gadget.

Updates On Google’s Custom Whitechapel Chipset For Google Pixel 6
Updates On Google’s Custom Whitechapel Chipset For Google Pixel 6

Last year, the Whitechapel chip was originally rumoured. It was believed to be Google’s attempt to make its own processors, similar to what Apple does with its iPhones, iPads, and MacBooks. It’s also been reported that Google and Samsung are working together on the chip. For its Android-powered smartphones, the South Korean firm develops its own Exynos CPUs, the most current of which is the Exynos 2100.

The chip is being created by Samsung Semiconductor’s system large-scale integration (SLSI) group, according to the article.

This implies that Google’s next chip will be comparable to Samsung’s Exynos processor. This move also indicates that Google intends to build its own hardware rather than relying just on software, which has been its strong point in the past.

Whitechapel is also linked to the codename “Slider” in a paper, according to the article. In Google’s camera app, the identical codename has been discovered. The “Raven” and “Oriole” will be the first phones to run on the Slider platform. Google Pixel 6 might be one of these codenamed devices.

The chip is known as “GS101” within the company. The abbreviation GS might also stand for “Google Silicon.” So yet, Google has said nothing regarding the custom-made chips. There have also been no leaks concerning the display, design, camera configuration, or other aspects of Google’s flagship smartphone. The Qualcomm Snapdragon 765G CPU was used in the Google Pixel 5 last year. It will be fascinating to see if or not the forthcoming Pixel 6 has a flagship-equivalent CPU.

When the search engine giant recruited an ex-Apple developer in mid-2017, rumours about Google working on bespoke chipsets initially appeared. Goole released the Pixel Core Visual processor a year later. Google has now revealed the Titan M security chip as well as the Pixel Neural Core.

Qualcomm would suffer a significant setback as a result of the move. Qualcomm is the world’s largest semiconductor provider for contemporary cellphones. Qualcomm processors have been used in Google Pixel devices since the beginning. For example, the Pixel 4 has a Snapdragon 855 CPU, while the Pixel 4a is expected to have a Snapdragon 730 chipset.

qualcomm snapdragon 660630
3D illustration of a glowing blue Qualcomm logo sitting on top of a glossy microchip

It’s only logical for Google to create its own chipset. Google can manage both hardware and software by moving from Qualcomm to its own SoC. As a result, Google will be able to compete with Apple, which has complete control of its hardware, software, and chipset.

The biggest challenge for Google, though, is to first repair its smartphone business. The company’s Pixel smartphones have sold at a far lower rate than expected. The lack of purpose and Google’s larger position in the smartphone business are the problems with Google Pixel phones. Google will not be able to win the premium smartphone fight until these flaws are addressed.

AI model LaMDA

Google claims to have taken a significant step forward in the development of conversational artificial intelligence (AI) models, creating a system capable of holding its own in genuine discussions. Last night, at the first day keynote of Google I/O, the company’s annual developer conference, the firm revealed LaMDA. The Language Model for Dialogue Applications (LaMDA) project intends to replace robotic interactions with more natural dialogues using artificial intelligence.

(Source: 9to5google.com)

Although one might argue that Google’s choice of themes in the demos harmed the system’s utility, the firm presented demonstrations of how LaMDA works.

The business demonstrated two demos, one with people conversing with Pluto and the other with a paper aeroplane. It’s unclear why Google picked these two topics, but they were intended to demonstrate how the system might respond to real queries rather than becoming puzzled when users make basic conversational comments.

In other words, Google wants AI discussions to feel more natural, which would lower the learning curve involved with engaging with such systems. Human interactions are natural in the sense that we don’t respond to a query like “hello, how are you?” with the same five answers every time. You might say something like, “I’m good, how are you?” or alter your response depending on who you’re talking to, as long as the other person can understand your tone, meaning, and reply accordingly.

Interactions with AI, on the other hand, are constrained and confined, rather than free-flowing and open-ended as they are in real life.

You must currently utter wake phrases such as “Hey Google,” followed by specific instructions such as “turn off the lights,” and so on. Systems like LaMDA can make such instructions feel more normal, or even facilitate discussions with them. They may also be used to improve chatbots and other automation applications, a market that is expected to develop significantly by 2020.

AI is designing chipsets in hours

The next-generation chipset is being designed by Google utilising Artificial Intelligence (AI). And the AI takes only six hours to build, compared to months for human designers.

AI to make Chipsets
AI to make Chipsets

Chip floor planning is the process of creating a computer chip’s physical layout. Chip floor planning has eluded automation despite decades of research, requiring months of hard labour from physical design engineers to develop manufacturable layouts.

In all essential parameters, such as power consumption, performance, and chip area, the new chips are reported to be superior or similar to those made by humans.

In all essential parameters, such as power consumption, performance, and chip area, the new chips are reported to be superior or similar to those made by humans.

To do so, the designers turned chip floor planning into a reinforcement learning issue and created a neural network that could learn chip representations.

The technique drew on previous experience to improve and speed up problem solving in new situations. This permits the chip’s artificial designers to gather more expertise than any human designer.