Google has been a pioneer of a number of leading inventions and projects that have not only changed the way we perceive science and technology but has also made out lives easier and leisurelier.
From Google Cardboard to Driverless cars to Google glasses, Google has never disappointed us when it comes to show casing its talent in the world of technology and innovation.
Have you ever fantasied of lowering the volume of your system or changing the time on your phone by making just tiny movements with you hands in air without even touching the screen, like Minority Report and Iron Man movies? Yes? Then Google’s next project has come as an answer to all your fantasies.
In June 2015, Google, during it annual developer conference announced the world’s first radar based key technology project, Project Soli that would change the way shoppers interact with wearable devices.
Soli
Soli is a creation of Google’s research and development lab, ATAP (Advanced Technology and Projects), headed by Dr. Ivan Poupyrev, an award-winning scientist, inventor and designer in the field of design and technologies who is famous in blending the realities of digital and physical world. Mr. Poupyrev was Principal Research Scientist at the Walt Disney Imagineering research division.
Soli is a new sensing technology that makes movements or detects touchless gesture with the help of miniature radar. The entire sensor and antenna array is incorporated into an ultra-compact 8mm x 10mm chip (size of a finger-nail).
Related Read: Google Will Use Wide Field Radar In Its Project Soli Indicates A Patent
The Soli chip can be embedded in wearable, phones, computers, cars and IoT devices in our environment. Soli has no moving parts; it gets attached onto a chip and consumes little energy. The chip does not get bothered by light conditions and works through most materials.
The radars emitted by the chip, keeps a track of movements with sub-millimeter accuracy, helping you control the volume of a speaker, or dialing number in mid-air. Google’s Soli radars are so accurate that they not only detect the exterior of an object, but also its internal structure and rear surface, helping you to know what the thing is.
How does Soli work?
The Soli Sensor:
Project Soli is based on the concept of touchless interactions, which can be achieved with radar system. The device used in the project is RadarCat (Radar Categorization for Input and Interaction) and functions like any normal radar system. It is a small, versatile radar-based system to identify material and object classifications that help you interact with digital devices.
Think that you are walking in dark and you use torch to see the objects. The light beam that travels out from the torch, reflects off objects in front of you, and bounces back into your eyes, which helps you see the objects lying near you. RadarCat also functions in a very similar manner.
The base units fires electromagnetic waves at its target, some of which gets bounce back and return to the radar antenna helping in detecting the movement and gesture of the hands.
It is just not the bouncing back of the radar that helps Soli in analyzing the hand movements, size, shape, orientation, material, distance, and velocity of the object but properties such as energy, time delay, and frequency also help in analyzing the work.
So in short, the radar base emits a broad beam of electromagnets, which in turn gets reflected because of the hand gestures. The sensor then receives motion of signals as a series of reflections, which in turn is detected by the radar chip and are transformed into multiple representations. From these representations, engineers can extract features, such as “range doppler distance”.
The features are then passed to machine learning algorithms, which interpret them and approximate hand motions based on the signals received. A device that is capable of classifying and interpreting gestures, perceives all the motion signals. For example,
- A pinching motion can indicate the press of a button.
- Rubbing a finger along the length of a thumb can indicate moving a slider, say for volume control.
- Pulling your thumb across your pointer knuckle simulates dragging a screen.
The radar sensors used in Soli does not require large bandwidth and high spatial resolution like the traditional radars. Rather Soli’s evaluation board has juts two transmit antennas and four receive antennas.
It’s spatial resolutions rely on motion resolution that can be perceived by subtle changes from signals that are received over time. It is because of this signal variation that Soli can distinguish complex finger movements and deforming hand shapes within its field.
Limitations of RadarCat
- RadarCat does occasionally confuse objects with similar material properties (eg. Glass of mirror or a class of a mug),
- It takes longer to detect a hollow object as compared to the solid objects with flat surfaces.
- It also needs to be taught how exactly each object looks like before it can recognize it.
How is Project Soli helpful?
The project Soli is futuristic interface that will forever change the way we use all the technological devices, and not just wearable. A smart watch with a Soli chip wouldn’t need a digital crown as you could just wave your fingers in the air to get things done.
The work forth going by ATAP is not marvelous but commendable too as to how they transformed a briefcase size radar system into a fingernail sized chip that fits any smart watch. Soli in no time, after its release, will be get updated and could be worked from meters away, which would mean that with a snap of a finger you could control your devices.
Also Read: What Do You Know About Pranav Mistry – The “Inventions” Guy!!
This comes as boon for that area of medical field that are under a constant threat of infections and contamination. The benefit of gesture recognition feature over the present multi touch inputs would be that we no longer would be requiring screens. Which would be all new accomplishment in the world of machine and electronics.
Gesture recognition inputs will be a revolutionary and groundbreaking feature in the field of VR devices and play station. This project can further help in improving operator safety at work place. If the operator’s hands aren’t near moving parts of the machine then there are great chances of reducing onsite accidents.
Conclusion
The pace at which developments in technology and digital world is moving is faster than expected. The Project Soli will forever change our perspective on how we interact with machines, specially the wireless ones. Soli was initiated not just with the aim to quantify and interpret human gestures, but it also aimed to refine it to that extent that they could make use of gestures without receiving any interference from the environment.
Project Soli would be seen having major impact on the virtual and augmented reality. Today, where while using VR headsets we have to hold the controller, with this revolutionary project, we can simply use gestures movement to define how a user roams a virtual world.
Google, later this year, would start shipping development kits worldwide for its project, confirmed the ATAP officials at the Game Developer Conference. It would be interesting to see where developments in interface of Soli will take us
(Disclaimer: This is a guest post submitted on Techstory by the mentioned authors. All the contents and images in the article have been provided to Techstory by the authors of the article. Techstory is not responsible or liable for any content in this article.)
Image Source: YouTube
About The Author:
Ms. Shireen Shukla, intern at Khurana and Khurana, Advocates and IP Attorneys. Views expressed in this article are solely of the intern and do not reflect the views of either of any of the employees or employers.
Queries regarding this may be directed to [email protected]