Imagine your hands turning into an interface and you are interacting with electronic devices with your hand gestures. An invisible button is embedded between your fingers and as soon as you press them, things happen just like magic!! Project Soli is here to turn this fantasy into reality.
Soli is a purpose-built interaction sensor that uses radar for motion tracking of the human hand.
Currently, most of the gesture-sensing technology in the market is erratic and bulky. Project Soli, one of the latest experiment of Google from its ATAP is ready to showcase a powerful motion controller that will change the way we communicate and interact with everything especially with everyday electronic appliances like smartphones and tablets.
Project Soli is a sensor based on radar technology, similar to motion and other gesture-tracking controllers that detect hand movements accurately in real time. The small size of the sensor allows it to fit into a tiny chip and hence can be adjusted into smallest wearables.
The basic idea behind this project is to replace traditional input devices like mouse, keyboard or touch screens with hand gestures based on virtual reality. Rather than touching a physical object, with Project Soli, we can control the device with our hand motions.
See Also: What Is The Future of Smartphones?
What is a Radar?
Radar is a technology that uses radio waves to detect range, angle and velocity of nearby objects. When Radar transmits radio waves in the specified range, it can intercept the motion of any unidentified object. Earlier, the scope and utilization of Radar was limited to Army and Defense agencies for tracking the movements of enemy. But now, its application has extended to Marine, weather-sensing, biological researches, traffic controls etc.
Highlights of Project Soli
The sensor used in project Soli is the major highlight of this development as it captures sub millimeters of fingers in 3D space. The inbuilt interaction sensor runs at 60 Ghz that allows it to capture motion of fingers at a remarkable rate of ten thousand frames per second, which is much more accurate than camera-based systems.
With this technology, you can do things with your camera that you have never done before. With high accuracy and incredible speed, interacting with devices will become extremely easy. The tiny circuit board of the sensor can determine the hand size, velocity and motion. Meanwhile, the embedded Machine Learning technology translates these movements to pre-programmed commands and doppler effect detects the speed.
Soli’s utility is not just limited to wearables but can be used in any device, including the objects that don’t have a traditional display. The concept of virtual tools is key to Soli communications and is referred as gestures that mimic interactions with physical tools. Though these controls are virtual in nature, the interactions feel real and receptive.
The major advantages of this project are that it lets you control gadgets with your hand gestures and allows free hand typing. Also, it permits us to control things with efficient accuracy and limits the need to carry the gadgets while using them. However, its limited radar range and absence of multiple gestures restricts the potential of sensors. Also, they are expensive and have security threats.
Currently, the biggest problem with wearable devices is inputs and there is no easy way to control these devices. As with independent gestures, individuals can perform certain functions with ease while using electronic machines.
If you have anything to share, please comment in the section below.