What if you don’t have to press any physical button or screen to move the cursor etc to interact with an electronic device ? What if slight rubbing and tapping your thumb and index finger in the thin air could do all the usual things that normally we do. This has now been made possible by the Google Project Soli. They have already developed a small chip that does these kinds of interactions.
Even though these controls are virtual, the interactions feel physical and responsive. Feedback is generated by the haptic sensation of fingers touching each other. Without the constraints of physical controls, these virtual tools can take on the fluidity and precision of our natural human hand motion.
How does it work?
Soli sensor technology works by emitting electromagnetic waves in a broad beam. Objects within the beam scatter this energy, reflecting some portion back towards the radar antenna. Properties of the reflected signal, such as energy, time delay, and frequency shift capture rich information about the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity.
Radar signal visualization with Baudline
Soli tracks and recognizes dynamic gestures expressed by delicate motions of the fingers and hand. To accomplish this with a single chip sensor, we developed a novel radar sensing paradigm with tailored hardware, software, and algorithms. Unlike traditional radar sensors, Soli does not require large bandwidth and high spatial resolution; in fact, Soli’s spatial resolution is coarser than the scale of most smooth finger gestures. Instead, our fundamental sensing principles rely on motion resolution by extracting subtle changes in the received signal over time. By processing these temporal signal variations, Soli can distinguish complex finger movements and deforming hand shapes within its field.
Soli gesture recognition
The Soli software’s architecture consists of a generalized gesture recognition pipeline which is hardware agnostic and can work with different types of radar. The pipeline implements several stages of signal abstraction: from the raw radar data to signal transformations, core and general machine learning features, detection and tracking, gesture probabilities, and finally UI tools to interpret gesture controls.
The Soli SDK enables developers to access easily and build upon our gesture recognition pipeline. The Soli libraries extract real-time signals from radar hardware, outputting signal transformations, high precision position and motion data, and gesture labels and parameters at frame rates from 100 to 10,000 frames per second.
What are the potential applications of Soli?
The Soli chip can be embed in wearables, phones, computers, cars and IoT devices in our environment. Soli has no moving parts, it fits onto a chip and consumes little energy. It is not affected by light conditions, and it works through most materials.Possibilities are beyond our imagination.
Source: atap.google.com