Google has taken a significant step toward making it possible to control a smart device via hand gestures from afar. For improved accuracy at greater power levels, it received federal approval to continue “Experiment Soli,” its small radar project from 2015. For years, Google’s experimental division has been working on this. The idea proposed using gestures such as rubbing your index and thumb together to control a smart speaker or smartwatch instead of touching directly on a touchscreen.
You may be able to turn music on and off with a flick of your fingertips or turn on a JBL smart speaker by moving your hand closer to it using this technology. Your hand could be detected by the speaker’s modest radar sensors. Project Soli’s initial introduction as a model encountered a snag since the radar was unable to accurately capture user motions and had difficulty picking up each action. Consumers can only experiment with a limited amount of gestures that can be recognised by any gadget.
Apple and Google are both working on wearable health and mobile functions that will allow us to use our smartwatches and smartphones as the hub of our health and fitness regimes. Even while Google’s Wear OS is lagging behind Apple’s watchOS and Health app, it is beginning to catch up. In the past, Google added a few additional features that made it easier to take a break and keep track of your progress.
Widgets for the Google Fit mobile app’s home screen are the first. It’s now possible to place activity progress metres, such as the number of calories burnt and steps taken, directly on the home screen of your Android phone. The second is a new function for Wear OS devices that mimics Apple’s default Breathe app for the Apple Watch in terms of showing breathing exercises.