Project Soli was demonstrated inside of a prototype smartwatch this morning on the last day of Google I/O, which had the developer crowd cheering quite loudly. Project Soli is Google’s attempt to remove the physical interaction we have with tech devices, and I do mean that literally.
By minimizing the size of radar chips, which is no easy task by any means, Project Soli reacts to air gestures made by a user, then interprets that data into commands for a smartwatch or other various devices, such as a Bluetooth speaker. For example, if a user holds their hand up to a smartwatch, then does a pinch and twist motion, you could view and scroll through text on the smartwatch.
As of right now, Project Soli is nowhere near ready for consumer consumption, but with Google showing it off at Google I/O and beginning to get developers interested, it could help the company breakthrough many of the existing boundaries that are currently in place. One of those boundaries, specific to the Bluetooth speaker implementation, are the vibrations made from said speaker. Soli is developed to recognize the slightest of gesture movements in the air in front of it, but while inside of a vibrating box (speaker), the gesture interpreting becomes quite difficult. Problems such as these are what the Project Soli team are working to overcome.
However, the demo given during I/O with the smartwatch was pretty incredible, and as soon as the session is posted onto YouTube for sharing, we will update the post with it. It should be required viewing for all. In the meantime, The Verge went hands-on with Project Soli at Google HQ. Check out their video below.