Google Project Soli: Wearable Gesture Tracking
Jay Liew stashed this in Technology
Source: YouTube Video
Stashed in: Google!, Virtual Reality!, Google
On Project Soli:
At a time when most gesture-sensing technology is unreliable and clunky, Project Soli, one of Google latest cutting-edge experiments from its secretive Advanced Technology and Projects group (ATAP), provides an enticing example of the type of powerful motion controller that could actually change how we interact with everything from smartwatches and tablets to appliances and other everyday objects.
SEE ALSO: Project Jacquard: Google reinvents the wearable as stuff you actually wear
At a basic level, motion controllers are premised on the idea that a user's hands replace traditional input devices like touch screens or mouse and keyboards. Rather than touching a physical object — like a display or button — to control a device, you use hand gestures. Using hand gestures, proponents say, makes user interfaces much more intuitive and easy to use and opens up new ways for designers and developers to create better user experiences.
Radar to gestures
Project Soli's gesture-tracking takes a particularly unique approach in that it depends on radar. Radar, which detects objects in motion through high frequency radio waves, enables what Project Soli's design lead Carste Schwesig calls a "fundamentally different approach" to motion tracking.
"A typical model of the way you think about radar is like a police radar or baseball where you just have an object and you measure its speed," explains Schwesig.
"But actually we are beaming out a continuos signal that gets reflected by an arm, for example...so you measure the differences between the emitted and the received signal. It's a very complex wave signal and from that we can provide signal processing and machine learning techniques to detect gestures."
Source: http://mashable.com/2015/05/30/google-project-soli-analysis/
9:37 AM Jun 07 2015