Oculus Rift hack transfers your facial expressions onto your virtual avatar
J Thoendell stashed this in Oculus
Researchers at University of Southern California (with help from Facebook) have devised a system that tracks a user's facial expressions and translates them onto an avatar in the VR world. It works by using an off-the-shelf Intel RealSense 3D Camera bolted to the front of an Oculus Rift DK2 in order to capture facial movements for the lower half of the face. The really clever part, though, is how it captures movements for the top half of the face, which is obviously covered up.
The researchers mounted eight strain gauges inside the foam liner of the Rift and developed software based on the Facial Action Coding System (FACS) often used by animators to integrate the data from the depth-sensing camera, strain gauges, and the Rift itself. The result is an eerily accurate representation of the user's facial expressions, down to the smallest of movements. Even better, latency was generally low, with the researchers measuring 3ms for facial feature detection, 5ms for blend shape optimisation, and 3ms for the mapping in software
Stashed in: Virtual Reality!
Cool but creepy. It's hard to trust this given that Oculus is owned by Facebook.
Because does anyone want Facebook to know our facial expressions?