
Credit: Sci-Fi Lab/Provided
Human facial movements convey emotions and help us in nonverbal communication and physical activities such as eating & drinking.
Tracking facial movements and possibly their cause is one of the proposed uses for NeckFace, one of the first necklace-type sensor technologies. A team led by Cheng Zhang, Assistant Professor of Information Science at Cornell Ann S. Bowers College of Computing and Information Science, developed NeckFace, which can continuously track full facial expressions by using infrared cameras to capture images of the chin and face . from below the neck.
Their work is detailed in NeckFace: Continuous Tracking of Complete Facial Expressions on Neck-Mounted Wearable Devices, published June 24 in the Proceedings of the ACM on Interactive, Mobile, Wearable & Ubiquitous Technologies.
Colead’s authors are Tuochao Chen (from Peking University) and Yaxuan Li (from McGill University), visiting students in smart Computer Interfaces for Future Interactions Laboratory (SciFi) and Cornell student MPS Songyun Tao. Other contributors include HyunChul Lim, Mose Sakashita, and Ruidong Zhang, Cornell PhD students in information science, and François Guimbretière, professor of information science at Cornell Bowers College.
NeckFace is the next generation of Zhang’s previous work that resulted in CFace, a similar device but in headset format. According to Zhang, NeckFace offers a significant improvement in performance and privacy, and gives the user the option to use a less intrusive neck mount device.
In addition to the possible emotion tracking, Zhang sees many applications for this technology: virtual conferences when a front camera is not an option; Recognition of facial expressions in virtual reality scenarios; and silent speech recognition.
“The ultimate goal is for the user to be able to track their own behavior by continuously monitoring facial movements,” said Zhang, lead researcher at SciFi Lab. “And hopefully this can give us a lot of insight into their physical activity.” and mental activities.
Guimbretière said NeckFace also has the potential to transform video conferencing.
“The user would not have to be careful to stay in the field of view of a camera,” he said. Instead, NeckFace can recreate the perfect shot of the head as we move around a classroom, or even go outside to share a walk with a distant friend.
To test the effectiveness of NeckFace, Zhang and his co-workers conducted a user study with 13 participants, each asked to perform eight facial expressions while seated and eight more while walking. In the seated scenarios, participants were also asked to turn their heads while facial and remove & re-mount the device in one session.
NeckFace was tested in two versions: a neck band wrapped around the neck with two cameras just below the collarbone-level; and a necklace with a pendant-like infrared (IR) camera device that hangs below the neck.
The group collected basic facial movement data using the TrueDepth 3D camera on an iPhone X and then compared it to the data collected using NeckFace. Among the facial expressions of sitting, walking, & re-mounting, the study participants expressed a total of 52 face shapes.
Using deep learning calculations, the group found that NeckFace detects facial movements with almost the same precision as direct measurements with the phone’s camera. The neckband was found to be more accurate than the necklace, the researchers said, possibly because 2 cameras on the neck-band could capture more information from both-side than the centrally mounted necklace camera.
Zhang said that if the device is optimized, it could be particularly useful in the mental health field for tracking people’s emotions throughout the day. Although people don’t always put their emotions in front of their faces, he said, the amount of facial expressions that change over time could indicate emotional-swings.
“Can we really see your emotions change throughout the day?” He said. “With this technology, we could have a database of how you are doing physically and mentally during the day, and that means you can track your own behavior. A doctor could also use the information to support a decision.
This tech research were published on ACM Digital Library.