Johns Hopkins University Applied Physics Laboratory has created a mixed reality system designed to amplify a person’s facial movement to detect social gestures, emotional signals and other forms of nonverbal communication.
The Mixed Reality Social Prosthesis technology works by overlaying facial imagery with psychophysiological data through the use of Microsoft’s HoloLens holographic platform, APL said Thursday.
The system uses sensors to gather the needed signal data that HoloLens then overlays on a subject’s face.
“The result is dramatic accentuation of subtle changes in the face, including changes that people are not usually aware of, like pupil dilation or nostril flare,” said Ariel Greenberg, APL research scientist.
“The Mixed Reality Social Prosthesis could help train officers to recognize and overcome the impact of stress on perception of emotion,” Greenberg added.
APL noted the technology has potential applications in the law enforcement, intelligence and healthcare areas.