Understanding Human Reactions Looking at Facial Microexpressions With an Event Camera

With the establishment of Industry 4.0 , machines are now required to interact with workers. By observing biometrics they can assess if humans are authorized, or mentally and physically fit to work. Understanding body language, makes human-machine interaction more natural, secure, and effective. Non...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on industrial informatics Vol. 18; no. 12; pp. 9112 - 9121
Main Authors: Becattini, Federico, Palai, Federico, Bimbo, Alberto Del
Format: Journal Article
Language:English
Published: Piscataway IEEE 01-12-2022
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:With the establishment of Industry 4.0 , machines are now required to interact with workers. By observing biometrics they can assess if humans are authorized, or mentally and physically fit to work. Understanding body language, makes human-machine interaction more natural, secure, and effective. Nonetheless, traditional cameras have limitations; low frame rate and dynamic range hinder a comprehensive human understanding. This poses a challenge, since faces undergo frequent instantaneous microexpressions. In addition, this is privacy-sensitive information that must be protected. We propose to model expressions with event cameras, bio-inspired vision sensors that have found application within the Industry 4.0 scope. They capture motion at millisecond rates and work under challenging conditions like low illumination and highly dynamic scenes. Such cameras are also privacy-preserving, making them extremely interesting for industry. We show that using event cameras, we can understand human reactions by only observing facial expressions. Comparison with red-green-blue (RGB)-based modeling demonstrates improved effectiveness and robustness.
ISSN:1551-3203
1941-0050
DOI:10.1109/TII.2022.3195063