A Hybrid Explainable AI Framework Applied to Global and Local Facial Expression Recognition
Facial Expression Recognition (FER) systems have many applications such as human behavior understanding, human machine interface, video games and health monitoring. The main advantage of the traditional white box methods is their explainability. However, the accuracy of recognition of these methods...
Saved in:
Published in: | 2021 IEEE International Conference on Imaging Systems and Techniques (IST) pp. 1 - 5 |
---|---|
Main Authors: | , , , |
Format: | Conference Proceeding |
Language: | English |
Published: |
IEEE
24-08-2021
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Facial Expression Recognition (FER) systems have many applications such as human behavior understanding, human machine interface, video games and health monitoring. The main advantage of the traditional white box methods is their explainability. However, the accuracy of recognition of these methods is completely reliant on the extracted features. On the other hand, the use of deep neural networks has advantage regarding the overall precision compared to traditional methods. Indeed, they are considered as black box methods and thus suffer from lack of reliability and explainability. In this work, we introduce a hybrid AI explainable framework (HEF) composed of a main functional pipeline comprising a Convolutional Neural Network (CNN) to classify input images and an explainable pipeline using Facial Action Units and application agnostic models LIME providing more useful data allowing to explain the obtained results and reinforce the decision provided by the main functional pipeline. The proposed HEF has been validated on the CK+ dataset and shows very promising results in terms of explainability of the obtained results. |
---|---|
DOI: | 10.1109/IST50367.2021.9651357 |