A hybrid human recognition framework using machine learning and deep neural networks

Faces are a crucial environmental trigger. They communicate information about several key features, including identity. However, the 2019 coronavirus pandemic (COVID-19) significantly affected how we process faces. To prevent viral spread, many governments ordered citizens to wear masks in public. I...

Full description

Saved in:
Bibliographic Details
Published in:PloS one Vol. 19; no. 6; p. e0300614
Main Authors: Sheneamer, Abdullah M, Halawi, Malik H, Al-Qahtani, Meshari H
Format: Journal Article
Language:English
Published: United States Public Library of Science 21-06-2024
Public Library of Science (PLoS)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Faces are a crucial environmental trigger. They communicate information about several key features, including identity. However, the 2019 coronavirus pandemic (COVID-19) significantly affected how we process faces. To prevent viral spread, many governments ordered citizens to wear masks in public. In this research, we focus on identifying individuals from images or videos by comparing facial features, identifying a person's biometrics, and reducing the weaknesses of person recognition technology, for example when a person does not look directly at the camera, the lighting is poor, or the person has effectively covered their face. Consequently, we propose a hybrid approach of detecting either a person with or without a mask, a person who covers large parts of their face, and a person based on their gait via deep and machine learning algorithms. The experimental results are excellent compared to the current face and gait detectors. We achieved success of between 97% and 100% in the detection of face and gait based on F1 score, precision, and recall. Compared to the baseline CNN system, our approach achieves extremely high recognition accuracy.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
Current address: Computer Science Department, Jazan University, Jazan, KSA
Competing Interests: The authors have declared that no competing interests exist.
ISSN:1932-6203
1932-6203
DOI:10.1371/journal.pone.0300614