An Unobtrusive Human Activity Recognition System Using Low Resolution Thermal Sensors, Machine and Deep Learning

Given the aging population, healthcare systems need to be established to deal with health issues such as injurious falls. Wearable devices can be used to detect falls. However, most wearable devices are obtrusive, and patients generally do not like or may forget to wear them. In this study, we devel...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on biomedical engineering Vol. 70; no. 1; pp. 115 - 124
Main Authors: Rezaei, Ariyamehr, Stevens, Michael C., Argha, Ahmadreza, Mascheroni, Alessandro, Puiatti, Alessandro, Lovell, Nigel H.
Format: Journal Article
Language:English
Published: United States IEEE 01-01-2023
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Given the aging population, healthcare systems need to be established to deal with health issues such as injurious falls. Wearable devices can be used to detect falls. However, most wearable devices are obtrusive, and patients generally do not like or may forget to wear them. In this study, we developed an unobtrusive monitoring system using infrared technology to unobtrusively detect locations and recognize human activities such as sitting, standing, walking, lying, and falling. We prototyped a system consisting of two 24×32 thermal array sensors and collected data from healthy young volunteers performing ten different scenarios. A supervised deep learning (DL)-based approach classified activities and detected locations from images. The performance of the DL approach was also compared with the machine learning (ML)-based methods. In addition, we fused the data of two sensors and formed a stereo system, which resulted in better performance compared to a single sensor. Furthermore, to detect critical activities such as falling and lying on floor, we performed a binary classification in which one class was falling plus lying on floor and another class was all the remaining activities. Using the DL-based algorithm on the stereo dataset to recognize activities, overall average accuracy and F1-score were achieved as 97.6%, and 0.935, respectively. These scores for location detection were 97.3%, and 0.927, respectively. These scores for binary classification were 97.9%, and 0.945, respectively. Our results suggest the proposed system recognized human activities, detected locations, and detected critical activities namely falling and lying on floor accurately.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:0018-9294
1558-2531
DOI:10.1109/TBME.2022.3186313