Low-Cost CO Sensor Calibration Using One Dimensional Convolutional Neural Network

The advent of cost-effective sensors and the rise of the Internet of Things (IoT) presents the opportunity to monitor urban pollution at a high spatio-temporal resolution. However, these sensors suffer from poor accuracy that can be improved through calibration. In this paper, we propose to use One...

Full description

Saved in:
Bibliographic Details
Published in:Sensors (Basel, Switzerland) Vol. 23; no. 2; p. 854
Main Authors: Ali, Sharafat, Alam, Fakhrul, Arif, Khalid Mahmood, Potgieter, Johan
Format: Journal Article
Language:English
Published: Switzerland MDPI AG 11-01-2023
MDPI
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The advent of cost-effective sensors and the rise of the Internet of Things (IoT) presents the opportunity to monitor urban pollution at a high spatio-temporal resolution. However, these sensors suffer from poor accuracy that can be improved through calibration. In this paper, we propose to use One Dimensional Convolutional Neural Network (1DCNN) based calibration for low-cost carbon monoxide sensors and benchmark its performance against several Machine Learning (ML) based calibration techniques. We make use of three large data sets collected by research groups around the world from field-deployed low-cost sensors co-located with accurate reference sensors. Our investigation shows that 1DCNN performs consistently across all datasets. Gradient boosting regression, another ML technique that has not been widely explored for gas sensor calibration, also performs reasonably well. For all datasets, the introduction of temperature and relative humidity data improves the calibration accuracy. Cross-sensitivity to other pollutants can be exploited to improve the accuracy further. This suggests that low-cost sensors should be deployed as a suite or an array to measure covariate factors.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s23020854