Out-of-Distribution Detection for Deep Neural Networks With Isolation Forest and Local Outlier Factor

Deep Neural Networks (DNNs) are extensively deployed in today's safety-critical autonomous systems thanks to their excellent performance. However, they are known to make mistakes unpredictably, e.g., a DNN may misclassify an object if it is used for perception, or issue unsafe control commands...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access Vol. 9; pp. 132980 - 132989
Main Authors: Luan, Siyu, Gu, Zonghua, Freidovich, Leonid B., Jiang, Lili, Zhao, Qingling
Format: Journal Article
Language:English
Published: Piscataway IEEE 2021
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep Neural Networks (DNNs) are extensively deployed in today's safety-critical autonomous systems thanks to their excellent performance. However, they are known to make mistakes unpredictably, e.g., a DNN may misclassify an object if it is used for perception, or issue unsafe control commands if it is used for planning and control. One common cause for such unpredictable mistakes is Out-of-Distribution (OOD) input samples, i.e., samples that fall outside of the distribution of the training dataset. We present a framework for OOD detection based on outlier detection in one or more hidden layers of a DNN with a runtime monitor based on either Isolation Forest (IF) or Local Outlier Factor (LOF). Performance evaluation indicates that LOF is a promising method in terms of both the Machine Learning metrics of precision, recall, F1 score and accuracy, as well as computational efficiency during testing.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2021.3108451