Uncertainty Quantification in Deep Learning Framework for Mallampati Classification
Mallampati classification is an indication to predict whether a patient might have crowded airways. According to the scale, there are four classes with increasing severity of airway crowding, which may indicate obstructive sleep apnea, as reported in multiple studies. Conventionally, the Mallampati...
Saved in:
Published in: | 2024 IEEE 12th International Conference on Healthcare Informatics (ICHI) pp. 626 - 631 |
---|---|
Main Authors: | , , , , , |
Format: | Conference Proceeding |
Language: | English |
Published: |
IEEE
03-06-2024
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Mallampati classification is an indication to predict whether a patient might have crowded airways. According to the scale, there are four classes with increasing severity of airway crowding, which may indicate obstructive sleep apnea, as reported in multiple studies. Conventionally, the Mallampati scale is manually identified by an expert in the clinic, but the same can be done by assessing the image of a person's oral cavity. In this regard, this study aims to develop a deep learning framework to perform Mallampati classification using the oral cavity images of individuals. The proposed framework for Mallampati classification develops a loss function by combining the aleatoric and epistemic uncertainty principles to improve the reliability of predictions of the ConvNeXt model for image classification. The experimental analysis was performed on a dataset of 262 subjects acquired from the sleep lab at All India Institute of Medical Sciences Bhopal in India, demonstrating that the proposed framework performs better than the state-of-the-art in terms of common evaluation metrics. Notably, experimental results reveal that the proposed uncertainty principle performs well as the experiments were conducted on all states of the arts with and without using this principle. |
---|---|
ISSN: | 2575-2634 |
DOI: | 10.1109/ICHI61247.2024.00100 |