An Inception-Residual-Based Architecture with Multi-Objective Loss for Detecting Respiratory Anomalies
This paper presents a deep learning system applied for detecting anomalies from respiratory sound recordings. Initially, our system begins with audio feature extraction using Gammatone and Continuous Wavelet transformation. This step aims to transform the respiratory sound input into a two-dimension...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Journal Article |
Language: | English |
Published: |
07-03-2023
|
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper presents a deep learning system applied for detecting anomalies
from respiratory sound recordings. Initially, our system begins with audio
feature extraction using Gammatone and Continuous Wavelet transformation. This
step aims to transform the respiratory sound input into a two-dimensional
spectrogram where both spectral and temporal features are presented. Then, our
proposed system integrates Inception-residual-based backbone models combined
with multi-head attention and multi-objective loss to classify respiratory
anomalies. Instead of applying a simple concatenation approach by combining
results from various spectrograms, we propose a Linear combination, which has
the ability to regulate equally the contribution of each individual spectrogram
throughout the training process. To evaluate the performance, we conducted
experiments over the benchmark dataset of SPRSound (The Open-Source SJTU
Paediatric Respiratory Sound) proposed by the IEEE BioCAS 2022 challenge. As
regards the Score computed by an average between the average score and harmonic
score, our proposed system gained significant improvements of 9.7%, 15.8%,
17.8%, and 16.1% in Task 1-1, Task 1-2, Task 2-1, and Task 2-2, respectively,
compared to the challenge baseline system. Notably, we achieved the Top-1
performance in Task 2-1 and Task 2-2 with the highest Score of 74.5% and 53.9%,
respectively. |
---|---|
DOI: | 10.48550/arxiv.2303.04104 |