Leveraging machine learning and similarity judgements to identify perceptually relevant acoustic features of small, unmanned aircraft system signal similarity
Auditory signals can be described quantitatively by a set of measurable acoustic features (i.e., zero crossing rate, attack slope, etc.) or qualitatively with adjectives, such as whooping, thunderous, melodic, or in comparative terms such as different, louder, etc. Listeners can rate the similarity...
Saved in:
Published in: | The Journal of the Acoustical Society of America Vol. 154; no. 4_supplement; p. A145 |
---|---|
Main Authors: | , , , , , |
Format: | Journal Article |
Language: | English |
Published: |
01-10-2023
|
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Auditory signals can be described quantitatively by a set of measurable acoustic features (i.e., zero crossing rate, attack slope, etc.) or qualitatively with adjectives, such as whooping, thunderous, melodic, or in comparative terms such as different, louder, etc. Listeners can rate the similarity of signals and assign qualitative descriptions relatively easily; however, most lack the ability to articulate the quantitative basis of these judgments. Because the qualitative differences in signals typically correlate to a measurable difference in the acoustic features, signal similarity ratings can be used to recover the acoustic features that define signal similarity. In the present study, subjects were given combinations of small, unmanned aircraft systems (SUAS) signals consisting of either two different SUASs or SUAS and non-SUAS and asked to rate similarity on a scale from non-similar to highly similar. Using these similarity ratings along with acoustic difference features, machine learning algorithms were trained to predict human responses. These algorithms predict the position of a withheld UAS signal within the similarity feature-space. Crucial prediction acoustic difference features are extracted from the algorithms via feature importance and sensitivity analysis techniques. The extracted acoustic difference features may be inferred as prominent information impacting the human perception of signal similarity. |
---|---|
ISSN: | 0001-4966 1520-8524 |
DOI: | 10.1121/10.0023069 |