Measurement of speech signal patterns under borderline mental disorders

An algorithm for pitch frequency measurement for pattern detecting systems of borderline mental disorders is developed. The essence of the algorithm is decomposition of a speech signal into frequency components using an adaptive method for analyzing of non-stationary signals, improved complete ensem...

Full description

Saved in:
Bibliographic Details
Published in:2017 21st Conference of Open Innovations Association (FRUCT) Vol. 562; no. 21; pp. 26 - 33
Main Authors: Alimuradov, Alan, Tychkov, Alexander, Kuzmin, Andrey, Churakov, Pyotr, Ageykin, Alexey, Vishnevskaya, Galina
Format: Conference Proceeding Journal Article
Language:English
Published: FRUCT 01-11-2017
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:An algorithm for pitch frequency measurement for pattern detecting systems of borderline mental disorders is developed. The essence of the algorithm is decomposition of a speech signal into frequency components using an adaptive method for analyzing of non-stationary signals, improved complete ensemble empirical mode decomposition with adaptive noise, and isolating the component containing pitch. A block diagram for the developed algorithm and a detailed mathematical description are presented. A research of the algorithm using the formed verified signal base of healthy patients, and male and female patients with psychogenic disorders, aged from 18 to 60, is conducted. The research results are evaluated in comparison with the known algorithms for pitch frequency measurement, realized on the basis of the autocorrelation function and its modifications, the robust algorithm for pitch tracking, and the sawtooth waveform inspired pitch estimation. In accordance with the results of the study, the developed algorithm for pitch frequency measurement provides an accuracy increase in determination of borderline mental disorders: for the error of the first kind, on the average, it is more accurate by 10.7%, and for the second type error by 4.7%.
ISSN:2305-7254
2343-0737
DOI:10.23919/FRUCT.2017.8250161