Development of Low Bit Rate Speech Encoder based on Vector Quantization and Compressive Sensing

Speech coding is a representation of a digitized speech signal using as few bits as possible, while maintaining reasonable level of speech quality. Due to growing need for bandwidth conservation in wireless communication, the research in speech coding has increased. Recently, Compressive Sensing (CS...

Full description

Saved in:
Bibliographic Details
Published in:Journal of applied sciences (Asian Network for Scientific Information) Vol. 13; no. 1; p. 49
Main Authors: Kassim, L A S, Gunawan, T S, Khalifa, O O, Kartiwi, M, Sulong, A, Abdullah, K, Hasbullah, N F
Format: Journal Article
Language:English
Published: 2013
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Speech coding is a representation of a digitized speech signal using as few bits as possible, while maintaining reasonable level of speech quality. Due to growing need for bandwidth conservation in wireless communication, the research in speech coding has increased. Recently, Compressive Sensing (CS) is gaining a great interest because of its ability to recover original signals by taking only few measurements. CS is a new approach that goes against the common data acquisition methods. In this research, a new system of speech encoding system is developed using compressive sensing. Since CS performs well in sparse signals, different sparsifying transforms are analyzed and compared using Gini coefficient. The quality of the speech coder is evaluated using Perceptual Evaluation of Speech Quality (PESQ), Signal-to-Noise Ratio (SNR) and subjective listening tests. Results show that the speech coders have achieved a PESQ score of 3.16 at 4 kbps which is a good quality as confirmed by listening tests. Furthermore, the coder is also compared with Code Excited Linear Prediction (CELP) coder.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:1812-5654
1812-5662
DOI:10.3923/jas.2013.49.59