MIFS-ND: A mutual information-based feature selection method

•We propose a greedy feature selection method using mutual information theory.•The method uses feature–class and feature–feature mutual information.•We use NSGA-II method to select an optimal feature subset.•The accuracy of the proposed method is evaluated using multiple classifiers. Feature selecti...

Full description

Saved in:
Bibliographic Details
Published in:Expert systems with applications Vol. 41; no. 14; pp. 6371 - 6385
Main Authors: Hoque, N., Bhattacharyya, D.K., Kalita, J.K.
Format: Journal Article
Language:English
Published: Amsterdam Elsevier Ltd 15-10-2014
Elsevier
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•We propose a greedy feature selection method using mutual information theory.•The method uses feature–class and feature–feature mutual information.•We use NSGA-II method to select an optimal feature subset.•The accuracy of the proposed method is evaluated using multiple classifiers. Feature selection is used to choose a subset of relevant features for effective classification of data. In high dimensional data classification, the performance of a classifier often depends on the feature subset used for classification. In this paper, we introduce a greedy feature selection method using mutual information. This method combines both feature–feature mutual information and feature–class mutual information to find an optimal subset of features to minimize redundancy and to maximize relevance among features. The effectiveness of the selected feature subset is evaluated using multiple classifiers on multiple datasets. The performance of our method both in terms of classification accuracy and execution time performance, has been found significantly high for twelve real-life datasets of varied dimensionality and number of instances when compared with several competing feature selection techniques.
Bibliography:ObjectType-Article-2
SourceType-Scholarly Journals-1
ObjectType-Feature-1
content type line 23
ObjectType-Article-1
ObjectType-Feature-2
ISSN:0957-4174
1873-6793
DOI:10.1016/j.eswa.2014.04.019