Improving Harris hawks optimization algorithm for hyperparameters estimation and feature selection in v‐support vector regression based on opposition‐based learning

Many real problems have been solved by support vector regression, especially v‐support vector regression (v‐SVR), but there are hyperparameters that usually needed to tune. In addition, v‐SVR cannot perform feature selection. Nature‐inspired algorithms have been used as a feature selection and as hy...

Full description

Saved in:
Bibliographic Details
Published in:Journal of chemometrics Vol. 34; no. 11
Main Authors: Ismael, Omar Mohammed, Qasim, Omar Saber, Algamal, Zakariya Yahya
Format: Journal Article
Language:English
Published: Chichester Wiley Subscription Services, Inc 01-11-2020
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Many real problems have been solved by support vector regression, especially v‐support vector regression (v‐SVR), but there are hyperparameters that usually needed to tune. In addition, v‐SVR cannot perform feature selection. Nature‐inspired algorithms have been used as a feature selection and as hyperparameters estimation procedure. In this paper, the opposition‐based learning Harris hawks optimization algorithm (HHOA‐OBL) is proposed to optimize the hyperparameters of the v‐SVR with embedding the feature selection simultaneously. The experimental results over four datasets show that the HHOA‐OBL outperforms the standard Harris hawks optimization algorithm, grid search, and cross‐validation methods, in terms of prediction, number of selected features, and running time. Besides, the experimental results of the HHOA‐OBL confirm the efficiency of the proposed algorithm in improving the prediction performance and computational time compared with other nature‐inspired algorithms, which proves the ability of HHOA‐OBL in searching for the best hyperparameters values and selecting the most informative features for prediction tasks. Thus, the experiments and comparisons support the performance of the proposed approach in making predictions in other real applications. v‐Support vector regression (v‐SVR) can solve many real problems. In v‐SVR, there are hyperparameters that usually needed to tune. In addition, v‐SVR cannot perform feature selection. In this paper, the opposition‐based learning Harris hawks optimization algorithm is proposed to optimize the hyperparameters of the v‐SVR with embedding the feature selection simultaneously. The experimental results show that the proposed algorithm outperforms the standard Harris hawks optimization algorithm, and other state‐of‐the‐art methods, in terms of prediction and running time.
ISSN:0886-9383
1099-128X
DOI:10.1002/cem.3311