Optimizing model-agnostic random subspace ensembles
This paper presents a model-agnostic ensemble approach for supervised learning. The proposed approach is based on a parametric version of Random Subspace, in which each base model is learned from a feature subset sampled according to a Bernoulli distribution. Parameter optimization is performed usin...
Saved in:
Published in: | Machine learning Vol. 113; no. 2; pp. 993 - 1042 |
---|---|
Main Authors: | , |
Format: | Journal Article Web Resource |
Language: | English |
Published: |
New York
Springer US
01-02-2024
Springer Nature B.V Springer |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper presents a model-agnostic ensemble approach for supervised learning. The proposed approach is based on a parametric version of Random Subspace, in which each base model is learned from a feature subset sampled according to a Bernoulli distribution. Parameter optimization is performed using gradient descent and is rendered tractable by using an importance sampling approach that circumvents frequent re-training of the base models after each gradient descent step. The degree of randomization in our parametric Random Subspace is thus automatically tuned through the optimization of the feature selection probabilities. This is an advantage over the standard Random Subspace approach, where the degree of randomization is controlled by a hyper-parameter. Furthermore, the optimized feature selection probabilities can be interpreted as feature importance scores. Our algorithm can also easily incorporate any differentiable regularization term to impose constraints on these importance scores. We show the good performance of the proposed approach, both in terms of prediction and feature ranking, on simulated and real-world datasets. We also show that PRS can be successfully used for the reconstruction of gene regulatory networks. |
---|---|
Bibliography: | scopus-id:2-s2.0-85176091212 |
ISSN: | 0885-6125 1573-0565 1573-0565 |
DOI: | 10.1007/s10994-023-06427-5 |