From Wikipedia, the free encyclopedia
Method in machine learning
In machine learning the random subspace method,1 also called attribute bagging2 or feature bagging, is an ensemble learning method that attempts to reduce the correlation between estimators in an ensemble by training them on random samples of features instead of the entire feature set.
In ensemble learning one tries to combine the models produced by several learners into an ensemble that performs better than the original learners. One way of combining learners is bootstrap aggregating or bagging, which shows each learner a randomly sampled subset of the training points so that the learners will produce different models that can be sensibly averaged.3 In bagging, one samples training points with replacement from the full training set.
The random subspace method is similar to bagging except that the features (âattributesâ, âpredictorsâ, âindependent variablesâ) are randomly sampled, with replacement, for each learner. Informally, this causes individual learners to not over-focus on features that appear highly predictive/descriptive in the training set, but fail to be as predictive for points outside that set. For this reason, random subspaces are an attractive choice for high-dimensional problems where the number of features is much larger than the number of training points, such as learning from fMRI data4 or gene expression data.5
The random subspace method has been used for decision trees; when combined with âordinaryâ bagging of decision trees, the resulting models are called random forests.6 It has also been applied to linear classifiers,7 support vector machines,8 nearest neighbours910 and other types of classifiers. This method is also applicable to one-class classifiers.1112 The random subspace method has also been applied to the portfolio selection13141516 problem showing its superiority to the conventional resampled portfolio essentially based on Bagging.
To tackle high-dimensional sparse problems, a framework named Random Subspace Ensemble (RaSE)17 was developed. RaSE combines weak learners trained in random subspaces with a two-layer structure and iterative process.18 RaSE has been shown to enjoy appealing theoretical properties and practical performance.17
An ensemble of models employing the random subspace method can be constructed using the following algorithm:
- Let the number of training points be N and the number of features in the training data be D.
- Let L be the number of individual models in the ensemble.
- For each individual model l, choose nl (nl < N) to be the number of input points for l. It is common to have only one value of nl for all the individual models.
- For each individual model l, create a training set by choosing dl features from D with replacement and train the model.
Now, to apply the ensemble model to an unseen point, combine the outputs of the L individual models by majority voting or by combining the posterior probabilities.
Footnotes
-
Ho, Tin Kam (1998). âThe Random Subspace Method for Constructing Decision Forestsâ (PDF). IEEE Transactions on Pattern Analysis and Machine Intelligence. 20 (8): 832â844. doi:10.1109/34.709601. S2CIDÂ 206420153. Archived from the original (PDF) on 2019-05-14. â©
-
Bryll, R. (2003). âAttribute bagging: improving accuracy of classifier ensembles by using random feature subsetsâ. Pattern Recognition. 36 (6): 1291â1302. doi:10.1016/s0031-3203(02)00121-8. â©
-
If each learner follows the same, deterministic, algorithm, the models produced are necessarily all the same. â©
-
Kuncheva, Ludmila; et al. (2010). âRandom Subspace Ensembles for fMRI Classificationâ (PDF). IEEE Transactions on Medical Imaging. 29 (2): 531â542. CiteSeerX 10.1.1.157.1178. doi:10.1109/TMI.2009.2037756. PMID 20129853. â©
-
Bertoni, Alberto; Folgieri, Raffaella; Valentini, Giorgio (2005). âBio-molecular cancer prediction with random subspace ensembles of support vector machinesâ (PDF). Neurocomputing. 63: 535â539. doi:10.1016/j.neucom.2004.07.007. hdl:2434/9370. â©
-
Ho, Tin Kam (1995). Random Decision Forest (PDF). Proceedings of the 3rd International Conference on Document Analysis and Recognition, Montreal, QC, 14â16 August 1995. pp. 278â282. â©
-
Skurichina, Marina (2002). âBagging, boosting and the random subspace method for linear classifiersâ. Pattern Analysis and Applications. 5 (2): 121â135. doi:10.1007/s100440200011. â©
-
Tao, D. (2006). âAsymmetric bagging and random subspace for support vector machines-based relevance feedback in image retrievalâ (PDF). IEEE Transactions on Pattern Analysis and Machine Intelligence. 28 (7): 1088â99. doi:10.1109/tpami.2006.134. PMIDÂ 16792098. â©
-
Ho, Tin Kam (1998). âNearest neighbors in random subspacesâ. Advances in Pattern Recognition. Lecture Notes in Computer Science. Vol. 1451. pp. 640â648. doi:10.1007/BFb0033288. ISBN 978-3-540-64858-1. â©
-
Tremblay, G. (2004). Optimizing Nearest Neighbour in Random Subspaces using a Multi-Objective Genetic Algorithm (PDF). 17th International Conference on Pattern Recognition. pp. 208â211. doi:10.1109/ICPR.2004.1334060. ISBN 978-0-7695-2128-2. â©
-
Nanni, L. (2006). âExperimental comparison of one-class classifiers for online signature verificationâ. Neurocomputing. 69 (7): 869â873. doi:10.1016/j.neucom.2005.06.007. â©
-
Cheplygina, Veronika; Tax, David M. J. (2011-06-15). âPruned Random Subspace Method for One-Class Classifiersâ. In Sansone, Carlo; Kittler, Josef; Roli, Fabio (eds.). Multiple Classifier Systems. Lecture Notes in Computer Science. Vol. 6713. Springer Berlin Heidelberg. pp. 96â105. doi:10.1007/978-3-642-21557-5_12. ISBN 9783642215568. â©
-
Varadi, David (2013). âRandom Subspace Optimization (RSO)â. CSS Analytics. â©
-
Gillen, Ben (2016). âSubset Optimization for Asset Allocationâ. CaltechAUTHORS. â©
-
Shen, Weiwei; Wang, Jun (2017), âPortfolio Selection via Subset Resamplingâ, Proceedings of AAAI Conference on Artificial Intelligence (AAAI2017) â©
-
Shen, Weiwei; Wang, Bin; Pu, Jian; Wang, Jun (2019), âThe Kelly growth optimal portfolio with ensemble learningâ, Proceedings of AAAI Conference on Artificial Intelligence (AAAI2019), 33: 1134â1141, doi:10.1609/aaai.v33i01.33011134 â©
-
Tian, Ye; Feng, Yang (2021). âRaSE: Random Subspace Ensemble Classificationâ. Journal of Machine Learning Research. 22 (45): 1â93. ISSNÂ 1533-7928. â© â©2
-
Tian, Ye; Feng, Yang (2021). âR Package âRaSEnâ: Random Subspace Ensemble Classification and Variable Screeningâ. CRAN. â©