Recursive Feature Elimination (RFE)
YouTube search... ...Google search
- Notes on Feature Preprocessing: The What, the Why, and the How | Matthew Mayo - KDnuggets
- Automated Machine Learning (AML) - AutoML
- Principal Component Analysis (PCA)
- Datasets
- Batch Norm(alization) & Standardization
- Data Preprocessing
- Hyperparameters
- Data Augmentation
- Visualization
- Feature Exploration/Learning
- Recursive Feature Elimination (RFE) for Feature Selection in Python | Jason Brownlee - Machine Learning Mastery
- Feature Selection in Python — Recursive Feature Elimination | Dario Radečić - Towards Data Science
..recursive feature elimination (RFE, Guyon et al. (2002)) is basically a backward selection of the predictors. This technique begins by building a model on the entire set of predictors and computing an importance score for each predictor. The least important predictor(s) are then removed, the model is re-built, and importance scores are computed again. In practice, the analyst specifies the number of predictor subsets to evaluate as well as each subset’s size. Therefore, the subset size is a tuning parameter for RFE. The subset size that optimizes the performance criteria is used to select the predictors based on the importance rankings. The optimal subset is then used to train the final model. Feature Engineering and Selection: A Practical Approach for Predictive Models | Max Kuhn and Kjell Johnson