Difference between revisions of "Dimensional Reduction"
m |
|||
| Line 8: | Line 8: | ||
[http://www.google.com/search?q=Dimensional+Reduction+Algorithm+Dimension+machine+learning+ML ...Google search] | [http://www.google.com/search?q=Dimensional+Reduction+Algorithm+Dimension+machine+learning+ML ...Google search] | ||
| + | Algorithms: | ||
* [[Principal Component Analysis (PCA)]] | * [[Principal Component Analysis (PCA)]] | ||
| + | * [[Pooling / Sub-sampling: Max, Mean]] | ||
* [[Kernel Approximation]] | * [[Kernel Approximation]] | ||
* [[Isomap]] | * [[Isomap]] | ||
| Line 14: | Line 16: | ||
* [[t-Distributed Stochastic Neighbor Embedding (t-SNE)]] | * [[t-Distributed Stochastic Neighbor Embedding (t-SNE)]] | ||
* [[Softmax]] | * [[Softmax]] | ||
| − | * [[ | + | * [http://en.wikipedia.org/wiki/Canonical_correlation Canonical Correlation Analysis (CCA)] |
| + | * [http://en.wikipedia.org/wiki/Independent_component_analysis[Independent Component Analysis (ICA)] | ||
| + | * [http://en.wikipedia.org/wiki/Linear_discriminant_analysis Linear Discriminant Analysis (LDA)] | ||
| + | * [http://en.wikipedia.org/wiki/Multidimensional_scaling Multidimensional Scaling (MDS)] | ||
| + | * [http://en.wikipedia.org/wiki/Non-negative_matrix_factorization Non-Negative Matrix Factorization (NMF)]] | ||
| + | * [http://en.wikipedia.org/wiki/Partial_least_squares_regression Partial Least Squares Regression (PLSR)] | ||
| + | * [http://en.wikipedia.org/wiki/Principal_component_regression[Principal Component Regression (PCR)] | ||
| + | * [http://en.wikipedia.org/wiki/Projection_pursuit Projection Pursuit] | ||
| + | * [https://en.wikipedia.org/wiki/Sammon_mapping Sammon Mapping/Projection] | ||
| + | |||
| + | |||
| + | Related: | ||
* [[(Deep) Convolutional Neural Network (DCNN/CNN)]] | * [[(Deep) Convolutional Neural Network (DCNN/CNN)]] | ||
| + | * [http://en.wikipedia.org/wiki/Factor_analysis Factor analysis] | ||
| + | * [http://en.wikipedia.org/wiki/Feature_extraction Feature extraction] | ||
| + | * [http://en.wikipedia.org/wiki/Feature_selection Feature selection] | ||
* [http://files.knime.com/sites/default/files/inline-images/knime_seventechniquesdatadimreduction.pdf Seven Techniques for Dimensionality Reduction | KNIME] | * [http://files.knime.com/sites/default/files/inline-images/knime_seventechniquesdatadimreduction.pdf Seven Techniques for Dimensionality Reduction | KNIME] | ||
* [http://en.wikipedia.org/wiki/Nonlinear_dimensionality_reduction#Locally-linear_embedding Nonlinear dimensionality reduction | Wikipedia] | * [http://en.wikipedia.org/wiki/Nonlinear_dimensionality_reduction#Locally-linear_embedding Nonlinear dimensionality reduction | Wikipedia] | ||
Revision as of 06:47, 4 February 2019
Youtube search... ...Google search
Algorithms:
- Principal Component Analysis (PCA)
- Pooling / Sub-sampling: Max, Mean
- Kernel Approximation
- Isomap
- Local Linear Embedding (LLE)
- t-Distributed Stochastic Neighbor Embedding (t-SNE)
- Softmax
- Canonical Correlation Analysis (CCA)
- [Independent Component Analysis (ICA)
- Linear Discriminant Analysis (LDA)
- Multidimensional Scaling (MDS)
- Non-Negative Matrix Factorization (NMF)]
- Partial Least Squares Regression (PLSR)
- [Principal Component Regression (PCR)
- Projection Pursuit
- Sammon Mapping/Projection
Related:
- (Deep) Convolutional Neural Network (DCNN/CNN)
- Factor analysis
- Feature extraction
- Feature selection
- Seven Techniques for Dimensionality Reduction | KNIME
- Nonlinear dimensionality reduction | Wikipedia
Some datasets may contain many variables that may cause very hard to handle. Especially nowadays data collecting in systems occur at very detailed level due to the existence of more than enough resources. In such cases, the data sets may contain thousands of variables and most of them can be unnecessary as well. In this case, it is almost impossible to identify the variables which have the most impact on our prediction. Dimensional Reduction Algorithms are used in this kind of situations. It utilizes other algorithms like Random Forest, Decision Tree to identify the most important variables. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium