Difference between revisions of "Dimensional Reduction"

From
Jump to: navigation, search
Line 5: Line 5:
 
* [[Kernel Approximation]]
 
* [[Kernel Approximation]]
 
* [[Isomap]]
 
* [[Isomap]]
* [[Local Linear Embedding]]
+
* [[Local Linear Embedding (LLE)]]
 
* [[t-Distributed Stochastic Neighbor Embedding (t-SNE)]]
 
* [[t-Distributed Stochastic Neighbor Embedding (t-SNE)]]
 
* [[Softmax]]
 
* [[Softmax]]

Revision as of 18:22, 7 January 2019

Youtube search... ...Google search

Some datasets may contain many variables that may cause very hard to handle. Especially nowadays data collecting in systems occur at very detailed level due to the existence of more than enough resources. In such cases, the data sets may contain thousands of variables and most of them can be unnecessary as well. In this case, it is almost impossible to identify the variables which have the most impact on our prediction. Dimensional Reduction Algorithms are used in this kind of situations. It utilizes other algorithms like Random Forest, Decision Tree to identify the most important variables. 10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium