Difference between revisions of "Isomap"

From
Jump to: navigation, search
(Created page with "[http://www.youtube.com/results?search_query=Kernel+Approximation YouTube search...] [http://www.google.com/search?q=Kernel+Approximation+machine+learning+ML ...Google search]...")
 
Line 8: Line 8:
 
* [[Principal Component Analysis (PCA)]]
 
* [[Principal Component Analysis (PCA)]]
 
* [[T-Distributed Stochastic Neighbor Embedding (t-SNE)]]
 
* [[T-Distributed Stochastic Neighbor Embedding (t-SNE)]]
* [[Isomap]]
 
 
* [[Local Linear Embedding]]
 
* [[Local Linear Embedding]]
 
* [[Kernel Approximation]]
 
* [[Kernel Approximation]]
* [http://en.wikipedia.org/wiki/Kernel_method Kernel method | Wikipedia]
+
* [http://en.wikipedia.org/wiki/Isomap Isomap | Wikipedia]
* [http://staff.ustc.edu.cn/~cheneh/paper_pdf/2017/Chu-Guan-Neurocomputing.pdf Efficient karaoke song recommendation via multiple kernel learning approximation | C. Guana, Y. Fub, X. Luc, E. Chena, X. Li, and H. Xiong]
+
* [http://science.sciencemag.org/content/295/5552/7 The Isomap Algorithm and Topological Stability | M. Balasubramanian, E. Schwartz, J. Tenenbaum, Vin de Silva and J. Langford]
  
 +
a nonlinear dimensionality reduction method. It is one of several widely used low-dimensional embedding methods.[1] Isomap is used for computing a quasi-isometric, low-dimensional embedding of a set of high-dimensional data points. The algorithm provides a simple method for estimating the intrinsic geometry of a data manifold based on a rough estimate of each data point’s neighbors on the manifold. Isomap is highly efficient and generally applicable to a broad range of data sources and dimensionalities.
  
The word "kernel" is used in mathematics to denote a weighting function for a weighted sum or integral.
+
http://science.sciencemag.org/content/sci/295/5552/7/F1.medium.gif
  
functions that approximate the feature mappings that correspond to certain kernels, as they are used for example in [[Support Vector Machine (SVM)]]. The feature functions perform non-linear transformations of the input, which can serve as a basis for linear classification or other algorithms. The advantage of using approximate explicit feature maps compared to the kernel trick, which makes use of feature maps implicitly, is that explicit mappings can be better suited for online learning and can significantly reduce the cost of learning with very large datasets. Standard kernelized [[Support Vector Machine (SVM)]]s do not scale well to large datasets, but using an approximate kernel map it is possible to use much more efficient linear [[Support Vector Machine (SVM)]]s. [http://scikit-learn.org/stable/modules/kernel_approximation.html Kernel Approximation | Scikit-Learn]
 
  
 
+
<youtube>LVwOB_9UEr4</youtube>
http://upload.wikimedia.org/wikipedia/commons/thumb/c/cc/Kernel_trick_idea.svg/750px-Kernel_trick_idea.svg.png
+
<youtube>sfndt7YYKUw</youtube>
 
+
<youtube>RPjPLlGefzw</youtube>
 
 
<youtube>mTyT-oHoivA</youtube>
 

Revision as of 18:01, 7 January 2019

YouTube search... ...Google search

a nonlinear dimensionality reduction method. It is one of several widely used low-dimensional embedding methods.[1] Isomap is used for computing a quasi-isometric, low-dimensional embedding of a set of high-dimensional data points. The algorithm provides a simple method for estimating the intrinsic geometry of a data manifold based on a rough estimate of each data point’s neighbors on the manifold. Isomap is highly efficient and generally applicable to a broad range of data sources and dimensionalities.

F1.medium.gif