Support Vector Machine (SVM)

From
Revision as of 12:01, 12 July 2019 by BPeat (talk | contribs)
Jump to: navigation, search

YouTube search... ...Google search

__________________________________________________________

Support vector machines (SVM) find optimal solutions for classification problems. Classically they were only capable of categorising linearly separable data; say finding which images are of Garfield and which of Snoopy, with any other outcome not being possible. During training, SVMs can be thought of as plotting all the data (Garfields and Snoopys) on a graph (2D) and figuring out how to draw a line between the data points. This line would separate the data, so that all Snoopys are on one side and the Garfields on the other. This line moves to an optimal line in such a way that the margins between the data points and the line are maximised on both sides. Classifying new data would be done by plotting a point on this graph and simply looking on which side of the line it is (Snoopy side or Garfield side). Using the kernel trick, they can be taught to classify n-dimensional data. This entails plotting points in a 3D plot, allowing it to distinguish between Snoopy, Garfield AND Simon’s cat, or even higher dimensions distinguishing even more cartoon characters. SVMs are not always considered neural networks. Cortes, Corinna, and Vladimir Vapnik. “Support-vector networks.” Machine learning 20.3 (1995): 273-297.

Note: SVMs represent a special case of Radial Basis Function Network (RBFN).

svm.png svm.png

The algorithm will separate the data points using a line. This line is chosen such that it will be furthermost from the nearest data points in 2 categories.10 Machine Learning Algorithms You need to Know | Sidath Asir @ Medium

Two-Class Support Vector Machine

YouTube search...

Two-Class Locally Deep Support Vector Machine (SVM)

YouTube search...

Support vector machines (SVMs) find the boundary that separates classes by as wide a margin as possible. When the two classes can't be clearly separated, the algorithms find the best boundary they can. As written in Machine Learning, the two-class SVM does this with a straight line only. (In SVM-speak, it uses a linear kernel.) Because it makes this linear approximation, it is able to run fairly quickly. Where it really shines is with feature-intense data, like text or genomic. In these cases SVMs are able to separate classes more quickly and with less overfitting than most other algorithms, in addition to requiring only a modest amount of memory. - Dinesh Chandrasekar