Difference between revisions of "Support Vector Regression (SVR)"
Line 1: | Line 1: | ||
+ | {{#seo: | ||
+ | |title=PRIMO.ai | ||
+ | |titlemode=append | ||
+ | |keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, Microsoft, Azure, Amazon, AWS | ||
+ | |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | ||
+ | }} | ||
[http://www.youtube.com/results?search_query=Support+Vector+Regression+SVR YouTube search...] | [http://www.youtube.com/results?search_query=Support+Vector+Regression+SVR YouTube search...] | ||
[http://www.google.com/search?q=Support+Vector+Regression+SVR+machine+learning+ML ...Google search] | [http://www.google.com/search?q=Support+Vector+Regression+SVR+machine+learning+ML ...Google search] |
Revision as of 13:25, 2 February 2019
YouTube search... ...Google search
- AI Solver
- Capabilities
- Support Vector Machine (SVM)
- Support vector regression for real-time flood stage forecasting | Pao-Shan Yu*, Shien-Tsung Chen and I-Fan Chang
Recently, several studies on calibrating traffic flow models have been undertaken using support vector regression (SVR). For me, the method seems to be a bit counter-intuitive. Rather than the sum of the squared errors, the sum of squared prefactors of the model function is minimized. However, this seems to have nothing to do with the fit quality itself. The fit quality enters only indirectly in form of some constraints, and small deviations are not penalized at all. Furthermore, you obtain a sort of black-box model which is lengthy to write down explicitly and which cannot be understood intuitively. Under which circumstances, the SVR should nevertheless be preferred to an ordinary LSE minimization? - Martin Treiber