Difference between revisions of "Support Vector Regression (SVR)"
m |
m |
||
Line 17: | Line 17: | ||
[http://www.google.com/search?q=Support+Vector+Regression+SVR+machine+learning+ML ...Google search] | [http://www.google.com/search?q=Support+Vector+Regression+SVR+machine+learning+ML ...Google search] | ||
− | * [[AI Solver]] ... [[Algorithms]] ... [[Algorithm Administration|Administration]] ... [[Model Search]] ... [[Discriminative vs. Generative | + | * [[AI Solver]] ... [[Algorithms]] ... [[Algorithm Administration|Administration]] ... [[Model Search]] ... [[Discriminative vs. Generative]] ... [[Train, Validate, and Test]] |
** [[...predict values]] | ** [[...predict values]] | ||
* [[Regression]] Analysis | * [[Regression]] Analysis |
Latest revision as of 21:51, 5 March 2024
YouTube search... ...Google search
- AI Solver ... Algorithms ... Administration ... Model Search ... Discriminative vs. Generative ... Train, Validate, and Test
- Regression Analysis
- Support Vector Machine (SVM)
- Math for Intelligence ... Finding Paul Revere ... Social Network Analysis (SNA) ... Dot Product ... Kernel Trick
- Support vector regression for real-time flood stage forecasting | Pao-Shan Yu*, Shien-Tsung Chen and I-Fan Chang
Recently, several studies on calibrating traffic flow models have been undertaken using support vector regression (SVR). For me, the method seems to be a bit counter-intuitive. Rather than the sum of the squared errors, the sum of squared prefactors of the model function is minimized. However, this seems to have nothing to do with the fit quality itself. The fit quality enters only indirectly in form of some constraints, and small deviations are not penalized at all. Furthermore, you obtain a sort of black-box model which is lengthy to write down explicitly and which cannot be understood intuitively. Under which circumstances, the SVR should nevertheless be preferred to an ordinary LSE minimization? - Martin Treiber