Difference between revisions of "Support Vector Regression (SVR)"

From
Jump to: navigation, search
m
m
 
Line 17: Line 17:
 
[http://www.google.com/search?q=Support+Vector+Regression+SVR+machine+learning+ML ...Google search]
 
[http://www.google.com/search?q=Support+Vector+Regression+SVR+machine+learning+ML ...Google search]
  
* [[AI Solver]] ... [[Algorithms]] ... [[Algorithm Administration|Administration]] ... [[Model Search]] ... [[Discriminative vs. Generative]] ... [[Optimizer]] ... [[Train, Validate, and Test]]
+
* [[AI Solver]] ... [[Algorithms]] ... [[Algorithm Administration|Administration]] ... [[Model Search]] ... [[Discriminative vs. Generative]] ... [[Train, Validate, and Test]]
 
** [[...predict values]]
 
** [[...predict values]]
 
* [[Regression]] Analysis
 
* [[Regression]] Analysis

Latest revision as of 21:51, 5 March 2024

YouTube search... ...Google search

Recently, several studies on calibrating traffic flow models have been undertaken using support vector regression (SVR). For me, the method seems to be a bit counter-intuitive. Rather than the sum of the squared errors, the sum of squared prefactors of the model function is minimized. However, this seems to have nothing to do with the fit quality itself. The fit quality enters only indirectly in form of some constraints, and small deviations are not penalized at all. Furthermore, you obtain a sort of black-box model which is lengthy to write down explicitly and which cannot be understood intuitively. Under which circumstances, the SVR should nevertheless be preferred to an ordinary LSE minimization? - Martin Treiber


080526035813E4MDyJ.gif