Difference between revisions of "General Regression Neural Network (GRNN)"

From
Jump to: navigation, search
(Created page with "[http://www.youtube.com/results?search_query=Support+Vector+Regression+SVR YouTube search...] * AI Solver * ...predict values * Capabilities * Support Vector M...")
 
Line 1: Line 1:
[http://www.youtube.com/results?search_query=Support+Vector+Regression+SVR YouTube search...]
+
[http://www.youtube.com/results?search_query=Neural+Network+Regression YouTube search...]
  
 
* [[AI Solver]]
 
* [[AI Solver]]
 
* [[...predict values]]
 
* [[...predict values]]
 
* [[Capabilities]]  
 
* [[Capabilities]]  
* [[Support Vector Machine (SVM)]]
+
* [http://docs.microsoft.com/en-us/azure/machine-learning/studio-module-reference/neural-network-regression Neural Network Regression | Microsoft]
* [http://research.ncku.edu.tw/re/articles/e/20080620/3.html Support vector regression for real-time flood stage forecasting | Pao-Shan Yu*, Shien-Tsung Chen and I-Fan Chang]
 
  
 
Recently, several studies on calibrating traffic flow models have been undertaken using support vector regression (SVR). For me, the method seems to be a bit counter-intuitive. Rather than the sum of the squared errors, the sum of squared prefactors of the model function is minimized. However, this seems to have nothing to do with the fit quality itself. The fit quality enters only indirectly in form of some constraints, and small deviations are not penalized at all. Furthermore, you obtain a sort of black-box model which is lengthy to write down explicitly and which cannot be understood intuitively. Under which circumstances, the SVR should nevertheless be preferred to an ordinary LSE minimization? - Martin Treiber
 
Recently, several studies on calibrating traffic flow models have been undertaken using support vector regression (SVR). For me, the method seems to be a bit counter-intuitive. Rather than the sum of the squared errors, the sum of squared prefactors of the model function is minimized. However, this seems to have nothing to do with the fit quality itself. The fit quality enters only indirectly in form of some constraints, and small deviations are not penalized at all. Furthermore, you obtain a sort of black-box model which is lengthy to write down explicitly and which cannot be understood intuitively. Under which circumstances, the SVR should nevertheless be preferred to an ordinary LSE minimization? - Martin Treiber

Revision as of 05:06, 2 June 2018

YouTube search...

Recently, several studies on calibrating traffic flow models have been undertaken using support vector regression (SVR). For me, the method seems to be a bit counter-intuitive. Rather than the sum of the squared errors, the sum of squared prefactors of the model function is minimized. However, this seems to have nothing to do with the fit quality itself. The fit quality enters only indirectly in form of some constraints, and small deviations are not penalized at all. Furthermore, you obtain a sort of black-box model which is lengthy to write down explicitly and which cannot be understood intuitively. Under which circumstances, the SVR should nevertheless be preferred to an ordinary LSE minimization? - Martin Treiber


080526035813E4MDyJ.gif