Difference between revisions of "...predict categories"

From
Jump to: navigation, search
Line 15: Line 15:
 
* ...have multi-class classification; three or more categories...  
 
* ...have multi-class classification; three or more categories...  
 
** linear, then try the [[Naive Bayes]]   
 
** linear, then try the [[Naive Bayes]]   
** fast training, linear, then try the [[Logistic Regression]]   
+
** fast training, linear, then try the [[Logistic Regression (LR)]]   
 
** fast training, accurate, then try the [[Random Forest (or) Random Decision Forest]]
 
** fast training, accurate, then try the [[Random Forest (or) Random Decision Forest]]
 
** accurate, then try the [[Decision Jungle]] for multi-class classification
 
** accurate, then try the [[Decision Jungle]] for multi-class classification
 
** accurate, can allow long training times, then try the [[Deep Neural Network (DNN)]]
 
** accurate, can allow long training times, then try the [[Deep Neural Network (DNN)]]
 
*** which is a type of is predecessors... [[Feed Forward Neural Network (FF or FFNN)]] and [[Artificial Neural Network (ANN)]]
 
*** which is a type of is predecessors... [[Feed Forward Neural Network (FF or FFNN)]] and [[Artificial Neural Network (ANN)]]

Revision as of 01:49, 6 January 2019

Classifiers are ubiquitous in data science. The world around is full of classifiers. Classifiers help in identifying customers who may churn. Classifiers help in predicting whether it will rain or not. Classifiers help in preventing spam e-mails. If the targets are designed to be binary (two-class classification) then a binary classifier is used, the target will only take a 0 or 1 value.

___________________________________________________

If you ...