Difference between revisions of "...predict categories"

From
Jump to: navigation, search
Line 8: Line 8:
 
If you ...
 
If you ...
  
* ...have two-class classification...  
+
* ...have two-class classification; two predicting categories...  
 
** fast training, linear, then try the [[Perceptron (P)]]  
 
** fast training, linear, then try the [[Perceptron (P)]]  
 
** fast training, linear, then try the two-class [[Naive Bayes]] point machine
 
** fast training, linear, then try the two-class [[Naive Bayes]] point machine
 
** linear, greater than 100 features, then try the [[Support Vector Machine (SVM)]]   
 
** linear, greater than 100 features, then try the [[Support Vector Machine (SVM)]]   
 
** fast training, accurate, and can have a large footprint, then try the [[Boosted Decision Tree]]  
 
** fast training, accurate, and can have a large footprint, then try the [[Boosted Decision Tree]]  
* ...have multi-class classification...  
+
* ...have multi-class classification; three or more categories...  
 
** linear, then try the [[Naive Bayes]]   
 
** linear, then try the [[Naive Bayes]]   
 
** fast training, linear, then try the [[Logistic Regression]]   
 
** fast training, linear, then try the [[Logistic Regression]]   

Revision as of 19:39, 4 June 2018

Classifiers are ubiquitous in data science. The world around is full of classifiers. Classifiers help in identifying customers who may churn. Classifiers help in predicting whether it will rain or not. Classifiers help in preventing spam e-mails. If the targets are designed to be binary (two-class classification) then a binary classifier is used, the target will only take a 0 or 1 value.

___________________________________________________

If you ...