Difference between revisions of "...predict categories"

From
Jump to: navigation, search
Line 12: Line 12:
 
** fast training, linear, and the features are independent, then try the two-class [[Naive Bayes]] point machine
 
** fast training, linear, and the features are independent, then try the two-class [[Naive Bayes]] point machine
 
** linear, greater than 100 features, then try the [[Support Vector Machine (SVM)]]   
 
** linear, greater than 100 features, then try the [[Support Vector Machine (SVM)]]   
** fast training, accurate, and can have a large footprint, then try the [[Boosted Decision Tree]]  
+
** fast training, accurate, and can have a large footprint, then try the [[(Boosted) Decision Tree]]  
 
* ...have multi-class classification; three or more categories...  
 
* ...have multi-class classification; three or more categories...  
 
** linear, then try the [[Naive Bayes]]   
 
** linear, then try the [[Naive Bayes]]   

Revision as of 01:53, 6 January 2019

Classifiers are ubiquitous in data science. The world around is full of classifiers. Classifiers help in identifying customers who may churn. Classifiers help in predicting whether it will rain or not. Classifiers help in preventing spam e-mails. If the targets are designed to be binary (two-class classification) then a binary classifier is used, the target will only take a 0 or 1 value.

___________________________________________________

If you ...