# ...predict categories

Do you have...

- ...two-class classification; two predicting categories?
- Do you need the results to be explainable?
- Yes
- fast training, accurate, and can have a large footprint, then try the (Boosted) Decision Tree
- linear, greater than 100 features, then try the Support Vector Machine (SVM)

- No
- fast training, linear, then try the Perceptron (P)
- fast training, linear, and the features are independent, then try the two-class Naive Bayes point machine

- Yes

- Do you need the results to be explainable?
- ...multi-class classification; three or more categories?
- Do you need the results to be explainable?
- Yes
- fast training, linear, then try the Logistic Regression (LR)
- accurate, then try the Decision Jungle for multi-class classification
- fast training, accurate, then try the Random Forest (or) Random Decision Forest

- No
- linear, then try the Naive Bayes
- accurate, can allow long training times, then try the Deep Neural Network (DNN)
- which is a type of is predecessors... Feed Forward Neural Network (FF or FFNN) and Artificial Neural Network (ANN)

- Yes

- Do you need the results to be explainable?

___________________________________________________

Classification problems are sometimes divided into binary (yes or no) and multi-category problems (animal, vegetable, or mineral). Classifiers are ubiquitous in data science. The world around is full of classifiers. Classifiers help in identifying customers who may churn. Classifiers help in predicting whether it will rain or not. Classifiers help in preventing spam e-mails. If the targets are designed to be binary (two-class classification) then a binary classifier is used, the target will only take a 0 or 1 value. Machine learning algorithms explained | Martin Heller - InfoWorld