Difference between revisions of "...predict categories"

From
Jump to: navigation, search
m
 
(30 intermediate revisions by the same user not shown)
Line 1: Line 1:
* [[AI Solver]]
+
{{#seo:
* [[Capabilities]]
+
|title=PRIMO.ai
 +
|titlemode=append
 +
|keywords=ChatGPT, artificial, intelligence, machine, learning, GPT-4, GPT-5, NLP, NLG, NLC, NLU, models, data, singularity, moonshot, Sentience, AGI, Emergence, Moonshot, Explainable, TensorFlow, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Hugging Face, OpenAI, Tensorflow, OpenAI, Google, Nvidia, Microsoft, Azure, Amazon, AWS, Meta, LLM, metaverse, assistants, agents, digital twin, IoT, Transhumanism, Immersive Reality, Generative AI, Conversational AI, Perplexity, Bing, You, Bard, Ernie, prompt Engineering LangChain, Video/Image, Vision, End-to-End Speech, Synthesize Speech, Speech Recognition, Stanford, MIT |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools 
  
Classifiers are ubiquitous in data science. The world around is full of classifiers. Classifiers help in identifying customers who may churn. Classifiers help in predicting whether it will rain or not. Classifiers help in preventing spam e-mails. If the targets are designed to be binary (two-class classification) then a binary classifier is used, the target will only take a 0 or 1 value.
+
<!-- Google tag (gtag.js) -->
 +
<script async src="https://www.googletagmanager.com/gtag/js?id=G-4GCWLBVJ7T"></script>
 +
<script>
 +
  window.dataLayer = window.dataLayer || [];
 +
  function gtag(){dataLayer.push(arguments);}
 +
  gtag('js', new Date());
 +
 
 +
  gtag('config', 'G-4GCWLBVJ7T');
 +
</script>
 +
}}
 +
* [[AI Solver]] ... [[Algorithms]] ... [[Algorithm Administration|Administration]] ... [[Model Search]] ... [[Discriminative vs. Generative]] ... [[Train, Validate, and Test]]
 +
 
 +
Do you have...
 +
* ...two-class classification; two predicting categories?
 +
** Do you need the results to be explainable?
 +
*** Yes
 +
**** fast training, accurate, and can have a large footprint, then try the [[(Boosted) Decision Tree]]
 +
**** linear, greater than 100 features, then try the [[Support Vector Machine (SVM)]] 
 +
*** No
 +
**** fast training, linear, and the features are independent, then try the two-class [[Bayes#Naive Bayes|Naive Bayes]] point machine
 +
* ...multi-class classification; three or more categories?
 +
** Do you need the results to be explainable?
 +
*** Yes
 +
**** fast training, linear, then try the [[Logistic Regression (LR)]] 
 +
**** accurate, then try the [[Decision Jungle]] for multi-class classification
 +
**** fast training, accurate, then try the [[Random Forest (or) Random Decision Forest]]
 +
*** No
 +
**** linear, then try the [[Bayes#Naive Bayes|Naive Bayes]] 
 +
**** accurate, can allow long training times, then try the [[Neural Network#Deep Neural Network (DNN)|Deep Neural Network (DNN)]] e.g. [[Image Classification]]
 +
**** which is a type of is predecessors... [[Feed Forward Neural Network (FF or FFNN)]] and [[Neural Network]]
  
 
___________________________________________________
 
___________________________________________________
  
If you ...
 
  
* ...have two-class classification, fast training, linear, then try the [[Perceptron (P)]]  
+
* [[Data Quality#Data Augmentation, Data Labeling, and Auto-Tagging|Data Augmentation, Data Labeling, and Auto-Tagging]]
* ...have two-class classification, fast training, linear, then try the two-class [[Naive Bayes]] point machine
+
* [https://medium.com/@srnghn/machine-learning-trying-to-predict-a-categorical-outcome-6ba542b854f5 Machine Learning: Trying to classify your data | Stacey Ronaghan - Medium]
* ...have two-class classification, linear, greater than 100 features, then try the [[Support Vector Machine (SVM)]]
+
* [[Evaluating Machine Learning Models]]
* ...have two-class classification, fast training, accurate, and can have a large footprint, then try the [[Boosted Decision Tree]]  
+
** [https://bookdown.org/max/FES/encoding-categorical-predictors.html Feature Engineering and Selection: A Practical Approach for Predictive Models - 5 Encoding Categorical Predictors | Max Kuhn and Kjell Johnson]
 +
** [https://docs.aws.amazon.com/machine-learning/latest/dg/binary-model-insights.html Binary Model Insights |] [[Amazon | Amazon Web Services]]
 +
** [https://docs.aws.amazon.com/machine-learning/latest/dg/multiclass-model-insights.html Multiclass Model Insights |] [[Amazon | Amazon Web Services]]
  
* ...have multi-class classification, linear, then try the [[Naive Bayes]] 
 
* ...have multi-class classification, fast training, linear, then try the [[Logistic Regression]] 
 
* ...have multi-class classification, fast training, accurate, then try the [[Random Forest (or) Random Decision Forest]]
 
* ...have multi-class classification, accurate, then try the [[Decision Jungle]] for multi-class classification
 
  
* ...have multi-class classification, accurate, can allow long training times, then try the [[Feed Forward Neural Network (FF or FFNN)]]
+
Classification problems are sometimes divided into binary (yes or no) and multi-category problems (animal, vegetable, or mineral). Classifiers are ubiquitous in data science.  The world around is full of classifiers. Classifiers help in identifying customers who may churn. Classifiers help in predicting whether it will rain or not. Classifiers help in preventing spam e-mails. If the targets are designed to be binary (two-class classification) then a binary classifier is used, the target will only take a 0 or 1 value. [https://www.infoworld.com/article/3394399/machine-learning-algorithms-explained.html Machine learning algorithms explained | Martin Heller - InfoWorld]
* ...have multi-class classification, accurate, can allow long training times, then try the [[Artificial Neural Network (ANN)]]
 
* ...have multi-class classification, accurate, can allow long training times, then try the [[Deep Neural Network (DNN)]]
 

Latest revision as of 22:52, 5 March 2024

Do you have...

___________________________________________________



Classification problems are sometimes divided into binary (yes or no) and multi-category problems (animal, vegetable, or mineral). Classifiers are ubiquitous in data science. The world around is full of classifiers. Classifiers help in identifying customers who may churn. Classifiers help in predicting whether it will rain or not. Classifiers help in preventing spam e-mails. If the targets are designed to be binary (two-class classification) then a binary classifier is used, the target will only take a 0 or 1 value. Machine learning algorithms explained | Martin Heller - InfoWorld