Difference between revisions of "Softmax"

From
Jump to: navigation, search
Line 13: Line 13:
 
* [[Activation Functions]]
 
* [[Activation Functions]]
 
* [http://developers.google.com/machine-learning/crash-course/multi-class-neural-networks/softmax Multi-Class Neural Networks: Softmax]
 
* [http://developers.google.com/machine-learning/crash-course/multi-class-neural-networks/softmax Multi-Class Neural Networks: Softmax]
 +
* [http://towardsdatascience.com/softmax-function-simplified-714068bf8156 The Softmax Function, Simplified - How a regression formula improves accuracy of deep learning models | Hamza Mahmood - Towards Data Science]
  
 
Function is used in various multiclass classification methods, such as multinomial logistic regression (also known as softmax regression), multiclass linear discriminant analysis, naive Bayes classifiers, and artificial neural networks.
 
Function is used in various multiclass classification methods, such as multinomial logistic regression (also known as softmax regression), multiclass linear discriminant analysis, naive Bayes classifiers, and artificial neural networks.

Revision as of 21:48, 30 June 2019

Youtube search... ...Google search

Function is used in various multiclass classification methods, such as multinomial logistic regression (also known as softmax regression), multiclass linear discriminant analysis, naive Bayes classifiers, and artificial neural networks.