Difference between revisions of "Softmax"

From
Jump to: navigation, search
Line 14: Line 14:
 
* [http://developers.google.com/machine-learning/crash-course/multi-class-neural-networks/softmax Multi-Class Neural Networks: Softmax]
 
* [http://developers.google.com/machine-learning/crash-course/multi-class-neural-networks/softmax Multi-Class Neural Networks: Softmax]
 
* [http://towardsdatascience.com/softmax-function-simplified-714068bf8156 The Softmax Function, Simplified - How a regression formula improves accuracy of deep learning models | Hamza Mahmood - Towards Data Science]
 
* [http://towardsdatascience.com/softmax-function-simplified-714068bf8156 The Softmax Function, Simplified - How a regression formula improves accuracy of deep learning models | Hamza Mahmood - Towards Data Science]
 +
* [http://dataaspirant.com/2017/03/07/difference-between-softmax-function-and-sigmoid-function/ Difference Between Softmax Function And Sigmoid Function | Saimadhu Polamuri - Dataaspirant]
  
 
Function is used in various multiclass classification methods, such as multinomial logistic regression (also known as softmax regression), multiclass linear discriminant analysis, naive Bayes classifiers, and artificial neural networks.
 
Function is used in various multiclass classification methods, such as multinomial logistic regression (also known as softmax regression), multiclass linear discriminant analysis, naive Bayes classifiers, and artificial neural networks.
 +
 +
http://i1.wp.com/dataaspirant.com/wp-content/uploads/2017/03/Softmax_Graph.png
  
 
<youtube>LLux1SW--oM</youtube>
 
<youtube>LLux1SW--oM</youtube>

Revision as of 21:52, 30 June 2019