Difference between revisions of "Optimization Methods"

From
Jump to: navigation, search
m
Line 8: Line 8:
 
[http://www.google.com/search?q=Optimization+LSTM+SGD+Adagrad+Adadelta+RMSprop+Adam+BGF+machine+learning+ML+artificial+intelligence ...Google search]
 
[http://www.google.com/search?q=Optimization+LSTM+SGD+Adagrad+Adadelta+RMSprop+Adam+BGF+machine+learning+ML+artificial+intelligence ...Google search]
  
* [[Natural Language Processing (NLP)]]
+
* [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]]  ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ...  [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]]
 
* [[Recurrent Neural Network (RNN)]]
 
* [[Recurrent Neural Network (RNN)]]
 
** [[Average-Stochastic Gradient Descent (SGD) Weight-Dropped LSTM (AWD-LSTM)]]
 
** [[Average-Stochastic Gradient Descent (SGD) Weight-Dropped LSTM (AWD-LSTM)]]

Revision as of 14:32, 28 April 2023

Youtube search... ...Google search

Methods:

  • Stochastic gradient descent (SGD) (with and without momentum)
  • L-BGFS
  • Adagrad
  • Adadelta
  • Root Mean Squared (RMSprop)
  • Adam
  • Hessian-free (HF)