Difference between revisions of "Optimization Methods"

From
Jump to: navigation, search
Line 1: Line 1:
 +
{{#seo:
 +
|title=PRIMO.ai
 +
|titlemode=append
 +
|keywords=artificial, intelligence, machine, learning, models, algorithms, data, singularity, moonshot, Tensorflow, Google, Nvidia, Microsoft, Azure, Amazon, AWS
 +
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools
 +
}}
 
[http://www.youtube.com/results?search_query=Optimization+LSTM+SGD+Adagrad+Adadelta+RMSprop+Adam+BGFS Youtube search...]
 
[http://www.youtube.com/results?search_query=Optimization+LSTM+SGD+Adagrad+Adadelta+RMSprop+Adam+BGFS Youtube search...]
 +
[http://www.google.com/search?q=Optimization+LSTM+SGD+Adagrad+Adadelta+RMSprop+Adam+BGF+machine+learning+ML+artificial+intelligence ...Google search]
  
 
* [[Natural Language Processing (NLP)]]
 
* [[Natural Language Processing (NLP)]]

Revision as of 13:34, 3 February 2019

Youtube search... ...Google search

Methods:

  • Stochastic gradient descent (SGD) (with and without momentum)
  • L-BGFS
  • Adagrad
  • Adadelta
  • Root Mean Squared (RMSprop)
  • Adam
  • Hessian-free (HF)