Difference between revisions of "Optimization Methods"
| Line 9: | Line 9: | ||
* [[Natural Language Processing (NLP)]] | * [[Natural Language Processing (NLP)]] | ||
| − | * [[ | + | * [[Recurrent Neural Network (RNN)]] |
** [[Average-SGD Weight-Dropped LSTM (AWD-LSTM)]] | ** [[Average-SGD Weight-Dropped LSTM (AWD-LSTM)]] | ||
* Gradient [[Boosting]] Algorithms | * Gradient [[Boosting]] Algorithms | ||
Revision as of 07:12, 30 June 2019
Youtube search... ...Google search
- Natural Language Processing (NLP)
- Recurrent Neural Network (RNN)
- Gradient Boosting Algorithms
Methods:
- Stochastic gradient descent (SGD) (with and without momentum)
- L-BGFS
- Adagrad
- Adadelta
- Root Mean Squared (RMSprop)
- Adam
- Hessian-free (HF)