Difference between revisions of "Optimizer"

From
Jump to: navigation, search
m
m
Line 20: Line 20:
 
* [[Agents#AI Agent Optimization|AI Agent Optimization]] ... [[Optimization Methods]] ... [[Optimizer]] ... [[Objective vs. Cost vs. Loss vs. Error Function]] ... [[Exploration]]
 
* [[Agents#AI Agent Optimization|AI Agent Optimization]] ... [[Optimization Methods]] ... [[Optimizer]] ... [[Objective vs. Cost vs. Loss vs. Error Function]] ... [[Exploration]]
 
* [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ...[[Softmax]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]]
 
* [[Backpropagation]] ... [[Feed Forward Neural Network (FF or FFNN)|FFNN]] ... [[Forward-Forward]] ... [[Activation Functions]] ...[[Softmax]] ... [[Loss]] ... [[Boosting]] ... [[Gradient Descent Optimization & Challenges|Gradient Descent]] ... [[Algorithm Administration#Hyperparameter|Hyperparameter]] ... [[Manifold Hypothesis]] ... [[Principal Component Analysis (PCA)|PCA]]
* [[Objective vs. Cost vs. Loss vs. Error Function]]
 
 
* [http://www.tensorflow.org/api_guides/python/train TensorFlow Training Classes Python API]  
 
* [http://www.tensorflow.org/api_guides/python/train TensorFlow Training Classes Python API]  
 
* [http://videos.h2o.ai/watch/4Qx2eUbrsUCZ4rThjtVxeb H2O Driverless AI - Intro + Interactive Hands-on Lab - Video]
 
* [http://videos.h2o.ai/watch/4Qx2eUbrsUCZ4rThjtVxeb H2O Driverless AI - Intro + Interactive Hands-on Lab - Video]

Revision as of 20:29, 5 March 2024

YouTube search... ...Google search

There are many options for optimizer in TensorFlow. Optimizers are the tool to minimise loss between prediction and real value. There are many different weights a model could learn, and brute-force testing every one would take forever. Instead, an optimizer is chosen which evaluates the loss value, and smartly updates the weights. Click here For a list of Keras optimizer functions. Optimizer is one of the two parameters required to compile a model...



model.compile(optimizer='sgd'. loss='mean_squared_error')


Genetic Algorithm Optimization