Difference between revisions of "Differentiable Programming"

From
Jump to: navigation, search
m
Line 9: Line 9:
  
 
* [http://en.wikipedia.org/wiki/Category:Programming_paradigms Programming paradigms | Wikipedia]
 
* [http://en.wikipedia.org/wiki/Category:Programming_paradigms Programming paradigms | Wikipedia]
* [[Automated Machine Learning (AML) - AutoML]]
+
* [[Algorithm Administration#Automated Learning|Automated Learning]]
 
* [http://en.wikipedia.org/wiki/Automatic_differentiation Automatic differentiation - Wikipedia]
 
* [http://en.wikipedia.org/wiki/Automatic_differentiation Automatic differentiation - Wikipedia]
 
* [[Graph]]
 
* [[Graph]]

Revision as of 20:07, 27 September 2020

YouTube search... ...Google search

“People are now building a new kind of software by assembling networks of parameterized functional blocks and by training them from examples using some form of gradient-based optimization.” - Facebook Chief AI Scientist Yann LeCun

Differentiable programs are programs that rewrite themselves at least one component by optimizing along a gradient, like neural networks do using optimization algorithms such as gradient descent. Here’s a graphic illustrating the difference between differential and probabilistic programming approaches. A Beginner's Guide to Differentiable Programming | Chris Nicholson - A.I. Wiki pathmind

TensorFlow 1 uses the static graph approach, whereas TensorFlow 2 uses the dynamic graph approach by default. Differentiable programming | Wikipedia