Differentiable Programming

From
Revision as of 08:31, 28 March 2023 by BPeat (talk | contribs) (Text replacement - "http:" to "https:")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

YouTube search... ...Google search

“People are now building a new kind of software by assembling networks of parameterized functional blocks and by training them from examples using some form of gradient-based optimization.” - Facebook Chief AI Scientist Yann LeCun

Differentiable programs are programs that rewrite themselves at least one component by optimizing along a gradient, like neural networks do using optimization algorithms such as gradient descent. Here’s a graphic illustrating the difference between differential and probabilistic programming approaches. A Beginner's Guide to Differentiable Programming | Chris Nicholson - A.I. Wiki pathmind

TensorFlow 1 uses the static graph approach, whereas TensorFlow 2 uses the dynamic graph approach by default. Differentiable programming | Wikipedia