Difference between revisions of "Activation Functions"
| Line 7: | Line 7: | ||
<youtube>9vB5nzrL4hY</youtube> | <youtube>9vB5nzrL4hY</youtube> | ||
<youtube>3r65ZuFyi5Y</youtube> | <youtube>3r65ZuFyi5Y</youtube> | ||
| + | |||
| + | === Sigmoid === | ||
| + | |||
| + | <youtube>0RTvFKCQ_yo</youtube> | ||
| + | <youtube>aHQgFpJ_xp8</youtube> | ||
| + | |||
| + | === tanh === | ||
| + | |||
| + | <youtube>pWxSUZWOctk</youtube> | ||
| + | <youtube>Opg63pan_YQ</youtube> | ||
=== ReLU === | === ReLU === | ||
| Line 17: | Line 27: | ||
<youtube>mlaLLQofmR8</youtube> | <youtube>mlaLLQofmR8</youtube> | ||
<youtube>LLux1SW--oM</youtube> | <youtube>LLux1SW--oM</youtube> | ||
| + | |||
| + | === identity === | ||
| + | |||
| + | <youtube>aHQgFpJ_xp8</youtube> | ||
| + | <youtube>olR031-wcMs</youtube> | ||
Revision as of 20:09, 1 May 2018
Contents
Sigmoid
tanh
ReLU
Softmax
identity