Difference between revisions of "Neural Architecture"
| Line 31: | Line 31: | ||
<youtube>CYUpDogeIL0</youtube> | <youtube>CYUpDogeIL0</youtube> | ||
<youtube>MtuADH7kVeQ</youtube> | <youtube>MtuADH7kVeQ</youtube> | ||
| + | |||
| + | |||
| + | == Differentiable Neural Computer (DNC) == | ||
| + | [[http://www.youtube.com/results?search_query=differentiable+neural+computers+%28DNC%29 YouTube search...]] | ||
| + | |||
| + | <youtube>r5XKzjTFCZQ</youtube> | ||
| + | <youtube>6SmN8dt8WeQ</youtube> | ||
Revision as of 20:21, 2 July 2019
NAS+machine+learning YouTube search... NAS+machine+learning ...Google search
- Automated Machine Learning (AML) - AutoML
- Hyperparameters Optimization
- Other codeless options, Code Generators, Drag n' Drop
- MIT’s AI can train neural networks faster than ever before | Christine Fisher - Engadget
- Neural Architecture Search (NAS) with Reinforcement Learning | Wikipedia
- Neural Architecture Search (NAS) with Evolution | Wikipedia
- Multi-objective Neural architecture search | Wikipedia
- Awesome NAS; a curated list
- Hierarchical Temporal Memory (HTM)
Various approaches to NAS have designed networks that are on par or even outperform hand-designed architectures. Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used:
- The search space defines which type of ANN can be designed and optimized in principle.
- The search strategy defines which strategy is used to find optimal ANN's within the search space.
- Obtaining the performance of an ANN is costly as this requires training the ANN first. Therefore, performance estimation strategies are used obtain less costly estimates of a model's performance. Neural Architecture Search | Wikipedia
Differentiable Neural Computer (DNC)