Difference between revisions of "Neural Architecture"
(→Neural Architecture Search (NAS)) |
|||
| Line 21: | Line 21: | ||
* [http://www.automl.org/automl/literature-on-neural-architecture-search/ Literature on Neural Architecture Search |] [http://www.automl.org/ AutoML.org] | * [http://www.automl.org/automl/literature-on-neural-architecture-search/ Literature on Neural Architecture Search |] [http://www.automl.org/ AutoML.org] | ||
| + | * [http://github.com/D-X-Y/awesome-NAS Awesome NAS]; a curated list | ||
* [http://en.wikipedia.org/wiki/Neural_architecture_search#NAS_with_Reinforcement_Learning Neural Architecture Search (NAS) with Reinforcement Learning | Wikipedia] | * [http://en.wikipedia.org/wiki/Neural_architecture_search#NAS_with_Reinforcement_Learning Neural Architecture Search (NAS) with Reinforcement Learning | Wikipedia] | ||
** [[Reinforcement Learning (RL)]] | ** [[Reinforcement Learning (RL)]] | ||
* [http://en.wikipedia.org/wiki/Neural_architecture_search#NAS_with_Evolution Neural Architecture Search (NAS) with Evolution | Wikipedia] | * [http://en.wikipedia.org/wiki/Neural_architecture_search#NAS_with_Evolution Neural Architecture Search (NAS) with Evolution | Wikipedia] | ||
* [http://en.wikipedia.org/wiki/Neural_architecture_search#Multi-objective_Neural_architecture_search Multi-objective Neural architecture search | Wikipedia] | * [http://en.wikipedia.org/wiki/Neural_architecture_search#Multi-objective_Neural_architecture_search Multi-objective Neural architecture search | Wikipedia] | ||
| − | |||
Various approaches to Neural Architecture Search (NAS) have designed networks that are on par or even outperform hand-designed architectures. Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used: | Various approaches to Neural Architecture Search (NAS) have designed networks that are on par or even outperform hand-designed architectures. Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used: | ||
Revision as of 23:17, 7 April 2020
YouTube search... ...Google search
- Hierarchical Temporal Memory (HTM)
- MIT’s AI can train neural networks faster than ever before | Christine Fisher - Engadget
- Other codeless options, Code Generators, Drag n' Drop
- Automated Machine Learning (AML) - AutoML
- Auto Keras
- Evolutionary Computation / Genetic Algorithms
- Hyperparameters Optimization
Neural Architecture Search (NAS)
YouTube search... ...Google search
- Literature on Neural Architecture Search | AutoML.org
- Awesome NAS; a curated list
- Neural Architecture Search (NAS) with Reinforcement Learning | Wikipedia
- Neural Architecture Search (NAS) with Evolution | Wikipedia
- Multi-objective Neural architecture search | Wikipedia
Various approaches to Neural Architecture Search (NAS) have designed networks that are on par or even outperform hand-designed architectures. Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used:
- The search space defines which type of ANN can be designed and optimized in principle.
- The search strategy defines which strategy is used to find optimal ANN's within the search space.
- Obtaining the performance of an ANN is costly as this requires training the ANN first. Therefore, performance estimation strategies are used obtain less costly estimates of a model's performance. Neural Architecture Search | Wikipedia
Differentiable Neural Computer (DNC)