Difference between revisions of "AdaNet"
m (Text replacement - "http://" to "https://") |
|||
(One intermediate revision by the same user not shown) | |||
Line 5: | Line 5: | ||
|description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | |description=Helpful resources for your journey with artificial intelligence; videos, articles, techniques, courses, profiles, and tools | ||
}} | }} | ||
− | [ | + | [https://www.youtube.com/results?search_query=AdaNet+ensemble+AutoML+artificial+intelligence YouTube search...] |
− | [ | + | [https://www.google.com/search?q=AdaNet+ensemble+AutoML+deep+machine+learning+ML+artificial+intelligence ...Google search] |
* [[AdaNet]] | * [[AdaNet]] | ||
− | * [[Automated | + | * [[Algorithm Administration#Automated Learning|Automated Learning]] |
* [[Reinforcement Learning (RL)]] | * [[Reinforcement Learning (RL)]] | ||
a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on our recent reinforcement learning and evolutionary-based AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models. | a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on our recent reinforcement learning and evolutionary-based AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models. | ||
− | AdaNet is easy to use, and creates high-quality models, saving ML practitioners the time normally spent selecting optimal neural network architectures, implementing an adaptive algorithm for learning a neural architecture as an ensemble of subnetworks. AdaNet is capable of adding subnetworks of different depths and widths to create a diverse ensemble, and trade off performance improvement with the number of parameters. [ | + | AdaNet is easy to use, and creates high-quality models, saving ML practitioners the time normally spent selecting optimal neural network architectures, implementing an adaptive algorithm for learning a neural architecture as an ensemble of subnetworks. AdaNet is capable of adding subnetworks of different depths and widths to create a diverse ensemble, and trade off performance improvement with the number of parameters. [https://ai.googleblog.com/2018/10/introducing-adanet-fast-and-flexible.html Introducing AdaNet: Fast and Flexible AutoML with Learning Guarantees | Charles Weill] |
− | + | https://2.bp.blogspot.com/-MXSy_I9M6nI/W9cx1LsFKRI/AAAAAAAADdc/HSFi3QnzgNwv5ovScFkLKUT9vyhAqVu2QCLcBGAs/s400/image1.gif | |
<youtube>3jX-f06Ke74</youtube> | <youtube>3jX-f06Ke74</youtube> |
Latest revision as of 21:07, 27 March 2023
YouTube search... ...Google search
a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on our recent reinforcement learning and evolutionary-based AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models. AdaNet is easy to use, and creates high-quality models, saving ML practitioners the time normally spent selecting optimal neural network architectures, implementing an adaptive algorithm for learning a neural architecture as an ensemble of subnetworks. AdaNet is capable of adding subnetworks of different depths and widths to create a diverse ensemble, and trade off performance improvement with the number of parameters. Introducing AdaNet: Fast and Flexible AutoML with Learning Guarantees | Charles Weill