Difference between revisions of "Hypernetworks"

From
Jump to: navigation, search
(Created page with "[http://www.youtube.com/results?search_query=hyperparameter+deep+learning+tuning+optimization+ai YouTube search...] [http://www.google.com/search?q=hyperparameter+optimization...")
 
m
Line 5: Line 5:
 
* [[Algorithm Administration]]
 
* [[Algorithm Administration]]
 
* [http://www.quantamagazine.org/researchers-build-ai-that-builds-ai-20220125/ Researchers Build AI That Builds AI] By using hypernetworks, researchers can now preemptively fine-tune artificial neural networks, saving some of the time and expense of training
 
* [http://www.quantamagazine.org/researchers-build-ai-that-builds-ai-20220125/ Researchers Build AI That Builds AI] By using hypernetworks, researchers can now preemptively fine-tune artificial neural networks, saving some of the time and expense of training
 +
 +
A hypernetwork is a network that generates the weights of another network (Ha et al., 2017). The hypernetworks capture the shared information, while the generated task conditional adapters and layer normalization allow the model to adapt to each individual task to reduce negative task interference. [http://aclanthology.org/2021.acl-long.47.pdf Parameter-efficient Multi-task Fine-tuning for Transformers via Shared Hypernetworks R.K. Mahabadi, S. Ruder, M. Dehghani, & J. Hernderson]

Revision as of 00:27, 11 February 2022

YouTube search... ...Google search

A hypernetwork is a network that generates the weights of another network (Ha et al., 2017). The hypernetworks capture the shared information, while the generated task conditional adapters and layer normalization allow the model to adapt to each individual task to reduce negative task interference. Parameter-efficient Multi-task Fine-tuning for Transformers via Shared Hypernetworks R.K. Mahabadi, S. Ruder, M. Dehghani, & J. Hernderson