Memory Networks

From
Revision as of 10:58, 9 January 2019 by BPeat (talk | contribs) (Created page with "[http://www.youtube.com/results?search_query=Memory+Networks+attention+model+ai+learning+model YouTube search...] [http://www.google.com/search?q=Memory+Networks+attention+mod...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

YouTube search... ...Google search


Attention networks are a kind of short-term memory that allocates attention over input features they have recently seen. Attention mechanisms are components of memory networks, which focus their attention on external memory storage rather than a sequence of hidden states in an Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM). Memory networks are a little different, but not too. They work with external data storage, and they are useful for, say, mapping questions as input to answers stored in that external memory. That external data storage acts as an embedding that the attention mechanism can alter, writing to the memory what it learns, and reading from it to make a prediction. While the hidden states of a recurrent neural network are a sequence of embeddings, memory is an accumulation of those embeddings (imagine performing max pooling on all your hidden states – that would be like memory). A Beginner's Guide to Attention Mechanisms and Memory Networks | Skymind


memory-network.png