Difference between revisions of "Memory Networks"

From
Jump to: navigation, search
Line 14: Line 14:
 
* [[Autoencoder (AE) / Encoder-Decoder]]
 
* [[Autoencoder (AE) / Encoder-Decoder]]
 
* [[Natural Language Processing (NLP)]]
 
* [[Natural Language Processing (NLP)]]
 
+
* [[Feature Exploration/Learning]]
  
 
Attention networks are a kind of short-term memory that allocates attention over input features they have recently seen. Attention mechanisms are components of memory networks, which focus their attention on external memory storage rather than a sequence of hidden states in an [[Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)]]. Memory networks are a little different, but not too. They work with external data storage, and they are useful for, say, mapping questions as input to answers stored in that external memory. That external data storage acts as an embedding that the attention mechanism can alter, writing to the memory what it learns, and reading from it to make a prediction. While the hidden states of a recurrent neural network are a sequence of embeddings, memory is an accumulation of those embeddings (imagine performing max pooling on all your hidden states – that would be like memory). [http://skymind.ai/wiki/attention-mechanism-memory-network A Beginner's Guide to Attention Mechanisms and Memory Networks | Skymind]
 
Attention networks are a kind of short-term memory that allocates attention over input features they have recently seen. Attention mechanisms are components of memory networks, which focus their attention on external memory storage rather than a sequence of hidden states in an [[Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)]]. Memory networks are a little different, but not too. They work with external data storage, and they are useful for, say, mapping questions as input to answers stored in that external memory. That external data storage acts as an embedding that the attention mechanism can alter, writing to the memory what it learns, and reading from it to make a prediction. While the hidden states of a recurrent neural network are a sequence of embeddings, memory is an accumulation of those embeddings (imagine performing max pooling on all your hidden states – that would be like memory). [http://skymind.ai/wiki/attention-mechanism-memory-network A Beginner's Guide to Attention Mechanisms and Memory Networks | Skymind]

Revision as of 17:51, 22 May 2019

YouTube search... ...Google search

Attention networks are a kind of short-term memory that allocates attention over input features they have recently seen. Attention mechanisms are components of memory networks, which focus their attention on external memory storage rather than a sequence of hidden states in an Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM). Memory networks are a little different, but not too. They work with external data storage, and they are useful for, say, mapping questions as input to answers stored in that external memory. That external data storage acts as an embedding that the attention mechanism can alter, writing to the memory what it learns, and reading from it to make a prediction. While the hidden states of a recurrent neural network are a sequence of embeddings, memory is an accumulation of those embeddings (imagine performing max pooling on all your hidden states – that would be like memory). A Beginner's Guide to Attention Mechanisms and Memory Networks | Skymind


memory-network.png