Difference between revisions of "Transformer"
| Line 6: | Line 6: | ||
* [[Autoencoders / Encoder-Decoders]] | * [[Autoencoders / Encoder-Decoders]] | ||
* [[Natural Language Processing (NLP)]] | * [[Natural Language Processing (NLP)]] | ||
| − | * [ | + | * [[Memory Networks]] |
| − | + | Attention mechanisms in neural networks are about memory access. That’s the first thing to remember about attention: it’s something of a misnomer. [http://skymind.ai/wiki/attention-mechanism-memory-network A Beginner's Guide to Attention Mechanisms and Memory Networks | Skymind] | |
http://skymind.ai/images/wiki/attention_mechanism.png | http://skymind.ai/images/wiki/attention_mechanism.png | ||
| − | + | ||
<youtube>W2rWgXJBZhU</youtube> | <youtube>W2rWgXJBZhU</youtube> | ||
Revision as of 10:48, 9 January 2019
YouTube search... ...Google search
- Sequence to Sequence (Seq2Seq)
- Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)
- Autoencoders / Encoder-Decoders
- Natural Language Processing (NLP)
- Memory Networks
Attention mechanisms in neural networks are about memory access. That’s the first thing to remember about attention: it’s something of a misnomer. A Beginner's Guide to Attention Mechanisms and Memory Networks | Skymind