Difference between revisions of "Sequence to Sequence (Seq2Seq)"

From
Jump to: navigation, search
Line 8: Line 8:
 
* [[Natural Language Processing (NLP)]]
 
* [[Natural Language Processing (NLP)]]
 
* [[Assistants]]
 
* [[Assistants]]
 +
* [[Attention Mechanism/Model - Transformer Model]]
 
* [http://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/ Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) | Jay Alammar]
 
* [http://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/ Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) | Jay Alammar]
  

Revision as of 19:56, 19 January 2019