Difference between revisions of "Sequence to Sequence (Seq2Seq)"
m |
|||
| Line 20: | Line 20: | ||
* [http://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/ Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) | Jay Alammar] | * [http://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/ Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) | Jay Alammar] | ||
| − | a general-purpose encoder-decoder that can be used for machine translation, text summarization, conversational modeling, image captioning, and more. | + | a general-purpose encoder-decoder that can be used for machine translation, text summarization, conversational modeling, image captioning, interpreting dialects of software code, and more. |
http://3.bp.blogspot.com/-3Pbj_dvt0Vo/V-qe-Nl6P5I/AAAAAAAABQc/z0_6WtVWtvARtMk0i9_AtLeyyGyV6AI4wCLcB/s1600/nmt-model-fast.gif | http://3.bp.blogspot.com/-3Pbj_dvt0Vo/V-qe-Nl6P5I/AAAAAAAABQc/z0_6WtVWtvARtMk0i9_AtLeyyGyV6AI4wCLcB/s1600/nmt-model-fast.gif | ||
Revision as of 07:13, 30 April 2019
YouTube search... ...Google search
- Open Seq2Seq | NVIDIA
- Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM)
- Autoencoder (AE) / Encoder-Decoder
- Attention Models
- Natural Language Processing (NLP)
- Assistants
- Attention Mechanism/Model - Transformer Model
- NLP Keras model in browser with TensorFlow.js
- seq2seq: the clown car of deep learning | Dev Nag - Medium
- Natural Language Processing (NLP)
- Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention) | Jay Alammar
a general-purpose encoder-decoder that can be used for machine translation, text summarization, conversational modeling, image captioning, interpreting dialects of software code, and more.