Difference between revisions of "NLP Keras model in browser with TensorFlow.js"
| Line 19: | Line 19: | ||
* [[Datasets]]: | * [[Datasets]]: | ||
** [http://www.clips.uantwerpen.be/conll2003/ner/ Language-Independent Named Entity Recognition (II)] | ** [http://www.clips.uantwerpen.be/conll2003/ner/ Language-Independent Named Entity Recognition (II)] | ||
| − | * Model: | + | * [[Sequence to Sequence (Seq2Seq)]] Model: |
** [[Natural Language Processing (NLP)]]: | ** [[Natural Language Processing (NLP)]]: | ||
*** Named Entity Recognition (NER) | *** Named Entity Recognition (NER) | ||
*** Word Embedding | *** Word Embedding | ||
| − | |||
** [[Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Recurrent Neural Network (RNN)]] | ** [[Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Recurrent Neural Network (RNN)]] | ||
** [[Attention Mechanism/Model - Transformer Model]] | ** [[Attention Mechanism/Model - Transformer Model]] | ||
Revision as of 06:07, 30 April 2019
Youtube search... | ...Google search
How to use your Keras model in browser with tensorflow.js
- NLP Keras model in browser with TensorFlow.js | Mikhail Salnikov - Towards Data Science
- The Unreasonable Effectiveness of Recurrent Neural Networks | Andrej Karpathy - Towards Data Science
- Tools:
- Datasets:
- Sequence to Sequence (Seq2Seq) Model:
- Natural Language Processing (NLP):
- Named Entity Recognition (NER)
- Word Embedding
- Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Recurrent Neural Network (RNN)
- Attention Mechanism/Model - Transformer Model
- Natural Language Processing (NLP):
SOTA = State of the Art
Mr. [B-PER] Johnson [I-PER] like Moscow! [B-LOC]