Difference between revisions of "Class"
Line 10: | Line 10: | ||
= Word Embeddings = | = Word Embeddings = | ||
− | HMMS, CRF, PGMs | + | * HMMS, CRF, PGMs |
− | Bag of Words / ngrams - feature per word/n items | + | ** CBoW -Bag of Words / ngrams - feature per word/n items |
− | * 1-hot | + | ** 1-hot Sparse input - create a vector the size of the entire vocabulary |
− | Stop Words | + | * Stop Words |
− | TF-IDF | + | *TF-IDF |
− | Word2Vec | + | == Word2Vec == |
− | Firth 1957 Distributional Hypothess | + | Skip-Gram |
+ | |||
+ | |||
+ | |||
+ | |||
+ | *Firth 1957 Distributional Hypothess | ||
= Text Classification = | = Text Classification = | ||
= Text/Machine Translation (MNT) = | = Text/Machine Translation (MNT) = |
Revision as of 08:31, 22 October 2018
https://courses.nvidia.com/dashboard
Linguistic Concepts
- conference - anaphors
- gang of four design
- null subject
- recursion
Word Embeddings
- HMMS, CRF, PGMs
- CBoW -Bag of Words / ngrams - feature per word/n items
- 1-hot Sparse input - create a vector the size of the entire vocabulary
- Stop Words
- TF-IDF
Word2Vec
Skip-Gram
- Firth 1957 Distributional Hypothess