- Natural Language Processing (NLP)
- Global Vectors for Word Representation (GloVe)
- Bag-of-Words (BoW)
- Continuous Bag-of-Words (CBoW)
- A Beginner's Guide to Word2Vec and Neural Word Embeddings | Chris Nicholson - A.I. Wiki pathmind
- Introduction to Word Embedding and Word2Vec | Dhruvil Karani - Towards Data Science - Medium
- Distributed Representations of Words and Phrases and their Compositionality | Tomas Mikolov - Google
a shallow, two-layer neural networks which is trained to reconstruct linguistic contexts of words. It takes as its input a large corpus of words and produces a vector space, typically of several hundred dimensions, with each unique word in the corpus being assigned a corresponding vector in the space.