Difference between revisions of "Word2Vec"
m |
m |
||
| Line 8: | Line 8: | ||
[http://www.google.com/search?q=Word2Vec+word+vectors+deep+machine+learning+ML+artificial+intelligence ...Google search] | [http://www.google.com/search?q=Word2Vec+word+vectors+deep+machine+learning+ML+artificial+intelligence ...Google search] | ||
| − | * [[Natural Language Processing (NLP)]] | + | * [[Large Language Model (LLM)]] ... [[Natural Language Processing (NLP)]] ...[[Natural Language Generation (NLG)|Generation]] ... [[Natural Language Classification (NLC)|Classification]] ... [[Natural Language Processing (NLP)#Natural Language Understanding (NLU)|Understanding]] ... [[Language Translation|Translation]] ... [[Natural Language Tools & Services|Tools & Services]] |
* [[Doc2Vec]] | * [[Doc2Vec]] | ||
* [[Node2Vec]] | * [[Node2Vec]] | ||
Revision as of 14:28, 28 April 2023
Youtube search... ...Google search
- Large Language Model (LLM) ... Natural Language Processing (NLP) ...Generation ... Classification ... Understanding ... Translation ... Tools & Services
- Doc2Vec
- Node2Vec
- Skip-Gram
- Global Vectors for Word Representation (GloVe)
- Bag-of-Words (BoW)
- Continuous Bag-of-Words (CBoW)
- Similarity
- TensorFlow
- A Beginner's Guide to Word2Vec and Neural Word Embeddings | Chris Nicholson - A.I. Wiki pathmind
- Introduction to Word Embedding and Word2Vec | Dhruvil Karani - Towards Data Science - Medium
- Distributed Representations of Words and Phrases and their Compositionality | Tomas Mikolov - Google
a shallow, two-layer neural networks which is trained to reconstruct linguistic contexts of words. It takes as its input a large corpus of words and produces a vector space, typically of several hundred dimensions, with each unique word in the corpus being assigned a corresponding vector in the space.