# Node2Vec

Youtube search... ...Google search

- Word2Vec
- Doc2Vec
- node2vec: Scalable Feature Learning for Networks | Aditya Grover & Jure Leskovec
- node2vec-example | Github

We can apply reduction from the graphical structure of our data into a linear structure such that the information encoded in the graphical structure isn’t lost. Doing so, we’ll be able to use good old Word2Vec. The key point is to perform random walks in the graph. Each walk starts at a random node, and performs a series of steps, where each step goes to a random neighbor. Each random walk forms a sentence that can be fed into word2vec. ... Using the right data structure to represent your data is important. Each data structure implies a different learning algorithm, or in other words — introduces a different inductive bias. Identifying your data has a certain structure, so you can use the right tool for the job, might be challenging. Since so many real world datasets are naturally represented as graphs, we think Graph Neural Networks are a must-have in our tool box as data scientists. Think your Data Different | Yoel Zeldes