[CS224W]03.Node Embeddings
- Youtube, Lecture Notes
- Embedding → No need to do feature engineering and can be used for various downstream tasks.
- Encoder: encode nodes in the embedding space based on similarities in the graphs(GNN is a deep encoder)
- Decoder(Node similarity function): maps from embeddings to the similarity score.(usually use dot product)
- Random Walk Approaches for Node Embeddings
- Inner product the vectors after random walk embedding represents the probability that the two nodes co-occur on a random walk over the graph. Random walk approximates this probability as similarity.
- Approximate the denominator of loss function with just taking sample of neighbors because of their computational costs (negative sampling)
- Node2Vec: add random walk only additional strategy BFS or DFS(Biased Walk)
- Graph embeddings(classify graph)
- 1) Just sum(or average) all nodes in a graph. Nodes are already embedded by using random walk or Node2Vec.
- 2) Virtual nodes
- 3) Anonymous Walks:
- Sampling anonymous walks: sample and embed on already calculated set.
- Learn Walk embedding: Use the sampled items as set. predict and train sequentially constrained on fixed size Δ