Skip to content

Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.

License

Notifications You must be signed in to change notification settings

dataflowr/Project-graphtransformer

 
 

Repository files navigation

DL-DIY potential project ideas

Run the code on the SBM datasets for community detection. Create new datasets to compare perfs for various parametes, test generalization...

Graph Transformer Architecture

Source code for the paper "A Generalization of Transformer Networks to Graphs" by Vijay Prakash Dwivedi and Xavier Bresson, at AAAI'21 Workshop on Deep Learning on Graphs: Methods and Applications (DLG-AAAI'21).

We propose a generalization of transformer neural network architecture for arbitrary graphs: Graph Transformer.
Compared to the Standard Transformer, the highlights of the presented architecture are:

  • The attention mechanism is a function of neighborhood connectivity for each node in the graph.
  • The position encoding is represented by Laplacian eigenvectors, which naturally generalize the sinusoidal positional encodings often used in NLP.
  • The layer normalization is replaced by a batch normalization layer.
  • The architecture is extended to have edge representation, which can be critical to tasks with rich information on the edges, or pairwise interactions (such as bond types in molecules, or relationship type in KGs. etc).

Graph Transformer Architecture
Figure: Block Diagram of Graph Transformer Architecture

1. Repo installation

This project is based on the benchmarking-gnns repository.

Follow these instructions to install the benchmark and setup the environment.


2. Download datasets

Proceed as follows to download the datasets used to evaluate Graph Transformer.


3. Reproducibility

Use this page to run the codes and reproduce the published results.


4. Reference

📃 Paper on arXiv
📝 Blog on Towards Data Science
🎥 Video on YouTube

@article{dwivedi2021generalization,
  title={A Generalization of Transformer Networks to Graphs},
  author={Dwivedi, Vijay Prakash and Bresson, Xavier},
  journal={AAAI Workshop on Deep Learning on Graphs: Methods and Applications},
  year={2021}
}




About

Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to Graphs", DLG-AAAI'21.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 76.0%
  • Shell 18.5%
  • Jupyter Notebook 5.5%