The repository is the PyTorch implamentation of Graph Convolutional Networks and Graph Attention Networks for spatial information extraction from time-series flow data (PeMS04, METR_LA).
Jamp to /config/
and edit the corresponding .conf
as needed.
nums_of_nodes
: number of the nodes on the road networklen
: length of time windows for predictiontime_interval
: time period of each time windowbatch_size
: number of time slots per batchlearning_rate
epochs
hidden_dim
: the hidden dimensions of the middle layerstraining_days
: length of training data(days)validation_days
: length of validation data(days)test_days
: length of test data(days)
And then to run the demo
cd module python trainer.py
You can choose between the following models from baseline.py
:
GCN3
: Three-layered GCNGCN2
: Two-layered GCNbGCN
: Graph convolutional network (Thomas N. Kipf, Max Welling, Semi-Supervised Classification with Graph Convolutional Networks, 2016)ChebConv
: One-layer Chebyshev polynomial version of graph convolutional network as described in (Michaël Defferrard, Xavier Bresson, Pierre Vandergheynst, Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, NIPS 2016)ChebNet
: Two-layered Chebyshev polynomial version of graph convolutional networkGraphAttentionLayer
: One-layer graph attention layer with self-attention mechanismGATSubNet
: One-layer graph attention layer with multi-attention mechanism (the time-series input is splated into flat one-dimensional array as [B, N, T * C])GATNet2
: One-layer graph attention layer with multi-attention mechanism (apply attention mechanism on T*[B, N, C] tensor, the input should be [B, N, C])