GAT-master
所属分类:其他
开发工具:Python
文件大小:1230KB
下载次数:15
上传日期:2019-08-18 15:42:14
上 传 者:
shu_
说明: 图注意力机制神经网络,添加监督信息来训练图神经网络
(Training Graph Neural Network for Classification)
文件列表:
LICENSE (1075, 2018-11-23)
data (0, 2018-11-23)
data\ind.cora.allx (257305, 2018-11-23)
data\ind.cora.ally (47959, 2018-11-23)
data\ind.cora.graph (59847, 2018-11-23)
data\ind.cora.test.index (5000, 2018-11-23)
data\ind.cora.tx (148025, 2018-11-23)
data\ind.cora.ty (28135, 2018-11-23)
data\ind.cora.x (22119, 2018-11-23)
data\ind.cora.y (4054, 2018-11-23)
execute_cora.py (6694, 2018-11-23)
execute_cora_sparse.py (7455, 2018-11-23)
models (0, 2018-11-23)
models\__init__.py (47, 2018-11-23)
models\base_gattn.py (3508, 2018-11-23)
models\gat.py (1296, 2018-11-23)
models\sp_gat.py (1414, 2018-11-23)
pre_trained (0, 2018-11-23)
pre_trained\cora (0, 2018-11-23)
pre_trained\cora\checkpoint (83, 2018-11-23)
pre_trained\cora\mod_cora.ckpt.data-00000-of-00001 (1108700, 2018-11-23)
pre_trained\cora\mod_cora.ckpt.index (5337, 2018-11-23)
pre_trained\cora\mod_cora.ckpt.meta (784543, 2018-11-23)
utils (0, 2018-11-23)
utils\__init__.py (0, 2018-11-23)
utils\layers.py (3249, 2018-11-23)
utils\process.py (6733, 2018-11-23)
utils\process_ppi.py (9395, 2018-11-23)
# GAT
Graph Attention Networks (Velickovic *et al.*, ICLR 2018): [https://arxiv.org/abs/1710.10903](https://arxiv.org/abs/1710.10903)
GAT layer | t-SNE + Attention coefficients on Cora
:-------------------------:|:-------------------------:
![](http://www.cl.cam.ac.uk/~pv273/images/gat.jpg) | ![](http://www.cl.cam.ac.uk/~pv273/images/gat_tsne.jpg)
## Overview
Here we provide the implementation of a Graph Attention Network (GAT) layer in TensorFlow, along with a minimal execution example (on the Cora dataset). The repository is organised as follows:
- `data/` contains the necessary dataset files for Cora;
- `models/` contains the implementation of the GAT network (`gat.py`);
- `pre_trained/` contains a pre-trained Cora model (achieving 84.4% accuracy on the test set);
- `utils/` contains:
* an implementation of an attention head, along with an experimental sparse version (`layers.py`);
* preprocessing subroutines (`process.py`);
* preprocessing utilities for the PPI benchmark (`process_ppi.py`).
Finally, `execute_cora.py` puts all of the above together and may be used to execute a full training run on Cora.
## Sparse version
An experimental sparse version is also available, working only when the batch size is equal to 1.
The sparse model may be found at `models/sp_gat.py`.
You may execute a full training run of the sparse model on Cora through `execute_cora_sparse.py`.
## Dependencies
The script has been tested running under Python 3.5.2, with the following packages installed (along with their dependencies):
- `numpy==1.14.1`
- `scipy==1.0.0`
- `networkx==2.1`
- `tensorflow-gpu==1.6.0`
In addition, CUDA 9.0 and cuDNN 7 have been used.
## Reference
If you make advantage of the GAT model in your research, please cite the following in your manuscript:
```
@article{
velickovic2018graph,
title="{Graph Attention Networks}",
author={Veli{\v{c}}kovi{\'{c}}, Petar and Cucurull, Guillem and Casanova, Arantxa and Romero, Adriana and Li{\`{o}}, Pietro and Bengio, Yoshua},
journal={International Conference on Learning Representations},
year={2018},
url={https://openreview.net/forum?id=rJXMpikCZ},
note={accepted as poster},
}
```
You may also be interested in the following unofficial ports of the GAT model:
- \[Keras\] [keras-gat](https://github.com/danielegrattarola/keras-gat), currently under development by [Daniele Grattarola](https://github.com/danielegrattarola);
- \[PyTorch\] [pyGAT](https://github.com/Diego999/pyGAT), currently under development by [Diego Antognini](https://github.com/Diego999).
## License
MIT
近期下载者:
相关文件:
收藏者: