TCN-with-attention-master

所属分类:其他
开发工具:Python
文件大小:16KB
下载次数:26
上传日期:2020-06-03 14:51:33
上 传 者对我而言可
说明:  基于注意力的的方法,预测使用的tcn,tcn是比lstm更好的一种预测方法,附数据
(Based on the attention method, TCN is better than LSTM in prediction, with data attached)

文件列表:
data (0, 2018-05-31)
dataset.py (1926, 2018-05-31)
kor_char_parser.py (2776, 2018-05-31)
main.py (6164, 2018-05-31)
results (0, 2018-05-31)
results\without_attention.txt (21556, 2018-05-31)
results\with_attention.txt (21586, 2018-05-31)
tcn.py (4598, 2018-05-31)

# TCN with attention Temporal Convolutional Network with attention layer Concept of model is mostly like [Simple Neural Attentive Meta-Learner (SNAIL)](https://github.com/sagelywizard/snail). But in this model, attention layer is on every top of convolutions layers. And attention size is differ from SNAIL. ## Result Dataset: [Agnews](https://github.com/mhjabreel/CharCNN/tree/master/data/ag_news_csv) without pre-processing - with attention: 0.82 - without attention: 0.81 ### My thoughts on result Most of simple models on agnews shows 0.81 accuracy. (Which tested on [A Structured Self-Attentive Sentence Embedding](https://github.com/flrngel/Self-Attentive-tensorflow), [TagSpace](https://github.com/flrngel/TagSpace-tensorflow) and it uses word based embedding) So 0.82 accuracy with **character based model** seems worthiness.

近期下载者

相关文件


收藏者