ormer-based-pretrained-model-for-event-extraction
所属分类:人工智能/神经网络/深度学习
开发工具:Python
文件大小:26329KB
下载次数:3
上传日期:2020-02-29 03:29:16
上 传 者:
sh-1993
说明: Transformer-based-pretrained-model-for-event-extraction,使用基于Transformer的预训练模型在ACE2005数据集上进行事件抽取任务
(Transformer based pre trained model for event extraction using a Transformer based pre trained model on the ACE2005 dataset)
文件列表:
DataLoadAndTrain.py (27868, 2020-02-29)
consts.py (5028, 2020-02-29)
data (0, 2020-02-29)
data\sample.json (1279, 2020-02-29)
migration_model (0, 2020-02-29)
migration_model\DMCNN (0, 2020-02-29)
migration_model\DMCNN\config.py (6499, 2020-02-29)
migration_model\DMCNN\dmcnn.py (3410, 2020-02-29)
migration_model\DMCNN\loader.py (6659, 2020-02-29)
migration_model\DMCNN\test.py (171, 2020-02-29)
migration_model\DMCNN\train.py (153, 2020-02-29)
migration_model\DataLoadAndTrain.py (19674, 2020-02-29)
migration_model\data_load.py (7212, 2020-02-29)
migration_model\enet (0, 2020-02-29)
migration_model\enet\__init__.py (0, 2020-02-29)
migration_model\enet\consts.py (209, 2020-02-29)
migration_model\enet\corpus (0, 2020-02-29)
migration_model\enet\corpus\Corpus.py (987, 2020-02-29)
migration_model\enet\corpus\Data.py (8690, 2020-02-29)
migration_model\enet\corpus\Sentence.py (5943, 2020-02-29)
migration_model\enet\corpus\__init__.py (0, 2020-02-29)
migration_model\enet\corpus\debug.py (279, 2020-02-29)
migration_model\enet\models (0, 2020-02-29)
migration_model\enet\models\DynamicLSTM.py (3979, 2020-02-29)
migration_model\enet\models\EmbeddingLayer.py (6000, 2020-02-29)
migration_model\enet\models\GCN.py (4704, 2020-02-29)
migration_model\enet\models\HighWay.py (1202, 2020-02-29)
migration_model\enet\models\SelfAttention.py (2699, 2020-02-29)
migration_model\enet\models\__init__.py (0, 2020-02-29)
migration_model\enet\models\ee.py (12231, 2020-02-29)
migration_model\enet\models\model.py (1246, 2020-02-29)
migration_model\enet\run (0, 2020-02-29)
migration_model\enet\run\__init__.py (0, 2020-02-29)
migration_model\enet\run\ee (0, 2020-02-29)
migration_model\enet\run\ee\__init__.py (0, 2020-02-29)
migration_model\enet\run\ee\out (0, 2020-02-29)
migration_model\enet\run\ee\out\dev_epoch_1.txt (177357, 2020-02-29)
migration_model\enet\run\ee\out\entity.vec (530, 2020-02-29)
... ...
# Transformer-based-pretrained-model-for-event-extraction
使用BERT/OpenAI-GPT2/ALBERT/XLM/Roberta/XLNet/Ctrl/DistilBert/TransfoXL等预训练语言模型在ace2005数据集上进行事件抽取任务。
代码在nlpcl-lab / bert-event-extraction框架上修改,使用transformers包和CRF模型替换了原项目的模型构建部分。
模型整体采用序列标注的方式,未使用任何辅助信息。 先用crf做触发词识别,再根据触发词识别结果再用crf进行论元识别。
预训练模型选用xlm-roberta-large时,trigger-f1=0.72; argument-f1=0.45。argument提升了0.05。
#### 说明:当前是按照一个事件类型使用一个CRF进行识别论元,这会造成一定程度的数据稀疏问题,可以在consts.py中修改,将多种事件类型合并为一个CRF识别
#### trigger classification
P=0.677 R=0.754 F1=0.713
#### argument classification
P=0.588 R=0.384 F1=0.4***
#### trigger identification
P=0.723 R=0.805 F1=0.762
#### argument identification
P=0.617 R=0.403 F1=0.488
超参如下
#### ==================== 超参 ====================
可选预训练模型:
PreTrainModel = ['Bert_large', 'Gpt', 'Gpt2', 'Ctrl', 'TransfoXL',
'Xlnet_base', 'Xlnet_large', 'XLM', 'DistilBert_base', 'DistilBert_large',
'Roberta_base', 'Roberta_large', 'XLMRoberta_base', 'XLMRoberta_large',
'ALBERT-base-v1', 'ALBERT-large-v1', 'ALBERT-xlarge-v1', 'ALBERT-xxlarge-v1',
'ALBERT-base-v2', 'ALBERT-large-v2', 'ALBERT-xlarge-v2', 'ALBERT-xxlarge-v2']
early_stop = 5
lr = 1e-05
l2 = 1e-05
n_epochs = 50
logdir = logdir
trainset = data/train.json
devset = data/dev.json
testset = data/test.json
LOSS_alpha = 1.0
PreTrain_Model = XLMRoberta_large
model_path = /Transformer-based-pretrained-model-for-event-extraction-master/save_model/latest_model.pt
batch_size = 16
### 运行
1. 在LDC网站获取ACE2005数据集,企业和学校购买后方可获取(LDC2006T06): https://catalog.ldc.upenn.edu/byyear#2005
2. 按照 https://github.com/nlpcl-lab/ace2005-preprocessing 的方法,将ACE2005数据处理为json格式的train/dev/test后放入\\data文件夹,处理后的格式应该和\\data中sample.json一致
3. 安装依赖环境
4. 训练评估:python DataLoadAndTrain.py --LOSS_alpha=1 --lr=1e-5 --l2=1e-5 --early_stop=5 --PreTrain_Model="XLMRoberta_large" --batch_size=16
我的邮箱:491377729@qq.com
我的知乎主页:https://www.zhihu.com/people/zhang-han-32-13-81
近期下载者:
相关文件:
收藏者: