最近小编在整理一些篇章级别的工作,首先整理了一份篇章级关系抽取 (Doc-RE)的论文列表,希望可以为大家的学习和研究提供一定的便利!
ps. 感谢张振宇博士(http://zhenyu.ac.cn/)在此文完成后给予的订正!
2017
TACL2017: Cross-sentence n-ary relation extraction with graph LSTMs
EACL2017: Distant Supervision for Relation Extraction Beyond the Sentence Boundary
2018
NAACL2018: Simultaneously Self-Attending to All Mentions for Full-Abstract Biological Relation Extraction
2019
ACL2019: DocRED: A large-scale document-level relation extraction dataset
ACL2019: Inter-sentence Relation Extraction with Document-level Graph Convolutional Neural Network
EMNLP2019: Connecting the Dots: Document-level Neural Relation Extraction with Edge-oriented Graphs
AAAI2019: Neural relation extraction within and across sentence boundaries
2020
ACL2020: Reasoning with Latent Structure Refinement for Document-Level Relation Extraction
EMNLP2020: Double Graph Based Reasoning for Document-level Relation Extraction
EMNLP2020: Global-to-Local Neural Networks for Document-Level Relation Extraction
COLING2020: Document-level Relation Extraction with Dual-tier Heterogeneous Graph
COLING2020: Graph Enhanced Dual Attention Network for Document-Level Relation Extraction
COLING2020: Global Context-enhanced Graph Convolutional Networks for Document-level Relation Extraction
PAKDD2020: HIN: Hierarchical Inference Network for Document-Level Relation Extraction
Fine-tune Bert for Docred with two-step processpaper
Entity and Evidence Guided Relation Extraction for DocRED
2021
AAAI2021: Document-Level Relation Extraction with Reconstruction
AAAI2021: Document-Level Relation Extraction with Adaptive Thresholding and Localized Context Pooling
AAAI2021: Entity Structure Within and Throughout: Modeling Mention Dependencies for DocumentLevel Relation Extraction
AAAI2021: Multi-view Inference for Relation Extraction with Uncertain Knowledge
责任编辑:xj
原文标题:篇章级关系抽取(Doc-RE)论文列表整理
文章出处:【微信公众号:深度学习自然语言处理】欢迎添加关注!文章转载请注明出处。
-
深度学习
+关注
关注
73文章
5504浏览量
121221 -
自然语言
+关注
关注
1文章
288浏览量
13355
原文标题:篇章级关系抽取(Doc-RE)论文列表整理
文章出处:【微信号:zenRRan,微信公众号:深度学习自然语言处理】欢迎添加关注!文章转载请注明出处。
发布评论请先 登录
相关推荐
评论