0
Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension
本文研究在BERT预训练模型的基础上增加外部KB知识,实验证明在MRC任务上取得了优于BERT的表现。 paper: https://drive.google.com/open?id=156rShpAzTax0Pzql1yuHVuT-tg6Qf_xXsource: ACL 2019code: http://github.com/paddlepaddle/models/tree/develop/PaddleNLP/Research/ACL2019-KTNET