基于BERT的中文阅读理解多步注意力网络
首发时间:2019-12-20
摘要:阅读理解是当前自然语言理解中一个重要的任务,它可以很好的衡量一个自然语言处理模型的能力。为了促进阅读理解任务的发展,有很多相关数据集和阅读理解模型被提出。但是这些数据集和模型大多针对于英文阅读理解,中文阅读理解的相关工作还比较少。在这篇文章中,本文研究中文阅读理解任务,提出了一个基于bert和多步推理机制的中文阅读理解模型,该模型主要由BERT层和多步注意力层构成。本文用BERT在多个中文阅读理解数据集上取得了很好的结果,并在此基础上,引入多步推理机制,进一步提高了模型的性能。
关键词: 中文阅读理解; BERT; 多步推理
For information in English, please click here
A Bert based Multi-step Attention Network for Chinese Reading Comprehension
Abstract:Reading comprehension is an important task in current natural language understanding. It can well measure the ability of a natural language processing model. To facilitate the development of reading comprehension tasks, many related datasets and reading comprehension models have been proposed. However, most of these data sets and models are aimed at English reading comprehension, and the related work of Chinese reading comprehension is still relatively small. In this article, we study the task of Chinese reading comprehension and propose a Chinese reading comprehension model based on bert and multi-step inference mechanism. The model is mainly composed of a BERT layer and a multi-step attention layer.We used BERT to obtain good results on multiple Chinese reading comprehension datasets, and based on this, we introduced a multi-step inference mechanism to further improve the performance of the model.
Keywords: Chinese Reading Comprehension BERT Muli-step reasoning
基金:
引用
No.****
动态公开评议
共计0人参与
勘误表
基于BERT的中文阅读理解多步注意力网络
评论
全部评论0/1000