基于分层注意力机制的知识图谱问答方法
首发时间:2019-01-03
摘要:知识图谱问答能够根据知识图谱中的结构化知识回答自然语言问题。由于自然语言与结构化查询难以直接对应,该映射过程成为知识图谱问答研究中的难点。本文从问题的句法结构入手,提出了一种基于分层注意力机制的知识图谱问答方法 (Hierarchical Attention Mechanism based Approach for Question Answering over Knowledge Graph, KGQA-HAM)。该方法分为编码和解码两个阶段。在编码阶段,该方法将问题表示为依存树,基于改进的LSTM (Long Short-Term Memory Network) 编码各层子树的语义向量,利用子树表达问题描述的实体或关系语义,在问题与查询语句之间建立映射关系。在解码阶段,为了动态提取与查询语句相对应的信息,本文利用分层注意力机制,自根节点向下逐层提取上下文向量,将其融入LSTM中生成查询语句,并根据查询语句查询知识图谱得到问题的答案。本文在公开数据集WebQuestion上进行了测试。实验结果表明,相比于词级别的注意力机制的方法,本文提出的依存树级别的分层注意力机制的方法显著提升了知识图谱问答任务的准确率。
For information in English, please click here
Hierarchical Attention Mechanism based Approach for Question Answering over Knowledge Graph
Abstract:Question answering over knowledge graph(KGQA) can give answers to natural language questions according to the structured knowledge in the knowledge graph. The mismatch between natural language and structured query makes the mappinga major difficulty of KGQA. Starting from the syntactic structure of the question, this paper proposes a hierarchical attention mechanism based approach for KGQA. Our approach is divided into two stages: encoding and decoding. In the encoding stage, our approach represents the question as a dependency tree. Based on the improved LSTM (Long Short-Term Memory Network), our approach encodes the semantic vectors of each subtree, and uses subtrees toexpress the semantics of entities or relations described by the question, so as to establish a mapping relationship between the question and the query statement. In the decoding stage, our approach utilizes hierarchical attention mechanism to extract context vector from root node to leaf node in order to dynamically extract the information corresponding to the query statement, then integrate it into LSTM to generate query statement and query the knowledge graph to get answers. Our approach was tested on the public dataset WebQuestion. The results show that compared with the word level attention mechanism approach, the dependency tree levelhierarchical attention mechanism approach significantly improves the accuracy of KGQA.
Keywords: Question answering system Knowledge graph Dependency tree Hierarchical attention
基金:
引用
No.****
同行评议
勘误表
基于分层注意力机制的知识图谱问答方法
评论
全部评论0/1000