面向动态知识库的多教师知识问答模型

MULTI-TEACHER KNOWLEDGE BASED QUESTION ANSWERING MODEL FOR DYNAMIC KNOWLEDGE BASE

  • 摘要: 针对已有知识库问答方法缺乏连续学习能力,难以在动态知识库上进行部署,提出一种基于多教师蒸馏与增量学习的动态知识库问答模型。该模型能够在学习新知识的同时保证对历史知识的记忆能力,实现连续学习。此外,受到多教师蒸馏的启发,设计一个多教师框架,利用多个不同的教师模型进一步优化知识蒸馏的效果。实验结果表明,该模型在三个标准数据集上达到了91.02%、72.65%和73.82%的准确率。

     

    Abstract: Aiming at the lack of continuous learning ability of existing knowledge base question answering methods and difficulty on dynamic knowledge bases, we proposed a question answering model based on multi-teacher distillation and incremental learning dynamic knowledge base. The model could guarantee the memory ability of historical knowledge while learning new knowledge, and realized continuous learning. In addition, inspired by multi-teacher distillation, this paper designed a multi-teacher framework to further optimize the effect of knowledge distillation by using multiple different teacher models. Experimental results show that the model achieves accuracy rates of 91.02, 72.65, and 73.82 on three standard datasets.

     

/

返回文章
返回