ONTOLOGY-BASED DIALOGUE STATE TRACKING AND ITS KNOWLEDGE DISTILLATION METHOD
-
Abstract
Dialogue state tracking is an indispensable part of task-oriented dialogue systems, which can acquire and manage the user’sintentions during the dialogue process. In previous studies, dialogue state tracking methods use multi - slot learning to capture the associations between slots, but the difference in difficulty of each task is not considered. In addition, most existing models are large models based on pre-trained models, which are not conducive to deployment and do not meet the real-time requirements of dialogue systems. In view of the above problems, the weighted joint optimization of slot loss was performed according to the difficulty of slot classification. The knowledge distillation method was used to compress the model and maintained the accuracy. In the absence of a large teacher model, the small models were used to distill each other from scratch, and it could also achieve the accuracy of the teacher model distillation. Experimental results show that the proposed method achieves good results on the standard WOZ2. 0 task-oriented dialogue dataset.
-
-