JOINT MULTIPLE INTENT DETECTION AND SLOT FILLING VIA SELF-DISTILLATION
Lisong Chen, Peilin Zhou, Yuexian Zou
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 00:10:27
Intent detection and slot filling are two main tasks in natural language understanding (NLU). These two tasks are highly related and often trained jointly. However, most previous works assume an utterance only corresponds to one intent, ignoring that it can include multiple intents. In this paper, we propose a novel Self-Distillation Joint NLU model (SDJN) for multi-intent NLU. Specifically, we adopt three orderly connected decoders and a self-distillation approach to form an auxiliary loop that establishes interrelated connections between multiple intents and slots. The output of each decoder serves as auxiliary information for the next decoder, and the auxiliary loop completes via the self-distillation. Furthermore, we formulate multiple intent detection as a weakly supervised task and handle it with multiple instance learning (MIL), which exploits token-level intent information to predict multiple intents and guide slot decoder. Experimental results indicate that our model achieves competitive performance compared to others.