Auxiliary Capsules For Natural Language Understanding
Ieva Stali?nait?, Ignacio Iacobacci
-
SPS
IEEE Members: $11.00
Non-members: $15.00Length: 11:54
Lately, joint training of intent detection and slot filling has become the best performing approach on the field of Natural Language Understanding (NLU). In this work we explore the combination of the newly introduced capsule network, in a multi-task learning environment, using relevant auxiliary tasks. Specifically, our models perform joint intent classification and slot filling with the aid of named entity recognition (NER) and part of speech (POS) tagging tasks. This allows us to exploit the hierarchical relationships between the intents of the utterances and the different features of input text, not only slots but also named entity mentions, parts of speech, quantity indications, etc. The models developed in this work were evaluated in standard benchmarks, achieving state-of-the-art results on the SNIPS datasets while outperforming the best commercial systems on several low-resource datasets.