Skip to main content
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:06:55
10 May 2022

Spoken language understanding system (SLU) typically includes two tasks: Intent detection (ID) and Slot filling (SF). Optimizing these two tasks in an interactive way with attention mechanism has been shown effective. However, previous attention-based works leveraged only the first order attention design, which is lacking in efficacy. To trigger more adequate information interaction between the input intent or slot features, we propose a novel framework with Bilinear attention, which can build the second order feature interactions. By stacking numerous Bilinear attention modules and equipping the Exponential Linear Unit activation, it can build higher and infinity order feature interactions. To demonstrate the effectiveness of the proposed framework, we conduct some experiments on two benchmark datasets, i.e., SNIPS and ATIS. And the experimental results show that our framework is more competitive than multiple baselines as well as the first order attention model.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00