Skip to main content

WLINKER: MODELING RELATIONAL TRIPLET EXTRACTION AS WORD LINKING

Yongxiu Xu, Chuan Zhou, Jing Yu, Yue Hu, Heyan Huang

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 00:10:16
08 May 2022

Relational triplet extraction (RTE) is a fundamental task for automatically extracting information from unstructured text, which has attracted growing interest in recent years. However, it remains challenging due to the difficulty in extracting the overlapping relational triplets. Existing approaches for overlapping RTE, either suffer from exposure bias or designing complex tagging scheme. In light of these limitations, we take an innovative perspective on RTE by modeling it as a word linking problem that learns to link from subject words to object words for each relation type. To this end, we propose a simple but effective multi-task learning model, WLinker, which can extract overlapping relational triplets in an end-to-end fashion. Specifically, we perform word link prediction based on multi-level biaffine attention for leaning the word-level correlations under each relation type. Additionally, our model joint entity detection and word link prediction tasks by a multi-task framework, which combines the local sequential and global dependency structures of words in sentence and captures the implicit interactions between the two tasks. Extensive experiments are conducted on two benchmark datasets NYT and WebNLG. The results demonstrate the effectiveness of WLinker, in comparison with a range of previous state-of-the-art baselines.

More Like This

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00