Skip to main content

Bert Is Not All You Need For Commonsense Inference

Sunghyun Park, Junsung Son, Seung-Won Hwang, Kyunglang Park

  • SPS
    Members: Free
    IEEE Members: $11.00
    Non-members: $15.00
    Length: 13:04
04 May 2020

This paper studies the task of commonsense inference, especially natural language inference (NLI) and causal inference (CI), requiring knowledge beyond what is stated in the input sentences. State-of-the-arts have been neural models powered with knowledge or contextual embeddings, for example BERT, as commonsenses knowledge. Our research questions are thus: Is BERT all we need for NLI and CI? If not, what is missing information and where to find such information? While many work has studied what is captured in BERT, the limitation of BERT is rather under-studied. Our contribution is observing the limitations of BERT in commonsense inference, then leveraging complementary resources containing missing information. Specifically, we model BERT and complementary resource as two heterogeneous modalities, and explore the pros and cons of multimodal integration approaches. We demonstrate that our proposed integration models achieve the state-of-the-art performance on both NLI and CI tasks.

More Like This

  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00
  • SPS
    Members: $150.00
    IEEE Members: $250.00
    Non-members: $350.00