Cookies

We use cookies to ensure that we give you the best experience on our website. By continuing to browse this repository, you give consent for essential cookies to be used. You can read more about our Privacy and Cookie Policy.


Durham Research Online
You are in:

ExBERT: An External Knowledge Enhanced BERT for Natural Language Inference

Gajbhiye, Amit and Al Moubayed, Noura and Bradley, Steven (2021) 'ExBERT: An External Knowledge Enhanced BERT for Natural Language Inference.', IEEE 30th International Conference on Artificial Neural Networks (ICANN2021) Virtual, 14-17 Sept 2021.

Abstract

Neural language representation models such as BERT, pretrained on large-scale unstructured corpora lack explicit grounding to real-world commonsense knowledge and are often unable to remember facts required for reasoning and inference. Natural Language Inference (NLI) is a challenging reasoning task that relies on common human understanding of language and real-world commonsense knowledge. We introduce a new model for NLI called External Knowledge Enhanced BERT (ExBERT), to enrich the contextual representation with realworld commonsense knowledge from external knowledge sources and enhance BERT’s language understanding and reasoning capabilities. ExBERT takes full advantage of contextual word representations obtained from BERT and employs them to retrieve relevant external knowledge from knowledge graphs and to encode the retrieved external knowledge. Our model adaptively incorporates the external knowledge context required for reasoning over the inputs. Extensive experiments on the challenging SciTail and SNLI benchmarks demonstrate the effectiveness of ExBERT: in comparison to the previous state-of-the-art, we obtain an accuracy of 95.9% on SciTail and 91.5% on SNLI.

Item Type:Conference item (Paper)
Full text:(AM) Accepted Manuscript
Download PDF
(424Kb)
Status:Peer-reviewed
Publisher Web site:https://e-nns.org/icann2021/
Publisher statement:© 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Date accepted:01 July 2021
Date deposited:20 July 2021
Date of first online publication:No date available
Date first made open access:18 September 2021

Save or Share this output

Export:
Export
Look up in GoogleScholar