Please use this identifier to cite or link to this item:
Title: Performance evaluation of BERT vectors on natural language inference models
Authors: Ogul, I.U.
Tekir, S.
Keywords: BERT
Decomposable Attention
Natural Language Inference
Issue Date: 2021
Publisher: Institute of Electrical and Electronics Engineers Inc.
Abstract: Natural language inference aims to classify the binary relation between opinionated sentences as a contradiction, entailment, or neutral. To accomplish the task, classifiers transform textual data into numerical representations called vectors or embeddings. In this study, both static (Glove, OntoNotes5) and contextual (BERT) word embedding methods are used. Classifying the logical relationships between opinionated sentences is difficult. These sentences have complex grammatical structures to convert them into logical representations, and traditional natural language processing solutions are insufficient to meet the requirement. This study uses Decomposable Attention and Advanced LSTM for Natural Language Inference (ESIM) deep learning methods to perform this classification. The best accuracy score is achieved with 88% using ESIM - BERT on the SNLI corpus. © 2021 IEEE.
Description: 29th IEEE Conference on Signal Processing and Communications Applications, SIU 2021 -- 9 June 2021 through 11 June 2021 -- -- 170536
ISBN: 9781665436496
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection

Show full item record

CORE Recommender

Page view(s)

checked on Jan 17, 2022

Google ScholarTM



Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.