Please use this identifier to cite or link to this item: https://hdl.handle.net/11147/14815
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAgrali, Mahmut-
dc.contributor.authorTekir, Selma-
dc.date.accessioned2024-09-24T15:58:52Z-
dc.date.available2024-09-24T15:58:52Z-
dc.date.issued2024-
dc.identifier.isbn9798350388978-
dc.identifier.isbn9798350388961-
dc.identifier.issn2165-0608-
dc.identifier.urihttps://doi.org/10.1109/SIU61531.2024.10600801-
dc.description.abstractPre-trained language models have introduced significant performance boosts in natural language processing. Fine-tuning of these models using downstream tasks' supervised data further improves the acquired results. In the fine-tuning process, combining the learning of tasks is an effective approach. This paper proposes a multi-task learning framework based on BERT. To accomplish the tasks of sentiment analysis, paraphrase detection, and semantic text similarity, we include linear layers, a Siamese network with cosine similarity, and convolutional layers to the appropriate places in the architecture. We conducted an ablation study using Stanford Sentiment Treebank (SST), Quora, and SemEval STS datasets for each task to test the framework and its components' effectiveness. The results demonstrate that the proposed multi-task framework improves the performance of BERT. The best results obtained for sentiment analysis, paraphrase detection, and semantic text similarity are accuracies of 0.534 and 0.697 and a Pearson correlation coefficient of 0.345.en_US
dc.language.isotren_US
dc.publisherIeeeen_US
dc.relation.ispartof32nd IEEE Signal Processing and Communications Applications Conference (SIU) -- MAY 15-18, 2024 -- Tarsus Univ Campus, Mersin, TURKEYen_US
dc.relation.ispartofseriesSignal Processing and Communications Applications Conference-
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectMulti-Task Learningen_US
dc.subjectSentiment Analysisen_US
dc.subjectParaphrase Detectionen_US
dc.subjectSemantic Textual Similarityen_US
dc.titleImprovements on a Multi-Task Bert Modelen_US
dc.title.alternativeÇok Görevli Bert Modeli Üzerinde Iyileştirmeleren_US
dc.typeConference Objecten_US
dc.departmentİzmir Institute of Technologyen_US
dc.identifier.wosWOS:001297894700071-
dc.identifier.scopus2-s2.0-85200919611-
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.identifier.doi10.1109/SIU61531.2024.10600801-
dc.identifier.wosqualityN/A-
dc.identifier.scopusqualityN/A-
dc.description.woscitationindexConference Proceedings Citation Index - Science-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.languageiso639-1tr-
item.openairetypeConference Object-
item.grantfulltextnone-
item.fulltextNo Fulltext-
item.cerifentitytypePublications-
crisitem.author.dept03.04. Department of Computer Engineering-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Show simple item record



CORE Recommender

Page view(s)

88
checked on Mar 31, 2025

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.