Please use this identifier to cite or link to this item: https://hdl.handle.net/11147/14815
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAğralı,M.-
dc.contributor.authorTekir,S.-
dc.date.accessioned2024-09-24T15:58:52Z-
dc.date.available2024-09-24T15:58:52Z-
dc.date.issued2024-
dc.identifier.isbn979-835038896-1-
dc.identifier.urihttps://doi.org/10.1109/SIU61531.2024.10600801-
dc.identifier.urihttps://hdl.handle.net/11147/14815-
dc.descriptionBerdan Civata B.C.; et al.; Figes; Koluman; Loodos; Tarsus Universityen_US
dc.description.abstractPre-trained language models have introduced significant performance boosts in natural language processing. Fine-tuning of these models using downstream tasks' supervised data further improves the acquired results. In the fine-tuning process, combining the learning of tasks is an effective approach. This paper proposes a multi-task learning framework based on BERT. To accomplish the tasks of sentiment analysis, paraphrase detection, and semantic text similarity, we include linear layers, a Siamese network with cosine similarity, and convolutional layers to the appropriate places in the architecture. We conducted an ablation study using Stanford Sentiment Treebank (SST), Quora, and SemEval STS datasets for each task to test the framework and its components' effectiveness. The results demonstrate that the proposed multi-task framework improves the performance of BERT. The best results obtained for sentiment analysis, paraphrase detection, and semantic text similarity are accuracies of 0.534 and 0.697 and a Pearson correlation coefficient of 0.345. © 2024 IEEE.en_US
dc.language.isotren_US
dc.publisherInstitute of Electrical and Electronics Engineers Inc.en_US
dc.relation.ispartof32nd IEEE Conference on Signal Processing and Communications Applications, SIU 2024 - Proceedings -- 32nd IEEE Conference on Signal Processing and Communications Applications, SIU 2024 -- 15 May 2024 through 18 May 2024 -- Mersin -- 201235en_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectmulti-task learningen_US
dc.subjectparaphrase detectionen_US
dc.subjectsemantic textual similarityen_US
dc.subjectsentiment analysisen_US
dc.titleImprovements on a Multi-task BERT Model;en_US
dc.title.alternativeÇok Görevli BERT Modeli Üzerinde Iyileştirmeleren_US
dc.typeConference Objecten_US
dc.departmentIzmir Institute of Technologyen_US
dc.identifier.scopus2-s2.0-85200919611-
dc.relation.publicationcategoryKonferans Öğesi - Uluslararası - Kurum Öğretim Elemanıen_US
dc.identifier.doi10.1109/SIU61531.2024.10600801-
dc.authorscopusid58156314700-
dc.authorscopusid16234844500-
dc.identifier.wosqualityN/A-
dc.identifier.scopusqualityN/A-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.languageiso639-1tr-
item.cerifentitytypePublications-
item.grantfulltextnone-
item.openairetypeConference Object-
item.fulltextNo Fulltext-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Show simple item record



CORE Recommender

Page view(s)

10
checked on Oct 28, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.