Please use this identifier to cite or link to this item: https://hdl.handle.net/11147/14580
Full metadata record
DC FieldValueLanguage
dc.contributor.authorErdem,Y.S.-
dc.contributor.authorIheme,L.O.-
dc.contributor.authorUçar,M.-
dc.contributor.authorÖzuysal,Ö.Y.-
dc.contributor.authorBalıkçı,M.-
dc.contributor.authorMorani,K.-
dc.contributor.authorÜnay,D.-
dc.date.accessioned2024-06-19T14:29:43Z-
dc.date.available2024-06-19T14:29:43Z-
dc.date.issued2024-
dc.identifier.issn1746-8094-
dc.identifier.urihttps://doi.org/10.1016/j.bspc.2024.106514-
dc.descriptionIheme, Leonardo/0000-0002-1136-3961; ERDEM, Yusuf Sait/0000-0002-8515-8303; Morani, Kenan/0000-0002-4383-5732; Yalcin-Ozuysal, Ozden/0000-0003-0552-368X; Unay, Devrim/0000-0003-3478-7318; Toreyin, Behcet Ugur/0000-0003-4406-2783en_US
dc.description.abstractRecent advancements in the field of image synthesis have led to the development of Neural Style Transfer (NST) and Generative Adversarial Networks (GANs) which have proven to be powerful tools for data augmentation and realistic data generation. While GANs have been widely used for both data augmentation and generation, NST has not been employed for data generation tasks. Nonetheless, the simpler structure of NST compared to GANs makes it a promising alternative. In this research, we introduce an NST-based method for data generation, which to the best of our knowledge, is the first of its kind. By taking advantage of simplified architecture of NST models attributed to the utilization of a real image as the style input, our method enhances performance in data generation tasks under limited resource conditions. Additionally by utilizing patch-based training and high-resolution inference process high quality images are synthesized with limited resources. Furthermore multi-model and noised input is utilized for increased diversity with the novel NST-based data generation approach. Our proposed method utilizes binary segmentation maps as the condition input, representing the cell and wound regions. We evaluate the performance of our proposed NST-based method and compare it with a modified and fine-tuned conditional GAN (C-GAN) methods for the purpose of conditional generation of phase-contrast wound healing assay images. Through a series of quantitative and qualitative analyses, we demonstrate that our NST-based method outperforms the modified C-GAN while utilizing fewer resources. Additionally, we show that our NST-based method enhances segmentation performance when used as a data augmentation method. Our findings provide compelling evidence regarding the potential of NST for data generation tasks and its superiority over traditional GAN-based methods. The NST for data generation method was implemented in Python language and will be accessible at https://github.com/IDU-CVLab/NST_for_Gen under the MIT licence. © 2024 Elsevier Ltden_US
dc.description.sponsorshipTürkiye Bilimsel ve Teknolojik Araştırma Kurumu, TÜBİTAK, (119E578, FP7 PIRG08-GA-2010-27697); Türkiye Bilimsel ve Teknolojik Araştırma Kurumu, TÜBİTAKen_US
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.relation.ispartofBiomedical Signal Processing and Controlen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectBiomedical image synthesisen_US
dc.subjectGenerative artificial neural networken_US
dc.subjectNeural style transferen_US
dc.subjectPhase-contrast microscopyen_US
dc.subjectWound healingen_US
dc.titleNovel Neural Style Transfer based data synthesis method for phase-contrast wound healing assay imagesen_US
dc.typeArticleen_US
dc.authoridIheme, Leonardo/0000-0002-1136-3961-
dc.authoridERDEM, Yusuf Sait/0000-0002-8515-8303-
dc.authoridMorani, Kenan/0000-0002-4383-5732-
dc.authoridYalcin-Ozuysal, Ozden/0000-0003-0552-368X-
dc.authoridUnay, Devrim/0000-0003-3478-7318-
dc.authoridToreyin, Behcet Ugur/0000-0003-4406-2783-
dc.departmentIzmir Institute of Technologyen_US
dc.identifier.volume96en_US
dc.identifier.wosWOS:001257665300001-
dc.identifier.scopus2-s2.0-85195813996-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.doi10.1016/j.bspc.2024.106514-
dc.authorscopusid57216734973-
dc.authorscopusid37035860200-
dc.authorscopusid58717241700-
dc.authorscopusid57221553749-
dc.authorscopusid58305217200-
dc.authorscopusid57203893476-
dc.authorscopusid9249500700-
dc.authorwosidYalcin Ozuysal, Ozden/D-5528-2019-
dc.authorwosidIheme, Leonardo/AAE-2987-2022-
dc.authorwosiderdem, yusuf/AAE-1131-2020-
dc.authorwosidUnay, Devrim/G-6002-2010-
dc.authorwosidMorani, Kenan/ITV-5602-2023-
dc.authorwosidToreyin, Behcet Ugur/A-6780-2012-
dc.identifier.wosqualityQ2-
dc.identifier.scopusqualityQ1-
dc.description.woscitationindexScience Citation Index Expanded-
item.fulltextNo Fulltext-
item.grantfulltextnone-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
item.openairetypeArticle-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection
Show simple item record



CORE Recommender

Page view(s)

56
checked on Nov 18, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.