Please use this identifier to cite or link to this item: https://hdl.handle.net/11147/11404
Title: Incorporating Concreteness in Multi-Modal Language Models with Curriculum Learning
Authors: Sezerer, Erhan
Tekir, Selma
Keywords: multi-modal dataset
Wikimedia Commons
multi-modal language model
concreteness
curriculum learning
Issue Date: 2021
Publisher: Mdpi
Abstract: Over the last few years, there has been an increase in the studies that consider experiential (visual) information by building multi-modal language models and representations. It is shown by several studies that language acquisition in humans starts with learning concrete concepts through images and then continues with learning abstract ideas through the text. In this work, the curriculum learning method is used to teach the model concrete/abstract concepts through images and their corresponding captions to accomplish multi-modal language modeling/representation. We use the BERT and Resnet-152 models on each modality and combine them using attentive pooling to perform pre-training on the newly constructed dataset, which is collected from the Wikimedia Commons based on concrete/abstract words. To show the performance of the proposed model, downstream tasks and ablation studies are performed. The contribution of this work is two-fold: A new dataset is constructed from Wikimedia Commons based on concrete/abstract words, and a new multi-modal pre-training approach based on curriculum learning is proposed. The results show that the proposed multi-modal pre-training approach contributes to the success of the model.
URI: https://doi.org/10.3390/app11178241
https://hdl.handle.net/11147/11404
ISSN: 2076-3417
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection

Show full item record

CORE Recommender

Page view(s)

6
checked on Jan 17, 2022

Google ScholarTM

Check

Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.