Please use this identifier to cite or link to this item: https://hdl.handle.net/11147/5744
Full metadata record
DC FieldValueLanguage
dc.contributor.authorBaştanlar, Yalın-
dc.date.accessioned2017-06-12T13:15:31Z-
dc.date.available2017-06-12T13:15:31Z-
dc.date.issued2014-
dc.identifier.citationBastanlar, Y. (2014). Reduced egomotion estimation drift using omnidirectional views. Electronic Letters on Computer Vision and Image Analysis, 13(3), 1-12. doi:10.5565/rev/elcvia.564en_US
dc.identifier.issn1577-5097-
dc.identifier.urihttps://doi.org/10.5565/rev/elcvia.564-
dc.identifier.urihttp://hdl.handle.net/11147/5744-
dc.description.abstractEstimation of camera motion from a given image sequence is a common task for multi-view 3D computer vision applications. Salient features (lines, corners etc.) in the images are used to estimate the motion of the camera, also called egomotion. This estimation suffers from an error built-up as the length of the image sequence increases and this causes a drift in the estimated position. In this letter, this phenomenon is demonstrated and an approach to improve the estimation accuracy is proposed. The main idea of the proposed method is using an omnidirectional camera (360° horizontal field of view) in addition to a conventional (perspective) camera. Taking advantage of the correspondences between the omnidirectional and perspective images, the accuracy of camera position estimates can be improved. In our work, we adopt the sequential structure-from-motion approach which starts with estimating the motion between first two views and more views are added one by one. We automatically match points between omnidirectional and perspective views. Point correspondences are used for the estimation of epipolar geometry, followed by the reconstruction of 3D points with iterative linear triangulation. In addition, we calibrate our cameras using sphere camera model which covers both omnidirectional and perspective cameras. This enables us to treat the cameras in the same way at any step of structure-from-motion. We performed simulated and real image experiments to compare the estimation accuracy when only perspective views are used and when an omnidirectional view is added. Results show that the proposed idea of adding omnidirectional views reduces the drift in egomotion estimation.en_US
dc.language.isoenen_US
dc.publisherCentre de Visio per Computadoren_US
dc.relation.ispartofElectronic Letters on Computer Vision and Image Analysisen_US
dc.rightsinfo:eu-repo/semantics/openAccessen_US
dc.subjectEgomotion estimationen_US
dc.subjectOmnidirectional camerasen_US
dc.subjectStructure-from-motionen_US
dc.subjectVisual odometryen_US
dc.titleReduced egomotion estimation drift using omnidirectional viewsen_US
dc.typeArticleen_US
dc.authoridTR176747en_US
dc.institutionauthorBaştanlar, Yalın-
dc.departmentİzmir Institute of Technology. Computer Engineeringen_US
dc.identifier.volume13en_US
dc.identifier.issue3en_US
dc.identifier.startpage1en_US
dc.identifier.endpage12en_US
dc.identifier.scopus2-s2.0-84919771361en_US
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.doi10.5565/rev/elcvia.564-
dc.relation.doi10.5565/rev/elcvia.564en_US
dc.coverage.doi10.5565/rev/elcvia.564en_US
dc.identifier.scopusqualityQ4-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
item.fulltextWith Fulltext-
item.languageiso639-1en-
item.grantfulltextopen-
item.openairetypeArticle-
crisitem.author.dept03.04. Department of Computer Engineering-
Appears in Collections:Computer Engineering / Bilgisayar Mühendisliği
Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Files in This Item:
File Description SizeFormat 
5744.pdfMakale2.25 MBAdobe PDFThumbnail
View/Open
Show simple item record



CORE Recommender

Page view(s)

64,342
checked on Apr 15, 2024

Download(s)

574
checked on Apr 15, 2024

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.