Please use this identifier to cite or link to this item:
https://hdl.handle.net/11147/9571
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Köksal, Ali | - |
dc.contributor.author | Özuysal, Mustafa | - |
dc.date.accessioned | 2020-07-25T22:17:41Z | - |
dc.date.available | 2020-07-25T22:17:41Z | - |
dc.date.issued | 2019 | - |
dc.identifier.issn | 1751-9632 | - |
dc.identifier.issn | 1751-9640 | - |
dc.identifier.uri | https://doi.org/10.1049/iet-cvi.2018.5613 | - |
dc.identifier.uri | https://hdl.handle.net/11147/9571 | - |
dc.description.abstract | The authors propose a novel approach for the description of objects based on contours in their images using real-valued feature vectors. The approach is particularly suitable when objects of interest have high contrast and texture-free images or when the texture variations are high so textural cues are nuisance factors for classification. The proposed descriptor is suitable for nearest neighbour classification still popular in embedded vision applications when the power considerations outweigh the performance requirements. They describe object outlines purely based on the histograms of contour tangent directions mimicking many of the design heuristics of texture-based descriptors such as scale-invariant feature transform (SIFT). However, unlike SIFT and its variants, the proposed approach is directly designed to work with contour data and it is robust to variations inside and outside the object outline as well as the sampling of the contour itself. They show that relying on tangent direction estimation as opposed to gradient computation yields a more robust description and higher nearest neighbour classification rates in a variety of classification problems. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Institution of Engineering and Technology | en_US |
dc.relation.ispartof | IET Computer Vision | en_US |
dc.rights | info:eu-repo/semantics/openAccess | en_US |
dc.subject | Image classification | en_US |
dc.subject | Feature extraction | en_US |
dc.subject | Gradient methods | en_US |
dc.subject | Textural cues | en_US |
dc.subject | Embedded vision applications | en_US |
dc.subject | SIFT | en_US |
dc.subject | Nearest neighbour classification | en_US |
dc.title | Curve description by histograms of tangent directions | en_US |
dc.type | Article | en_US |
dc.institutionauthor | Köksal, Ali | - |
dc.institutionauthor | Özuysal, Mustafa | - |
dc.department | İzmir Institute of Technology. Computer Engineering | en_US |
dc.identifier.volume | 13 | en_US |
dc.identifier.issue | 5 | en_US |
dc.identifier.startpage | 507 | en_US |
dc.identifier.endpage | 514 | en_US |
dc.identifier.wos | WOS:000479306100008 | en_US |
dc.identifier.scopus | 2-s2.0-85070439212 | en_US |
dc.relation.publicationcategory | Makale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanı | en_US |
dc.identifier.doi | 10.1049/iet-cvi.2018.5613 | - |
dc.relation.doi | 10.1049/iet-cvi.2018.5613 | en_US |
dc.coverage.doi | 10.1049/iet-cvi.2018.5613 | en_US |
dc.identifier.wosquality | Q3 | - |
dc.identifier.scopusquality | Q3 | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.languageiso639-1 | en | - |
item.cerifentitytype | Publications | - |
item.grantfulltext | open | - |
item.openairetype | Article | - |
item.fulltext | With Fulltext | - |
crisitem.author.dept | 01. Izmir Institute of Technology | - |
crisitem.author.dept | 03.04. Department of Computer Engineering | - |
Appears in Collections: | Computer Engineering / Bilgisayar Mühendisliği Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection |
Files in This Item:
File | Size | Format | |
---|---|---|---|
IET Computer Vision.pdf | 2.21 MB | Adobe PDF | View/Open |
CORE Recommender
SCOPUSTM
Citations
1
checked on Oct 25, 2024
Page view(s)
65,378
checked on Oct 28, 2024
Download(s)
182
checked on Oct 28, 2024
Google ScholarTM
Check
Altmetric
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.