Please use this identifier to cite or link to this item:
https://hdl.handle.net/11147/14856
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Yonder, Veli Mustafa | - |
dc.contributor.author | Dogan, Fehmi | - |
dc.contributor.author | Cavka, Hasan Burak | - |
dc.contributor.author | Tayfur, Gokmen | - |
dc.contributor.author | Dulgeroglu, Ozum | - |
dc.date.accessioned | 2024-10-25T23:18:36Z | - |
dc.date.available | 2024-10-25T23:18:36Z | - |
dc.date.issued | 2023 | - |
dc.identifier.isbn | 9789491207341 | - |
dc.identifier.issn | 2684-1843 | - |
dc.identifier.uri | https://hdl.handle.net/11147/14856 | - |
dc.description.abstract | People spend a considerable amount of time in public spaces for a variety of reasons, albeit at various times of the day and during season. Therefore, it is of utmost importance for both urban designers and local authorities to try to gain an understanding of the architectural qualities of these spaces. Within the scope of this study, squares and green parks in Izmir, the third largest city in Turkey, were analyzed in terms of their dimensions, landscape characteristics, the quality of their semi-open spaces, their landmarks, accessibility, and overall aesthetic quality. Using linear predictor, general regression neural networks, multilayer feed-forward neural networks (2-3-4-5-6 nodes), and genetic algorithms, soft computing models were trained in accordance with the results of the conducted analyses. Meanwhile, using space syntax methodologies, a visibility graph analysis and axial map analysis were conducted. The training results (i.e., root mean square error, mean absolute error, bad prediction rates for testing and training phases, and standard deviation of absolute error) were obtained in a comparative table based on training times and root mean square error values. According to the benchmarking table, the network that most accurately predicts the aesthetic score is the 2-node MLFNN, whereas the 6-node MLFN network is the least successful network. | en_US |
dc.language.iso | en | en_US |
dc.publisher | Ecaade-education & Research Computer Aided Architectural design Europe | en_US |
dc.relation.ispartof | 41st Conference on Education and Research in Computer Aided Architectural Design in Europe (ECAADE) -- SEP 18-23, 2023 -- Graz Univ Technol, Graz, AUSTRIA | en_US |
dc.relation.ispartofseries | eCAADe Proceedings | - |
dc.rights | info:eu-repo/semantics/closedAccess | en_US |
dc.subject | Multilayer Perceptron | en_US |
dc.subject | Architectural Aesthetics | en_US |
dc.subject | General Regression Neural Net | en_US |
dc.subject | Spatial Configuration | en_US |
dc.title | Decoding and Predicting the Attributes of Urban Public Spaces with Soft Computing Models and Space Syntax Approaches | en_US |
dc.type | Conference Object | en_US |
dc.department | Izmir Institute of Technology | en_US |
dc.identifier.startpage | 761 | en_US |
dc.identifier.endpage | 768 | en_US |
dc.identifier.wos | WOS:001235623100076 | - |
dc.relation.publicationcategory | Konferans Öğesi - Uluslararası - Kurum Öğretim Elemanı | en_US |
dc.identifier.doi | [WOS-DOI-BELIRLENECEK-4] | - |
dc.authorwosid | Çavka, Hasan/R-1698-2019 | - |
dc.identifier.wosquality | N/A | - |
dc.identifier.scopusquality | Q4 | - |
dc.description.woscitationindex | Conference Proceedings Citation Index - Social Science &- Humanities | - |
item.fulltext | No Fulltext | - |
item.grantfulltext | none | - |
item.languageiso639-1 | en | - |
item.openairecristype | http://purl.org/coar/resource_type/c_18cf | - |
item.cerifentitytype | Publications | - |
item.openairetype | Conference Object | - |
crisitem.author.dept | 02.02. Department of Architecture | - |
crisitem.author.dept | 02.02. Department of Architecture | - |
crisitem.author.dept | 02.02. Department of Architecture | - |
crisitem.author.dept | 03.03. Department of Civil Engineering | - |
Appears in Collections: | WoS İndeksli Yayınlar Koleksiyonu / WoS Indexed Publications Collection |
CORE Recommender
Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.