Please use this identifier to cite or link to this item:
http://cmuir.cmu.ac.th/jspui/handle/6653943832/76351
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Nattapat Karaket | en_US |
dc.contributor.author | Sansanee Auephanwiriyakul | en_US |
dc.contributor.author | Nipon Theera-Umpon | en_US |
dc.date.accessioned | 2022-10-16T07:08:38Z | - |
dc.date.available | 2022-10-16T07:08:38Z | - |
dc.date.issued | 2021-01-01 | en_US |
dc.identifier.issn | 21903026 | en_US |
dc.identifier.issn | 21903018 | en_US |
dc.identifier.other | 2-s2.0-85105943493 | en_US |
dc.identifier.other | 10.1007/978-981-33-6757-9_46 | en_US |
dc.identifier.uri | https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85105943493&origin=inward | en_US |
dc.identifier.uri | http://cmuir.cmu.ac.th/jspui/handle/6653943832/76351 | - |
dc.description.abstract | To help in the education-assistive technology in automobile industry, one of the best approaches is to create an augmented reality of each car part in the engine room. To do so, the location of each part is required to be identified. In this paper, the Multi-layer Multi-model Images Classifier Ensemble (MICE) is utilized in the recognition process. The data was collected from car engine room of the Toyota Vios 2017 model in different angles and different lighting conditions. The recognition rate of the best validation test set is 91.25% even though there is still problem with the similarity between classes. | en_US |
dc.subject | Computer Science | en_US |
dc.subject | Decision Sciences | en_US |
dc.title | Automobile Parts Localization Using Multi-layer Multi-model Images Classifier Ensemble | en_US |
dc.type | Book Series | en_US |
article.title.sourcetitle | Smart Innovation, Systems and Technologies | en_US |
article.volume | 212 | en_US |
article.stream.affiliations | Chiang Mai University | en_US |
Appears in Collections: | CMUL: Journal Articles |
Files in This Item:
There are no files associated with this item.
Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.