Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/52444
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSansanee Auephanwiriyakulen_US
dc.contributor.authorSuwannee Phitakwinaien_US
dc.contributor.authorWattanapong Suttapaken_US
dc.contributor.authorPhonkrit Chandaen_US
dc.contributor.authorNipon Theera-Umponen_US
dc.date.accessioned2018-09-04T09:25:21Z-
dc.date.available2018-09-04T09:25:21Z-
dc.date.issued2013-05-28en_US
dc.identifier.issn01678655en_US
dc.identifier.other2-s2.0-84878059124en_US
dc.identifier.other10.1016/j.patrec.2013.04.017en_US
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=84878059124&origin=inwarden_US
dc.identifier.urihttp://cmuir.cmu.ac.th/jspui/handle/6653943832/52444-
dc.description.abstractVisual communication is important for a deft and/or mute person. It is also one of the tools for the communication between human and machines. In this paper, we develop an automatic Thai sign language translation system that is able to translate sign language that is not finger-spelling sign language. In particular, we utilize Scale Invariant Feature Transform (SIFT) to match a test frame with observation symbols from keypoint descriptors collected in the signature library. These keypoint descriptors are computed from several keyframes that are recorded at different times of day for several days from five subjects. Hidden Markov Models (HMMs) are then used to translate observation sequences to words. We also collect Thai sign language videos from 20 subjects for testing. The system achieves approximately 86-95% for signer-dependent on the average, 79.75% for signer-semi-independent (same subjects used in the HMM training only) on the average and 76.56% for signer-independent on the average. These results are from the constrained system in which each signer wears a shirt with long sleeves in front of dark background. The unconstrained system in which each signer does not wear a long-sleeve shirt in front of various natural backgrounds yields a good result of around 74% on the average on the signer-independent experiment. The important feature of the proposed system is the consideration of shapes and positions of fingers, in addition to hand information. This feature provides the system ability to recognize the hand sign words that have similar gestures. © 2013 Elsevier B.V. All rights reserved.en_US
dc.subjectComputer Scienceen_US
dc.titleThai sign language translation using scale invariant feature transform and hidden markov modelsen_US
dc.typeJournalen_US
article.title.sourcetitlePattern Recognition Lettersen_US
article.volume34en_US
article.stream.affiliationsChiang Mai Universityen_US
article.stream.affiliationsUniversity of Phayaoen_US
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.