Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/56955
Title: A novel string grammar unsupervised possibilistic C-medians algorithm for sign language translation systems
Authors: Atcharin Klomsae
Sansanee Auephanwiriyakul
Nipon Theera-Umpon
Authors: Atcharin Klomsae
Sansanee Auephanwiriyakul
Nipon Theera-Umpon
Keywords: Chemistry;Computer Science;Mathematics;Physics and Astronomy
Issue Date: 1-Dec-2017
Abstract: © 2017 by the authors. Sign language is a basic method for solving communication problems between deaf and hearing people. In order to communicate, deaf and hearing people normally use hand gestures, which include a combination of hand positioning, hand shapes, and hand movements. Thai Sign Language is the communication method for Thai hearing-impaired people. Our objective is to improve the dynamic Thai Sign Language translation method with a video captioning technique that does not require prior hand region detection and segmentation through using the Scale Invariant Feature Transform (SIFT) method and the String Grammar Unsupervised Possibilistic C-Medians (sgUPCMed) algorithm. This work is the first to propose the sgUPCMed algorithm to cope with the unsupervised generation of multiple prototypes in the possibilistic sense for string data. In our experiments, the Thai Sign Language data set (10 isolated sign language words) was collected from 25 subjects. The best average result within the constrained environment of the blind test data sets of signer-dependent cases was 89-91%, and the successful rate of signer semi-independent cases was 81-85%, on average. For the blind test data sets of signer-independent cases, the best average classification rate was 77-80%. The average result of the system without a constrained environment was around 62-80% for the signer-independent experiments. To show that the proposed algorithm can be implemented in other sign languages, the American sign language (RWTH-BOSTON-50) data set, which consists of 31 isolated American Sign Language words, is also used in the experiment. The system provides 88.56% and 91.35% results on the validation set alone, and for both the training and validation sets, respectively.
URI: https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85040053527&origin=inward
http://cmuir.cmu.ac.th/jspui/handle/6653943832/56955
ISSN: 20738994
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.