Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/77592
Full metadata record
DC FieldValueLanguage
dc.contributor.authorKornprom Pikulkaewen_US
dc.contributor.authorVarin Chouvatuten_US
dc.date.accessioned2022-10-16T07:48:47Z-
dc.date.available2022-10-16T07:48:47Z-
dc.date.issued2023-01-01en_US
dc.identifier.issn23673389en_US
dc.identifier.issn23673370en_US
dc.identifier.other2-s2.0-85135091457en_US
dc.identifier.other10.1007/978-981-19-2394-4_67en_US
dc.identifier.urihttps://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85135091457&origin=inwarden_US
dc.identifier.urihttp://cmuir.cmu.ac.th/jspui/handle/6653943832/77592-
dc.description.abstractNowadays, face expression technology is widespread. For instance, 2D pain detection is utilized in hospitals; nevertheless, it has some disadvantages that should be considered. Our goal was to design a 3D pain detection system that anybody may use before coming to the hospital, supporting all orientations. We utilized a dataset from the University of Northern British Columbia (UNBC) as a training set in this study. Pain is classified as not hurting, becoming painful, and painful in our system. The system’s effectiveness was established by comparing its results to those of a highly trained medical and two-dimensional pain identification. To conclude, our study has developed an uncomplicated, cost-effective, and easy to comprehend alternative tool for screening for pain before admission for the public in general and health provider.en_US
dc.subjectComputer Scienceen_US
dc.subjectEngineeringen_US
dc.titlePain Detection Using Deep Learning Method from 3D Facial Expression and Movement of Motionen_US
dc.typeBook Seriesen_US
article.title.sourcetitleLecture Notes in Networks and Systemsen_US
article.volume464en_US
article.stream.affiliationsChiang Mai Universityen_US
Appears in Collections:CMUL: Journal Articles

Files in This Item:
There are no files associated with this item.


Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.