Please use this identifier to cite or link to this item: http://cmuir.cmu.ac.th/jspui/handle/6653943832/80146
Title: การตรวจจับประเภทของอารมณ์จากสัญญาณคลื่นไฟฟ้าสมองด้วยวิธีสตริงแกรมมาร์
Other Titles: Emotion detection from electroencephalography using string grammar methods
Authors: กัมปนาท สุทธิจิระพันธ์
Authors: ศันสนีย์ เอื้อพันธ์วิริยะกุล
กัมปนาท สุทธิจิระพันธ์
Issue Date: Sep-2024
Publisher: เชียงใหม่ : บัณฑิตวิทยาลัย มหาวิทยาลัยเชียงใหม่
Abstract: This research aims to develop a technique for classifying human emotions from electroencephalography (EEG) signals, which are inherently complex and significantly affected by noise. The technique employs string grammar in conjunction with the K-Nearest Neighbors (K-NN) algorithm to enhance the accuracy of emotion classification. Standard datasets, including SEED, SEED-IV, and DEAP, were utilized in this study. These datasets underwent decomposition into low and high frequencies using the Discrete Wavelet Transform (DWT) and statistical moment calculations to generate feature vectors. Subsequently, the feature vectors were clustered using K-means clustering and then converted into character strings for emotion classification with the K-NN algorithm. The experimental results indicate that the developed model achieved a maximum accuracy of 93.75% in Experiment 1 when using the SEED-IV dataset. In Experiment 2, the SEED dataset achieved a maximum accuracy of 71.85%, the SEED-IV dataset achieved a maximum accuracy of 80.56%, and the DEAP dataset achieved a maximum accuracy of 27.73%. In Experiment 3, the SEED dataset reached a maximum accuracy of 71.85%, the SEED-IV dataset reached a maximum accuracy of 79.17%, and the DEAP dataset reached a maximum accuracy of 28.91%. These findings demonstrate the effectiveness of the proposed technique in classifying emotions from EEG signals. However, this method can only work well with data set that is similar to SEED data set
URI: http://cmuir.cmu.ac.th/jspui/handle/6653943832/80146
Appears in Collections:ENG: Theses



Items in CMUIR are protected by copyright, with all rights reserved, unless otherwise indicated.