scholarly journals Human Emotion Recognition Based on EEG Signal Using Fast Fourier Transform and K-Nearest Neighbor

2020 ◽  
Vol 5 (6) ◽  
pp. 1082-1088
Author(s):  
Anton Yudhana ◽  
Akbar Muslim ◽  
Dewi Eko Wati ◽  
Intan Puspitasari ◽  
Ahmad Azhari ◽  
...  
2017 ◽  
Vol 8 (2) ◽  
Author(s):  
Ranny Ranny ◽  
Yustinus Eko Soelistio ◽  
Ni Made Satvika

The development of fruit local industry is very high, but it less competitive than the imported fruit product. The kind of Indonesian fruit is very variative, but the support technology in this industry is stiil not implemented. This problem make the local fruit industry cannot compete with imported fruit. The purpose of the research is to develop a technology that can increase the using of technology on fruit industry. This research focus is fruit sweetness measeurment technology. This research the fruit tapping sound. Fast Fourier Transform is used as sound feature extraction method to get the feature. Based on the feature the fruit sweetness level can be predicted using the k Nearest Neighbour (kNN). The experiment on this research is divided into two parts. is using the training data to predict the sweetness levelof the fruits. The result of the research shows that the correlation between tapping sound and sweetness level can be used to predict the sweetness level of the fruit. Index Terms—Sweetness Degree, Brix, k Nearest Neighbor, and Fast Fourier Transform.


2021 ◽  
Vol 10 (1) ◽  
pp. 15-22
Author(s):  
Eko Budi Setiawan ◽  
Al Ghani Iqbal Dzulfiqar

This research was conducted to facilitate the interaction between radio broadcasters and radio listeners during the song request process.  This research was triggered by the difficulty of the broadcasters in monitoring song requests from listeners. The system is made to accommodate all song requests by listeners. The application produced in this study uses speech emotion recognition technology based on a person's mood obtained from the spoken words.  This technology can change the voice into one of the mood categories: neutral, angry, sad, and afraid.  The k-Nearest Neighbor method is used to get recommendations for recommended song titles by looking for the closeness of the value between the listener's mood and the availability of song playlists. kNN is used because this method is suitable for user-based collaborative problems. kNN will recommend three songs which then be offered to listeners by broadcasters. Based on tests conducted to the broadcasters and radio listeners, this study has produced a song request application by recommending song titles according to the listener's mood,  the text message, the searching songs, and the song requests and the song details that have been requested. Functional test that has been carried out has received 100 because all test components have succeeded as expected.


2020 ◽  
Author(s):  
aras Masood Ismael ◽  
Ömer F Alçin ◽  
Karmand H Abdalla ◽  
Abdulkadir k sengur

Abstract In this paper, a novel approach that is based on two-stepped majority voting is proposed for efficient EEG based emotion classification. Emotion recognition is important for human-machine interactions. Facial-features and body-gestures based approaches have been generally proposed for emotion recognition. Recently, EEG based approaches become more popular in emotion recognition. In the proposed approach, the raw EEG signals are initially low-pass filtered for noise removal and band-pass filters are used for rhythms extraction. For each rhythm, the best performed EEG channels are determined based on wavelet-based entropy features and fractal dimension based features. The k-nearest neighbor (KNN) classifier is used in classification. The best five EEG channels are used in majority voting for getting the final predictions for each EEG rhythm. In the second majority voting step, the predictions from all rhythms are used to get a final prediction. The DEAP dataset is used in experiments and classification accuracy, sensitivity and specificity are used for performance evaluation metrics. The experiments are carried out to classify the emotions into two binary classes such as high valence (HV) vs low valence (LV) and high arousal (HA) vs low arousal (LA). The experiments show that 86.3% HV vs LV discrimination accuracy and 85.0% HA vs LA discrimination accuracy is obtained. The obtained results are also compared with some of the existing methods. The comparisons show that the proposed method has potential in the use of EEG based emotion classification.


2020 ◽  
Vol 49 (3) ◽  
pp. 285-298
Author(s):  
Jian Zhang ◽  
Yihou Min

Human Emotion Recognition is of vital importance to realize human-computer interaction (HCI), while multichannel electroencephalogram (EEG) signals gradually replace other physiological signals and become the main basis of emotional recognition research with the development of brain-computer interface (BCI). However, the accuracy of emotional classification based on EEG signals under video stimulation is not stable, which may be related to the characteristics of  EEG signals before receiving stimulation. In this study, we extract the change of Differential Entropy (DE) before and after stimulation based on wavelet packet transform (WPT) to identify individual emotional state. Using the EEG emotion database DEAP, we divide the experimental EEG data in the database equally into 15 sets and extract their differential entropy on the basis of WPT. Then we calculate value of DE change of each separated EEG signal set. Finally, we divide the emotion into four categories in the two-dimensional valence-arousal emotional space by combining it with the integrated algorithm, Random Forest (RF). The simulation results show that the WPT-RF model established by this method greatly improves the recognition rate of EEG signal, with an average classification accuracy of 87.3%. In addition, we use WPT-RF model to train individual subjects, and the classification accuracy reached 97.7%.


2019 ◽  
Vol 7 (10) ◽  
pp. 43-47
Author(s):  
Shinde Ashok R. ◽  
Agnihotri Prashant P. ◽  
Raut S.D. ◽  
Khanale Prakash B.

Sign in / Sign up

Export Citation Format

Share Document