Emotion Recognition from EEG Using Rhythm Synchronization Patterns with Joint Time-Frequency-Space Correlation

Author(s):  
Hongzhi Kuai ◽  
Hongxia Xu ◽  
Jianzhuo Yan
Author(s):  
Hang Zhao ◽  
Dongxuan He ◽  
Ziqi Kang ◽  
Hua Wang

Author(s):  
Zhiqiang Wei ◽  
Weijie Yuan ◽  
Shuangyang Li ◽  
Jinhong Yuan ◽  
Derrick Wing Kwan Ng

2021 ◽  
Vol 183 ◽  
pp. 108287
Author(s):  
Yao Haiyang ◽  
Zhang Zhichen ◽  
Wang Haiyan ◽  
Wang Yong

2007 ◽  
Vol 187 (1) ◽  
pp. 153-162 ◽  
Author(s):  
Keiko Fujita ◽  
Yoshitsugu Takei ◽  
Akira Morimoto ◽  
Ryuichi Ashino

2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Ahmet Mert ◽  
Hasan Huseyin Celik

Abstract The feasibility of using time–frequency (TF) ridges estimation is investigated on multi-channel electroencephalogram (EEG) signals for emotional recognition. Without decreasing accuracy rate of the valence/arousal recognition, the informative component extraction with low computational cost will be examined using multivariate ridge estimation. The advanced TF representation technique called multivariate synchrosqueezing transform (MSST) is used to obtain well-localized components of multi-channel EEG signals. Maximum-energy components in the 2D TF distribution are determined using TF-ridges estimation to extract instantaneous frequency and instantaneous amplitude, respectively. The statistical values of the estimated ridges are used as a feature vector to the inputs of machine learning algorithms. Thus, component information in multi-channel EEG signals can be captured and compressed into low dimensional space for emotion recognition. Mean and variance values of the five maximum-energy ridges in the MSST based TF distribution are adopted as feature vector. Properties of five TF-ridges in frequency and energy plane (e.g., mean frequency, frequency deviation, mean energy, and energy deviation over time) are computed to obtain 20-dimensional feature space. The proposed method is performed on the DEAP emotional EEG recordings for benchmarking, and the recognition rates are yielded up to 71.55, and 70.02% for high/low arousal, and high/low valence, respectively.


Author(s):  
Huarong Ren ◽  
Weikai Xu ◽  
Lin Wang

2018 ◽  
Vol 10 (6) ◽  
pp. 1-7 ◽  
Author(s):  
Pengcheng Wei ◽  
Qi Sui ◽  
Zibin Li ◽  
Fan Li ◽  
Xingwen Yi ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document