eeg signals
Recently Published Documents


TOTAL DOCUMENTS

4287
(FIVE YEARS 1794)

H-INDEX

82
(FIVE YEARS 19)

2022 ◽  
Vol 73 ◽  
pp. 103418
Author(s):  
Fatma Krikid ◽  
Ahmad Karfoul ◽  
Sahbi Chaibi ◽  
Amar Kachenoura ◽  
Anca Nica ◽  
...  

2022 ◽  
Vol 74 ◽  
pp. 103479
Author(s):  
Meiyan Zhang ◽  
Dan Liu ◽  
Qisong Wang ◽  
Boqi Zhao ◽  
Ou Bai ◽  
...  

2022 ◽  
Vol 73 ◽  
pp. 103417
Author(s):  
Afshin Shoeibi ◽  
Navid Ghassemi ◽  
Marjane Khodatars ◽  
Parisa Moridian ◽  
Roohallah Alizadehsani ◽  
...  

Author(s):  
I Made Agus Wirawan ◽  
Retantyo Wardoyo ◽  
Danang Lelono

Electroencephalogram (EEG) signals in recognizing emotions have several advantages. Still, the success of this study, however, is strongly influenced by: i) the distribution of the data used, ii) consider of differences in participant characteristics, and iii) consider the characteristics of the EEG signals. In response to these issues, this study will examine three important points that affect the success of emotion recognition packaged in several research questions: i) What factors need to be considered to generate and distribute EEG data?, ii) How can EEG signals be generated with consideration of differences in participant characteristics?, and iii) How do EEG signals with characteristics exist among its features for emotion recognition? The results, therefore, indicate some important challenges to be studied further in EEG signals-based emotion recognition research. These include i) determine robust methods for imbalanced EEG signals data, ii) determine the appropriate smoothing method to eliminate disturbances on the baseline signals, iii) determine the best baseline reduction methods to reduce the differences in the characteristics of the participants on the EEG signals, iv) determine the robust architecture of the capsule network method to overcome the loss of knowledge information and apply it in more diverse data set.


2022 ◽  
Vol 72 ◽  
pp. 103292
Author(s):  
Christos Stergiadis ◽  
Vasiliki-Despoina Kostaridou ◽  
Manousos A. Klados
Keyword(s):  

PLoS ONE ◽  
2022 ◽  
Vol 17 (1) ◽  
pp. e0262417
Author(s):  
Cédric Simar ◽  
Robin Petit ◽  
Nichita Bozga ◽  
Axelle Leroy ◽  
Ana-Maria Cebolla ◽  
...  

Objective Different visual stimuli are classically used for triggering visual evoked potentials comprising well-defined components linked to the content of the displayed image. These evoked components result from the average of ongoing EEG signals in which additive and oscillatory mechanisms contribute to the component morphology. The evoked related potentials often resulted from a mixed situation (power variation and phase-locking) making basic and clinical interpretations difficult. Besides, the grand average methodology produced artificial constructs that do not reflect individual peculiarities. This motivated new approaches based on single-trial analysis as recently used in the brain-computer interface field. Approach We hypothesize that EEG signals may include specific information about the visual features of the displayed image and that such distinctive traits can be identified by state-of-the-art classification algorithms based on Riemannian geometry. The same classification algorithms are also applied to the dipole sources estimated by sLORETA. Main results and significance We show that our classification pipeline can effectively discriminate between the display of different visual items (Checkerboard versus 3D navigational image) in single EEG trials throughout multiple subjects. The present methodology reaches a single-trial classification accuracy of about 84% and 93% for inter-subject and intra-subject classification respectively using surface EEG. Interestingly, we note that the classification algorithms trained on sLORETA sources estimation fail to generalize among multiple subjects (63%), which may be due to either the average head model used by sLORETA or the subsequent spatial filtering failing to extract discriminative information, but reach an intra-subject classification accuracy of 82%.


Sign in / Sign up

Export Citation Format

Share Document