EEG Image Features for Classification of Emotional States

Author(s):  
Firgan N. Feradov
2021 ◽  
pp. 1-11
Author(s):  
Yaning Liu ◽  
Lin Han ◽  
Hexiang Wang ◽  
Bo Yin

Papillary thyroid carcinoma (PTC) is a common carcinoma in thyroid. As many benign thyroid nodules have the papillary structure which could easily be confused with PTC in morphology. Thus, pathologists have to take a lot of time on differential diagnosis of PTC besides personal diagnostic experience and there is no doubt that it is subjective and difficult to obtain consistency among observers. To address this issue, we applied deep learning to the differential diagnosis of PTC and proposed a histological image classification method for PTC based on the Inception Residual convolutional neural network (IRCNN) and support vector machine (SVM). First, in order to expand the dataset and solve the problem of histological image color inconsistency, a pre-processing module was constructed that included color transfer and mirror transform. Then, to alleviate overfitting of the deep learning model, we optimized the convolution neural network by combining Inception Network and Residual Network to extract image features. Finally, the SVM was trained via image features extracted by IRCNN to perform the classification task. Experimental results show effectiveness of the proposed method in the classification of PTC histological images.


2020 ◽  
Author(s):  
Eleonora De Filippi ◽  
Mara Wolter ◽  
Bruno Melo ◽  
Carlos J. Tierra-Criollo ◽  
Tiago Bortolini ◽  
...  

AbstractDuring the last decades, neurofeedback training for emotional self-regulation has received significant attention from both the scientific and clinical communities. However, most studies have focused on broader emotional states such as “negative vs. positive”, primarily due to our poor understanding of the functional anatomy of more complex emotions at the electrophysiological level. Our proof-of-concept study aims at investigating the feasibility of classifying two complex emotions that have been implicated in mental health, namely tenderness and anguish, using features extracted from the electroencephalogram (EEG) signal in healthy participants. Electrophysiological data were recorded from fourteen participants during a block-designed experiment consisting of emotional self-induction trials combined with a multimodal virtual scenario. For the within-subject classification, the linear Support Vector Machine was trained with two sets of samples: random cross-validation of the sliding windows of all trials; and 2) strategic cross-validation, assigning all the windows of one trial to the same fold. Spectral features, together with the frontal-alpha asymmetry, were extracted using Complex Morlet Wavelet analysis. Classification results with these features showed an accuracy of 79.3% on average when doing random cross-validation, and 73.3% when applying strategic cross-validation. We extracted a second set of features from the amplitude time-series correlation analysis, which significantly enhanced random cross-validation accuracy while showing similar performance to spectral features when doing strategic cross-validation. These results suggest that complex emotions show distinct electrophysiological correlates, which paves the way for future EEG-based, real-time neurofeedback training of complex emotional states.Significance statementThere is still little understanding about the correlates of high-order emotions (i.e., anguish and tenderness) in the physiological signals recorded with the EEG. Most studies have investigated emotions using functional magnetic resonance imaging (fMRI), including the real-time application in neurofeedback training. However, concerning the therapeutic application, EEG is a more suitable tool with regards to costs and practicability. Therefore, our proof-of-concept study aims at establishing a method for classifying complex emotions that can be later used for EEG-based neurofeedback on emotion regulation. We recorded EEG signals during a multimodal, near-immersive emotion-elicitation experiment. Results demonstrate that intraindividual classification of discrete emotions with features extracted from the EEG is feasible and may be implemented in real-time to enable neurofeedback.


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1502
Author(s):  
Ben Wilkes ◽  
Igor Vatolkin ◽  
Heinrich Müller

We present a multi-modal genre recognition framework that considers the modalities audio, text, and image by features extracted from audio signals, album cover images, and lyrics of music tracks. In contrast to pure learning of features by a neural network as done in the related work, handcrafted features designed for a respective modality are also integrated, allowing for higher interpretability of created models and further theoretical analysis of the impact of individual features on genre prediction. Genre recognition is performed by binary classification of a music track with respect to each genre based on combinations of elementary features. For feature combination a two-level technique is used, which combines aggregation into fixed-length feature vectors with confidence-based fusion of classification results. Extensive experiments have been conducted for three classifier models (Naïve Bayes, Support Vector Machine, and Random Forest) and numerous feature combinations. The results are presented visually, with data reduction for improved perceptibility achieved by multi-objective analysis and restriction to non-dominated data. Feature- and classifier-related hypotheses are formulated based on the data, and their statistical significance is formally analyzed. The statistical analysis shows that the combination of two modalities almost always leads to a significant increase of performance and the combination of three modalities in several cases.


Author(s):  
Marion Neumann ◽  
Lisa Hallau ◽  
Benjamin Klatt ◽  
Kristian Kersting ◽  
Christian Bauckhage

Modern communication and sensor technology coupled with powerful pattern recognition algorithms for information extraction and classification allow the development and use of integrated systems to tackle environmental problems. This integration is particularly promising for applications in crop farming, where such systems can help to control growth and improve yields while harmful environmental impacts are minimized. Thus, the vision of sustainable agriculture for anybody, anytime, and anywhere in the world can be put into reach. This chapter reviews and presents approaches to plant disease classification based on cell phone images, a novel way to supply farmers with personalized information and processing recommendations in real time. Several statistical image features and a novel scheme of measuring local textures of leaf spots are introduced. The classification of disease symptoms caused by various fungi or bacteria are evaluated for two important agricultural crop varieties, wheat and sugar beet.


Author(s):  
Gustavo Assunção ◽  
Paulo Menezes ◽  
Fernando Perdigão

<div class="page" title="Page 1"><div class="layoutArea"><div class="column"><p><span>The idea of recognizing human emotion through speech (SER) has recently received considerable attention from the research community, mostly due to the current machine learning trend. Nevertheless, even the most successful methods are still rather lacking in terms of adaptation to specific speakers and scenarios, evidently reducing their performance when compared to humans. In this paper, we evaluate a largescale machine learning model for classification of emotional states. This model has been trained for speaker iden- tification but is instead used here as a front-end for extracting robust features from emotional speech. We aim to verify that SER improves when some speak- er</span><span>’</span><span>s emotional prosody cues are considered. Experiments using various state-of- the-art classifiers are carried out, using the Weka software, so as to evaluate the robustness of the extracted features. Considerable improvement is observed when comparing our results with other SER state-of-the-art techniques.</span></p></div></div></div>


1964 ◽  
Vol 110 (467) ◽  
pp. 561-570 ◽  
Author(s):  
Walter J. Johannsen ◽  
Samuel H. Friedman ◽  
John V. Liccione

Although much effort has gone into the description and classification of schizophrenic behaviour patterns, little attempt has been made to detail the course of the disorder. The present paper is an effort to trace the effects of chronicity on perception by use of a cross-sectional method. The specific tasks were selected to maximize autochthonous factors in perception. At the same time a demonstrable sensitivity to emotional states was desired, since changes in the schizophrenic condition are often considered reflections of reaction to stress. A wide enough area of behaviour was encompassed so that a general pattern, if one emerged, could be considered typical of perception as a whole.


Sign in / Sign up

Export Citation Format

Share Document