artifact rejection
Recently Published Documents


TOTAL DOCUMENTS

110
(FIVE YEARS 26)

H-INDEX

14
(FIVE YEARS 3)

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 1030
Author(s):  
Ana Tost ◽  
Carolina Migliorelli ◽  
Alejandro Bachiller ◽  
Inés Medina-Rivera ◽  
Sergio Romero ◽  
...  

Rett syndrome is a disease that involves acute cognitive impairment and, consequently, a complex and varied symptomatology. This study evaluates the EEG signals of twenty-nine patients and classify them according to the level of movement artifact. The main goal is to achieve an artifact rejection strategy that performs well in all signals, regardless of the artifact level. Two different methods have been studied: one based on the data distribution and the other based on the energy function, with entropy as its main component. The method based on the data distribution shows poor performance with signals containing high amplitude outliers. On the contrary, the method based on the energy function is more robust to outliers. As it does not depend on the data distribution, it is not affected by artifactual events. A double rejection strategy has been chosen, first on a motion signal (accelerometer or EEG low-pass filtered between 1 and 10 Hz) and then on the EEG signal. The results showed a higher performance when working combining both artifact rejection methods. The energy-based method, to isolate motion artifacts, and the data-distribution-based method, to eliminate the remaining lower amplitude artifacts were used. In conclusion, a new method that proves to be robust for all types of signals is designed.


2021 ◽  
Author(s):  
Alexa Danielle Monachino ◽  
Kelsie Lynn Lopez ◽  
Lara J. Pierce ◽  
laurel Joy Gabard-durnam

Event-Related Potential (ERP) designs are a common method for interrogating neurocognitive function with electroencephalography (EEG). However, the gold standard of preprocessing ERP data is manual-editing, a subjective, time-consuming processes. A number of automated pipelines have recently been created to address the need for standardization, automation, and quantification of EEG data processing; however, few are optimized for ERP analyses (especially in developmental or clinical populations). To fill this need, we propose and validate the HAPPE plus Event-Related (HAPPE+ER) software, a standardized and automated processing pipeline optimized for ERP analyses in EEG data. HAPPE+ER processes event-related potential data from raw files through a series of filtering, line noise reduction, bad channel detection, artifact rejection from continuous data, segmentation, and bad segment rejection methods. HAPPE+ER also includes post-processing reports of both data quality and pipeline quality metrics to facilitate the evaluation and reporting of data processing in a standardized manner. Finally, HAPPE+ER includes a post-processing script to facilitate generating ERP figures and measures for statistical analysis. We describe multiple approaches with both adult and developmental data to optimize and validate pipeline performance. HAPPE+ER software is freely available under the terms of GNU General Public License at https://github.com/PINE-Lab/HAPPE.


2021 ◽  
Author(s):  
Kelsie Lynn Lopez ◽  
Alexa Danielle Monachino ◽  
Santiago Morales ◽  
Stephanie Leach ◽  
Maureen Bowers ◽  
...  

Low-density Electroencephalography (EEG) recordings (e.g. fewer than 32 electrodes) are widely-used in research and clinical practice and enable scalable brain function measurement across a variety of settings and populations. Though a number of automated pipelines have recently been proposed to standardize and optimize EEG preprocessing for high-density systems with state-of-the-art methods, few solutions have emerged that are compatible with low-density systems. However, low-density data often include long recording times and/or large sample sizes that would benefit from similar standardization and automation with contemporary methods. To address this need, we propose the HAPPE In Low Electrode Electroencephalography (HAPPILEE) pipeline as a standardized, automated pipeline optimized for EEG recordings with low density channel layouts of any size. HAPPILEE processes task-free (e.g. resting-state) and task-related EEG, and event-related potential (ERP) data, from raw files through a series of processing steps including filtering, line noise reduction, bad channel detection, artifact rejection from continuous data, segmentation, and bad segment rejection that have all been optimized for low density data. HAPPILEE also includes post-processing reports of data and pipeline quality metrics to facilitate the evaluation and reporting of data quality and processing-related changes to the data in a standardized manner. We describe multiple approaches with both recorded and simulated EEG data to optimize and validate pipeline performance. The HAPPILEE pipeline is freely available as part of HAPPE 2.0 software under the terms of the GNU General Public License at: https://github.com/PINE-Lab/HAPPE.


2021 ◽  
Author(s):  
Phillip M. Alday ◽  
Jeroen van Paridon

Traditionally, artifacts are handled one of two ways in ERP studies: (1) rejection of affected segments and (2) correction via e.g. ICA. Threshold-based rejection is problematic because of the arbitrariness of the chosen limits and particular threshold criterion (e.g. peak-to-peak, absolute, slope, etc.), resulting in large researcher degrees of freedom. Manual rejection may suffer from low inter-rater reliability and is often done without appropriate blinding. Additionally, rejections are typically done for an entire trial, even if the ERP measure of interest isn't impacted by the artifact in question (e.g. motion artifact at the end of the trial). Additionally, fixed thresholds cannot distinguish between non-artifactual extreme values (i.e. those arising from brain activity and which have some 'signal' and some 'noise') and truly artifactual values (e.g. those arising from muscle activity or the electrical environment and which are essentially pure 'noise'). These aspects all become particularly problematic when analyzing EEG recorded under more naturalistic conditions, such as free dialogue in hyperscanning or virtual reality. By using modern, robust statistical methods, we can avoid setting arbitrary thresholds and allow the statistical model to extract the signal from the noise. To demonstrate this, we re-analyzed data from a multimodal virtual-reality N400 paradigm. We created two versions of the dataset, one using traditional threshold-based peak-to-peak artifact rejection (150µV), and one without artifact rejection, and examined the mean voltage at 250-350ms after stimulus onset. We then analyzed the data with both robust and traditional techniques from both a frequentist and Bayesian perspective. The non-robust models yielded different effect estimates when fit to dirty data than when fit to cleaned data, as well as different estimates of the residual variation. The robust models meanwhile estimated similar effect sizes for the dirty and cleaned data, with slightly different estimates of the residual variation. In other words, the robust model worked equally well with or without artifact rejection and did not require setting any arbitrary thresholds. Conversely, the standard, non-robust model was sensitive to the degree of data cleaning. This suggests that robust methods should become the standard in ERP analysis, regardless of data cleaning procedure.


2021 ◽  
Vol 15 ◽  
Author(s):  
Omid Abbasi ◽  
Nadine Steingräber ◽  
Joachim Gross

Recording brain activity during speech production using magnetoencephalography (MEG) can help us to understand the dynamics of speech production. However, these measurements are challenging due to the induced artifacts coming from several sources such as facial muscle activity, lower jaw and head movements. Here, we aimed to characterize speech-related artifacts, focusing on head movements, and subsequently present an approach to remove these artifacts from MEG data. We recorded MEG from 11 healthy participants while they pronounced various syllables in different loudness. Head positions/orientations were extracted during speech production to investigate its role in MEG distortions. Finally, we present an artifact rejection approach using the combination of regression analysis and signal space projection (SSP) in order to correct the induced artifact from MEG data. Our results show that louder speech leads to stronger head movements and stronger MEG distortions. Our proposed artifact rejection approach could successfully remove the speech-related artifact and retrieve the underlying neurophysiological signals. As the presented artifact rejection approach was shown to remove artifacts arising from head movements, induced by overt speech in the MEG, it will facilitate research addressing the neural basis of speech production with MEG.


2021 ◽  
Vol 11 (6) ◽  
pp. 743
Author(s):  
Hani M. Bu-Omer ◽  
Akio Gofuku ◽  
Kenji Sato ◽  
Makoto Miyakoshi

The sense of agency (SoA) is part of psychophysiological modules related to the self. Disturbed SoA is found in several clinical conditions, hence understanding the neural correlates of the SoA is useful for the diagnosis and determining the proper treatment strategies. Although there are several neuroimaging studies on SoA, it is desirable to translate the knowledge to more accessible and inexpensive EEG-based biomarkers for the sake of applicability. However, SoA has not been widely investigated using EEG. To address this issue, we designed an EEG experiment on healthy adults (n = 15) to determine the sensitivity of EEG on the SoA paradigm using hand movement with parametrically delayed visual feedback. We calculated the power spectral density over the traditional EEG frequency bands for ten delay conditions relative to no delay condition. Independent component analysis and equivalent current dipole modeling were applied to address artifact rejection, volume conduction, and source localization to determine the effect of interest. The results revealed that the alpha and low-beta EEG power increased in the parieto-occipital regions in proportion to the reduced SoA reported by the subjects. We conclude that the parieto-occipital alpha and low-beta EEG power reflect the sense of agency.


SLEEP ◽  
2021 ◽  
Vol 44 (Supplement_2) ◽  
pp. A311-A311
Author(s):  
Miranda Lim ◽  
Christina Reynolds ◽  
Carolyn Jones ◽  
Sophia Lambert ◽  
Nadir Balba ◽  
...  

Abstract Introduction A bidirectional relationship exists between sleep disruption and neuropathology in Alzheimer’s disease (AD). The sleep electroencephalogram (EEG) is a highly stereotyped, direct neurophysiological window into brain function; prior studies have identified abnormalities in EEG slow waves in early AD. EEG coherence across channels during sleep, a normally highly coherent brain state, could be an indicator of network coordination across brain regions. Accordingly, altered slow wave coherence during sleep may be an early indicator of cognitive decline. Methods EEG was collected during an attended overnight polysomnogram (PSG) from a community-based cohort of older subjects (n=44, average age = 71), approximately 25% of whom met criteria for mild cognitive impairment or early AD. Files were exported to EDF and a slow wave peak detector was implemented in MATLAB to count the number of slow wave oscillations, with automated artifact rejection, across 6 EEG leads standard for PSG (C3, C4, F3, F4, O1, and O2). Slow wave coherence was inferred when slow waves occurred in temporal synchrony across channels within 100 ms. Results Subjects with cognitive impairment showed significantly reduced total sleep time and time spent in rapid eye movement (REM) sleep compared to age-matched controls. EEG slow wave coherence was reliably quantified during wake, non-REM stages N1, N2, N3, and REM vigilance states as well as during transition periods between sleep stages. Using this algorithm, specific signatures of slow wave propagation during sleep were identified, including increased variability in slow wave activity and coherence, that appeared more prominent in subjects with impaired cognition. Conclusion EEG slow wave coherence during sleep and wake states can be calculated by applying automated algorithms to PSG data, and may be associated with cognitive impairment. Support (if any) NIH R01 AG059507


Author(s):  
Bahman Abdi-Sargezeh ◽  
Reza Foodeh ◽  
Vahid Shalchyan ◽  
Mohammad Reza Daliri

2021 ◽  
Vol 14 ◽  
Author(s):  
Makoto Miyakoshi ◽  
Lauren M. Schmitt ◽  
Craig A. Erickson ◽  
John A. Sweeney ◽  
Ernest V. Pedapati
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document