scholarly journals Software and resources for experiments and data analysis of MEG and EEG data

Author(s):  
Lau M. Andersen

Data from magnetoencephalography (MEG) and electroencephalography (EEG) is extremely rich and multifaceted. For example, in a standard MEG recording with 306 sensors and a sampling rate of 1,000 Hz, 306,000 data points are sampled every second. To be able to answer the question, which was the ultimate reason for acquiring the data, thus necessitates efficient data handling. Luckily, several software packages have been developed for handling MEG and/or EEG data. To name some of the most popular: MNE-Python; FieldTrip; Brainstorm; EEGLAB and SPM. These are all available under a public domain licence, meaning that they can be run, shared and modified by anyone. Commercial software released under proprietary licences include BESA and CURRY. It is important to be aware of that for clinical diagnosis of for example epilepsy, certified software is required FieldTrip, MNE-Python, Brainstorm, EEGLAB and SPM for example cannot be used for that. In this chapter, the emphasis will be on MNE-Python and FieldTrip. This will allow users of both Python and MATLAB (or alternatively GNU Octave to code along as the chapter unfolds. As a general remark, all that MNE-Python can do, FieldTrip can do and vice versa – though with some small difference. A full analysis going from raw data to a source reconstruction will be presented, illustrated with both code and figures with the aim of providing newcomers to the field a stepping stone towards doing their own analyses of their own datasets.

2019 ◽  
Author(s):  
Lau M. Andersen

Data from magnetoencephalography (MEG) and electroencephalography (EEG) is extremely rich and multifaceted. For example, in a standard MEG recording with 306 sensors and a sampling rate of 1,000 Hz, 306,000 data points are sampled every second. To be able to answer the question, which was the ultimate reason for acquiring the data, thus necessitates efficient data handling. Luckily, several software packages have been developed for handling MEG and/or EEG data. To name some of the most popular: MNE-Python; FieldTrip; Brainstorm; EEGLAB and SPM. These are all available under a public domain licence, meaning that they can be run, shared and modified by anyone. Commercial software released under proprietary licences include BESA and CURRY. It is important to be aware of that for clinical diagnosis of for example epilepsy, certified software is required FieldTrip, MNE-Python, Brainstorm, EEGLAB and SPM for example cannot be used for that. In this chapter, the emphasis will be on MNE-Python and FieldTrip. This will allow users of both Python and MATLAB (or alternatively GNU Octave to code along as the chapter unfolds. As a general remark, all that MNE-Python can do, FieldTrip can do and vice versa – though with some small difference. A full analysis going from raw data to a source reconstruction will be presented, illustrated with both code and figures with the aim of providing newcomers to the field a stepping stone towards doing their own analyses of their own datasets.


2019 ◽  
Author(s):  
Lau M. Andersen

Data from magnetoencephalography (MEG) and electroencephalography (EEG) is extremely rich and multifaceted. For example, in a standard MEG recording with 306 sensors and a sampling rate of 1,000 Hz, 306,000 data points are sampled every second. To be able to answer the question, which was the ultimate reason for acquiring the data, thus necessitates efficient data handling. Luckily, several software packages have been developed for handling MEG and/or EEG data. To name some of the most popular: MNE-Python; FieldTrip; Brainstorm; EEGLAB and SPM. These are all available under a public domain licence, meaning that they can be run, shared and modified by anyone. Commercial software released under proprietary licences include BESA and CURRY. It is important to be aware of that for clinical diagnosis of for example epilepsy, certified software is required FieldTrip, MNE-Python, Brainstorm, EEGLAB and SPM for example cannot be used for that. In this chapter, the emphasis will be on MNE-Python and FieldTrip. This will allow users of both Python and MATLAB (or alternatively GNU Octave to code along as the chapter unfolds. As a general remark, all that MNE-Python can do, FieldTrip can do and vice versa – though with some small difference. A full analysis going from raw data to a source reconstruction will be presented, illustrated with both code and figures with the aim of providing newcomers to the field a stepping stone towards doing their own analyses of their own datasets.


2019 ◽  
Author(s):  
Philip Held ◽  
Randy A Boley ◽  
Walter G Faig ◽  
John A O'Toole ◽  
Imran Desai ◽  
...  

UNSTRUCTURED Electronic health records (EHRs) offer opportunities for research and improvements in patient care. However, challenges exist in using data from EHRs due to the volume of information existing within clinical notes, which can be labor intensive and costly to transform into usable data with existing strategies. This case report details the collaborative development and implementation of the postencounter form (PEF) system into the EHR at the Road Home Program at Rush University Medical Center in Chicago, IL to address these concerns with limited burden to clinical workflows. The PEF system proved to be an effective tool with over 98% of all clinical encounters including a completed PEF within 5 months of implementation. In addition, the system has generated over 325,188 unique, readily-accessible data points in under 4 years of use. The PEF system has since been deployed to other settings demonstrating that the system may have broader clinical utility.


2014 ◽  
Vol 20 (5) ◽  
pp. 1392-1403 ◽  
Author(s):  
Irina Kolotuev

AbstractTransmission electron microscopy (TEM) is an important tool for studies in cell biology, and is essential to address research questions from bacteria to animals. Recent technological innovations have advanced the entire field of TEM, yet classical techniques still prevail for most present-day studies. Indeed, the majority of cell and developmental biology studies that use TEM do not require cutting-edge methodologies, but rather fast and efficient data generation. Although access to state-of-the-art equipment is frequently problematic, standard TEM microscopes are typically available, even in modest research facilities. However, a major unmet need in standard TEM is the ability to quickly prepare and orient a sample to identify a region of interest. Here, I provide a detailed step-by-step method for a positional correlative anatomy approach to flat-embedded samples. These modifications make the TEM preparation and analytic procedures faster and more straightforward, supporting a higher sampling rate. To illustrate the modified procedures, I provide numerous examples addressing research questions in Caenorhabditis elegans and Drosophila. This method can be equally applied to address questions of cell and developmental biology in other small multicellular model organisms.


2012 ◽  
Vol 239-240 ◽  
pp. 865-868
Author(s):  
Chun Fu Li ◽  
Yan Qin Li

A data acquisition system for electronic automatic transmission test, which makes use of American NI’s data acquisition card, is designed based on Virtual Instrument. The data can be collected by the data acquisition system include automatic transmission oil pressure, solenoid duty, as well as speed of turbine and output shaft of transmission and throttle degree of engine and so on. The key problems of the system design are measurement method of automatic transmission solenoid duty, signal conditioning circuit design and how to improve sampling rate of the system. Test platform is of a armored vehicle which has equipped a high-power electronic automatic transmission. The automatic transmission shift control strategy and characteristics of throttle oil pressure etc can be obtained through test data acquisition.


2010 ◽  
Vol 20 (06) ◽  
pp. 1703-1721 ◽  
Author(s):  
FRANÇOIS LAURENT ◽  
MICHEL BESSERVE ◽  
LINE GARNERO ◽  
MATTHIEU PHILIPPE ◽  
GENEVIÈVE FLORENCE ◽  
...  

We classified performance-related mental states from EEG-derived measurements. We investigated the usefulness of massively distributed source reconstruction, comparing scalp and cortical scales. This approach provides a more detailed picture of the functional brain networks underlying the changes related to the mental state of interest. Local and distant synchrony measurements (coherence, phase locking value) were used for both scalp measurements and cortical current density sources, and were fed into a SVM-based classifier. We designed two simulations where classification scores increased when our 60-electrode scalp measurements were reconstructed on 60 sources and on a 500-source cortex. Source reconstruction appeared to be most useful in these simulations, in particular, when distant synchronies were involved and local synchronies did not prevail. Despite the simplicity of the model used, certain flaws in accuracy were observed in the localization of informative activities, due to the relationship between amplitude and phase for mixed signals. Our results with real EEG data suggested that the phenomenon of interest was characterized merely by modulations in local amplitudes, but also in strength of distant couplings. After source reconstruction, classification rates also increased for real EEG data when seeking distant phase-related couplings. When reconstructing a large number of sources, the regularization coefficient should be carefully selected on a subject-by-subject basis. We showed that training classifiers using such high-dimension data is useful for localizing discriminating patterns of activity.


2010 ◽  
Vol 28 (7) ◽  
pp. 1409-1418 ◽  
Author(s):  
T. Nygrén ◽  
Th. Ulich

Abstract. The standard method of calculating the spectrum of a digital signal is based on the Fourier transform, which gives the amplitude and phase spectra at a set of equidistant frequencies from signal samples taken at equal intervals. In this paper a different method based on stochastic inversion is introduced. It does not imply a fixed sampling rate, and therefore it is useful in analysing geophysical signals which may be unequally sampled or may have missing data points. This could not be done by means of Fourier transform without preliminary interpolation. Another feature of the inversion method is that it allows unequal frequency steps in the spectrum, although this property is not needed in practice. The method has a close relation to methods based on least-squares fitting of sinusoidal functions to the signal. However, the number of frequency bins is not limited by the number of signal samples. In Fourier transform this can be achieved by means of additional zero-valued samples, but no such extra samples are used in this method. Finally, if the standard deviation of the samples is known, the method is also able to give error limits to the spectrum. This helps in recognising signal peaks in noisy spectra.


Author(s):  
Francesco Buonamici ◽  
Monica Carfagni

Reverse Engineering (RE), also known as “CAD reconstruction”, aims at the reconstruction of 3D geometric models of objects/mechanical parts, starting from 3D measured data (points/mesh). In recent years, considerable developments in RE were achieved thanks to both academic and industrial research (e.g. RE software packages). The aim of this work is to provide an overview of state of the art techniques and approaches presented in recent years (considering at the same time tools and methods provided by commercial CAD software and RE systems). In particular, this article focuses on the “constrained fitting” approach, which considers geometrical constraints between the generated surfaces, improving the reconstruction result. On the basis of the overview, possible theoretical principles are drafted with the aim of suggest new strategies to make the CAD reconstruction process more effective in order to obtain more ready/usable CAD models. Finally, a new RE framework is briefly outlined: the proposed approach hypothesizes a tool built within the environment of an existing CAD system and considers the fitting of a custom-built archetypal model, defined with all the a-priori known dimensions and constraints, to the scanned data.


Entropy ◽  
2018 ◽  
Vol 20 (8) ◽  
pp. 579 ◽  
Author(s):  
Samira Ahmadi ◽  
Nariman Sepehri ◽  
Christine Wu ◽  
Tony Szturm

Sample entropy (SampEn) has been used to quantify the regularity or predictability of human gait signals. There are studies on the appropriate use of this measure for inter-stride spatio-temporal gait variables. However, the sensitivity of this measure to preprocessing of the signal and to variant values of template size (m), tolerance size (r), and sampling rate has not been studied when applied to “whole” gait signals. Whole gait signals are the entire time series data obtained from force or inertial sensors. This study systematically investigates the sensitivity of SampEn of the center of pressure displacement in the mediolateral direction (ML COP-D) to variant parameter values and two pre-processing methods. These two methods are filtering the high-frequency components and resampling the signals to have the same average number of data points per stride. The discriminatory ability of SampEn is studied by comparing treadmill walk only (WO) to dual-task (DT) condition. The results suggest that SampEn maintains the directional difference between two walking conditions across variant parameter values, showing a significant increase from WO to DT condition, especially when signals are low-pass filtered. Moreover, when gait speed is different between test conditions, signals should be low-pass filtered and resampled to have the same average number of data points per stride.


Entropy ◽  
2018 ◽  
Vol 20 (10) ◽  
pp. 764 ◽  
Author(s):  
John McCamley ◽  
William Denton ◽  
Andrew Arnold ◽  
Peter Raffalt ◽  
Jennifer Yentes

Sample entropy (SE) has relative consistency using biologically-derived, discrete data >500 data points. For certain populations, collecting this quantity is not feasible and continuous data has been used. The effect of using continuous versus discrete data on SE is unknown, nor are the relative effects of sampling rate and input parameters m (comparison vector length) and r (tolerance). Eleven subjects walked for 10-minutes and continuous joint angles (480 Hz) were calculated for each lower-extremity joint. Data were downsampled (240, 120, 60 Hz) and discrete range-of-motion was calculated. SE was quantified for angles and range-of-motion at all sampling rates and multiple combinations of parameters. A differential relationship between joints was observed between range-of-motion and joint angles. Range-of-motion SE showed no difference; whereas, joint angle SE significantly decreased from ankle to knee to hip. To confirm findings from biological data, continuous signals with manipulations to frequency, amplitude, and both were generated and underwent similar analysis to the biological data. In general, changes to m, r, and sampling rate had a greater effect on continuous compared to discrete data. Discrete data was robust to sampling rate and m. It is recommended that different data types not be compared and discrete data be used for SE.


Sign in / Sign up

Export Citation Format

Share Document