scholarly journals The Decision Decoding ToolBOX (DDTBOX) – A Multivariate Pattern Analysis Toolbox for Event-Related Potentials

2018 ◽  
Vol 17 (1) ◽  
pp. 27-42 ◽  
Author(s):  
Stefan Bode ◽  
Daniel Feuerriegel ◽  
Daniel Bennett ◽  
Phillip M. Alday
2017 ◽  
Vol 55 ◽  
pp. 46-58 ◽  
Author(s):  
William Francis Turner ◽  
Phillip Johnston ◽  
Kathleen de Boer ◽  
Carmen Morawetz ◽  
Stefan Bode

2017 ◽  
Author(s):  
Stefan Bode ◽  
Daniel Feuerriegel ◽  
Daniel Bennett ◽  
Phillip M. Alday

AbstractIn recent years, neuroimaging research in cognitive neuroscience has increasingly used multivariate pattern analysis (MVPA) to investigate higher cognitive functions. Here we present DDTBOX, an open-source MVPA toolbox for electroencephalography (EEG) data. DDTBOX runs under MATLAB and is well integrated with the EEGLAB/ERPLAB and Fieldtrip toolboxes (Delorme and Makeig, 2004; Lopez-Calderon and Luck, 2014; Oostenveld et al. 2011). It trains support vector machines (SVMs) on patterns of event-related potential (ERP) amplitude data, following or preceding an event of interest, for classification or regression of experimental variables. These amplitude patterns can be extracted across space/electrodes (spatial decoding), time (temporal decoding), or both (spatiotemporal decoding). DDTBOX can also extract SVM feature weights, generate empirical chance distributions based on shuffled-labels decoding for group-level statistical testing, provide estimates of the prevalence of decodable information in the population, and perform a variety of corrections for multiple comparisons. It also includes plotting functions for single subject and group results. DDTBOX complements conventional analyses of ERP components, as subtle multivariate patterns can be detected that would be overlooked in standard analyses. It further allows for a more explorative search for information when no ERP component is known to be specifically linked to a cognitive process of interest. In summary, DDTBOX is an easy-to-use and open-source toolbox that allows for characterising the time-course of information related to various perceptual and cognitive processes. It can be applied to data from a large number of experimental paradigms and could therefore be a valuable tool for the neuroimaging community.


2015 ◽  
Vol 27 (9) ◽  
pp. 1823-1839 ◽  
Author(s):  
Matthew R. Johnson ◽  
Gregory McCarthy ◽  
Kathleen A. Muller ◽  
Samuel N. Brudner ◽  
Marcia K. Johnson

Refreshing is the component cognitive process of directing reflective attention to one of several active mental representations. Previous studies using fMRI suggested that refresh tasks involve a component process of initiating refreshing as well as the top–down modulation of representational regions central to refreshing. However, those studies were limited by fMRI's low temporal resolution. In this study, we used EEG to examine the time course of refreshing on the scale of milliseconds rather than seconds. ERP analyses showed that a typical refresh task does have a distinct electrophysiological response as compared to a control condition and includes at least two main temporal components: an earlier (∼400 msec) positive peak reminiscent of a P3 response and a later (∼800–1400 msec) sustained positivity over several sites reminiscent of the late directing attention positivity. Overall, the evoked potentials for refreshing representations from three different visual categories (faces, scenes, words) were similar, but multivariate pattern analysis showed that some category information was nonetheless present in the EEG signal. When related to previous fMRI studies, these results are consistent with a two-phase model, with the first phase dominated by frontal control signals involved in initiating refreshing and the second by the top–down modulation of posterior perceptual cortical areas that constitutes refreshing a representation. This study also lays the foundation for future studies of the neural correlates of reflective attention at a finer temporal resolution than is possible using fMRI.


Sign in / Sign up

Export Citation Format

Share Document