scholarly journals ORACA – an open-source software for Online Real-time neural Activity extraction and offline Cross-session Analysis

2020 ◽  
Author(s):  
Weihao Sheng ◽  
Xueyang Zhao ◽  
Yang Yang

AbstractThe idea to combine in vivo functional imaging with optogenetic stimulation to achieve closed-loop, all-optical recording and manipulation of neurons and neural circuits is appealing yet challenging. Other than necessary hardwares, it requires an analysis software fast enough to extract neural activities from imaging data in real time. Here we present an open-source, integrative image processing toolbox ORACA (Online Real-time Activity extraction and offline Cross-session Analysis), which provides solutions for both fast online and accurate offline data analyses. We developed fast GPU-based algorithms that can finish raw image registration, automatic identification and activity extraction of neurons within seconds after image acquisition. Offline analysis pipeline features a new cross-session alignment algorithm that takes into account the angle differences across imaging sessions, useful for shared microscope or long imaging intervals. A modular, user-friendly software that can be used as a complete package or independent modules, ORACA can effectively facilitate the image analysis process, especially for all-optical closed-loop control and long-term repeated imaging.

2018 ◽  
Author(s):  
Alessio Paolo Buccino ◽  
Mikkel Elle Lepperød ◽  
Svenn-Arne Dragly ◽  
Philipp Häfliger ◽  
Marianne Fyhn ◽  
...  

AbstractObjectiveA major goal in systems neuroscience is to determine the causal relationship between neural activity and behavior. To this end, methods that combine monitoring neural activity, behavioral tracking, and targeted manipulation of neurons in closed-loop are powerful tools. However, commercial systems that allow these types of experiments are usually expensive and rely on non-standardized data formats and proprietary software which may hinder user-modifications for specific needs. In order to promote reproducibility and data-sharing in science, transparent software and standardized data formats are an advantage. Here, we present an open source, low-cost, adaptable, and easy to set-up system for combined behavioral tracking, electrophysiology and closed-loop stimulation.ApproachBased on the Open Ephys system (www.open-ephys.org) we developed multiple modules to include real-time tracking and behavior-based closed-loop stimulation. We describe the equipment and provide a step-by-step guide to set up the system. Combining the open source software Bonsai (bonsai-rx.org) for analyzing camera images in real time with the newly developed modules in Open Ephys, we acquire position information, visualize tracking, and perform tracking-based closed-loop stimulation experiments. To analyze the acquired data we provide an open source file reading package in Python.Main resultsThe system robustly visualizes real-time tracking and reliably recovers tracking information recorded from a range of sampling frequencies (30-1000Hz). We combined electrophysiology with the newly-developed tracking modules in Open Ephys to record place cell and grid cell activity in the hippocampus and in the medial entorhinal cortex, respectively. Moreover, we present a case in which we used the system for closed-loop optogenetic stimulation of entorhinal grid cells.SignificanceExpanding the Open Ephys system to include animal tracking and behavior-based closed-loop stimulation extends the availability of high-quality, low-cost experimental setup within standardized data formats serving the neuroscience community.


2021 ◽  
Author(s):  
Mark Schatza ◽  
Ethan Blackwood ◽  
Sumedh Nagrale ◽  
Alik S Widge

Closing the loop between brain activity and behavior is one of the most active areas of development in neuroscience. There is particular interest in developing closed-loop control of neural oscillations. Many studies report correlations between oscillations and functional processes. Oscillation-informed closed-loop experiments might determine whether these relationships are causal and would provide important mechanistic insights which may lead to new therapeutic tools. These closed-loop perturbations require accurate estimates of oscillatory phase and amplitude, which are challenging to compute in real time. We developed an easy to implement, fast and accurate Toolkit for Oscillatory Real-time Tracking and Estimation (TORTE). TORTE operates with the open-source Open Ephys GUI (OEGUI) system, making it immediately compatible with a wide range of acquisition systems and experimental preparations. TORTE efficiently extracts oscillatory phase and amplitude from a target signal and includes a variety of options to trigger closed-loop perturbations. Implementing these tools into existing experiments is easy and adds minimal latency to existing protocols. Most labs use in-house lab-specific approaches, limiting replication and extension of their experiments by other groups. Accuracy of the extracted analytic signal and accuracy of oscillation-informed perturbations with TORTE match presented results by these groups. However, TORTE provides access to these tools in a flexible, easy to use toolkit without requiring proprietary software. We hope that the availability of a high-quality, open-source, and broadly applicable toolkit will increase the number of labs able to perform oscillatory closed-loop experiments, and will improve the replicability of protocols and data across labs.


Author(s):  
Gonçalo Lopes ◽  
Karolina Farrell ◽  
Edward A. B. Horrocks ◽  
Chi-Yu Lee ◽  
Mai M. Morimoto ◽  
...  

Real-time rendering of closed-loop visual environments is necessary for next-generation understanding of brain function and behaviour, but is prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks and communication with behavioural and physiological measurement and manipulation devices.


2018 ◽  
Author(s):  
Rodrigo Amaducci ◽  
Manuel Reyes-Sanchez ◽  
Irene Elices ◽  
Francisco B. Rodriguez ◽  
Pablo Varona

ABSTRACTClosed-loop technologies provide novel ways of online observation, control and bidirectional interaction with the nervous system, which help to study complex non-linear and partially observable neural dynamics. These protocols are often difficult to implement due to the temporal precision required when interacting with biological components, which in many cases can only be achieved using real-time technology. In this paper we introduce RTHybrid (www.github.com/GNB-UAM/RTHybrid), a free and open-source software that includes a neuron and synapse model library to build hybrid circuits with living neurons in a wide variety of experimental contexts. In an effort to encourage the standardization of real-time software technology in neuroscience research, we compared different open-source real-time operating system patches, RTAI, Xenomai 3 and Preempt-RT, according to their performance and usability. RTHybrid has been developed to run over Linux operating systems supporting both Xenomai 3 and Preempt-RT real-time patches, and thus allowing an easy implementation in any laboratory. We report a set of validation tests and latency benchmarks for the construction of hybrid circuits using this library. With this work we want to promote the dissemination of standardized, user-friendly and open-source software tools developed for open- and closed-loop experimental neuroscience.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Gonçalo Lopes ◽  
Karolina Farrell ◽  
Edward A B Horrocks ◽  
Chi Yu Lee ◽  
Mai M Morimoto ◽  
...  

Real-time rendering of closed-loop visual environments is important for next-generation understanding of brain function and behaviour, but is often prohibitively difficult for non-experts to implement and is limited to few laboratories worldwide. We developed BonVision as an easy-to-use open-source software for the display of virtual or augmented reality, as well as standard visual stimuli. BonVision has been tested on humans and mice, and is capable of supporting new experimental designs in other animal models of vision. As the architecture is based on the open-source Bonsai graphical programming language, BonVision benefits from native integration with experimental hardware. BonVision therefore enables easy implementation of closed-loop experiments, including real-time interaction with deep neural networks, and communication with behavioural and physiological measurement and manipulation devices.


2020 ◽  
Author(s):  
Johannes Friedrich ◽  
Andrea Giovannucci ◽  
Eftychios A. Pnevmatikakis

AbstractIn-vivo calcium imaging through microendoscopic lenses enables imaging of neuronal populations deep within the brains of freely moving animals. Previously, a constrained matrix factorization approach (CNMF-E) has been suggested to extract single-neuronal activity from microendoscopic data. However, this approach relies on offline batch processing of the entire video data and is demanding both in terms of computing and memory requirements. These drawbacks prevent its applicability to the analysis of large datasets and closed-loop experimental settings. Here we address both issues by introducing two different online algorithms for extracting neuronal activity from streaming microendoscopic data. Our first algorithm presents an online adaptation of the CNMF-E algorithm, which dramatically reduces its memory and computation requirements. Our second algorithm proposes a convolution-based background model for microendoscopic data that enables even faster (real time) processing on GPU hardware. Our approach is modular and can be combined with existing online motion artifact correction and activity deconvolution methods to provide a highly scalable pipeline for microendoscopic data analysis. We apply our algorithms on two previously published typical experimental datasets and show that they yield similar high-quality results as the popular offline approach, but outperform it with regard to computing time and memory requirements.Author summaryCalcium imaging methods enable researchers to measure the activity of genetically-targeted large-scale neuronal subpopulations. Whereas previous methods required the specimen to be stable, e.g. anesthetized or head-fixed, new brain imaging techniques using microendoscopic lenses and miniaturized microscopes have enabled deep brain imaging in freely moving mice.However, the very large background fluctuations, the inevitable movements and distortions of imaging field, and the extensive spatial overlaps of fluorescent signals complicate the goal of efficiently extracting accurate estimates of neural activity from the observed video data. Further, current activity extraction methods are computationally expensive due to the complex background model and are typically applied to imaging data after the experiment is complete. Moreover, in some scenarios it is necessary to perform experiments in real-time and closed-loop – analyzing data on-the-fly to guide the next experimental steps or to control feedback –, and this calls for new methods for accurate real-time processing. Here we address both issues by adapting a popular extraction method to operate online and extend it to utilize GPU hardware that enables real time processing. Our algorithms yield similar high-quality results as the original offline approach, but outperform it with regard to computing time and memory requirements. Our results enable faster and scalable analysis, and open the door to new closed-loop experiments in deep brain areas and on freely-moving preparations.


Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5953
Author(s):  
Leslie Barreto ◽  
Ahnsei Shon ◽  
Derrick Knox ◽  
Hojun Song ◽  
Hangue Park ◽  
...  

(1) Background: Insects, which serve as model systems for many disciplines with their unique advantages, have not been extensively studied in gait research because of the lack of appropriate tools and insect models to properly study the insect gaits. (2) Methods: In this study, we present a gait analysis of grasshoppers with a closed-loop custom-designed motorized insect treadmill with an optical recording system for quantitative gait analysis. We used the eastern lubber grasshopper, a flightless and large-bodied species, as our insect model. Gait kinematics were recorded and analyzed by making three grasshoppers walk on the treadmill with various speeds from 0.1 to 1.5 m/s. (3) Results: Stance duty factor was measured as 70–95% and decreased as walking speed increased. As the walking speed increased, the number of contact legs decreased, and diagonal arrangement of contact was observed at walking speed of 1.1 cm/s. (4) Conclusions: This pilot study of gait analysis of grasshoppers using the custom-designed motorized insect treadmill with the optical recording system demonstrates the feasibility of quantitative, repeatable, and real-time insect gait analysis.


Sign in / Sign up

Export Citation Format

Share Document