scholarly journals PupilEXT: Flexible Open-Source Platform for High-Resolution Pupillometry in Vision Research

2021 ◽  
Vol 15 ◽  
Author(s):  
Babak Zandi ◽  
Moritz Lode ◽  
Alexander Herzog ◽  
Georgios Sakas ◽  
Tran Quoc Khanh

The human pupil behavior has gained increased attention due to the discovery of the intrinsically photosensitive retinal ganglion cells and the afferent pupil control path’s role as a biomarker for cognitive processes. Diameter changes in the range of 10–2 mm are of interest, requiring reliable and characterized measurement equipment to accurately detect neurocognitive effects on the pupil. Mostly commercial solutions are used as measurement devices in pupillometry which is associated with high investments. Moreover, commercial systems rely on closed software, restricting conclusions about the used pupil-tracking algorithms. Here, we developed an open-source pupillometry platform consisting of hardware and software competitive with high-end commercial stereo eye-tracking systems. Our goal was to make a professional remote pupil measurement pipeline for laboratory conditions accessible for everyone. This work’s core outcome is an integrated cross-platform (macOS, Windows and Linux) pupillometry software called PupilEXT, featuring a user-friendly graphical interface covering the relevant requirements of professional pupil response research. We offer a selection of six state-of-the-art open-source pupil detection algorithms (Starburst, Swirski, ExCuSe, ElSe, PuRe and PuReST) to perform the pupil measurement. A developed 120-fps pupillometry demo system was able to achieve a calibration accuracy of 0.003 mm and an averaged temporal pupil measurement detection accuracy of 0.0059 mm in stereo mode. The PupilEXT software has extended features in pupil detection, measurement validation, image acquisition, data acquisition, offline pupil measurement, camera calibration, stereo vision, data visualization and system independence, all combined in a single open-source interface, available at https://github.com/openPupil/Open-PupilEXT.

2020 ◽  
Author(s):  
Elan Ness-Cohn ◽  
Marta Iwanaszko ◽  
William Kath ◽  
Ravi Allada ◽  
Rosemary Braun

AbstractThe circadian rhythm drives the oscillatory expression of thousands of genes across all tissues, coordinating physiological processes. The effect of this rhythm on health has generated increasing interest in discovering genes under circadian control by searching for periodic patterns in transcriptomic time-series experiments. While algorithms for detecting cycling transcripts have advanced, there remains little guidance quantifying the effect of experimental design and analysis choices on cycling detection accuracy. We present TimeTrial, a user-friendly benchmarking framework using both real and synthetic data to investigate cycle detection algorithms’ performance and improve circadian experimental design. Results show that the optimal choice of analysis method depends on the sampling scheme, noise level, and shape of the waveform of interest; and provides guidance on the impact of sampling frequency and duration on cycling detection accuracy. The TimeTrial software is freely available for download and may also be accessed through a web interface. By supplying a tool to vary and optimize experimental design considerations, TimeTrial will enhance circadian transcriptomics studies.


2019 ◽  
Author(s):  
Abigail M. Searfoss ◽  
James C. Pino ◽  
Nicole Creanza

AbstractAudio recording devices have changed significantly over the last 50 years, making large datasets of recordings of natural sounds, such as birdsong, easier to obtain. This increase in digital recordings necessitates an increase in high-throughput methods of analysis for researchers. Specifically, there is a need in the community for open-source methods that are tailored to recordings of varying qualities and from multiple species collected in nature.We developed Chipper, a Python-based software to semi-automate both the segmentation of acoustic signals and the subsequent analysis of their frequencies and durations. For avian recordings, we provide widgets to best determine appropriate thresholds for noise and syllable similarity, which aid in calculating note measurements and determining syntax. In addition, we generated a set of synthetic songs with various levels of background noise to test Chipper’s accuracy, repeatability, and reproducibility.Chipper provides an effective way to quickly generate reproducible estimates of birdsong features. The cross-platform graphical user interface allows the user to adjust parameters and visualize the resulting spectrogram and signal segmentation, providing a simplified method for analyzing field recordings.Chipper streamlines the processing of audio recordings with multiple user-friendly tools and is optimized for multiple species and varying recording qualities. Ultimately, Chipper supports the use of citizen-science data and increases the feasibility of large-scale multi-species birdsong studies.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Luca Masin ◽  
Marie Claes ◽  
Steven Bergmans ◽  
Lien Cools ◽  
Lien Andries ◽  
...  

AbstractGlaucoma is a disease associated with the loss of retinal ganglion cells (RGCs), and remains one of the primary causes of blindness worldwide. Major research efforts are presently directed towards the understanding of disease pathogenesis and the development of new therapies, with the help of rodent models as an important preclinical research tool. The ultimate goal is reaching neuroprotection of the RGCs, which requires a tool to reliably quantify RGC survival. Hence, we demonstrate a novel deep learning pipeline that enables fully automated RGC quantification in the entire murine retina. This software, called RGCode (Retinal Ganglion Cell quantification based On DEep learning), provides a user-friendly interface that requires the input of RBPMS-immunostained flatmounts and returns the total RGC count, retinal area and density, together with output images showing the computed counts and isodensity maps. The counting model was trained on RBPMS-stained healthy and glaucomatous retinas, obtained from mice subjected to microbead-induced ocular hypertension and optic nerve crush injury paradigms. RGCode demonstrates excellent performance in RGC quantification as compared to manual counts. Furthermore, we convincingly show that RGCode has potential for wider application, by retraining the model with a minimal set of training data to count FluoroGold-traced RGCs.


2016 ◽  
Vol 147 ◽  
pp. 50-56 ◽  
Author(s):  
Ana C. Dordea ◽  
Mark-Anthony Bray ◽  
Kaitlin Allen ◽  
David J. Logan ◽  
Fei Fei ◽  
...  

2016 ◽  
Author(s):  
Damien M. O’Halloran

ABSTRACTBackground: Node.js is an open-source and cross-platform environment that provides a JavaScript codebase for back-end server-side applications. JavaScript has been used to develop very fast and user-friendly front-end tools for bioinformatic and phylogenetic analyses. However, no such toolkits are available using Node.js to conduct comprehensive molecular phylogenetic analysis.Results: To address this problem, I have developed, phylo-node, which was developed using Node.js and provides a stable and scalable toolkit that allows the user to perform diverse molecular and phylogenetic tasks. phylo-node can execute the analysis and process the resulting outputs from a suite of software options that provides tools for read processing and genome alignment, sequence retrieval, multiple sequence alignment, primer design, evolutionary modeling, and phylogeny reconstruction. Furthermore, phylo-node enables the user to deploy server dependent applications, and also provides simple integration and interoperation with other Node modules and languages using Node inheritance patterns, and a customized piping module to support the production of diverse pipelines.Conclusions: phylo-node is open-source and freely available to all users without sign-up or login requirements. All source code and user guidelines are openly available at the GitHub repository: https://github.com/dohalloran/phylo-node


2020 ◽  
pp. 9-24
Author(s):  
Peter Bodrogi ◽  
Xue Guo ◽  
Tran Quoc Khanh

The brightness perception of a large (41°) uniform visual field was investigated in a visual psychophysical experiment. Subjects assessed the brightness of 20 light source spectra of different chromaticities at two luminance levels, Lv=267.6 cd/m2 and Lv=24.8 cd/m2. The resulting mean subjective brightness scale values were modelled by a combination of the signals of retinal mechanisms: S-cones, rods, intrinsically photosensitive retinal ganglion cells (ipRGCs) and the difference of the L-cone signal and the M-cone signal. A new quantity, “relative spectral blue content”, was also considered for modelling. This quantity was defined as “the spectral radiance of the light stimulus integrated with the range (380–520) nm, relative to luminance”. The “relative spectral blue content” model could describe the subjective brightness perception of the observers with reasonable accuracy.


Author(s):  
Kyril I. Kuznetsov ◽  
Vitaliy Yu. Maslov ◽  
Svetlana A. Fedulova ◽  
Nikolai S. Veselovsky

Sign in / Sign up

Export Citation Format

Share Document