Automated Recording and Semantics-Aware Replaying of High-Speed Eye Tracking and Interaction Data to Support Cognitive Studies of Software Engineering Tasks

Author(s):  
Vlas Zyrianov ◽  
Drew T. Guarnera ◽  
Cole S. Peterson ◽  
Bonita Sharif ◽  
Jonathan I. Maletic
Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 891 ◽  
Author(s):  
Malik M. Naeem Mannan ◽  
M. Ahmad Kamran ◽  
Shinil Kang ◽  
Hak Soo Choi ◽  
Myung Yung Jeong

Steady-state visual evoked potentials (SSVEPs) have been extensively utilized to develop brain–computer interfaces (BCIs) due to the advantages of robustness, large number of commands, high classification accuracies, and information transfer rates (ITRs). However, the use of several simultaneous flickering stimuli often causes high levels of user discomfort, tiredness, annoyingness, and fatigue. Here we propose to design a stimuli-responsive hybrid speller by using electroencephalography (EEG) and video-based eye-tracking to increase user comfortability levels when presented with large numbers of simultaneously flickering stimuli. Interestingly, a canonical correlation analysis (CCA)-based framework was useful to identify target frequency with a 1 s duration of flickering signal. Our proposed BCI-speller uses only six frequencies to classify forty-eight targets, thus achieve greatly increased ITR, whereas basic SSVEP BCI-spellers use an equal number of frequencies to the number of targets. Using this speller, we obtained an average classification accuracy of 90.35 ± 3.597% with an average ITR of 184.06 ± 12.761 bits per minute in a cued-spelling task and an ITR of 190.73 ± 17.849 bits per minute in a free-spelling task. Consequently, our proposed speller is superior to the other spellers in terms of targets classified, classification accuracy, and ITR, while producing less fatigue, annoyingness, tiredness and discomfort. Together, our proposed hybrid eye tracking and SSVEP BCI-based system will ultimately enable a truly high-speed communication channel.


Genes ◽  
2020 ◽  
Vol 11 (10) ◽  
pp. 1157
Author(s):  
Ahmed Salman ◽  
Samuel B. Hutton ◽  
Tutte Newall ◽  
Jennifer A. Scott ◽  
Helen L. Griffiths ◽  
...  

In this study, we seek to exclude other pathophysiological mechanisms by which Frmd7 knock-down may cause Idiopathic Infantile Nystagmus (IIN) using the Frmd7.tm1a and Frmd7.tm1b murine models. We used a combination of genetic, histological and visual function techniques to characterize the role of Frmd7 gene in IIN using a novel murine model for the disease. We demonstrate that the Frmd7.tm1b allele represents a more robust model of Frmd7 knock-out at the mRNA level. The expression of Frmd7 was investigated using both antibody staining and X-gal staining confirming previous reports that Frmd7 expression in the retina is restricted to starburst amacrine cells and demonstrating that X-gal staining recapitulates the expression pattern in this model. Thus, it offers a useful tool for further expression studies. We also show that gross retinal morphology and electrophysiology are unchanged in these Frmd7 mutant models when compared with wild-type mice. High-speed eye-tracking recordings of Frmd7 mutant mice confirm a specific horizontal optokinetic reflex defect. In summary, our study confirms the likely role for Frmd7 in the optokinetic reflex in mice mediated by starburst amacrine cells. We show that the Frmd7.tm1b model provides a more robust knock-out than the Frmd7.tm1a model at the mRNA level, although the functional consequence is unchanged. Finally, we establish a robust eye-tracking technique in mice that can be used in a variety of future studies using this model and others. Although our data highlight a deficit in the optiokinetic reflex as a result of the starburst amacrine cells in the retina, this does not rule out the involvement of other cells, in the brain or the retina where Frmd7 is expressed, in the pathophysiology of IIN.


2020 ◽  
Vol 25 (5) ◽  
pp. 3128-3174 ◽  
Author(s):  
Zohreh Sharafi ◽  
Bonita Sharif ◽  
Yann-Gaël Guéhéneuc ◽  
Andrew Begel ◽  
Roman Bednarik ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document