mouse cursor
Recently Published Documents


TOTAL DOCUMENTS

126
(FIVE YEARS 44)

H-INDEX

12
(FIVE YEARS 2)

2022 ◽  
Author(s):  
David Kim ◽  
Joseph Valacich ◽  
Jeff Jenkins ◽  
Manasvi Kumar ◽  
Alan Dennis

2021 ◽  
Vol 3 (4) ◽  
pp. 336-346
Author(s):  
Judy Simon

Human Computer Interface (HCI) requires proper coordination and definition of features that serve as input to the system. The parameters of a saccadic and smooth eye movement tracking are observed and a comparison is drawn for HCI. This methodology is further incorporated with Pupil, OpenCV and Microsoft Visual Studio for image processing to identify the position of the pupil and observe the pupil movement direction in real-time. Once the direction is identified, it is possible to determine the accurate cruise position which moves towards the target. To quantify the differences between the step-change tracking of saccadic eye movement and incremental tracking of smooth eye movement, the test was conducted on two users. With the help of incremental tracking of smooth eye movement, an accuracy of 90% is achieved. It is found that the incremental tracking requires an average time of 7.21s while the time for step change tracking is just 2.82s. Based on the observations, it is determined that, when compared to the saccadic eye movement tracking, the smooth eye movement tracking is over four times more accurate. Therefore, the smooth eye tracking was found to be more accurate, precise, reliable, and predictable to use with the mouse cursor than the saccadic eye movement tracking.


Appetite ◽  
2021 ◽  
pp. 105890
Author(s):  
Katharina Naomi Eichin ◽  
Claudio Georgii ◽  
Ann-Kathrin Arend ◽  
Zoé van Dyck ◽  
Jens Blechert

2021 ◽  
Author(s):  
A. Hampel ◽  
et al.

Figure 5 is interactive. Place the mouse cursor over the names or color-filled circles of the scarp profiles in A to view the related scarp profiles and detailed location maps in B.<div><br></div><div>Figure 7B is interactive. Use the radio buttons in the legend to view the S<sub>z</sub> values from all profiles (gray curve through data points with highest vertical slip) or separately from the different groups (blue, yellow, and red curves, respectively).<br></div>


2021 ◽  
Author(s):  
A. Hampel ◽  
et al.

Figure 5 is interactive. Place the mouse cursor over the names or color-filled circles of the scarp profiles in A to view the related scarp profiles and detailed location maps in B.<div><br></div><div>Figure 7B is interactive. Use the radio buttons in the legend to view the S<sub>z</sub> values from all profiles (gray curve through data points with highest vertical slip) or separately from the different groups (blue, yellow, and red curves, respectively).<br></div>


2021 ◽  
Author(s):  
A. Hampel ◽  
et al.

Figure 5 is interactive. Place the mouse cursor over the names or color-filled circles of the scarp profiles in A to view the related scarp profiles and detailed location maps in B.<div><br></div><div>Figure 7B is interactive. Use the radio buttons in the legend to view the S<sub>z</sub> values from all profiles (gray curve through data points with highest vertical slip) or separately from the different groups (blue, yellow, and red curves, respectively).<br></div>


2021 ◽  
Vol 12 (1) ◽  
pp. 53
Author(s):  
Triadi Triadi ◽  
Inung Wijayanto ◽  
Sugondo Hadiyoso

This study design a system prototype to control a mouse cursor's movement on a computer using an electrooculogram (EOG) signal. The EOG signal generated from eye movement was processed utilizing a microcontroller with an analog to the digital conversion process, which communicates with the computer through a USB port. The signal was decomposed using continuous wavelet transform (CWT), followed by feature extraction processes using statistic calculation, and then classified using K-Nearest Neighbors (k-NN) to decide the movement and direction of the mouse cursor. The test was carried out with 110 EOG signals then separated, 0.5 as training data and 0.5 as test data with eight categories of directional movement patterns, including up, bottom, right, left, top right, top left, bottom right bottom left. The highest accuracy that can be achieved using CWT-bump and kurtosis is 100%, while the time needed to translate the eye movement to the cursor movement is 1.9792 seconds. It is hoped that the proposed system can help assistive devices, particularly for Amyotrophic Lateral Sclerosis (ALS) sufferers.  


2021 ◽  
Vol 12 ◽  
pp. 180-189
Author(s):  
Ata Jedari Golparvar ◽  
Murat Kaya Yapici

The study of eye movements and the measurement of the resulting biopotential, referred to as electrooculography (EOG), may find increasing use in applications within the domain of activity recognition, context awareness, mobile human–computer and human–machine interaction (HCI/HMI), and personal medical devices; provided that, seamless sensing of eye activity and processing thereof is achieved by a truly wearable, low-cost, and accessible technology. The present study demonstrates an alternative to the bulky and expensive camera-based eye tracking systems and reports the development of a graphene textile-based personal assistive device for the first time. This self-contained wearable prototype comprises a headband with soft graphene textile electrodes that overcome the limitations of conventional “wet” electrodes, along with miniaturized, portable readout electronics with real-time signal processing capability that can stream data to a remote device over Bluetooth. The potential of graphene textiles in wearable eye tracking and eye-operated remote object interaction is demonstrated by controlling a mouse cursor on screen for typing with a virtual keyboard and enabling navigation of a four-wheeled robot in a maze, all utilizing five different eye motions initiated with a single channel EOG acquisition. Typing speeds of up to six characters per minute without prediction algorithms and guidance of the robot in a maze with four 180° turns were successfully achieved with perfect pattern detection accuracies of 100% and 98%, respectively.


Author(s):  
Martin Schoemann ◽  
Denis O’Hora ◽  
Rick Dale ◽  
Stefan Scherbaum

AbstractMouse cursor tracking has become a prominent method for characterizing cognitive processes, used in a wide variety of domains of psychological science. Researchers have demonstrated considerable ingenuity in the application of the approach, but the methodology has not undergone systematic analysis to facilitate the development of best practices. Furthermore, recent research has demonstrated effects of experimental design features on a number of mousetracking outcomes. We conducted a systematic review of the mouse-tracking literature to survey the reporting and spread of mouse variables (Cursor speed, Sampling rate, Training), physical characteristics of the experiments (Stimulus position, Response box position) and response requirements (Start procedure, Response procedure, Response deadline). This survey reveals that there is room for improvement in reporting practices, especially of subtler design features that researchers may have assumed would not impact research results (e.g., Cursor speed). We provide recommendations for future best practices in mouse-tracking studies and consider how best to standardize the mouse-tracking literature without excessively constraining the methodological flexibility that is essential to the field.


2020 ◽  
Author(s):  
Moein Razavi ◽  
Takashi Yamauchi ◽  
Vahid Janfaza ◽  
Anton Leontyev ◽  
Shanle Longmire-Monford ◽  
...  

AbstractThe human mind is multimodal. Yet most behavioral studies rely on century-old measures of behavior - task accuracy and latency (response time). Multimodal and multisensory analysis of human behavior creates a better understanding of how the mind works. The problem is that designing and implementing these experiments is technically complex and costly. This paper introduces versatile and economical means of developing multimodal-multisensory human experiments. We provide an experimental design framework that automatically integrates and synchronizes measures including electroencephalogram (EEG), galvanic skin response (GSR), eye-tracking, virtual reality (VR), body movement, mouse/cursor motion and response time. Unlike proprietary systems (e.g., iMotions), our system is free and open-source; it integrates PsychoPy, Unity and Lab Streaming Layer (LSL). The system embeds LSL inside PsychoPy/Unity for the synchronization of multiple sensory signals - gaze motion, electroencephalogram (EEG), galvanic skin response (GSR), mouse/cursor movement, and body motion - with low-cost consumer-grade devices in a simple behavioral task designed by PsychoPy and a virtual reality environment designed by Unity. This tutorial shows a step-by-step process by which a complex multimodal-multisensory experiment can be designed and implemented in a few hours. When conducting the experiment, all of the data synchronization and recoding of the data to disk will be done automatically.


Sign in / Sign up

Export Citation Format

Share Document