scholarly journals Eye-hand coordination patterns of intermediate and novice surgeons in a simulation-based endoscopic surgery training environment

2018 ◽  
Vol 11 (6) ◽  
Author(s):  
Damla Topalli ◽  
Nergiz Ercil Cagiltay

Endoscopic surgery procedures require specific skills, such as eye-hand coordination to be developed. Current education programs are facing with problems to provide appropriate skill improvement and assessment methods in this field. This study aims to propose objective metrics for hand-movement skills and assess eye-hand coordination. An experimental study is conducted with 15 surgical residents to test the newly proposed measures. Two computer-based both-handed endoscopic surgery practice scenarios are developed in a simulation environment to gather the participants’ eye-gaze data with the help of an eye tracker as well as the related hand movement data through haptic interfaces. Additionally, participants’ eye-hand coordination skills are analyzed. The results indicate higher correlations in the intermediates’ eye-hand movements compared to the novices. An increase in intermediates’ visual concentration leads to smoother hand movements. Similarly, the novices’ hand movements are shown to remain at a standstill. After the first round of practice, all participants’ eye-hand coordination skills are improved on the specific task targeted in this study. According to these results, it can be concluded that the proposed metrics can potentially provide some additional insights about trainees’ eye-hand coordination skills and help instructional system designers to better address training requirements.

2017 ◽  
Vol 29 (5) ◽  
pp. 919-927 ◽  
Author(s):  
Ngoc Hung Pham ◽  
◽  
Takashi Yoshimi

This paper describes a process for adaptive learning of hand movements in human demonstration for manipulation actions by robots using Dynamic Movement Primitives (DMPs) framework. The process includes 1) tracking hand movement from human demonstration, 2) segmenting hand movement, 3) adaptive learning with DMPs framework. We implement a extended DMPs model with a modified formulation for hand movement data observed from human demonstration including hand 3D position, orientation and fingers distance. We evaluate the generated movements by DMPs model which is reproduced without changes or adapted to change of goal of the movement. The adapted movement data is used to control a robot arm by spatial position and orientation of its end-effector with a parallel gripper.


2020 ◽  
Vol 7 (1) ◽  
Author(s):  
John-Ross Rizzo ◽  
Mahya Beheshti ◽  
Tahereh Naeimi ◽  
Farnia Feiz ◽  
Girish Fatterpekar ◽  
...  

Abstract Background Eye–hand coordination (EHC) is a sophisticated act that requires interconnected processes governing synchronization of ocular and manual motor systems. Precise, timely and skillful movements such as reaching for and grasping small objects depend on the acquisition of high-quality visual information about the environment and simultaneous eye and hand control. Multiple areas in the brainstem and cerebellum, as well as some frontal and parietal structures, have critical roles in the control of eye movements and their coordination with the head. Although both cortex and cerebellum contribute critical elements to normal eye-hand function, differences in these contributions suggest that there may be separable deficits following injury. Method As a preliminary assessment for this perspective, we compared eye and hand-movement control in a patient with cortical stroke relative to a patient with cerebellar stroke. Result We found the onset of eye and hand movements to be temporally decoupled, with significant decoupling variance in the patient with cerebellar stroke. In contrast, the patient with cortical stroke displayed increased hand spatial errors and less significant temporal decoupling variance. Increased decoupling variance in the patient with cerebellar stroke was primarily due to unstable timing of rapid eye movements, saccades. Conclusion These findings highlight a perspective in which facets of eye-hand dyscoordination are dependent on lesion location and may or may not cooperate to varying degrees. Broadly speaking, the results corroborate the general notion that the cerebellum is instrumental to the process of temporal prediction for eye and hand movements, while the cortex is instrumental to the process of spatial prediction, both of which are critical aspects of functional movement control.


Motor Control ◽  
2016 ◽  
Vol 20 (3) ◽  
pp. 316-336 ◽  
Author(s):  
Uta Sailer ◽  
Florian Güldenpfennig ◽  
Thomas Eggert

This study investigated the effect of hand movements on behavioral and electro-physiological parameters of saccade preparation. While event-related potentials were recorded in 17 subjects, they performed saccades to a visual target either together with a hand movement in the same direction, a hand movement in the opposite direction, a hand movement to a third, independent direction, or without any accompanying hand movements. Saccade latencies increased with any kind of accompanying hand movement. Both saccade and manual latencies were largest when both movements aimed at opposite directions. In contrast, saccade-related potentials indicating preparatory activity were mainly affected by hand movements in the same direction. The data suggest that concomitant hand movements interfere with saccade preparation, particularly when the two movements involve motor preparations that access the same visual stimulus. This indicates that saccade preparation is continually informed about hand movement preparation.


2016 ◽  
Vol 115 (5) ◽  
pp. 2470-2484 ◽  
Author(s):  
Atul Gopal ◽  
Aditya Murthy

Voluntary control has been extensively studied in the context of eye and hand movements made in isolation, yet little is known about the nature of control during eye-hand coordination. We probed this with a redirect task. Here subjects had to make reaching/pointing movements accompanied by coordinated eye movements but had to change their plans when the target occasionally changed its position during some trials. Using a race model framework, we found that separate effector-specific mechanisms may be recruited to control eye and hand movements when executed in isolation but when the same effectors are coordinated a unitary mechanism to control coordinated eye-hand movements is employed. Specifically, we found that performance curves were distinct for the eye and hand when these movements were executed in isolation but were comparable when they were executed together. Second, the time to switch motor plans, called the target step reaction time, was different in the eye-alone and hand-alone conditions but was similar in the coordinated condition under assumption of a ballistic stage of ∼40 ms, on average. Interestingly, the existence of this ballistic stage could predict the extent of eye-hand dissociations seen in individual subjects. Finally, when subjects were explicitly instructed to control specifically a single effector (eye or hand), redirecting one effector had a strong effect on the performance of the other effector. Taken together, these results suggest that a common control signal and a ballistic stage are recruited when coordinated eye-hand movement plans require alteration.


2018 ◽  
Vol 120 (2) ◽  
pp. 539-552 ◽  
Author(s):  
Marcel Jan de Haan ◽  
Thomas Brochier ◽  
Sonja Grün ◽  
Alexa Riehle ◽  
Frédéric V. Barthélemy

Large-scale network dynamics in multiple visuomotor areas is of great interest in the study of eye-hand coordination in both human and monkey. To explore this, it is essential to develop a setup that allows for precise tracking of eye and hand movements. It is desirable that it is able to generate mechanical or visual perturbations of hand trajectories so that eye-hand coordination can be studied in a variety of conditions. There are simple solutions that satisfy these requirements for hand movements performed in the horizontal plane while visual stimuli and hand feedback are presented in the vertical plane. However, this spatial dissociation requires cognitive rules for eye-hand coordination different from eye-hand movements performed in the same space, as is the case in most natural conditions. Here we present an innovative solution for the precise tracking of eye and hand movements in a single reference frame. Importantly, our solution allows behavioral explorations under normal and perturbed conditions in both humans and monkeys. It is based on the integration of two noninvasive commercially available systems to achieve online control and synchronous recording of eye (EyeLink) and hand (KINARM) positions during interactive visuomotor tasks. We also present an eye calibration method compatible with different eye trackers that compensates for nonlinearities caused by the system's geometry. Our setup monitors the two effectors in real time with high spatial and temporal resolution and simultaneously outputs behavioral and neuronal data to an external data acquisition system using a common data format. NEW & NOTEWORTHY We developed a new setup for studying eye-hand coordination in humans and monkeys that monitors the two effectors in real time in a common reference frame. Our eye calibration method allows us to track gaze positions relative to visual stimuli presented in the horizontal workspace of the hand movements. This method compensates for nonlinearities caused by the system’s geometry and transforms kinematics signals from the eye tracker into the same coordinate system as hand and targets.


2021 ◽  
Vol 7 (2) ◽  
pp. 15
Author(s):  
Tomohiro Shimizu ◽  
Ryo Hachiuma ◽  
Hiroki Kajita ◽  
Yoshifumi Takatsume ◽  
Hideo Saito

Detecting surgical tools is an essential task for the analysis and evaluation of surgical videos. However, in open surgery such as plastic surgery, it is difficult to detect them because there are surgical tools with similar shapes, such as scissors and needle holders. Unlike endoscopic surgery, the tips of the tools are often hidden in the operating field and are not captured clearly due to low camera resolution, whereas the movements of the tools and hands can be captured. As a result that the different uses of each tool require different hand movements, it is possible to use hand movement data to classify the two types of tools. We combined three modules for localization, selection, and classification, for the detection of the two tools. In the localization module, we employed the Faster R-CNN to detect surgical tools and target hands, and in the classification module, we extracted hand movement information by combining ResNet-18 and LSTM to classify two tools. We created a dataset in which seven different types of open surgery were recorded, and we provided the annotation of surgical tool detection. Our experiments show that our approach successfully detected the two different tools and outperformed the two baseline methods.


2020 ◽  
Vol 132 (5) ◽  
pp. 1358-1366
Author(s):  
Chao-Hung Kuo ◽  
Timothy M. Blakely ◽  
Jeremiah D. Wander ◽  
Devapratim Sarma ◽  
Jing Wu ◽  
...  

OBJECTIVEThe activation of the sensorimotor cortex as measured by electrocorticographic (ECoG) signals has been correlated with contralateral hand movements in humans, as precisely as the level of individual digits. However, the relationship between individual and multiple synergistic finger movements and the neural signal as detected by ECoG has not been fully explored. The authors used intraoperative high-resolution micro-ECoG (µECoG) on the sensorimotor cortex to link neural signals to finger movements across several context-specific motor tasks.METHODSThree neurosurgical patients with cortical lesions over eloquent regions participated. During awake craniotomy, a sensorimotor cortex area of hand movement was localized by high-frequency responses measured by an 8 × 8 µECoG grid of 3-mm interelectrode spacing. Patients performed a flexion movement of the thumb or index finger, or a pinch movement of both, based on a visual cue. High-gamma (HG; 70–230 Hz) filtered µECoG was used to identify dominant electrodes associated with thumb and index movement. Hand movements were recorded by a dataglove simultaneously with µECoG recording.RESULTSIn all 3 patients, the electrodes controlling thumb and index finger movements were identifiable approximately 3–6-mm apart by the HG-filtered µECoG signal. For HG power of cortical activation measured with µECoG, the thumb and index signals in the pinch movement were similar to those observed during thumb-only and index-only movement, respectively (all p > 0.05). Index finger movements, measured by the dataglove joint angles, were similar in both the index-only and pinch movements (p > 0.05). However, despite similar activation across the conditions, markedly decreased thumb movement was observed in pinch relative to independent thumb-only movement (all p < 0.05).CONCLUSIONSHG-filtered µECoG signals effectively identify dominant regions associated with thumb and index finger movement. For pinch, the µECoG signal comprises a combination of the signals from individual thumb and index movements. However, while the relationship between the index finger joint angle and HG-filtered signal remains consistent between conditions, there is not a fixed relationship for thumb movement. Although the HG-filtered µECoG signal is similar in both thumb-only and pinch conditions, the actual thumb movement is markedly smaller in the pinch condition than in the thumb-only condition. This implies a nonlinear relationship between the cortical signal and the motor output for some, but importantly not all, movement types. This analysis provides insight into the tuning of the motor cortex toward specific types of motor behaviors.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Andrea Cavallo ◽  
Luca Romeo ◽  
Caterina Ansuini ◽  
Francesca Battaglia ◽  
Lino Nobili ◽  
...  

AbstractFailure to develop prospective motor control has been proposed to be a core phenotypic marker of autism spectrum disorders (ASD). However, whether genuine differences in prospective motor control permit discriminating between ASD and non-ASD profiles over and above individual differences in motor output remains unclear. Here, we combined high precision measures of hand movement kinematics and rigorous machine learning analyses to determine the true power of prospective movement data to differentiate children with autism and typically developing children. Our results show that while movement is unique to each individual, variations in the kinematic patterning of sequential grasping movements genuinely differentiate children with autism from typically developing children. These findings provide quantitative evidence for a prospective motor control impairment in autism and indicate the potential to draw inferences about autism on the basis of movement kinematics.


Electronics ◽  
2021 ◽  
Vol 10 (9) ◽  
pp. 1051
Author(s):  
Si Jung Kim ◽  
Teemu H. Laine ◽  
Hae Jung Suk

Presence refers to the emotional state of users where their motivation for thinking and acting arises based on the perception of the entities in a virtual world. The immersion level of users can vary when they interact with different media content, which may result in different levels of presence especially in a virtual reality (VR) environment. This study investigates how user characteristics, such as gender, immersion level, and emotional valence on VR, are related to the three elements of presence effects (attention, enjoyment, and memory). A VR story was created and used as an immersive stimulus in an experiment, which was presented through a head-mounted display (HMD) equipped with an eye tracker that collected the participants’ eye gaze data during the experiment. A total of 53 university students (26 females, 27 males), with an age range from 20 to 29 years old (mean 23.8), participated in the experiment. A set of pre- and post-questionnaires were used as a subjective measure to support the evidence of relationships among the presence effects and user characteristics. The results showed that user characteristics, such as gender, immersion level, and emotional valence, affected their level of presence, however, there is no evidence that attention is associated with enjoyment or memory.


1979 ◽  
Vol 48 (1) ◽  
pp. 207-214 ◽  
Author(s):  
Luis R. Marcos

16 subordinate bilingual subjects produced 5-min. monologues in their nondominant languages, i.e., English or Spanish. Hand-movement activity manifested during the videotape monologues was scored and related to measures of fluency in the nondominant language. The hand-movement behavior categorized as Groping Movement was significantly related to all of the nondominant-language fluency measures. These correlations support the assumption that Groping Movement may have a function in the process of verbal encoding. The results are discussed in terms of the possibility of monitoring central cognitive processes through the study of “visible” motor behavior.


Sign in / Sign up

Export Citation Format

Share Document