Locating arbitrary noise sources in three‐dimensional space in real time.

2009 ◽  
Vol 125 (4) ◽  
pp. 2567-2567
Author(s):  
Na Zhu ◽  
Sean Wu
2014 ◽  
Author(s):  
Assaf Levanon ◽  
Yitzhak Yitzhaky ◽  
Natan S. Kopeika ◽  
Daniel Rozban ◽  
Amir Abramovich

2010 ◽  
Vol 104 (5) ◽  
pp. 2654-2666 ◽  
Author(s):  
Gregory A. Apker ◽  
Timothy K. Darling ◽  
Christopher A. Buneo

Reaching movements are subject to noise in both the planning and execution phases of movement production. The interaction of these noise sources during natural movements is not well understood, despite its importance for understanding movement variability in neurologically intact and impaired individuals. Here we examined the interaction of planning and execution related noise during the production of unconstrained reaching movements. Subjects performed sequences of two movements to targets arranged in three vertical planes separated in depth. The starting position for each sequence was also varied in depth with the target plane; thus required movement sequences were largely contained within the vertical plane of the targets. Each final target in a sequence was approached from two different directions, and these movements were made with or without visual feedback of the moving hand. These combined aspects of the design allowed us to probe the interaction of execution and planning related noise with respect to reach endpoint variability. In agreement with previous studies, we found that reach endpoint distributions were highly anisotropic. The principal axes of movement variability were largely aligned with the depth axis, i.e., the axis along which visual planning related noise would be expected to dominate, and were not generally well aligned with the direction of the movement vector. Our results suggest that visual planning–related noise plays a dominant role in determining anisotropic patterns of endpoint variability in three-dimensional space, with execution noise adding to this variability in a movement direction-dependent manner.


2012 ◽  
Vol 107 (1) ◽  
pp. 90-102 ◽  
Author(s):  
Gregory A. Apker ◽  
Christopher A. Buneo

Reaching movements are subject to noise associated with planning and execution, but precisely how these noise sources interact to determine patterns of endpoint variability in three-dimensional space is not well understood. For frontal plane movements, variability is largest along the depth axis (the axis along which visual planning noise is greatest), with execution noise contributing to this variability along the movement direction. Here we tested whether these noise sources interact in a similar way for movements directed in depth. Subjects performed sequences of two movements from a single starting position to targets that were either both contained within a frontal plane (“frontal sequences”) or where the first was within the frontal plane and the second was directed in depth (“depth sequences”). For both sequence types, movements were performed with or without visual feedback of the hand. When visual feedback was available, endpoint distributions for frontal and depth sequences were generally anisotropic, with the principal axes of variability being strongly aligned with the depth axis. Without visual feedback, endpoint distributions for frontal sequences were relatively isotropic and movement direction dependent, while those for depth sequences were similar to those with visual feedback. Overall, the results suggest that in the presence of visual feedback, endpoint variability is dominated by uncertainty associated with planning and updating visually guided movements. In addition, the results suggest that without visual feedback, increased uncertainty in hand position estimation effectively unmasks the effect of execution-related noise, resulting in patterns of endpoint variability that are highly movement direction dependent.


2008 ◽  
Vol 5 (27) ◽  
pp. 1181-1191 ◽  
Author(s):  
Dhruv Grover ◽  
John Tower ◽  
Simon Tavaré

In this paper, the design of a real-time image acquisition system for tracking the movement of Drosophila in three-dimensional space is presented. The system uses three calibrated and synchronized cameras to detect multiple flies and integrates the detected fly silhouettes to construct the three-dimensional visual hull models of each fly. We used an extended Kalman filter to estimate the state of each fly, given past positions from the reconstructed fly visual hulls. The results show that our approach constructs the three-dimensional visual hull of each fly from the detected image silhouettes and robustly tracks them at real-time rates. The system is suitable for a more detailed analysis of fly behaviour.


Sign in / Sign up

Export Citation Format

Share Document