A distributed virtual reality framework for Korea-Japan high-speed network test bed

Author(s):  
Hiroaki Nishino ◽  
Shinji Yamabiraki ◽  
Tsuneo Kagawa ◽  
Kouichi Utsumiya ◽  
Yong Moo Kwon ◽  
...  
Robotica ◽  
1992 ◽  
Vol 10 (5) ◽  
pp. 461-467 ◽  
Author(s):  
Robert Stone

SUMMARYThe UK Advanced Robotics Research Centre's VERDEX Project (Virtual Environment demote Driving Experiment) is an experimental test bed for investigating telepresence and virtual reality technologies in the design of human-system interfaces for telerobots. The achievements of the Project to date include the transformation of scanning laser rangefinder output to stereo virtual imagery (viewed using the VPL EyePhoneTM), the Teletact® Tactile Feedback Glove (for use with the VPL DataGloveTM), a high-speed, head-slaved stereo TV system, and a T800/i860 SuperVisionTM graphics/video parallel processing system.


Queue ◽  
2021 ◽  
Vol 19 (1) ◽  
pp. 77-93
Author(s):  
Niklas Blum ◽  
Serge Lachapelle ◽  
Harald Alvestrand

In this time of pandemic, the world has turned to Internet-based, RTC (realtime communication) as never before. The number of RTC products has, over the past decade, exploded in large part because of cheaper high-speed network access and more powerful devices, but also because of an open, royalty-free platform called WebRTC. WebRTC is growing from enabling useful experiences to being essential in allowing billions to continue their work and education, and keep vital human contact during a pandemic. The opportunities and impact that lie ahead for WebRTC are intriguing indeed.


1999 ◽  
Author(s):  
Yutaka Ando ◽  
Masayuki Kitamura ◽  
Nobuhiro Tsukamoto ◽  
Osamu Kawaguchi ◽  
Etsuo Kunieda ◽  
...  

2021 ◽  
pp. 1-18
Author(s):  
Sicong Liu ◽  
Jillian M. Clements ◽  
Elayna P. Kirsch ◽  
Hrishikesh M. Rao ◽  
David J. Zielinski ◽  
...  

Abstract The fusion of immersive virtual reality, kinematic movement tracking, and EEG offers a powerful test bed for naturalistic neuroscience research. Here, we combined these elements to investigate the neuro-behavioral mechanisms underlying precision visual–motor control as 20 participants completed a three-visit, visual–motor, coincidence-anticipation task, modeled after Olympic Trap Shooting and performed in immersive and interactive virtual reality. Analyses of the kinematic metrics demonstrated learning of more efficient movements with significantly faster hand RTs, earlier trigger response times, and higher spatial precision, leading to an average of 13% improvement in shot scores across the visits. As revealed through spectral and time-locked analyses of the EEG beta band (13–30 Hz), power measured prior to target launch and visual-evoked potential amplitudes measured immediately after the target launch correlate with subsequent reactive kinematic performance in the shooting task. Moreover, both launch-locked and shot/feedback-locked visual-evoked potentials became earlier and more negative with practice, pointing to neural mechanisms that may contribute to the development of visual–motor proficiency. Collectively, these findings illustrate EEG and kinematic biomarkers of precision motor control and changes in the neurophysiological substrates that may underlie motor learning.


Sign in / Sign up

Export Citation Format

Share Document