Implementation of a Real-Time Telerobotic System for Generating an Assistive Force Feedback for Rehabilitation Applications

Author(s):  
Eduardo Veras ◽  
Karan Khokar ◽  
Kathryn De Laurentis ◽  
Rajiv Dubey

In this paper we describe the implementation of a system that gives assistive force feedback to the remote user in a teleoperation based environment. The force feedback would help the user in trajectory following exercises for stroke rehabilitation. Exercises in virtual environments on a PC as well as real world exercises with a remote robotic arm, that would follow a trajectory in real-time as the user moves the master device, could be performed. Such real world exercises augmented with real-time force feedback can make the exercises more effective as the user gets force assistance along the desired path. Moreover, the system can find its application in remote therapy, where the therapist is away from the user, as it can be teleoperated and has internet based protocols. The assistive force feedback has been implemented using simple sensors such as the camera and the laser and a PC-based real-time multithreaded control system. The real-time force feedback from the remote robot to the master device has been possible using effective multithreading programming strategies in the control system design and novel sensor integration. The system also has the capabilities of autonomous as well as supervisory control of the remote robot and is modular as far as integration of different master devices for stroke rehabilitation exercises is concerned.

2021 ◽  
Author(s):  
◽  
Stephen Thompson

<p>This thesis presents a novel system for enabling remote collaboration within a mixed reality environment. Since the increase of virtual and augmented reality headsets, there has been increased interest in improving remote collaboration. Systems have been proposed to use 3D geometry or 360° video for providing remotely collaborating users with a view of the local, real-world environment. However, many systems provide limited interactions in the local environment and target using coupled views of all users, rather than simulating face-to-face interactions, or use virtual environments for the remote user, losing visual realism.  The presented system enables a user situated in a remote location to join a local user to collaborate on a task. An omni-directional camera is streamed to the remote user in real-time to provide a live view of the local space. The 360° video is also used to provide believable lighting when compositing virtual objects into the real-world. Remote users are displayed to local users as an abstracted avatar to provide basic body gestures and social presence. Voice chat is also provided for verbal communication.  The system has been evaluated for technical performance and user experience. The evaluation found the performance of the system was suitable for real-time collaboration. Remote and local users were also found to have similar satisfaction with the system, experiencing high levels of presence, social presence and tele-presence. Shared cinematic and remote presentations are suggested as possible applications to guide further development of the system.</p>


2021 ◽  
Author(s):  
◽  
Stephen Thompson

<p>This thesis presents a novel system for enabling remote collaboration within a mixed reality environment. Since the increase of virtual and augmented reality headsets, there has been increased interest in improving remote collaboration. Systems have been proposed to use 3D geometry or 360° video for providing remotely collaborating users with a view of the local, real-world environment. However, many systems provide limited interactions in the local environment and target using coupled views of all users, rather than simulating face-to-face interactions, or use virtual environments for the remote user, losing visual realism.  The presented system enables a user situated in a remote location to join a local user to collaborate on a task. An omni-directional camera is streamed to the remote user in real-time to provide a live view of the local space. The 360° video is also used to provide believable lighting when compositing virtual objects into the real-world. Remote users are displayed to local users as an abstracted avatar to provide basic body gestures and social presence. Voice chat is also provided for verbal communication.  The system has been evaluated for technical performance and user experience. The evaluation found the performance of the system was suitable for real-time collaboration. Remote and local users were also found to have similar satisfaction with the system, experiencing high levels of presence, social presence and tele-presence. Shared cinematic and remote presentations are suggested as possible applications to guide further development of the system.</p>


2018 ◽  
Author(s):  
Kyle Plunkett

This manuscript provides two demonstrations of how Augmented Reality (AR), which is the projection of virtual information onto a real-world object, can be applied in the classroom and in the laboratory. Using only a smart phone and the free HP Reveal app, content rich AR notecards were prepared. The physical notecards are based on Organic Chemistry I reactions and show only a reagent and substrate. Upon interacting with the HP Reveal app, an AR video projection shows the product of the reaction as well as a real-time, hand-drawn curved-arrow mechanism of how the product is formed. Thirty AR notecards based on common Organic Chemistry I reactions and mechanisms are provided in the Supporting Information and are available for widespread use. In addition, the HP Reveal app was used to create AR video projections onto laboratory instrumentation so that a virtual expert can guide the user during the equipment setup and operation.


Sign in / Sign up

Export Citation Format

Share Document