Exploring the Dimensions of Haptic Feedback Support in Manual Control

Author(s):  
D. A. Abbink ◽  
M. Mulder

A promising way to support operators in a manual control task is to provide them with guiding feedback forces on the control device (e.g., the steering wheel). These additional forces can suggest a safe course of action, which operators can follow or over-rule. This paper explores the idea that the feedback forces can be designed not only to depend on a calculated error (i.e., force feedback) but also on the control device position (i.e., stiffness feedback). First, the fundamental properties of force and stiffness feedback are explained, and important parameters for designing beneficial haptic feedback are discussed. Then, in an experiment, the unassisted control of a second-order system (perturbed by a multisine disturbance) is compared with the same control task supported by four haptic feedback systems: weak and strong force feedback, both with and without additional stiffness feedback. Time and frequency-domain analyses are used to understand the changes in human control behavior. The experimental results indicate that—when well designed—stiffness feedback may raise error-rejection performance with the same level of control activity as during unassisted control. The findings may aid in the design of haptic feedback systems for automotive and aerospace applications, where human attention is still required in a visually overloaded environment.

2005 ◽  
Vol 128 (2) ◽  
pp. 216-226 ◽  
Author(s):  
M. A. Vitrani ◽  
J. Nikitczuk ◽  
G. Morel ◽  
C. Mavroidis ◽  
B. Weinberg

Force-feedback mechanisms have been designed to simplify and enhance the human-vehicle interface. The increase in secondary controls within vehicle cockpits has created a desire for a simpler, more efficient human-vehicle interface. By consolidating various controls into a single, haptic feedback control device, information can be transmitted to the operator, without requiring the driver’s visual attention. In this paper, the experimental closed loop torque control of electro-rheological fluids (ERF) based resistive actuators for haptic applications is performed. ERFs are liquids that respond mechanically to electric fields by changing their properties, such as viscosity and shear stress electroactively. Using the electrically controlled rheological properties of ERFs, we developed resistive-actuators for haptic devices that can resist human operator forces in a controlled and tunable fashion. In this study, the ERF resistive-actuator analytical model is derived and experimentally verified and accurate closed loop torque control is experimentally achieved using a non-linear proportional integral controller with a feedforward loop.


2011 ◽  
Author(s):  
Yukio Horiguchi ◽  
Keisuke Yasuda ◽  
Hiroaki Nakanishi ◽  
Tetsuo Sawaragi
Keyword(s):  

Machines ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 47 ◽  
Author(s):  
Luca Salvati ◽  
Matteo d’Amore ◽  
Anita Fiorentino ◽  
Arcangelo Pellegrino ◽  
Pasquale Sena ◽  
...  

In recent years, driving simulators have been widely used by automotive manufacturers and researchers in human-in-the-loop experiments, because they can reduce time and prototyping costs, and provide unlimited parametrization, more safety, and higher repeatability. Simulators play an important role in studies about driver behavior in operating conditions or with unstable vehicles. The aim of the research is to study the effects that the force feedback (f.f.b.), provided to steering wheel by a lane-keeping-assist (LKA) system, has on a driver’s response in simulators. The steering’s force feedback system is tested by reproducing the conditions of criticality of the LKA system in order to minimize the distance required to recover the driving stability as a function of set f.f.b. intensity and speed. The results, obtained in three specific criticality conditions, show that the behaviour of the LKA system, reproduced in the simulator, is not immediately understood by the driver and, sometimes, it is in opposition with the interventions performed by the driver to ensure driving safety. The results also compare the performance of the subjects, either overall and classified into subgroups, with reference to the perception of the LKA system, evaluated by means of a questionnaire. The proposed experimental methodology is to be regarded as a contribution for the integration of acceptance tests in the evaluation of automation systems.


2021 ◽  
Vol 9 (2) ◽  
pp. 142-150
Author(s):  
Ivan Guschin ◽  
Anton Leschinskiy ◽  
Andrey Zhukov ◽  
Alexander Zarukin ◽  
Vyacheslav Kiryukhin ◽  
...  

The results of the development of a radiation-tolerant robotic complex URS-2 for operation in hot cells at nuclear enterprises are presented. The robotic complex consists of several original components: robotic arm, control device with force feedback, control panel with hardware buttons and touch screen, control computer with system and application software, control-and-power cabinet. The robotic manipulator has 6 degrees of freedom, replaceable pneumatic grippers and is characterized by high radiation tolerance, similar to that of mechanical master-slave manipulators. The original design of the control device based on the delta-robot model that implements a copying mode of manual control of the robotic complex with force feedback is presented. The hardware and software solutions developed has made it possible to create a virtual simulator of the RTC for testing innovative methods of remote control of the robot, as well as teaching operators to perform technological tasks in hot cells. The experimental model of the robotic complex has demonstrated the ability to perform basic technological tasks in a demo hot cell, both in manual and automatic modes.


2000 ◽  
Author(s):  
Michael L. Turner ◽  
Ryan P. Findley ◽  
Weston B. Griffin ◽  
Mark R. Cutkosky ◽  
Daniel H. Gomez

Abstract This paper describes the development of a system for dexterous telemanipulation and presents the results of tests involving simple manipulation tasks. The user wears an instrumented glove augmented with an arm-grounded haptic feedback apparatus. A linkage attached to the user’s wrist measures gross motions of the arm. The movements of the user are transferred to a two fingered dexterous robot hand mounted on the end of a 4-DOF industrial robot arm. Forces measured at the robot fingers can be transmitted back to the user via the haptic feedback apparatus. The results obtained in block-stacking and object-rolling experiments indicate that the addition of force feedback to the user did not improve the speed of task execution. In fact, in some cases the presence of incomplete force information is detrimental to performance speed compared to no force information. There are indications that the presence of force feedback did aid in task learning.


2018 ◽  
Vol 120 (6) ◽  
pp. 3187-3197 ◽  
Author(s):  
Marissa J. Rosenberg ◽  
Raquel C. Galvan-Garza ◽  
Torin K. Clark ◽  
David P. Sherwood ◽  
Laurence R. Young ◽  
...  

Precise motion control is critical to human survival on Earth and in space. Motion sensation is inherently imprecise, and the functional implications of this imprecision are not well understood. We studied a “vestibular” manual control task in which subjects attempted to keep themselves upright with a rotational hand controller (i.e., joystick) to null out pseudorandom, roll-tilt motion disturbances of their chair in the dark. Our first objective was to study the relationship between intersubject differences in manual control performance and sensory precision, determined by measuring vestibular perceptual thresholds. Our second objective was to examine the influence of altered gravity on manual control performance. Subjects performed the manual control task while supine during short-radius centrifugation, with roll tilts occurring relative to centripetal accelerations of 0.5, 1.0, and 1.33 GC (1 GC = 9.81 m/s2). Roll-tilt vestibular precision was quantified with roll-tilt vestibular direction-recognition perceptual thresholds, the minimum movement that one can reliably distinguish as leftward vs. rightward. A significant intersubject correlation was found between manual control performance (defined as the standard deviation of chair tilt) and thresholds, consistent with sensory imprecision negatively affecting functional precision. Furthermore, compared with 1.0 GC manual control was more precise in 1.33 GC (−18.3%, P = 0.005) and less precise in 0.5 GC (+39.6%, P < 0.001). The decrement in manual control performance observed in 0.5 GC and in subjects with high thresholds suggests potential risk factors for piloting and locomotion, both on Earth and during human exploration missions to the moon (0.16 G) and Mars (0.38 G). NEW & NOTEWORTHY The functional implications of imprecise motion sensation are not well understood. We found a significant correlation between subjects’ vestibular perceptual thresholds and performance in a manual control task (using a joystick to keep their chair upright), consistent with sensory imprecision negatively affecting functional precision. Furthermore, using an altered-gravity centrifuge configuration, we found that manual control precision was improved in “hypergravity” and degraded in “hypogravity.” These results have potential relevance for postural control, aviation, and spaceflight.


2018 ◽  
Vol 35 (2) ◽  
pp. 149-160 ◽  
Author(s):  
Mustufa H. Abidi ◽  
Abdulrahman M. Al-Ahmari ◽  
Ali Ahmad ◽  
Saber Darmoul ◽  
Wadea Ameen

AbstractThe design and verification of assembly operations is essential for planning product production operations. Recently, virtual prototyping has witnessed tremendous progress, and has reached a stage where current environments enable rich and multi-modal interaction between designers and models through stereoscopic visuals, surround sound, and haptic feedback. The benefits of building and using Virtual Reality (VR) models in assembly process verification are discussed in this paper. In this paper, we present the virtual assembly (VA) of an aircraft turbine engine. The assembly parts and sequences are explained using a virtual reality design system. The system enables stereoscopic visuals, surround sounds, and ample and intuitive interaction with developed models. A special software architecture is suggested to describe the assembly parts and assembly sequence in VR. A collision detection mechanism is employed that provides visual feedback to check the interference between components. The system is tested for virtual prototype and assembly sequencing of a turbine engine. We show that the developed system is comprehensive in terms of VR feedback mechanisms, which include visual, auditory, tactile, as well as force feedback. The system is shown to be effective and efficient for validating the design of assembly, part design, and operations planning.


Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 42
Author(s):  
Lichao Yang ◽  
Mahdi Babayi Semiromi ◽  
Yang Xing ◽  
Chen Lv ◽  
James Brighton ◽  
...  

In conditionally automated driving, the engagement of non-driving activities (NDAs) can be regarded as the main factor that affects the driver’s take-over performance, the investigation of which is of great importance to the design of an intelligent human–machine interface for a safe and smooth control transition. This paper introduces a 3D convolutional neural network-based system to recognize six types of driver behaviour (four types of NDAs and two types of driving activities) through two video feeds based on head and hand movement. Based on the interaction of driver and object, the selected NDAs are divided into active mode and passive mode. The proposed recognition system achieves 85.87% accuracy for the classification of six activities. The impact of NDAs on the perspective of the driver’s situation awareness and take-over quality in terms of both activity type and interaction mode is further investigated. The results show that at a similar level of achieved maximum lateral error, the engagement of NDAs demands more time for drivers to accomplish the control transition, especially for the active mode NDAs engagement, which is more mentally demanding and reduces drivers’ sensitiveness to the driving situation change. Moreover, the haptic feedback torque from the steering wheel could help to reduce the time of the transition process, which can be regarded as a productive assistance system for the take-over process.


2019 ◽  
Vol 121 (4) ◽  
pp. 1398-1409 ◽  
Author(s):  
Vonne van Polanen ◽  
Robert Tibold ◽  
Atsuo Nuruki ◽  
Marco Davare

Lifting an object requires precise scaling of fingertip forces based on a prediction of object weight. At object contact, a series of tactile and visual events arise that need to be rapidly processed online to fine-tune the planned motor commands for lifting the object. The brain mechanisms underlying multisensory integration serially at transient sensorimotor events, a general feature of actions requiring hand-object interactions, are not yet understood. In this study we tested the relative weighting between haptic and visual signals when they are integrated online into the motor command. We used a new virtual reality setup to desynchronize visual feedback from haptics, which allowed us to probe the relative contribution of haptics and vision in driving participants’ movements when they grasped virtual objects simulated by two force-feedback robots. We found that visual delay changed the profile of fingertip force generation and led participants to perceive objects as heavier than when lifts were performed without visual delay. We further modeled the effect of vision on motor output by manipulating the extent to which delayed visual events could bias the force profile, which allowed us to determine the specific weighting the brain assigns to haptics and vision. Our results show for the first time how visuo-haptic integration is processed at discrete sensorimotor events for controlling object-lifting dynamics and further highlight the organization of multisensory signals online for controlling action and perception. NEW & NOTEWORTHY Dexterous hand movements require rapid integration of information from different senses, in particular touch and vision, at different key time points as movement unfolds. The relative weighting between vision and haptics for object manipulation is unknown. We used object lifting in virtual reality to desynchronize visual and haptic feedback and find out their relative weightings. Our findings shed light on how rapid multisensory integration is processed over a series of discrete sensorimotor control points.


Author(s):  
Jinling Wang ◽  
Wen F. Lu

Virtual reality technology plays an important role in the fields of product design, computer animation, medical simulation, cloth motion, and many others. Especially with the emergence of haptics technology, virtual simulation system provides an intuitive way of human and computer interaction, which allows user to feel and touch the virtual environment. For a real-time simulation system, a physically based deformable model including complex material properties with a high resolution is required. However, such deformable model hardly satisfies the update rate of interactive haptic rendering that exceeds 1 kHz. To tackle this challenge, a real-time volumetric model with haptic feedback is developed in this paper. This model, named as Adaptive S-chain model, extends the S-chain model and integrates the energy-based wave propagation method by the proposed adaptive re-mesh method to achieve realistic graphic and haptic deformation results. The implemented results show that the nonlinear, heterogeneous, anisotropic, shape retaining material properties and large range deformation are well modeled. An accurate force feedback is generated by the proposed Adaptive S-chain model in case study which is quite close to the experiment data.


Sign in / Sign up

Export Citation Format

Share Document