An admittance-controlled wheeled mobile manipulator for mobility assistance: Human–robot interaction estimation and redundancy resolution for enhanced force exertion ability

Mechatronics ◽  
2021 ◽  
Vol 74 ◽  
pp. 102497
Author(s):  
Hongjun Xing ◽  
Ali Torabi ◽  
Liang Ding ◽  
Haibo Gao ◽  
Zongquan Deng ◽  
...  
2021 ◽  
Author(s):  
Sheila Sutjipto ◽  
Jon Woolfrey ◽  
Marc G. Carmichael ◽  
Gavin Paul

Author(s):  
David A. Lopez ◽  
Jared A. Frank ◽  
Vikram Kapila

As mobile robots experience increased commercialization, development of intuitive interfaces for human-robot interaction gains paramount importance to promote pervasive adoption of such robots in society. Although smart devices may be useful to operate robots, prior research has not fully investigated the appropriateness of various interaction elements (e.g., touch, gestures, sensors, etc.) to render an effective human-robot interface. This paper provides overviews of a mobile manipulator and a tablet-based application to operate the mobile manipulator. In particular, a mobile manipulator is designed to navigate an obstacle course and to pick and place objects around the course, all under the control of a human operator who uses a tablet-based application. The tablet application provides the user live videos that are captured and streamed by a camera onboard the robot and an overhead camera. In addition, to remotely operate the mobile manipulator, the tablet application provides the user a menu of four interface element options, including, virtual buttons, virtual joysticks, touchscreen gesture, and tilting the device. To evaluate the intuitiveness of the four interface elements for operating the mobile manipulator, a user study is conducted in which participants’ performance is monitored as they operate the mobile manipulator using the designed interfaces. The analysis of the user study shows that the tablet-based application allows even non-experienced users to operate the mobile manipulator without the need for extensive training.


2014 ◽  
Author(s):  
Mitchell S. Dunfee ◽  
Tracy Sanders ◽  
Peter A. Hancock

Author(s):  
Rosemarie Yagoda ◽  
Michael D. Coovert

2009 ◽  
Author(s):  
Matthew S. Prewett ◽  
Kristin N. Saboe ◽  
Ryan C. Johnson ◽  
Michael D. Coovert ◽  
Linda R. Elliott

2010 ◽  
Author(s):  
Eleanore Edson ◽  
Judith Lytle ◽  
Thomas McKenna

2020 ◽  
Author(s):  
Agnieszka Wykowska ◽  
Jairo Pérez-Osorio ◽  
Stefan Kopp

This booklet is a collection of the position statements accepted for the HRI’20 conference workshop “Social Cognition for HRI: Exploring the relationship between mindreading and social attunement in human-robot interaction” (Wykowska, Perez-Osorio & Kopp, 2020). Unfortunately, due to the rapid unfolding of the novel coronavirus at the beginning of the present year, the conference and consequently our workshop, were canceled. On the light of these events, we decided to put together the positions statements accepted for the workshop. The contributions collected in these pages highlight the role of attribution of mental states to artificial agents in human-robot interaction, and precisely the quality and presence of social attunement mechanisms that are known to make human interaction smooth, efficient, and robust. These papers also accentuate the importance of the multidisciplinary approach to advance the understanding of the factors and the consequences of social interactions with artificial agents.


Sign in / Sign up

Export Citation Format

Share Document