scholarly journals Improving HRI with Force Sensing

Machines ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 15
Author(s):  
Akiyoshi Hayashi ◽  
Liz Katherine Rincon-Ardila ◽  
Gentiane Venture

In the future, in a society where robots and humans live together, HRI is an important field of research. While most human–robot-interaction (HRI) studies focus on appearance and dialogue, touch-communication has not been the focus of many studies despite the importance of its role in human–human communication. This paper investigates how and where humans touch an inorganic non-zoomorphic robot arm. Based on these results, we install touch sensors on the robot arm and conduct experiments to collect data of users’ impressions towards the robot when touching it. Our results suggest two main things. First, the touch gestures were collected with two sensors, and the collected data can be analyzed using machine learning to classify the gestures. Second, communication between humans and robots using touch can improve the user’s impression of the robots.

2020 ◽  
Vol 10 (22) ◽  
pp. 7992
Author(s):  
Jinseok Woo ◽  
Yasuhiro Ohyama ◽  
Naoyuki Kubota

This paper presents a robot partner development platform based on smart devices. Humans communicate with others based on the basic motivations of human cooperation and have communicative motives based on social attributes. Understanding and applying these communicative motives become important in the development of socially-embedded robot partners. Therefore, it is becoming more important to develop robots that can be applied according to needs while taking these human communication elements into consideration. The role of a robot partner is more important in not only on the industrial sector but also in households. However, it seems that it will take time to disseminate robots. In the field of service robots, the development of robots according to various needs is important and the system integration of hardware and software becomes crucial. Therefore, in this paper, we propose a robot partner development platform for human-robot interaction. Firstly, we propose a modularized architecture of robot partners using a smart device to realize a flexible update based on the re-usability of hardware and software modules. In addition, we show examples of implementing a robot system using the proposed architecture. Next, we focus on the development of various robots using the modular robot partner system. Finally, we discuss the effectiveness of the proposed robot partner system through social implementation and experiments.


Symmetry ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 680
Author(s):  
Ethan Jones ◽  
Winyu Chinthammit ◽  
Weidong Huang ◽  
Ulrich Engelke ◽  
Christopher Lueg

Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone ( p < 0.05 ) and was competitive with the controller only setup, although did not outperform it ( p > 0.05 ).


AI Magazine ◽  
2011 ◽  
Vol 32 (4) ◽  
pp. 53-63 ◽  
Author(s):  
Andrea L. Thomaz ◽  
Crystal Chao

Turn-taking is a fundamental part of human communication. Our goal is to devise a turn-taking framework for human-robot interaction that, like the human skill, represents something fundamental about interaction, generic to context or domain. We propose a model of turn-taking, and conduct an experiment with human subjects to inform this model. Our findings from this study suggest that information flow is an integral part of human floor-passing behavior. Following this, we implement autonomous floor relinquishing on a robot and discuss our insights into the nature of a general turn-taking model for human-robot interaction.


2021 ◽  
Vol 8 ◽  
Author(s):  
Hua Minh Tuan ◽  
Filippo Sanfilippo ◽  
Nguyen Vinh Hao

Collaborative robots (or cobots) are robots that can safely work together or interact with humans in a common space. They gradually become noticeable nowadays. Compliant actuators are very relevant for the design of cobots. This type of actuation scheme mitigates the damage caused by unexpected collision. Therefore, elastic joints are considered to outperform rigid joints when operating in a dynamic environment. However, most of the available elastic robots are relatively costly or difficult to construct. To give researchers a solution that is inexpensive, easily customisable, and fast to fabricate, a newly-designed low-cost, and open-source design of an elastic joint is presented in this work. Based on the newly design elastic joint, a highly-compliant multi-purpose 2-DOF robot arm for safe human-robot interaction is also introduced. The mechanical design of the robot and a position control algorithm are presented. The mechanical prototype is 3D-printed. The control algorithm is a two loops control scheme. In particular, the inner control loop is designed as a model reference adaptive controller (MRAC) to deal with uncertainties in the system parameters, while the outer control loop utilises a fuzzy proportional-integral controller to reduce the effect of external disturbances on the load. The control algorithm is first validated in simulation. Then the effectiveness of the controller is also proven by experiments on the mechanical prototype.


2021 ◽  
Vol 3 ◽  
Author(s):  
Alberto Martinetti ◽  
Peter K. Chemweno ◽  
Kostas Nizamis ◽  
Eduard Fosch-Villaronga

Policymakers need to consider the impacts that robots and artificial intelligence (AI) technologies have on humans beyond physical safety. Traditionally, the definition of safety has been interpreted to exclusively apply to risks that have a physical impact on persons’ safety, such as, among others, mechanical or chemical risks. However, the current understanding is that the integration of AI in cyber-physical systems such as robots, thus increasing interconnectivity with several devices and cloud services, and influencing the growing human-robot interaction challenges how safety is currently conceptualised rather narrowly. Thus, to address safety comprehensively, AI demands a broader understanding of safety, extending beyond physical interaction, but covering aspects such as cybersecurity, and mental health. Moreover, the expanding use of machine learning techniques will more frequently demand evolving safety mechanisms to safeguard the substantial modifications taking place over time as robots embed more AI features. In this sense, our contribution brings forward the different dimensions of the concept of safety, including interaction (physical and social), psychosocial, cybersecurity, temporal, and societal. These dimensions aim to help policy and standard makers redefine the concept of safety in light of robots and AI’s increasing capabilities, including human-robot interactions, cybersecurity, and machine learning.


2022 ◽  
Vol 59 (1) ◽  
pp. 102750
Author(s):  
Jingyao Wang ◽  
Manas Ranjan Pradhan ◽  
Nallappan Gunasekaran

Sign in / Sign up

Export Citation Format

Share Document