SLAM-Based Multistate Tracking System for Mobile Human-Robot Interaction

Author(s):  
Thorsten Hempel ◽  
Ayoub Al-Hamadi
2021 ◽  
Author(s):  
Adrian Bulzacki

This work describes the implementation of various human-robot interaction systems in a functioning mobile robot. This project is the result of integrating a tracking system for human faces and objects, face recognition, gesture recognition, body tracking, stereo-vision, speech synthesis, and voice recognition. The majority of these systems are custom designed for this particular project, these systems and how they were designed are explained in detail throughout this report. A unique vector-based approach is used for gesture recognition. There is minimal focus on the mechanics and electronics of the human-robot interaction system, but rather on the information processing of the robot. Using combinations of many information processing systems will allow robots to interact with human users more naturally, and will provide a natural conduit for future cooperative human-robot efforts. This project lays the groundwork for what will be a large collaborative effort aimed at creating possibly one of the most advanced human interactive robot in the world.


Sensors ◽  
2020 ◽  
Vol 20 (15) ◽  
pp. 4088
Author(s):  
Aleš Vysocký ◽  
Stefan Grushko ◽  
Petr Oščádal ◽  
Tomáš Kot ◽  
Ján Babjak ◽  
...  

In this analysis, we present results from measurements performed to determine the stability of a hand tracking system and the accuracy of the detected palm and finger’s position. Measurements were performed for the evaluation of the sensor for an application in an industrial robot-assisted assembly scenario. Human–robot interaction is a relevant topic in collaborative robotics. Intuitive and straightforward control tools for robot navigation and program flow control are essential for effective utilisation in production scenarios without unnecessary slowdowns caused by the operator. For the hand tracking and gesture-based control, it is necessary to know the sensor’s accuracy. For gesture recognition with a moving target, the sensor must provide stable tracking results. This paper evaluates the sensor’s real-world performance by measuring the localisation deviations of the hand being tracked as it moves in the workspace.


2021 ◽  
Author(s):  
Adrian Bulzacki

This work describes the implementation of various human-robot interaction systems in a functioning mobile robot. This project is the result of integrating a tracking system for human faces and objects, face recognition, gesture recognition, body tracking, stereo-vision, speech synthesis, and voice recognition. The majority of these systems are custom designed for this particular project, these systems and how they were designed are explained in detail throughout this report. A unique vector-based approach is used for gesture recognition. There is minimal focus on the mechanics and electronics of the human-robot interaction system, but rather on the information processing of the robot. Using combinations of many information processing systems will allow robots to interact with human users more naturally, and will provide a natural conduit for future cooperative human-robot efforts. This project lays the groundwork for what will be a large collaborative effort aimed at creating possibly one of the most advanced human interactive robot in the world.


Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5976
Author(s):  
Inês Soares ◽  
Marcelo Petry ◽  
António Paulo Moreira

The world is living the fourth industrial revolution, marked by the increasing intelligence and automation of manufacturing systems. Nevertheless, there are types of tasks that are too complex or too expensive to be fully automated, it would be more efficient if the machines were able to work with the human, not only by sharing the same workspace but also as useful collaborators. A possible solution to that problem is on human–robot interaction systems, understanding the applications where they can be helpful to implement and what are the challenges they face. This work proposes the development of an industrial prototype of a human–machine interaction system through Augmented Reality, in which the objective is to enable an industrial operator without any programming experience to program a robot. The system itself is divided into two different parts: the tracking system, which records the operator’s hand movement, and the translator system, which writes the program to be sent to the robot that will execute the task. To demonstrate the concept, the user drew geometric figures, and the robot was able to replicate the operator’s path recorded.


2014 ◽  
Author(s):  
Mitchell S. Dunfee ◽  
Tracy Sanders ◽  
Peter A. Hancock

Author(s):  
Rosemarie Yagoda ◽  
Michael D. Coovert

2009 ◽  
Author(s):  
Matthew S. Prewett ◽  
Kristin N. Saboe ◽  
Ryan C. Johnson ◽  
Michael D. Coovert ◽  
Linda R. Elliott

Sign in / Sign up

Export Citation Format

Share Document