Evaluation of Local Descriptors for Vision-Based Localization of Humanoid Robots

Author(s):  
Noé G. Aldana-Murillo ◽  
Jean-Bernard Hayet ◽  
Héctor M. Becerra
Author(s):  
Jeakweon Han ◽  
Dennis Hong

Besides the difficulties in control and gait generation, designing a full-sized (taller than 1.3m) bipedal humanoid robot that can walk with two legs is a very challenging task, mainly due to the large torque requirements at the joints combined with the need for the actuators’ size and weight to be small. Most of the handful of successful humanoid robots in this size class that exist today utilize harmonic drives for gear reduction to gain high torque in a compact package. However, this makes the cost of such a robot too high and thus puts it out of reach of most of those who want to use it for general research, education and outreach activities. Besides the cost, the heavy weight of the robot also causes difficulties in handling and raises concerns for safety. In this paper we present the design of a new class of full-sized bipedal humanoid robots that is lightweight and low cost. This is achieved by utilizing spring assisted parallel four-bar linkages with synchronized actuation in the lower body to reduce the torque requirements of the individual actuators which also enables the use of off the shelf components to further reduce the cost significantly. The resulting savings in weight not only makes the operation of the robot safer, but also allows it to forgo the expensive force/torque sensors at the ankles and achieve stable bipedal walking only using the feedback from the IMU (Inertial Measurement Unit.) CHARLI-L (Cognitive Humanoid Autonomous Robot with Learning Intelligence - Lightweight) is developed using this approach and successfully demonstrated untethered bipedal locomotion using ZMP (Zero Moment Point) based control, stable omnidirectional gaits, and carrying out tasks autonomously using vision based localization.


2018 ◽  
Vol 24 (3) ◽  
pp. 471-481
Author(s):  
Noé G. Aldana-Murillo ◽  
Jean-Bernard Hayet ◽  
HECTOR BECERRA

2013 ◽  
Vol 10 (03) ◽  
pp. 1350019 ◽  
Author(s):  
OMID MOHARERI ◽  
AHMAD B. RAD

In this paper, we present a vision-based localization system using mobile augmented reality (MAR) and mobile audio augmented reality (MAAR) techniques, applicable to both humans and humanoid robots navigation in indoor environments. In the first stage, we propose a system that recognizes the location of a user from the image sequence of an indoor environment using its onboard camera. The location information is added to the user's view in the form of 3D objects and audio sounds with location information and navigation instruction content via augmented reality (AR). The location is recognized by using the prior knowledge about the layout of the environment and the location of the AR markers. The image sequence can be obtained using a smart phone's camera and the marker detection, 3D object placement and audio augmentation will be performed by the phone's operating processor and graphical/audio modules. Using this system will majorly reduce the hardware complexity of such navigation systems, as it replaces a system consisting of a mobile PC, wireless camera, head-mounted displays (HMD) and a remote PC with a smart phone with camera. In the second stage, the same algorithm is employed as a novel vision-based autonomous humanoid robot localization and navigation approach. The proposed technique is implemented on a humanoid robot NAO and improves the robot's navigation and localization performance previously done using an extended Kalman filter (EKF) by presenting location-based information to the robot through different AR markers placed in the robot environment.


Author(s):  
FRANCISCO ARTHUR BONFIM AZEVEDO ◽  
Daniela Vacarini de Faria ◽  
Marcos Maximo ◽  
Mauricio Donadon

Author(s):  
Adrian David Cheok ◽  
Kasun Karunanayaka ◽  
Emma Yann Zhang

Intimate relationships, such as love and sex, between human and machines, especially robots, has been one of the themes of science fiction. However, this topic has never been treated in the academic area until recently. It was first raised and discussed by David Levy in his book Love and Sex with Robotics (2007). Since then, researchers have come up with many implementations of robot companions, like sex robots, emotional robots, humanoid robots, and artificial intelligent systems that can simulate human emotions. This chapter presents a summary of significant recent activity in this field, predicts how the field is likely to develop, and discusses ethical and legal issues. We also discuss our research in physical devices for human–robot love and sex communication.


Sign in / Sign up

Export Citation Format

Share Document