Convolutional neural network based tracking for human following mobile robot with LQG based control system

Author(s):  
Sudip C Gupta ◽  
Jharna Majumdar
2021 ◽  
pp. 1-15
Author(s):  
Qinyu Mei ◽  
Ming Li

Aiming at the construction of the decision-making system for sports-assisted teaching and training, this article first gives a deep convolutional neural network model for sports-assisted teaching and training decision-making. Subsequently, In order to meet the needs of athletes to assist in physical exercise, a squat training robot is built using a self-developed modular flexible cable drive unit, and its control system is designed to assist athletes in squatting training in sports. First, the human squat training mechanism is analyzed, and the overall structure of the robot is determined; second, the robot force servo control strategy is designed, including the flexible cable traction force planning link, the lateral force compensation link and the establishment of a single flexible cable passive force controller; In order to verify the effect of robot training, a single flexible cable force control experiment and a man-machine squat training experiment were carried out. In the single flexible cable force control experiment, the suppression effect of excess force reached more than 50%. In the squat experiment under 200 N, the standard deviation of the system loading force is 7.52 N, and the dynamic accuracy is above 90.2%. Experimental results show that the robot has a reasonable configuration, small footprint, stable control system, high loading accuracy, and can assist in squat training in physical education.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 797 ◽  
Author(s):  
Yili Gu ◽  
Zhiqiang Li ◽  
Zhen Zhang ◽  
Jun Li ◽  
Liqing Chen

Due to the narrow row spacing of corn, the lack of light in the field caused by the blocking of branches, leaves and weeds in the middle and late stages of corn growth, it is generally difficult for machinery to move between rows and also impossible to observe the corn growth in real time. To solve the problem, a robot for corn interlines information collection thus is designed. First, the mathematical model of the robot is established using the designed control system. Second, an improved convolutional neural network model is proposed for training and learning, and the driving path is fitted by detecting and identifying corn rhizomes. Next, a multi-body dynamics simulation software, RecurDyn/track, is used to establish a dynamic model of the robot movement in soft soil conditions, and a control system is developed in MATLAB/SIMULINK for joint simulation experiments. Simulation results show that the method for controlling a sliding-mode variable structure can achieve better control results. Finally, experiments on the ground and in a simulated field environment show that the robot for field information collection based on the method developed runs stably and shows little deviation. The robot can be well applied for field plant protection, the control of corn diseases and insect pests, and the realization of human–machine separation.


2018 ◽  
Vol 30 (4) ◽  
pp. 540-551 ◽  
Author(s):  
Shingo Nakamura ◽  
◽  
Tadahiro Hasegawa ◽  
Tsubasa Hiraoka ◽  
Yoshinori Ochiai ◽  
...  

The Tsukuba Challenge is a competition, in which autonomous mobile robots run on a route set on a public road under a real environment. Their task includes not only simple running but also finding multiple specific persons at the same time. This study proposes a method that would realize person searching. While many person-searching algorithms use a laser sensor and a camera in combination, our method only uses an omnidirectional camera. The search target is detected using a convolutional neural network (CNN) that performs a classification of the search target. Training a CNN requires a great amount of data for which pseudo images created by composition are used. Our method is implemented in an autonomous mobile robot, and its performance has been verified in the Tsukuba Challenge 2017.


Sign in / Sign up

Export Citation Format

Share Document