Robust Tracking with and Beyond Visible Spectrum: A Four-Layer Data Fusion Framework

Author(s):  
Jianru Xue ◽  
Nanning Zheng
Author(s):  
Wen Qi ◽  
Hang Su ◽  
Ke Fan ◽  
Ziyang Chen ◽  
Jiehao Li ◽  
...  

The generous application of robot-assisted minimally invasive surgery (RAMIS) promotes human-machine interaction (HMI). Identifying various behaviors of doctors can enhance the RAMIS procedure for the redundant robot. It bridges intelligent robot control and activity recognition strategies in the operating room, including hand gestures and human activities. In this paper, to enhance identification in a dynamic situation, we propose a multimodal data fusion framework to provide multiple information for accuracy enhancement. Firstly, a multi-sensors based hardware structure is designed to capture varied data from various devices, including depth camera and smartphone. Furthermore, in different surgical tasks, the robot control mechanism can shift automatically. The experimental results evaluate the efficiency of developing the multimodal framework for RAMIS by comparing it with a single sensor system. Implementing the KUKA LWR4+ in a surgical robot environment indicates that the surgical robot systems can work with medical staff in the future.


2021 ◽  
Author(s):  
Zhuo Yang ◽  
Yan Lu ◽  
Simin Li ◽  
Jennifer Li ◽  
Yande Ndiaye ◽  
...  

Abstract To accelerate the adoption of Metal Additive Manufacturing (MAM) for production, an understanding of MAM process-structure-property (PSP) relationships is indispensable for quality control. A multitude of physical phenomena involved in MAM necessitates the use of multi-modal and in-process sensing techniques to model, monitor and control the process. The data generated from these sensors and process actuators are fused in various ways to advance our understanding of the process and to estimate both process status and part-in-progress states. This paper presents a hierarchical in-process data fusion framework for MAM, consisting of pointwise, trackwise, layerwise and partwise data analytics. Data fusion can be performed at raw data, feature, decision or mixed levels. The multi-scale data fusion framework is illustrated in detail using a laser powder bed fusion process for anomaly detection, material defect isolation, and part quality prediction. The multi-scale data fusion can be generally applied and integrated with real-time MAM process control, near-real-time layerwise repairing and buildwise decision making. The framework can be utilized by the AM research and standards community to rapidly develop and deploy interoperable tools and standards to analyze, process and exploit two or more different types of AM data. Common engineering standards for AM data fusion systems will dramatically improve the ability to detect, identify and locate part flaws, and then derive optimal policies for process control.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 180618-180632
Author(s):  
Mehdi Abdollahpour ◽  
Tohid Yousefi Rezaii ◽  
Ali Farzamnia ◽  
Ismail Saad

Author(s):  
Meisong Wang ◽  
Charith Perera ◽  
Prem Prakash Jayaraman ◽  
Miranda Zhang ◽  
Peter Strazdins ◽  
...  

Internet of Things (IoT) has gained substantial attention recently and play a significant role in smart city application deployments. A number of such smart city applications depend on sensor fusion capabilities in the cloud from diverse data sources. The authors introduce the concept of IoT and present in detail ten different parameters that govern our sensor data fusion evaluation framework. They then evaluate the current state-of-the art in sensor data fusion against our sensor data fusion framework. The authors' main goal is to examine and survey different sensor data fusion research efforts based on our evaluation framework. The major open research issues related to sensor data fusion are also presented.


2020 ◽  
Vol 17 (2) ◽  
pp. 172988142091176
Author(s):  
Raul Dominguez ◽  
Mark Post ◽  
Alexander Fabisch ◽  
Romain Michalec ◽  
Vincent Bissonnette ◽  
...  

Multisensor data fusion plays a vital role in providing autonomous systems with environmental information crucial for reliable functioning. In this article, we summarize the modular structure of the newly developed and released Common Data Fusion Framework and explain how it is used. Sensor data are registered and fused within the Common Data Fusion Framework to produce comprehensive 3D environment representations and pose estimations. The proposed software components to model this process in a reusable manner are presented through a complete overview of the framework, then the provided data fusion algorithms are listed, and through the case of 3D reconstruction from 2D images, the Common Data Fusion Framework approach is exemplified. The Common Data Fusion Framework has been deployed and tested in various scenarios that include robots performing operations of planetary rover exploration and tracking of orbiting satellites.


2014 ◽  
Vol 38 (9) ◽  
Author(s):  
O. H. Salman ◽  
M. F. A. Rasid ◽  
M. I. Saripan ◽  
S. K. Subramaniam
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document