Based on D - S Evidence Theory of Solution Concentration Detection Method Research

2014 ◽  
Vol 494-495 ◽  
pp. 869-872
Author(s):  
Xian Bao Wang ◽  
Shi Hai Zhao ◽  
Guo Wei

According to the theory of multi-sensor information fusion technology, based on D - S evidence theory to fuse of multiple sensors feedback information from different angles for detecting solution concentration, and achieving the same judgment; This system uses of D - S evidence theory of multi-sensor data fusion method, not only make up the disadvantages of using a single sensor, but also largely reduce the uncertainty of the judgment. Additionally this system improves the rapidity and accuracy of the solution concentration detection, and broadens the application field of multi-sensor information fusion technology.

2018 ◽  
Vol 14 (04) ◽  
pp. 4
Author(s):  
Xuemei Yao ◽  
Shaobo Li ◽  
Yong Yao ◽  
Xiaoting Xie

As the information measured by a single sensor cannot reflect the real situation of mechanical devices completely, a multi-sensor data fusion based on evidence theory is introduced. Evidence theory has the advantage of dealing with uncertain information. However, it produces unreasonable conclusions when the evidence conflicts. An improved fusion method is proposed to solve this problem. Basic probability assignment of evidence is corrected according to evidence and sensor weights, and an optimal fusion algorithm is selected by comparing an introduced threshold and a conflict factor. The effectiveness and practicability of the algorithm are tested by simulating the monitoring and diagnosis of rolling bearings. The result shows that the method has better robustness.


2012 ◽  
Vol 466-467 ◽  
pp. 1222-1226
Author(s):  
Bin Ma ◽  
Lin Chong Hao ◽  
Wan Jiang Zhang ◽  
Jing Dai ◽  
Zhong Hua Han

In this paper, we presented an equipment fault diagnosis method based on multi-sensor data fusion, in order to solve the problems such as uncertainty, imprecision and low reliability caused by using a single sensor to diagnose the equipment faults. We used a variety of sensors to collect the data for diagnosed objects and fused the data by using D-S evidence theory, according to the change of confidence and uncertainty, diagnosed whether the faults happened. Experimental results show that, the D-S evidence theory algorithm can reduce the uncertainty of the results of fault diagnosis, improved diagnostic accuracy and reliability, and compared with the fault diagnosis using a single sensor, this method has a better effect.


Sensors ◽  
2020 ◽  
Vol 20 (8) ◽  
pp. 2180 ◽  
Author(s):  
Prasanna Kolar ◽  
Patrick Benavidez ◽  
Mo Jamshidi

This paper focuses on data fusion, which is fundamental to one of the most important modules in any autonomous system: perception. Over the past decade, there has been a surge in the usage of smart/autonomous mobility systems. Such systems can be used in various areas of life like safe mobility for the disabled, senior citizens, and so on and are dependent on accurate sensor information in order to function optimally. This information may be from a single sensor or a suite of sensors with the same or different modalities. We review various types of sensors, their data, and the need for fusion of the data with each other to output the best data for the task at hand, which in this case is autonomous navigation. In order to obtain such accurate data, we need to have optimal technology to read the sensor data, process the data, eliminate or at least reduce the noise and then use the data for the required tasks. We present a survey of the current data processing techniques that implement data fusion using different sensors like LiDAR that use light scan technology, stereo/depth cameras, Red Green Blue monocular (RGB) and Time-of-flight (TOF) cameras that use optical technology and review the efficiency of using fused data from multiple sensors rather than a single sensor in autonomous navigation tasks like mapping, obstacle detection, and avoidance or localization. This survey will provide sensor information to researchers who intend to accomplish the task of motion control of a robot and detail the use of LiDAR and cameras to accomplish robot navigation.


2012 ◽  
Vol 532-533 ◽  
pp. 1006-1010 ◽  
Author(s):  
Ye Li ◽  
Yan Qing Jiang

The application of distributed multi-sensor information fusion technology in accurate positioning of Underwater Vehicle was introduced in this paper. According to the system structure of Distributed multi-sensor in an AUV “T1”, this article establishes the Kalman filtering mathematical model, accomplishes the fusion algorithm based on Kalman filtering and a numerical simulation. The experimental result shows that the application of fusion algorithm based on Kalman filtering can avoid the limitations of a single sensor, reduce its uncertainty impact and increase the confidence level of data.


Sensors ◽  
2019 ◽  
Vol 19 (21) ◽  
pp. 4810 ◽  
Author(s):  
Md Nazmuzzaman Khan ◽  
Sohel Anwar

Multi-sensor data fusion technology in an important tool in building decision-making applications. Modified Dempster–Shafer (DS) evidence theory can handle conflicting sensor inputs and can be applied without any prior information. As a result, DS-based information fusion is very popular in decision-making applications, but original DS theory produces counterintuitive results when combining highly conflicting evidences from multiple sensors. An effective algorithm offering fusion of highly conflicting information in spatial domain is not widely reported in the literature. In this paper, a successful fusion algorithm is proposed which addresses these limitations of the original Dempster–Shafer (DS) framework. A novel entropy function is proposed based on Shannon entropy, which is better at capturing uncertainties compared to Shannon and Deng entropy. An 8-step algorithm has been developed which can eliminate the inherent paradoxes of classical DS theory. Multiple examples are presented to show that the proposed method is effective in handling conflicting information in spatial domain. Simulation results showed that the proposed algorithm has competitive convergence rate and accuracy compared to other methods presented in the literature.


2013 ◽  
Vol 385-386 ◽  
pp. 601-604
Author(s):  
Han Min Ye ◽  
Zun Ding Xiao

The information fusion method is introduced into the transformer fault diagnosis. Through the sensor acquire transformer in operation of each state parameter, using two parallel BP neural networks to local diagnosis, with D-S evidence theory to global fuse the local diagnostic results. It realized the accurate diagnosis when transformer comes out one or a variety of faults at the same time. The experiments demonstrate that the credibility of diagnosis results are improved significantly, uncertainties are obviously reduced, which fully shows that the method is effective.


2014 ◽  
Vol 940 ◽  
pp. 280-283
Author(s):  
Chong Fa Liu ◽  
Zheng Xi Xie ◽  
Jie Min Yang ◽  
Zhi Jun Gao

Fault diagnosis based on multi-sensor information fusion technology processes multi-source information and data of the monitoring system in various manners such as detection, parallel and related processing, estimation, comprehensive treatment and so on so as to maximize the use of system knowledge and the information provided by the available detectable quantity of the system in fault diagnosis. Compared with the single sensor, multi-sensor information fusion enjoys obvious advantages in reducing information uncertainty, improving information accuracy obtained by the system and advancing system reliability and fault tolerance capability. As the accuracy of traditional fault diagnosis method is not high, considering the characteristics of faults in the electric starting system of self-propelled gun, a method of fault diagnosis is presented here based on network information fusion technology. The diagnostic process is divided into two level diagnosis, that is subsystem and system level. System adopts BP neural network in fault mode classification, while at system level D-S evidence theory is used in the process of synthetic decision evaluation on the entire system malfunction, ensuring accurate and fast fault diagnosis, which greatly shorten the corrective maintenance time.


2021 ◽  
Vol 2136 (1) ◽  
pp. 012036
Author(s):  
Chaoyu Wang ◽  
Zhi Liu ◽  
Yakun Wang

Abstract Intelligent fault diagnosis technology has become the focus of research in various fields. Its realization depends on the acquisition of equipment state by sensors. Because the fault information provided by a single sensor has limitations and cannot fully reflect the fault state of the tested object, we need to use multiple sensors to collect and fuse the fault information of rolling bearings to ensure the accuracy and accuracy of intelligent fault diagnosis. Based on this, this paper analyzes the application of fuzzy rules of multi-sensor information fusion technology in the fault diagnosis of bearings in the optoelectronic pod, so as to provide a reference for the realization of intelligent fault diagnosis of each structure in the optoelectronic pod.


Author(s):  
Yu-Jin Zhang

The human perception to the outside world is the results of action among brain and many organs. For example, the intelligent robots that people currently investigate can have many sensors for sense of vision, sense of hearing, sense of taste, sense of smell, sense of touch, sense of pain, sense of heat, sense of force, sense of slide, sense of approach (Luo, 2002). All these sensors provide different profile information of scene in same environment. To use suitable techniques for assorting with various sensors and combining their obtained information, the theories and methods of multi-sensor fusion are required. Multi-sensor information fusion is a basic ability of human beings. Single sensor can only provide incomplete, un-accurate, vague, uncertainty information. Sometimes, information obtained by different sensors can even be contradictory. Human beings have the ability to combine the information obtained by different organs and then make estimation and decision for environment and events. Using computer to perform multi-sensor information fusion can be considered as a simulation of the function of human brain for treating complex problems. Multi-sensor information fusion consists of operating on the information data come from various sensors and obtaining more comprehensive, accurate, and robust results than that obtained from single sensor. Fusion can be defined as the process of combined treating of data acquired from multiple sensors, as well as assorting, optimizing and conforming of these data to increase the ability of extracting information and improving the decision capability. Fusion can extend the coverage for space and time information, reducing the fuzziness, increasing the reliability of making decision, and the robustness of systems. Image fusion is a particular type of multi-sensor fusion, which takes images as operating objects. In a more general sense of image engineering (Zhang, 2006), the combination of multi-resolution images also can be counted as a fusion process. In this article, however, the emphasis is put on the information fusion of multi-sensor images.


Sign in / Sign up

Export Citation Format

Share Document