Research of Flood Prediction Based on Subjective/Objective Evidences Fusion Model

2012 ◽  
Vol 532-533 ◽  
pp. 1272-1276
Author(s):  
Wei Wu ◽  
Yan Ming Chen

This paper presents a model by combining BP neural network and DS evidential reasoning, which not only achieves the feature level fusion of all subjective and objective evidences in various domains and layers, but also makes distinct models complement each other. By the experiment, this method improves classification precision by 7.9 percent and reduces the time complexity of algorithm. The model solves the problems such as high complexity of algorithms and low accuracy rate of classifications lie in the flood prediction using single models.

2021 ◽  
Vol 5 (4) ◽  
pp. 229-250
Author(s):  
Chetana Kamlaskar ◽  
◽  
Aditya Abhyankar ◽  

<abstract><p>For reliable and accurate multimodal biometric based person verification, demands an effective discriminant feature representation and fusion of the extracted relevant information across multiple biometric modalities. In this paper, we propose feature level fusion by adopting the concept of canonical correlation analysis (CCA) to fuse Iris and Fingerprint feature sets of the same person. The uniqueness of this approach is that it extracts maximized correlated features from feature sets of both modalities as effective discriminant information within the features sets. CCA is, therefore, suitable to analyze the underlying relationship between two feature spaces and generates more powerful feature vectors by removing redundant information. We demonstrate that an efficient multimodal recognition can be achieved with a significant reduction in feature dimensions with less computational complexity and recognition time less than one second by exploiting CCA based joint feature fusion and optimization. To evaluate the performance of the proposed system, Left and Right Iris, and thumb Fingerprints from both hands of the SDUMLA-HMT multimodal dataset are considered in this experiment. We show that our proposed approach significantly outperforms in terms of equal error rate (EER) than unimodal system recognition performance. We also demonstrate that CCA based feature fusion excels than the match score level fusion. Further, an exploration of the correlation between Right Iris and Left Fingerprint images (EER of 0.1050%), and Left Iris and Right Fingerprint images (EER of 1.4286%) are also presented to consider the effect of feature dominance and laterality of the selected modalities for the robust multimodal biometric system.</p></abstract>


Author(s):  
S. Daneshtalab ◽  
H. Rastiveis ◽  
B. Hosseiny

Abstract. Land-cover classification of Remote Sensing (RS) data in urban area has always been a challenging task due to the complicated relations between different objects. Recently, fusion of aerial imagery and light detection and ranging (LiDAR) data has obtained a great attention in RS communities. Meanwhile, convolutional neural network (CNN) has proven its power in extracting high-level (deep) descriptors to improve RS data classification. In this paper, a CNN-based feature-level framework is proposed to integrate LiDAR data and aerial imagery for object classification in urban area. In our method, after generating low-level descriptors and fusing them in a feature-level fusion by layer-stacking, the proposed framework employs a novel CNN to extract the spectral-spatial features for classification process, which is performed using a fully connected multilayer perceptron network (MLP). The experimental results revealed that the proposed deep fusion model provides about 10% improvement in overall accuracy (OA) in comparison with other conventional feature-level fusion techniques.


2012 ◽  
Vol 263-266 ◽  
pp. 1947-1952
Author(s):  
Chun Ming Pei ◽  
Ling Li

An external insulation of contaminated-insulator assessment method is proposed based on a tri-level data fusion model of combining principal component analysis (PCA) method, artificial neural net (ANN) method and evidence theory in this paper. When contaminated-insulators partial discharge (PD) occur, much effective information obtained from the sound emitted with PD are synthesized to evaluate the external insulation strength of insulators in operation by the studied method. Firstly, nine characteristic parameters that can rapidly reflect the PD process are selected for image-level fusion of PCA to reduce dimension, which gets two new parameters. Then the new parameters are inputted to ANN for feature-level fusion. Finally, the feature-level fusion output is used as the input of decision-level fusion and fused by means of D-S evidence theory for further reducing the uncertainty of assessment. The artificial contamination experiments were explored to verify the proposed method. The result indicates that the proposed model is more precise than the ANN model under the same conditions.


2021 ◽  
Vol 781 (3) ◽  
pp. 032022
Author(s):  
Li Guan ◽  
Yifei Tong ◽  
Jingwei Li ◽  
Shaofeng Wu ◽  
Dongbo Li

2014 ◽  
Vol 543-547 ◽  
pp. 1223-1226
Author(s):  
Jian Cao ◽  
Cong Yan

After information fusion model has been established, the feature-level fusion algorithm based on fuzzy neural network and expert system is proposed, in which the expert system has been embedded into fuzzy neural network so that it could choose the membership function and adjust the network structure. At the same time, for code tracking loop, two new code phase discriminator algorithms based on DLL structure is proposed. Evidence theory has been applied to achieve the decision-making level fusion. The performances of the two algorithms were studied by using theoretical method and experimental method with analog IF signal data and actual IF signal data respectively. Then, the results of feature-level fusion have been taken as the evidences to construct the frame of discernment. The research results show that the process of information fusion has abilities of adapting and self-learning.


2010 ◽  
Vol 2 (1) ◽  
pp. 28-38 ◽  
Author(s):  
K. Kannan ◽  
S. Arumuga Perumal ◽  
K. Arulmozhi

Sign in / Sign up

Export Citation Format

Share Document