Generic and Concurrent Computation of Belief Combination Rules

Author(s):  
Frédéric Dambreville
1983 ◽  
Author(s):  
R. M. King ◽  
E. W. Mayr ◽  
C. Green

2021 ◽  
Vol 10 (6) ◽  
pp. 377
Author(s):  
Chiao-Ling Kuo ◽  
Ming-Hua Tsai

The importance of road characteristics has been highlighted, as road characteristics are fundamental structures established to support many transportation-relevant services. However, there is still huge room for improvement in terms of types and performance of road characteristics detection. With the advantage of geographically tiled maps with high update rates, remarkable accessibility, and increasing availability, this paper proposes a novel simple deep-learning-based approach, namely joint convolutional neural networks (CNNs) adopting adaptive squares with combination rules to detect road characteristics from roadmap tiles. The proposed joint CNNs are responsible for the foreground and background image classification and various types of road characteristics classification from previous foreground images, raising detection accuracy. The adaptive squares with combination rules help efficiently focus road characteristics, augmenting the ability to detect them and provide optimal detection results. Five types of road characteristics—crossroads, T-junctions, Y-junctions, corners, and curves—are exploited, and experimental results demonstrate successful outcomes with outstanding performance in reality. The information of exploited road characteristics with location and type is, thus, converted from human-readable to machine-readable, the results will benefit many applications like feature point reminders, road condition reports, or alert detection for users, drivers, and even autonomous vehicles. We believe this approach will also enable a new path for object detection and geospatial information extraction from valuable map tiles.


2021 ◽  
Vol 11 (3) ◽  
pp. 906
Author(s):  
Payam Tehrani ◽  
Denis Mitchell

The seismic responses of continuous multi-span reinforced concrete (RC) bridges were predicted using inelastic time history analyses (ITHA) and incremental dynamic analysis (IDA). Some important issues in ITHA were studied in this research, including: the effects of using artificial and natural records on predictions of the mean seismic demands, effects of displacement directions on predictions of the mean seismic response, the use of 2D analysis with combination rules for prediction of the response obtained using 3D analysis, and prediction of the maximum radial displacement demands compared to the displacements obtained along the principal axes of the bridges. In addition, IDA was conducted and predictions were obtained at different damage states. These issues were investigated for the case of regular and irregular bridges using three different sets of natural and artificial records. The results indicated that the use of natural and artificial records typically resulted in similar predictions for the cases studied. The effect of displacement direction was important in predicting the mean seismic response. It was shown that 2D analyses with the combination rules resulted in good predictions of the radial displacement demands obtained from 3D analyses. The use of artificial records in IDA resulted in good prediction of the median collapse capacity.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 820
Author(s):  
Jingyu Liu ◽  
Yongchuan Tang

The multi-agent information fusion (MAIF) system can alleviate the limitations of a single expert system in dealing with complex situations, as it allows multiple agents to cooperate in order to solve problems in complex environments. Dempster–Shafer (D-S) evidence theory has important applications in multi-source data fusion, pattern recognition, and other fields. However, the traditional Dempster combination rules may produce counterintuitive results when dealing with highly conflicting data. A conflict data fusion method in a multi-agent system based on the base basic probability assignment (bBPA) and evidence distance is proposed in this paper. Firstly, the new bBPA and reconstructed BPA are used to construct the initial belief degree of each agent. Then, the information volume of each evidence group is obtained by calculating the evidence distance so as to modify the reliability and obtain more reasonable evidence. Lastly, the final evidence is fused with the Dempster combination rule to obtain the result. Numerical examples show the effectiveness and availability of the proposed method, which improves the accuracy of the identification process of the MAIF system.


2008 ◽  
Vol 275 (1649) ◽  
pp. 2299-2308 ◽  
Author(s):  
M To ◽  
P.G Lovell ◽  
T Troscianko ◽  
D.J Tolhurst

Natural visual scenes are rich in information, and any neural system analysing them must piece together the many messages from large arrays of diverse feature detectors. It is known how threshold detection of compound visual stimuli (sinusoidal gratings) is determined by their components' thresholds. We investigate whether similar combination rules apply to the perception of the complex and suprathreshold visual elements in naturalistic visual images. Observers gave magnitude estimations (ratings) of the perceived differences between pairs of images made from photographs of natural scenes. Images in some pairs differed along one stimulus dimension such as object colour, location, size or blur. But, for other image pairs, there were composite differences along two dimensions (e.g. both colour and object-location might change). We examined whether the ratings for such composite pairs could be predicted from the two ratings for the respective pairs in which only one stimulus dimension had changed. We found a pooling relationship similar to that proposed for simple stimuli: Minkowski summation with exponent 2.84 yielded the best predictive power ( r =0.96), an exponent similar to that generally reported for compound grating detection. This suggests that theories based on detecting simple stimuli can encompass visual processing of complex, suprathreshold stimuli.


Sign in / Sign up

Export Citation Format

Share Document