scholarly journals UNCERTAINTY HANDLING IN DISASTER MANAGEMENT USING HIERARCHICAL ROUGH SET GRANULATION

Author(s):  
H. Sheikhian ◽  
M. R. Delavar ◽  
A. Stein

Uncertainty is one of the main concerns in geospatial data analysis. It affects different parts of decision making based on such data. In this paper, a new methodology to handle uncertainty for multi-criteria decision making problems is proposed. It integrates hierarchical rough granulation and rule extraction to build an accurate classifier. Rough granulation provides information granules with a detailed quality assessment. The granules are the basis for the rule extraction in granular computing, which applies quality measures on the rules to obtain the best set of classification rules. The proposed methodology is applied to assess seismic physical vulnerability in Tehran. Six effective criteria reflecting building age, height and material, topographic slope and earthquake intensity of the North Tehran fault have been tested. The criteria were discretized and the data set was granulated using a hierarchical rough method, where the best describing granules are determined according to the quality measures. The granules are fed into the granular computing algorithm resulting in classification rules that provide the highest prediction quality. This detailed uncertainty management resulted in 84% accuracy in prediction in a training data set. It was applied next to the whole study area to obtain the seismic vulnerability map of Tehran. A sensitivity analysis proved that earthquake intensity is the most effective criterion in the seismic vulnerability assessment of Tehran.

2018 ◽  
Vol 24 (3) ◽  
pp. 367-382
Author(s):  
Nassau de Nogueira Nardez ◽  
Cláudia Pereira Krueger ◽  
Rosana Sueli da Motta Jafelice ◽  
Marcio Augusto Reolon Schmidt

Abstract Knowledge concerning Phase Center Offset (PCO) is an important aspect in the calibration of GNSS antennas and has a direct influence on the quality of high precision positioning. Studies show that there is a correlation between meteorological variables when determining the north (N), east (E) and vertical Up (H) components of PCO. This article presents results for the application of Fuzzy Rule-Based Systems (FRBS) for determining the position of these components. The function Adaptive Neuro-Fuzzy Inference Systems (ANFIS) was used to generate FRBS, with the PCO components as output variables. As input data, the environmental variables such as temperature, relative humidity and precipitation were used; along with variables obtained from the antenna calibration process such as Positional Dilution of Precision and the multipath effect. An FRBS was constructed for each planimetric N and E components from the carriers L1 and L2, using a training data set by means of ANFIS. Once the FRBS were defined, the verification data set was applied, the components obtained by the FRBS and Antenna Calibration Base at the Federal University of Paraná were compared. For planimetric components, the difference was less than 1.00 mm, which shows the applicability of the method for horizontal components.


2018 ◽  
Vol 7 (11) ◽  
pp. 444 ◽  
Author(s):  
Mohsen Alizadeh ◽  
Mazlan Hashim ◽  
Esmaeil Alizadeh ◽  
Himan Shahabi ◽  
Mohammad Karami ◽  
...  

Earthquakes are among the most catastrophic natural geo-hazards worldwide and endanger numerous lives annually. Therefore, it is vital to evaluate seismic vulnerability beforehand to decrease future fatalities. The aim of this research is to assess the seismic vulnerability of residential houses in an urban region on the basis of the Multi-Criteria Decision Making (MCDM) model, including the analytic hierarchy process (AHP) and geographical information system (GIS). Tabriz city located adjacent to the North Tabriz Fault (NTF) in North-West Iran was selected as a case study. The NTF is one of the major seismogenic faults in the north-western part of Iran. First, several parameters such as distance to fault, percent of slope, and geology layers were used to develop a geotechnical map. In addition, the structural construction materials, building materials, size of building blocks, quality of buildings and buildings-floors were used as key factors impacting on the building’s structural vulnerability in residential areas. Subsequently, the AHP technique was adopted to measure the priority ranking, criteria weight (layers), and alternatives (classes) of every criterion through pair-wise comparison at all levels. Lastly, the layers of geotechnical and spatial structures were superimposed to design the seismic vulnerability map of buildings in the residential area of Tabriz city. The results showed that South and Southeast areas of Tabriz city exhibit low to moderate vulnerability, while some regions of the north-eastern area are under severe vulnerability conditions. In conclusion, the suggested approach offers a practical and effective evaluation of Seismic Vulnerability Assessment (SVA) and provides valuable information that could assist urban planners during mitigation and preparatory phases of less examined areas in many other regions around the world.


2012 ◽  
Vol 17 (Suppl1) ◽  
pp. 94-101 ◽  
Author(s):  
James Guest ◽  
James S. Harrop ◽  
Bizhan Aarabi ◽  
Robert G. Grossman ◽  
James W. Fawcett ◽  
...  

The North American Clinical Trials Network (NACTN) includes 9 clinical centers funded by the US Department of Defense and the Christopher Reeve Paralysis Foundation. Its purpose is to accelerate clinical testing of promising therapeutics in spinal cord injury (SCI) through the development of a robust interactive infrastructure. This structure includes key committees that serve to provide longitudinal guidance to the Network. These committees include the Executive, Data Management, and Neurological Outcome Assessments Committees, and the Therapeutic Selection Committee (TSC), which is the subject of this manuscript. The NACTN brings unique elements to the SCI field. The Network's stability is not restricted to a single clinical trial. Network members have diverse expertise and include experts in clinical care, clinical trial design and methodology, pharmacology, preclinical and clinical research, and advanced rehabilitation techniques. Frequent systematic communication is assigned a high value, as is democratic process, fairness and efficiency of decision making, and resource allocation. This article focuses on how decision making occurs within the TSC to rank alternative therapeutics according to 2 main variables: quality of the preclinical data set, and fit with the Network's aims and capabilities. This selection process is important because if the Network's resources are committed to a therapeutic, alternatives cannot be pursued. A proposed methodology includes a multicriteria decision analysis that uses a Multi-Attribute Global Inference of Quality matrix to quantify the process. To rank therapeutics, the TSC uses a series of consensus steps designed to reduce individual and group bias and limit subjectivity. Given the difficulties encountered by industry in completing clinical trials in SCI, stable collaborative not-for-profit consortia, such as the NACTN, may be essential to clinical progress in SCI. The evolution of the NACTN also offers substantial opportunity to refine decision making and group dynamics. Making the best possible decisions concerning therapeutics selection for trial testing is a cornerstone of the Network's function.


1999 ◽  
Vol 121 (4) ◽  
pp. 727-732 ◽  
Author(s):  
Y. Chen ◽  
E. Orady

Sensor fusion aims to identify useful information to facilitate decision-making using data from multiple sensors. Signals from each sensor are usually processed, through feature extraction, into different indices by which knowledge can be better represented. However, cautions should be placed in decision-making when multiple indices are used, since each index may carry different information or different aspects of the knowledge for the process/system under study. To this end, a practical scheme for index evaluation based on entropy and information gain is presented. This procedure is useful when index ranking is needed in designing a classifier for a complex system or process. Both regional entropy and class entropy are introduced based on a set of training data. Application of this scheme is illustrated by using a data set for a tapping process.


2011 ◽  
Vol 2 (1) ◽  
pp. 49-58
Author(s):  
Periasamy Vivekanandan ◽  
Raju Nedunchezhian

Genetic algorithm is a search technique purely based on natural evolution process. It is widely used by the data mining community for classification rule discovery in complex domains. During the learning process it makes several passes over the data set for determining the accuracy of the potential rules. Due to this characteristic it becomes an extremely I/O intensive slow process. It is particularly difficult to apply GA when the training data set becomes too large and not fully available. An incremental Genetic algorithm based on boosting phenomenon is proposed in this paper which constructs a weak ensemble of classifiers in a fast incremental manner and thus tries to reduce the learning cost considerably.


Author(s):  
H. Sheikhian ◽  
M. R. Delavar ◽  
A. Stein

Tehran, the capital of Iran, is surrounded by the North Tehran fault, the Mosha fault and the Rey fault. This exposes the city to possibly huge earthquakes followed by dramatic human loss and physical damage, in particular as it contains a large number of non-standard constructions and aged buildings. Estimation of the likely consequences of an earthquake facilitates mitigation of these losses. Mitigation of the earthquake fatalities may be achieved by promoting awareness of earthquake vulnerability and implementation of seismic vulnerability reduction measures. In this research, granular computing using generality and absolute support for rule extraction is applied. It uses coverage and entropy for rule prioritization. These rules are combined to form a granule tree that shows the order and relation of the extracted rules. In this way the seismic physical vulnerability is assessed, integrating the effects of the three major known faults. Effective parameters considered in the physical seismic vulnerability assessment are slope, seismic intensity, height and age of the buildings. Experts were asked to predict seismic vulnerability for 100 randomly selected samples among more than 3000 statistical units in Tehran. The integrated experts’ point of views serve as input into granular computing. Non-redundant covering rules preserve the consistency in the model, which resulted in 84% accuracy in the seismic vulnerability assessment based on the validation of the predicted test data against expected vulnerability degree. The study concluded that granular computing is a useful method to assess the effects of earthquakes in an earthquake prone area.


Author(s):  
Periasamy Vivekanandan ◽  
Raju Nedunchezhian

Genetic algorithm is a search technique purely based on natural evolution process. It is widely used by the data mining community for classification rule discovery in complex domains. During the learning process it makes several passes over the data set for determining the accuracy of the potential rules. Due to this characteristic it becomes an extremely I/O intensive slow process. It is particularly difficult to apply GA when the training data set becomes too large and not fully available. An incremental Genetic algorithm based on boosting phenomenon is proposed in this paper which constructs a weak ensemble of classifiers in a fast incremental manner and thus tries to reduce the learning cost considerably.


Author(s):  
Alina Köchling ◽  
Shirin Riazy ◽  
Marius Claus Wehner ◽  
Katharina Simbeck

AbstractThe study aims to identify whether algorithmic decision making leads to unfair (i.e., unequal) treatment of certain protected groups in the recruitment context. Firms increasingly implement algorithmic decision making to save costs and increase efficiency. Moreover, algorithmic decision making is considered to be fairer than human decisions due to social prejudices. Recent publications, however, imply that the fairness of algorithmic decision making is not necessarily given. Therefore, to investigate this further, highly accurate algorithms were used to analyze a pre-existing data set of 10,000 video clips of individuals in self-presentation settings. The analysis shows that the under-representation concerning gender and ethnicity in the training data set leads to an unpredictable overestimation and/or underestimation of the likelihood of inviting representatives of these groups to a job interview. Furthermore, algorithms replicate the existing inequalities in the data set. Firms have to be careful when implementing algorithmic video analysis during recruitment as biases occur if the underlying training data set is unbalanced.


2014 ◽  
Vol 11 (2) ◽  
Author(s):  
Pavol Král’ ◽  
Lukáš Sobíšek ◽  
Mária Stachová

Data quality can be seen as a very important factor for the validity of information extracted from data sets using statistical or data mining procedures. In the paper we propose a description of data quality allowing us to characterize data quality of the whole data set, as well as data quality of particular variables and individual cases. On the basis of the proposed description, we define a distance based measure of data quality for individual cases as a distance of the cases from the ideal one. Such a measure can be used as additional information for preparation of a training data set, fitting models, decision making based on results of analyses etc. It can be utilized in different ways ranging from a simple weighting function to belief functions.


Sign in / Sign up

Export Citation Format

Share Document