quantification model
Recently Published Documents


TOTAL DOCUMENTS

98
(FIVE YEARS 35)

H-INDEX

12
(FIVE YEARS 2)

Author(s):  
Jakub Bijak ◽  
Jason Hilton

AbstractBetter understanding of the behaviour of agent-based models, aimed at embedding them in the broader, model-based line of scientific enquiry, requires a comprehensive framework for analysing their results. Seeing models as tools for experimenting in silico, this chapter discusses the basic tenets and techniques of uncertainty quantification and experimental design, both of which can help shed light on the workings of complex systems embedded in computational models. In particular, we look at: relationships between model inputs and outputs, various types of experimental design, methods of analysis of simulation results, assessment of model uncertainty and sensitivity, which helps identify the parts of the model that matter in the experiments, as well as statistical tools for calibrating models to the available data. We focus on the role of emulators, or meta-models – high-level statistical models approximating the behaviour of the agent-based models under study – and in particular, on Gaussian processes (GPs). The theoretical discussion is illustrated by applications to the Routes and Rumours model of migrant route formation introduced before.


2021 ◽  
Vol 15 ◽  
Author(s):  
Xinjun Suo ◽  
Lining Guo ◽  
Dianxun Fu ◽  
Hao Ding ◽  
Yihong Li ◽  
...  

Currently, comparative studies evaluating the quantification accuracy of pyramidal tracts (PT) and PT branches that were tracked based on four mainstream diffusion models are deficient. The present study aims to evaluate four mainstream models using the high-quality Human Connectome Project (HCP) dataset. Diffusion tensor imaging (DTI), diffusion spectral imaging (DSI), generalized Q-space sampling imaging (GQI), and Q-ball imaging (QBI) were used to construct the PT and PT branches in 50 healthy volunteers from the HCP. False and true PT fibers were identified based on anatomic information. One-way repeated measure analysis of variance and post hoc paired-sample t-test were performed to identify the best PT and PT branch quantification model. The number, percentage, and density of true fibers of PT obtained based on GQI and QBI were significantly larger than those based on DTI and DSI (all p < 0.0005, Bonferroni corrected), whereas false fibers yielded the opposite results (all p < 0.0005, Bonferroni corrected). More trunk branches (PTtrunk) were present in the four diffusion models compared with the upper limb (PTUlimb), lower limb (PTLlimb), and cranial (PTcranial) branches. In addition, significantly more true fibers were obtained in PTtrunk, PTUlimb, and PTLlimb based on the GQI and QBI compared with DTI and DSI (all p < 0.0005, Bonferroni corrected). Finally, GQI-based group probabilistic maps showed that the four PT branches exhibited relatively unique spatial distributions. Therefore, the GQI and QBI represent better diffusion models for the PT and PT branches. The group probabilistic maps of PT branches have been shared with the public to facilitate more precise studies on the plasticity of and the damage to the motor pathway.


Mathematics ◽  
2021 ◽  
Vol 9 (23) ◽  
pp. 3010
Author(s):  
Lorenzo A. Ricciardi ◽  
Christie Alisa Maddock ◽  
Massimiliano Vasile

This paper presents a novel method for multi-objective optimisation under uncertainty developed to study a range of mission trade-offs, and the impact of uncertainties on the evaluation of launch system mission designs. A memetic multi-objective optimisation algorithm, named MODHOC, which combines the Direct Finite Elements in Time transcription method with Multi Agent Collaborative Search, is extended to account for model uncertainties. An Unscented Transformation is used to capture the first two statistical moments of the quantities of interest. A quantification model of the uncertainty was developed for the atmospheric model parameters. An optimisation under uncertainty was run for the design of descent trajectories for a spaceplane-based two-stage launch system.


Author(s):  
Ricardo Ochoa ◽  
Diego Ticse ◽  
Emilio Herrera ◽  
Jose Vargas

Recycling ◽  
2021 ◽  
Vol 6 (3) ◽  
pp. 62
Author(s):  
Rocío Quiñones ◽  
Carmen Llatas ◽  
Maria Victoria Montes ◽  
Isidro Cortés

Construction waste (CW) is a prime contributor to the stream of total waste worldwide. One of the biggest challenges of the construction industry is to minimise CW and to develop practices of a more sustainable nature for its management and recycling in order to promote its transition towards a more effective circular economy. The implementation of these practices contributes towards mitigating the scarcity of natural resources and the environmental impact of CW. Thus, a preceding and essential step is the estimation of CW during building design, which will allow the adoption of measures for its early reduction and optimisation. For this purpose, Building Information Modelling (BIM) has become a useful methodology to predict waste during the early stages of design. There remains, however, a lack of instrumental development. Therefore, this study proposes a BIM-based method to estimate CW during building design by integrating a consolidated construction waste quantification model in three different BIM platforms. For its validation, the method is applied to the structural system of a Spanish residential building. The results provide evidence that the proposed method is vendor-neutral and enables the automatic identification and quantification of the waste generated by each building element during the design stage in multiple BIM platforms.


Author(s):  
Omer Perry ◽  
Eli Jaffe ◽  
Yuval Bitan

Objective To develop a new model to quantify information management dynamically and to identify factors that lead to information gaps. Background Information management is a core task for emergency medical service (EMS) team leaders during the prehospital phase of a mass-casualty incident (MCI). Lessons learned from past MCIs indicate that poor information management can lead to increased mortality. Various instruments are used to evaluate information management during MCI training simulations, but the challenge of measuring and improving team leaders’ abilities to manage information remains. Method The Dynamic Communication Quantification (DCQ) model was developed based on the knowledge representation typology. Using multi point-of-view synchronized video, the model quantifies and visualizes information management. It was applied to six MCI simulations between 2014 and 2019, to identify factors that led to information gaps, and compared with other evaluation methods. Results Out of the three methods applied, only the DCQ model revealed two factors that led to information gaps: first, consolidation of numerous casualties from different areas, and second, tracking of casualty arrivals to the medical treatment area and departures from the MCI site. Conclusion The DCQ model allows information management to be objectively quantified. Thus, it reveals a new layer of knowledge, presenting information gaps during an MCI. Because the model is applicable to all MCI team leaders, it can make MCI simulations more effective. Application This DCQ model quantifies information management dynamically during MCI training simulations.


2021 ◽  
Author(s):  
Bhuvaneswari A

Abstract The widespread practice of Online Social Networking leads to the diffusion of trending information and exchanging various opinions with socially connected people online. Social media steams data extracted from Social Networks has become a vital communication tool and also turn up as an eventual informative platform to catch real human voices at the time of emergency events like disaster. An effective underlying quantification model is proposed in this paper which uses change point detection algorithm to detect events based on the relative streaming tweet density - ratio respectively. A morphological time-series analysis is carried out determine the dissemination of information about crisis events using Information Entropy. Further, the Event - Link ratio (ELR) is estimated to obtain meaningful patterns in events been identified. This paper focus to empirically quantify the information dissemination of the events based on user's tweeting activities. The proposed quantification method is compared with state-of-art techniques in terms of event detection rate, the entropy of information spread. It is found that the accuracy of the proposed method is up to 94% with event detection after 75 seconds. K-Center Clustering (KCC) is used which results in the location detection accuracy of 85%.


2021 ◽  
Vol 21 (2) ◽  
pp. 1-31
Author(s):  
Karam Bou-Chaaya ◽  
Richard Chbeir ◽  
Mansour Naser Alraja ◽  
Philippe Arnould ◽  
Charith Perera ◽  
...  

In today’s highly connected cyber-physical environments, users are becoming more and more concerned about their privacy and ask for more involvement in the control of their data. However, achieving effective involvement of users requires improving their privacy decision-making. This can be achieved by: (i) raising their awareness regarding the direct and indirect privacy risks they accept to take when sharing data with consumers; (ii) helping them in optimizing their privacy protection decisions to meet their privacy requirements while maximizing data utility. In this article, we address the second goal by proposing a user-centric multi-objective approach for context-aware privacy management in connected environments, denoted δ- Risk . Our approach features a new privacy risk quantification model to dynamically calculate and select the best protection strategies for the user based on her preferences and contexts. Computed strategies are optimal in that they seek to closely satisfy user requirements and preferences while maximizing data utility and minimizing the cost of protection. We implemented our proposed approach and evaluated its performance and effectiveness in various scenarios. The results show that δ- Risk delivers scalability and low-complexity in time and space. Besides, it handles privacy reasoning in real-time, making it able to support the user in various contexts, including ephemeral ones. It also provides the user with at least one best strategy per context.


2021 ◽  
pp. 1-22
Author(s):  
Lei Jinyu ◽  
Liu Lei ◽  
Chu Xiumin ◽  
He Wei ◽  
Liu Xinglong ◽  
...  

Abstract The ship safety domain plays a significant role in collision risk assessment. However, few studies take the practical considerations of implementing this method in the vicinity of bridge-waters into account. Therefore, historical automatic identification system data is utilised to construct and analyse ship domains considering ship–ship and ship–bridge collisions. A method for determining the closest boundary is proposed, and the boundary of the ship domain is fitted by the least squares method. The ship domains near bridge-waters are constructed as ellipse models, the characteristics of which are discussed. Novel fuzzy quaternion ship domain models are established respectively for inland ships and bridge piers, which would assist in the construction of a risk quantification model and the calculation of a grid ship collision index. A case study is carried out on the multi-bridge waterway of the Yangtze River in Wuhan, China. The results show that the size of the ship domain is highly correlated with the ship's speed and length, and analysis of collision risk can reflect the real situation near bridge-waters, which is helpful to demonstrate the application of the ship domain in quantifying the collision risk and to characterise the collision risk distribution near bridge-waters.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Hamed Khatibi ◽  
Suzanne Wilkinson ◽  
Mostafa Baghersad ◽  
Heiman Dianat ◽  
Hidayati Ramli ◽  
...  

PurposeThis paper aims to develop a framework that could establish and further the terminology of smart city/resilient city discourse in that resilience could support urban “smartness”, a term that is widely argued being not easily measured nor quantifiably assessed.Design/methodology/approachThe qualitative approach was employed, and based on selected keywords, a systematic literature review was carried out to understand the main themes within the smart city and resilient city concepts databases. Upon screening, 86 papers were used and synthesised through the meta-synthesis method using both synthesis approach, meta-aggregation and meta-ethnography that systematically identifies both properties and characteristics, to build an innovative framework as an indicator-based smart/resilience quantification model.FindingsTwo novel frameworks are proposed, smart resilient city (SRC) and resilient smart city (RSC), as guidelines regulatory that establish a city's smartness and resilience.Research limitations/implicationsThe quantitative research phase is not provided as the framework builds on the exploratory approach in which the model is proposed through the postulation of data definitions.Practical implicationsAlthough the study's scope was limited to the city, proposed frameworks may be interpreted for other contexts that deal with the topic of resilience and smart.Originality/valueThe established framework proposal would encourage further exploration in context, serving as an inspiration for other scholars, decision-makers, as well as municipalities to keep strengthening smart city through resilience factors.


Sign in / Sign up

Export Citation Format

Share Document