probabilistic setting
Recently Published Documents


TOTAL DOCUMENTS

40
(FIVE YEARS 12)

H-INDEX

7
(FIVE YEARS 1)

Dependability ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 26-30
Author(s):  
G. M. Volokhov ◽  
E. S. Oganian ◽  
G. I. Gajimetov ◽  
D. A. Knyazev ◽  
V. V. Chunin ◽  
...  

Aim. The most vital unit of railway rolling stock is a wheelpair, as a broken wheel or axle may have catastrophic consequences. Therefore, before the production of a highspeed flat wagon designed for operation at speeds of up to 140 km/h, which is unique for the 1520 mm gauge space, could commence, it was required to research the applicability of the standard wheelpair for high-speed movement. Ensuring the safe operation of a wheelpair involves compliance with the requirements that are to be confirmed by means of assessment of strength and durability parameters [1]. Product conformity assessment may be based on the requirements of standards, whose voluntary fulfilment ensures compliance with [1], or other documents. Methods. The paper describes the computational and experimental methods used for confirming the strength and estimating the life (durability) of wheelpair elements in the probabilistic setting. As experimental data, the authors used the results of full-scale bench testing of wheelpairs for fatigue using the method of rotational bending as it best approximates the loading conditions in operation. The results confirmed the endurance limits of the axle and wheel as parts of an assembled wheelpair. Using design analysis, the authors examined the stress-strain state of the wheelpair caused by installation and operational loads in various running modes. Results. The conducted studies confirmed the wheelpair’s compliance with the requirements of [1–3] in terms of safety factors of fatigue strength and endurance, which eliminates the possibility of hazardous situations in the course of high-speed flat wagon operation. The time to fatigue crack nucleation in wheelpair components was evaluated using the fatigue resistance figures of the parts and equivalent amplitudes of dynamic stress caused by operational loads. It appears that this assessment allows establishing – with the assumed probability of destruction – the assigned useful life of a wheelpair axle at 32 years, which corresponds to the assigned useful life of the flat wagon according to the combined criterion. Corresponding standards and regulations required for developing the container-carrying flat wagon are being updated and a new State Standard is being developed. Conclusion. The conducted conformity assessment established that the flat wagon wheelpair meets the safety requirements of [1] and ensures the absence of unacceptable risks associated with harm to life and health of people, animals and plants, the environment and property of individuals and companies in the course of flat wagon operation.


2021 ◽  
Author(s):  
Julian Gutierrez ◽  
Lewis Hammond ◽  
Anthony W. Lin ◽  
Muhammad Najib ◽  
Michael Wooldridge

Rational verification is the problem of determining which temporal logic properties will hold in a multi-agent system, under the assumption that agents in the system act rationally, by choosing strategies that collectively form a game-theoretic equilibrium. Previous work in this area has largely focussed on deterministic systems. In this paper, we develop the theory and algorithms for rational verification in probabilistic systems. We focus on concurrent stochastic games (CSGs), which can be used to model uncertainty and randomness in complex multi-agent environments. We study the rational verification problem for both non-cooperative games and cooperative games in the qualitative probabilistic setting. In the former case, we consider LTL properties satisfied by the Nash equilibria of the game and in the latter case LTL properties satisfied by the core. In both cases, we show that the problem is 2EXPTIME-complete, thus not harder than the much simpler verification problem of model checking LTL properties of systems modelled as Markov decision processes (MDPs).


2021 ◽  
Author(s):  
Christel Baier ◽  
Martin Diller ◽  
Clemens Dubslaff ◽  
Sarah Alice Gaggl ◽  
Holger Hermanns ◽  
...  

Abstract argumentation is a prominent reasoning framework. It comes with a variety of semantics, and has lately been enhanced by probabilities to enable a quantitative treatment of argumentation. While admissibility is a fundamental notion in the classical setting, it has been merely reflected so far in the probabilistic setting. In this paper, we address the quantitative treatment of argumentation based on probabilistic notions of admissibility in a way that they form fully conservative extensions of classical notions. In particular, our building blocks are not the beliefs regarding single arguments. Instead we start from the fairly natural idea that whatever argumentation semantics is to be considered, semantics systematically induces constraints on the joint probability distribution on the sets of arguments. In some cases there might be many such distributions, even infinitely many ones, in other cases there may be one or none. Standard semantic notions are shown to induce such sets of constraints, and so do their probabilistic extensions. This allows them to be tackled by SMT solvers, as we demonstrate by a proof-of-concept implementation. We present a taxonomy of semantic notions, also in relation to published work, together with a running example illustrating our achievements.


2021 ◽  
Author(s):  
Jonas Voigt ◽  
Keith-Noah Jurke ◽  
Julius Schultz ◽  
Ulrich Römer ◽  
Jens Friedrichs

Abstract In this work, we consider a parallel compressor model (PCM), which decomposes a compressor encountering non-uniform inflow into a distorted and an undistorted subcompressor, respectively, to determine its overall operating point. The main advantage of PCM modeling is a significantly reduced computational workload. At the same time, modeling errors are introduced, which need to be quantified together with model input uncertainties. Therefore, we introduce a probabilistic setting where unknown parameters are modeled as random variables. We carry out a global sensitivity analysis, which allows to reduce the complexity of the probabilistic model, by setting unimportant input parameters to their nominal values. This analysis attributes portions of the model output variance (the fan efficiency for instance) to particular input parameters or input parameter combinations, through so-called Sobol coefficients. We further include a parameter describing the PCM inflow averaging process into the analysis, which allows to determine the influence of specific modeling choices onto the predicted efficiency. Efficient sampling methods are needed to estimate the sensitivity coefficients with a reasonable computational effort. A key advantage of the global approach is that nonlinear effects are fully taken into account, the necessity of which will be demonstrated by our numerical examples. The model is also compared to CFD reference simulations to quantify structural model errors. This comparison is based on area validation metrics comparing the stochastic distribution functions of the probabilistic PCM model and the reference data.


Author(s):  
Adam Brown ◽  
Omer Bobrowski ◽  
Elizabeth Munch ◽  
Bei Wang

AbstractWe study the probabilistic convergence between the mapper graph and the Reeb graph of a topological space $${\mathbb {X}}$$ X equipped with a continuous function $$f: {\mathbb {X}}\rightarrow \mathbb {R}$$ f : X → R . We first give a categorification of the mapper graph and the Reeb graph by interpreting them in terms of cosheaves and stratified covers of the real line $$\mathbb {R}$$ R . We then introduce a variant of the classic mapper graph of Singh et al. (in: Eurographics symposium on point-based graphics, 2007), referred to as the enhanced mapper graph, and demonstrate that such a construction approximates the Reeb graph of $$({\mathbb {X}}, f)$$ ( X , f ) when it is applied to points randomly sampled from a probability density function concentrated on $$({\mathbb {X}}, f)$$ ( X , f ) . Our techniques are based on the interleaving distance of constructible cosheaves and topological estimation via kernel density estimates. Following Munch and Wang (In: 32nd international symposium on computational geometry, volume 51 of Leibniz international proceedings in informatics (LIPIcs), Dagstuhl, Germany, pp 53:1–53:16, 2016), we first show that the mapper graph of $$({\mathbb {X}}, f)$$ ( X , f ) , a constructible $$\mathbb {R}$$ R -space (with a fixed open cover), approximates the Reeb graph of the same space. We then construct an isomorphism between the mapper of $$({\mathbb {X}},f)$$ ( X , f ) to the mapper of a super-level set of a probability density function concentrated on $$({\mathbb {X}}, f)$$ ( X , f ) . Finally, building on the approach of Bobrowski et al. (Bernoulli 23(1):288–328, 2017b), we show that, with high probability, we can recover the mapper of the super-level set given a sufficiently large sample. Our work is the first to consider the mapper construction using the theory of cosheaves in a probabilistic setting. It is part of an ongoing effort to combine sheaf theory, probability, and statistics, to support topological data analysis with random data.


2020 ◽  
Vol 2020 (1) ◽  
Author(s):  
Xin Tan ◽  
Yanan Wang ◽  
Lu Sun ◽  
Xingfeng Shao ◽  
Guanggui Chen

2020 ◽  
Author(s):  
Alexander J. Winkler ◽  
Ranga B. Myneni ◽  
Alexis Hannart ◽  
Victor Brovkin

<div> <div> <div> <p>Satellite data reveal widespread changes in vegetation cover of Earth’s land surfaces. Regions intensively attended to by humans are mostly greening due to land management. Natural vegetation, on the other hand, is exhibiting patterns of both greening and browning in all continents. Factors linked to anthropogenic carbon emissions, such as CO<sub>2 </sub>fertilization, climate change and consequent episodic disturbances (<em>e.g. </em>fires and droughts) are hypothesized to be key drivers of changes in natural vegetation. A rigorous regional attribution at biome-level that can be scaled into a global picture of what is behind the observed changes is currently lacking.</p> <p>Therefore, we analyze here the longest available satellite record of global leaf area index (LAI, 1981-2017) and identify several clusters of significant long-term changes at the biome scale. Using process-based model simulations (fully-coupled MPI-M Earth system model and 13 stand-alone land surface models), we disentangle the effects of rising CO<sub>2 </sub>on LAI in a probabilistic setting applying Causal Counterfactual Theory.</p> <p>Our analysis reveals a slowing down of greening and strengthening of browning trends, particularly in the last two decades (2000-2017). The decreases in LAI are primarily concentrated in regions of high LAI (<em>i.e. </em>tropical forests), whereas the increases are in low LAI regions (<em>i.e. </em>northern and arid lands). These opposing trends are reducing the LAI texture of natural vegetation at the global scale. The analysis prominently indicates the effects of climate change on many biomes – warming in northern ecosystems and rainfall anomalies in tropical biomes. Our results do not support previously published accounts of dominant global-scale effects of CO<sub>2 </sub>fertilization. Most models largely underestimate vegetation browning, especially in the tropical rainforests. The leaf area loss in these productive ecosystems could be an early indicator of a slow-down in the terrestrial carbon sink. Models need to better account for this effect to realize plausible Earth system projections of the 21<sup>st </sup>century.</p> </div> </div> </div>


Geophysics ◽  
2020 ◽  
Vol 85 (1) ◽  
pp. R29-R39 ◽  
Author(s):  
Michael Gineste ◽  
Jo Eidsvik ◽  
York Zheng

Seismic waveform inversion is a nontrivial optimization task, which is often complicated by the nonlinear relationship between the elastic attributes of interest and the large amount of data obtained in seismic experiments. Quantifying the solution uncertainty can be even more challenging, and it requires considering the problem in a probabilistic setting. Consequently, the seismic inverse problem is placed in a Bayesian framework, using a sequential filtering approach to invert for the elastic parameters. The method uses an iterative ensemble smoother to estimate the subsurface parameters, and from the ensemble, a notion of estimation uncertainty is readily available. The ensemble implicitly linearizes the relation between the parameters and the observed waveform data; hence, it requires no tangent linear model. The approach is based on sequential conditioning on partitions of the whole data record (1) to regularize the inversion path and effectively drive the estimation process in a top-down manner and (2) to circumvent a consequence of the ensemble reduced rank approximation. The method is exemplified on a synthetic case, inverting for elastic parameters in a 1D medium using a seismic shot record. Our results indicate that the iterative ensemble method is applicable to seismic waveform inversion and that the ensemble representation indeed indicates estimation uncertainty.


2020 ◽  
Vol 18 (1) ◽  
pp. 1635-1644
Author(s):  
Yongjie Han ◽  
Hanyue Xiao ◽  
Guanggui Chen

Abstract In this paper, we define the entropy number in probabilistic setting and determine the exact order of entropy number of finite-dimensional space in probabilistic setting. Moreover, we also estimate the sharp order of entropy number of univariate Sobolev space in probabilistic setting by discretization method.


Author(s):  
Hoang Nga Nguyen ◽  
Abdur Rakib

Resource-bounded alternating-time temporal logic (RB-ATL), an extension of Coalition Logic (CL) and Alternating-time Temporal Logic (ATL), allows reasoning about resource requirements of coalitions in concurrent systems. However, many real-world systems are inherently probabilistic as well as resource-bounded, and there is no straightforward way of reasoning about their unpredictable behaviours. In this paper, we propose a logic for reasoning about coalitional power under resource constraints in the probabilistic setting. We extend RB-ATL with probabilistic reasoning and provide a standard algorithm for the model-checking problem of the resulting logic Probabilistic Resource-Bounded ATL (pRB-ATL).


Sign in / Sign up

Export Citation Format

Share Document