scholarly journals Snell Envelope with Small Probability Criteria

2012 ◽  
Vol 66 (3) ◽  
pp. 309-330 ◽  
Author(s):  
Pierre Del Moral ◽  
Peng Hu ◽  
Nadia Oudjane
1988 ◽  
Vol 53 (5) ◽  
pp. 889-902
Author(s):  
Josef Šebek

It is shown that the formation of the so-called rotator phase of alkanes (one of the high temperature crystalline phases) might be connected with a partial increase of the conformational flexibility of chains. The conformations with higher number of kinks per chain, which have been neglected till now, are shown to contribute effectively to the conformational partition function. Small probability of these states given by the Boltzmann exponent is compensated by a large number of ways in which they can be distributed along the chain. The deduced features of the rotator phase seem to be in agreement with the experimentally observed properties.


2014 ◽  
Vol 644-650 ◽  
pp. 4023-4026
Author(s):  
Yang Ju ◽  
Xin Yong Wang

The vector time series model for simulating the underwater target radiated-noise is developed in this paper. Experimental results show that the true value lying outside the confidence interval would be a small probability event.


2021 ◽  
Author(s):  
Mircea-Adrian Digulescu

It has long been known that cryptographic schemes offering provably unbreakable security exist, namely the One Time Pad (OTP). The OTP, however, comes at the cost of a very long secret key - as long as the plain-text itself. In this paper we propose an encryption scheme which we (boldly) claim offers the same level of security as the OTP, while allowing for much shorter keys, of size polylogarithmic in the computing power available to the adversary. The Scheme requires a large sequence of truly random words, of length polynomial in the both plain-text size and the logarithm of the computing power the adversary has. We claim that it ensures such an attacker cannot discern the cipher output from random data, except with small probability. We also show how it can be adapted to allow for several plain-texts to be encrypted in the same cipher output, with almost independent keys. Also, we describe how it can be used in lieu of a One Way Function.


2018 ◽  
Vol 24 (2) ◽  
pp. 101-115 ◽  
Author(s):  
Mohamed-Slim Alouini ◽  
Nadhir Ben Rached ◽  
Abla Kammoun ◽  
Raul Tempone

Abstract The sum of log-normal variates is encountered in many challenging applications such as performance analysis of wireless communication systems and financial engineering. Several approximation methods have been reported in the literature. However, these methods are not accurate in the tail regions. These regions are of primordial interest as small probability values have to be evaluated with high precision. Variance reduction techniques are known to yield accurate, yet efficient, estimates of small probability values. Most of the existing approaches have focused on estimating the right-tail of the sum of log-normal random variables (RVs). Here, we instead consider the left-tail of the sum of correlated log-normal variates with Gaussian copula, under a mild assumption on the covariance matrix. We propose an estimator combining an existing mean-shifting importance sampling approach with a control variate technique. This estimator has an asymptotically vanishing relative error, which represents a major finding in the context of the left-tail simulation of the sum of log-normal RVs. Finally, we perform simulations to evaluate the performances of the proposed estimator in comparison with existing ones.


2019 ◽  
Vol 31 (5) ◽  
pp. 998-1014 ◽  
Author(s):  
Heiko Hoffmann

It is still unknown how associative biological memories operate. Hopfield networks are popular models of associative memory, but they suffer from spurious memories and low efficiency. Here, we present a new model of an associative memory that overcomes these deficiencies. We call this model sparse associative memory (SAM) because it is based on sparse projections from neural patterns to pattern-specific neurons. These sparse projections have been shown to be sufficient to uniquely encode a neural pattern. Based on this principle, we investigate theoretically and in simulation our SAM model, which turns out to have high memory efficiency and a vanishingly small probability of spurious memories. This model may serve as a basic building block of brain functions involving associative memory.


Author(s):  
Jan O. de Kat ◽  
Dirk-Jan Pinkster ◽  
Kevin A. McTaggart

The objective of this paper is to apply a methodology aimed at the probabilistic capsize assessment of two naval ships: a frigate and a corvette. Use is made of combined knowledge of the wave and wind climate a ship will be exposed to during its lifetime and of the physical behavior of that ship in the various sea states it is likely to encounter. This includes the behavior in extreme wave conditions that have a small probability of occurrence, but which may be critical to the safe operation of a ship. Time domain simulations provide the basis for deriving short-term and long-term statistics for extreme roll angles. The numerical model is capable of predicting the 6 DOF behavior of a steered vessel in wind and waves, including conditions that may lead to broaching and capsizing.


2020 ◽  
Vol 8 (1) ◽  
pp. 120-147 ◽  
Author(s):  
Arkadiusz Wiśniowski ◽  
Joseph W Sakshaug ◽  
Diego Andres Perez Ruiz ◽  
Annelies G Blom

Abstract Survey data collection costs have risen to a point where many survey researchers and polling companies are abandoning large, expensive probability-based samples in favor of less expensive nonprobability samples. The empirical literature suggests this strategy may be suboptimal for multiple reasons, among them that probability samples tend to outperform nonprobability samples on accuracy when assessed against population benchmarks. However, nonprobability samples are often preferred due to convenience and costs. Instead of forgoing probability sampling entirely, we propose a method of combining both probability and nonprobability samples in a way that exploits their strengths to overcome their weaknesses within a Bayesian inferential framework. By using simulated data, we evaluate supplementing inferences based on small probability samples with prior distributions derived from nonprobability data. We demonstrate that informative priors based on nonprobability data can lead to reductions in variances and mean squared errors for linear model coefficients. The method is also illustrated with actual probability and nonprobability survey data. A discussion of these findings, their implications for survey practice, and possible research extensions are provided in conclusion.


Sign in / Sign up

Export Citation Format

Share Document