rejection sampling
Recently Published Documents


TOTAL DOCUMENTS

107
(FIVE YEARS 30)

H-INDEX

15
(FIVE YEARS 2)

Sensors ◽  
2022 ◽  
Vol 22 (2) ◽  
pp. 414
Author(s):  
Dominique Albert-Weiss ◽  
Ahmad Osman

A pivotal topic in agriculture and food monitoring is the assessment of the quality and ripeness of agricultural products by using non-destructive testing techniques. Acoustic testing offers a rapid in situ analysis of the state of the agricultural good, obtaining global information of its interior. While deep learning (DL) methods have outperformed state-of-the-art benchmarks in various applications, the reason for lacking adaptation of DL algorithms such as convolutional neural networks (CNNs) can be traced back to its high data inefficiency and the absence of annotated data. Active learning is a framework that has been heavily used in machine learning when the labelled instances are scarce or cumbersome to obtain. This is specifically of interest when the DL algorithm is highly uncertain about the label of an instance. By allowing the human-in-the-loop for guidance, a continuous improvement of the DL algorithm based on a sample efficient manner can be obtained. This paper seeks to study the applicability of active learning when grading ‘Galia’ muskmelons based on its shelf life. We propose k-Determinantal Point Processes (k-DPP), which is a purely diversity-based method that allows to take influence on the exploration within the feature space based on the chosen subset k. While getting coequal results to uncertainty-based approaches when k is large, we simultaneously obtain a better exploration of the data distribution. While the implementation based on eigendecomposition takes up a runtime of O(n3), this can further be reduced to O(n·poly(k)) based on rejection sampling. We suggest the use of diversity-based acquisition when only a few labelled samples are available, allowing for better exploration while counteracting the disadvantage of missing the training objective in uncertainty-based methods following a greedy fashion.


2022 ◽  
Author(s):  
Jake Carson ◽  
Alice Ledda ◽  
Luca Ferretti ◽  
Matt Keeling ◽  
Xavier Didelot

The coalescent model represents how individuals sampled from a population may have originated from a last common ancestor. The bounded coalescent model is obtained by conditioning the coalescent model such that the last common ancestor must have existed after a certain date. This conditioned model arises in a variety of applications, such as speciation, horizontal gene transfer or transmission analysis, and yet the bounded coalescent model has not been previously analysed in detail. Here we describe a new algorithm to simulate from this model directly, without resorting to rejection sampling. We show that this direct simulation algorithm is more computationally efficient than the rejection sampling approach. We also show how to calculate the probability of the last common ancestor occurring after a given date, which is required to compute the probability of realisations under the bounded coalescent model. Our results are applicable in both the isochronous (when all samples have the same date) and heterochronous (where samples can have different dates) settings. We explore the effect of setting a bound on the date of the last common ancestor, and show that it affects a number of properties of the resulting phylogenies. All our methods are implemented in a new R package called BoundedCoalescent which is freely available online.


Metrologia ◽  
2021 ◽  
Author(s):  
Manuel Marschall ◽  
Gerd Wuebbeler ◽  
Clemens Elster

Abstract Supplement 1 to the GUM (GUM-S1) extends the GUM uncertainty framework to nonlinear functions and non-Gaussian distributions. For this purpose, it employs a Monte Carlo method that yields a probability density function for the measurand. This Monte Carlo method has been successfully applied in numerous applications throughout metrology. However, considerable criticism has been raised against the type A uncertainty evaluation of GUM-S1. Most of the criticism could be addressed by including prior information about the measurand which, however, is beyond the scope of GUM-S1. We propose an alternative Monte Carlo method that will allow prior information about the measurand to be included. The proposed method is based on a Bayesian uncertainty evaluation and applies a simple rejection sampling approach using the Monte Carlo techniques of GUM-S1. The range of applicability of the approach is explored theoretically and in terms of examples. The results are promising, leading us to conclude that many metrological applications could benefit from this approach. Software support is provided to ease its implementation.


2021 ◽  
Author(s):  
Meng Tang ◽  
Yimin Liu ◽  
Louis J. Durlofsky

Abstract The use of deep-learning-based procedures for geological parameterization and fast surrogate flow modeling may enable the application of rigorous history matching algorithms that were previously considered impractical. In this study we incorporate such methods – specifically a geological parameterization that entails principal component analysis combined with a convolutional neural network (CNN-PCA) and a flow surrogate that uses a recurrent residual-U-Net procedure – into three different history matching procedures. The history matching algorithms considered are rejection sampling (RS), randomized maximum likelihood with mesh adaptive direct search optimization (MADS-RML), and ensemble smoother with multiple data assimilation (ES-MDA). RS is a rigorous sampler used here to provide reference results (though it can become intractable in cases with large amounts of observed data). History matching is performed for a channelized geomodel defined on a grid containing 128,000 cells. The CNN-PCA representation of geological realizations involves 400 parameters, and these are the variables determined through history matching. All flow evaluations (after training) are performed using the recurrent residual-U-Net surrogate model. Two cases, involving different amounts of historical data, are considered. We show that both MADS-RML and ES-MDA provide history matching results in general agreement with those from RS. MADS-RML is more accurate, however, and ES-MDA can display significant error in some quantities. ES-MDA requires many fewer function evaluations than MADS-RML, however, so there is a tradeoff between computational demand and accuracy. The framework developed here could be used to evaluate and tune a range of history matching procedures beyond those considered in this work.


2021 ◽  
Vol 2021 ◽  
pp. 1-17
Author(s):  
Yongli Tang ◽  
Feifei Xia ◽  
Qing Ye ◽  
Mengyao Wang ◽  
Ruijie Mu ◽  
...  

Although most existing linkable ring signature schemes on lattice can effectively resist quantum attacks, they still have the disadvantages of excessive time and storage overhead. This paper constructs an identity-based linkable ring signature (LRS) scheme over NTRU lattice by employing the technologies of trapdoor generation and rejection sampling. The security of this scheme relies on the small integer solution (SIS) problem on NTRU lattice. We prove that this scheme has unconditional anonymity, unforgeability, and linkability under the random oracle model (ROM). Through the performance analysis, this scheme has a shorter size of public/private keys, and when the number of ring members is small (such as N ≤ 8 ), this scheme has a shorter signature size compared with other existing latest lattice-based LRS schemes. The computational efficiency of signature has also been further improved since it only involves multiplication in the polynomial ring and modular operations of small integers. Finally, we implemented our scheme and other similar schemes, and it is shown that the time for the signature generation and verification of this scheme decreases roughly by 44.951% and 33.503%, respectively.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Zhongxiang Zheng ◽  
Anyu Wang ◽  
Lingyue Qin

Rejection sampling technology is a core tool in the design of lattice-based signatures with ‘Fiat–Shamir with Aborts’ structure, and it is related to signing efficiency and signature, size as well as security. In the rejection sampling theorem proposed by Lyubashevsky, the masking vector of rejection sampling is chosen from discrete Gaussian distribution. However, in practical designs, the masking vector is more likely to be chosen from bounded uniform distribution due to better efficiency and simpler implementation. Besides, as one of the third-round candidate signatures in the NIST postquantum cryptography standardization process, the 3rd round version of CRYSTALS-Dilithium has proposed a new method to decrease the rejection probability in order to achieve better efficiency and smaller signature size by decreasing the number of nonzero coefficients of the challenge polynomial according to the security levels. However, it is seen that small entropies in this new method may lead to higher risk of forgery attack compared with former schemes proposed in its 2nd version. Thus, in this paper, we first analyze the complexity of forgery attack for small entropies and then introduce a new method to decrease the rejection probability without loss of security including the security against forgery attack. This method is achieved by introducing a new rejection sampling theorem with tighter bound by utilizing Rényi divergence where masking vector follows uniform distribution. By observing large gaps between the security claim and actual security bound in CRYSTALS-Dilithium, we propose two series of adapted parameters for CRYSTALS-Dilithium. The first set can improve the efficiency of the signing process in CRYSTALS-Dilithium by factors of 61.7 %  and  41.7 % , according to the security levels, and ensure the security against known attacks, including forgery attack. And, the second set can reduce the signature size by a factor of 14.09 % with small improvements in efficiency at the same security level.


2021 ◽  
Vol 13 (1) ◽  
Author(s):  
Rizky Arden ◽  
Neno Ruseno ◽  
Yuda Arif Hidayat

Cargo plays a very important role in the aviation industry as a supporting revenue. In Airline X, cargo supports the revenue by 4% - 6% of the total revenue. There are opportunities to optimize the cargo compartment in Airline X by analyzing every agent involved in the purpose to know the optimum cargo loaded into the compartment using Agent-Based Modelling. The method used in this research is Rejection Sampling in Monte Carlo and Agent-Based Modelling. In addition, the theory used in this research is distribution function, to determine what type of distribution that represents the agent behavior. The final result shows that with the predetermined number of iterations, which is 300 iterations, the optimal value was obtained base on the convergent result. On the other hand, the distribution of passenger and baggage described as the Gaussian Distribution Function, while the distribution of EBT described as the Negative Exponential Distribution Function. These distributions represent agent behavior.


Author(s):  
Russell S. Crosbie ◽  
Praveen Kumar Rachakonda

AbstractRegional-scale estimates of groundwater recharge are inherently uncertain, but this uncertainty is rarely quantified. Quantifying this uncertainty provides an understanding of the limitations of the estimates, and being able to reduce the uncertainty makes the recharge estimates more useful for water resources management. This paper describes the development of a method to constrain the uncertainty in upscaled recharge estimates using a rejection sampling procedure for baseflow and remotely sensed evapotranspiration data to constrain the lower and upper end of the recharge distribution, respectively. The recharge estimates come from probabilistic chloride mass-balance estimates from 3,575 points upscaled using regression kriging with rainfall, soils and vegetation as covariates. The method is successfully demonstrated for the 570,000-km2 Cambrian Limestone Aquifer in northern Australia. The method developed here is able to reduce the uncertainty in the upscaled chloride mass-balance estimates of recharge by nearly a third using data that are readily available. The difference between the 5th and 95th percentiles of unconstrained recharge across the aquifer was 31 mm/yr (range 5–36 mm/yr) which was reduced to 22 mm/yr for the constrained case (9–31 mm/yr). The spatial distribution of recharge was dominated by the spatial distribution of rainfall but was comparatively reduced in areas with denser vegetation or finer textured soils. Recharge was highest in the north-west in the Daly River catchment with a catchment average of 101 (61–192) mm/yr and lowest in the south-east Georgina River catchment with 6 (4–12) mm/yr.


Sign in / Sign up

Export Citation Format

Share Document