A statistical approach to identifying closed object boundaries in images

1994 ◽  
Vol 26 (4) ◽  
pp. 831-854 ◽  
Author(s):  
Jeffrey D. Helterbrand ◽  
Noel Cressie ◽  
Jennifer L. Davidson

In this research, we present a statistical theory, and an algorithm, to identify one-pixel-wide closed object boundaries in gray-scale images. Closed-boundary identification is an important problem because boundaries of objects are major features in images. In spite of this, most statistical approaches to image restoration and texture identification place inappropriate stationary model assumptions on the image domain. One way to characterize the structural components present in images is to identify one-pixel-wide closed boundaries that delineate objects. By defining a prior probability model on the space of one-pixel-wide closed boundary configurations and appropriately specifying transition probability functions on this space, a Markov chain Monte Carlo algorithm is constructed that theoretically converges to a statistically optimal closed boundary estimate. Moreover, this approach ensures that any approximation to the statistically optimal boundary estimate will have the necessary property of closure.


1994 ◽  
Vol 26 (04) ◽  
pp. 831-854 ◽  
Author(s):  
Jeffrey D. Helterbrand ◽  
Noel Cressie ◽  
Jennifer L. Davidson

In this research, we present a statistical theory, and an algorithm, to identify one-pixel-wide closed object boundaries in gray-scale images. Closed-boundary identification is an important problem because boundaries of objects are major features in images. In spite of this, most statistical approaches to image restoration and texture identification place inappropriate stationary model assumptions on the image domain. One way to characterize the structural components present in images is to identify one-pixel-wide closed boundaries that delineate objects. By defining a prior probability model on the space of one-pixel-wide closed boundary configurations and appropriately specifying transition probability functions on this space, a Markov chain Monte Carlo algorithm is constructed that theoretically converges to a statistically optimal closed boundary estimate. Moreover, this approach ensures that any approximation to the statistically optimal boundary estimate will have the necessary property of closure.



2014 ◽  
Vol 25 (05) ◽  
pp. 563-584 ◽  
Author(s):  
PARTHA SARATHI MANDAL ◽  
ANIL K. GHOSH

Location verification in wireless sensor networks (WSNs) is quite challenging in the presence of malicious sensor nodes, which are called attackers. These attackers try to break the verification protocol by reporting their incorrect locations during the verification stage. In the literature of WSNs, most of the existing methods of location verification use a set of trusted verifiers, which are vulnerable to attacks by malicious nodes. These existing methods also use some distance estimation techniques, which are not accurate in noisy channels. In this article, we adopt a statistical approach for secure location verification to overcome these limitations. Our proposed method does not rely on any trusted entities and it takes care of the limited precision in distance estimation by using a suitable probability model for the noise. The resulting verification scheme detects and filters out all malicious nodes from the network with a very high probability even when it is in a noisy channel.



2021 ◽  
Vol 17 (1) ◽  
pp. e1008598
Author(s):  
Samuel Planton ◽  
Timo van Kerkoerle ◽  
Leïla Abbih ◽  
Maxime Maheu ◽  
Florent Meyniel ◽  
...  

Working memory capacity can be improved by recoding the memorized information in a condensed form. Here, we tested the theory that human adults encode binary sequences of stimuli in memory using an abstract internal language and a recursive compression algorithm. The theory predicts that the psychological complexity of a given sequence should be proportional to the length of its shortest description in the proposed language, which can capture any nested pattern of repetitions and alternations using a limited number of instructions. Five experiments examine the capacity of the theory to predict human adults’ memory for a variety of auditory and visual sequences. We probed memory using a sequence violation paradigm in which participants attempted to detect occasional violations in an otherwise fixed sequence. Both subjective complexity ratings and objective violation detection performance were well predicted by our theoretical measure of complexity, which simply reflects a weighted sum of the number of elementary instructions and digits in the shortest formula that captures the sequence in our language. While a simpler transition probability model, when tested as a single predictor in the statistical analyses, accounted for significant variance in the data, the goodness-of-fit with the data significantly improved when the language-based complexity measure was included in the statistical model, while the variance explained by the transition probability model largely decreased. Model comparison also showed that shortest description length in a recursive language provides a better fit than six alternative previously proposed models of sequence encoding. The data support the hypothesis that, beyond the extraction of statistical knowledge, human sequence coding relies on an internal compression using language-like nested structures.



Author(s):  
Mohammad Amin Hariri-Ardebili

Risk analysis of concrete dams and quantification of the failure probability are important tasks in dam safety assessment. The conditional probability of demand and capacity is usually estimated by numerical simulation and Monte Carlo technique. However, the estimated failure probability (or the reliability index) is dam-dependent which makes its application limited to some case studies. This article proposes an analytical failure model for generic gravity dam classes which is optimized based on large number of nonlinear finite element analyses. A hybrid parametric–probabilistic–statistical approach is used to estimate the failure probability as a function of dam size, material distributional models and external hydrological hazard. The proposed model can be used for preliminary design and evaluation of two-dimensional gravity dam models.



2020 ◽  
Vol 12 (5) ◽  
pp. 753
Author(s):  
Quanhua Zhao ◽  
Hongyun Zhang ◽  
Guanghui Wang ◽  
Yu Li

This paper presents a regionalized segmentation method for synthetic aperture radar (SAR) intensity images based on tessellation with irregular polygons. In the proposed method, the image domain is partitioned into a collection of irregular polygons, which are constructed using sets of nodes and are used to fit homogeneous regions with arbitrary shapes. Each partitioned polygon is taken as the basic processing unit. Assuming the intensities of the pixels in the polygon follow an independent and identical gamma distribution, the likelihood of the image intensities is modeled. After defining the prior distributions of the tessellation and the parameters for the likelihood model, a posterior probability model can be built based on the Bayes theorem as a segmentation model. To obtain optimal segmentation, a reversible-jump Markov chain Monte Carlo (RJMCMC) algorithm is designed to simulate from the segmentation model, where the move operations include updating the gamma distribution parameter, updating labels, moving nodes, merging polygons, splitting polygons, adding nodes, and deleting nodes. Experiments were carried out on synthetic and real SAR intensity images using the proposed method while the regular and Voronoi tessellation-based methods were also preformed for comparison. Our results show the proposed method overcomes some intrinsic limitations of current segmentation methods and is able to generate good results for homogeneous regions with different shapes.



2013 ◽  
Vol 336-338 ◽  
pp. 471-474
Author(s):  
Shi Guang ◽  
Hai Jing Yang ◽  
Qi Wei Wang ◽  
Yan Jin

In allusion to the issues of system line state transfer that may arise in adverse weather, a new method of probability calculation is proposed. In a statistical analysis, this article firstly defines that failure probability of the first line subjects to Poisson distribution. Secondly, we figure out the power flow transferring distribution after first line fault, according to the method of Flow Transferring Relativity Factor (FTRF), and combine with the protective possibility so as to build the probability model between the load rate and protection action. Then, the method defines the severity of line load rate. Finally, the approach constructs the line state transition probability model considering direct and indirect factors in adverse weather. The effectiveness and correctness of the proposed method are verified by simulation based on IEEE 39-node system.



Sign in / Sign up

Export Citation Format

Share Document