Maximizing Likelihood Function for Parameter Estimation in Point Clouds via Groebner Basis

Author(s):  
Joseph Awange ◽  
Béla Paláncz ◽  
Robert Lewis
Author(s):  
W. Nguatem ◽  
M. Drauschke ◽  
H. Mayer

In this paper, we present a fully automatic approach to localize the outlines of facade objects (windows and doors) in 3D point clouds of facades. We introduce an approach to search for the main facade wall and locate the facade objects within a probabilistic framework. Our search routine is based on Monte Carlo Simulation (MC-Simulation). Templates containing control points of curves are used to approximate the possible shapes of windows and doors. These are interpolated using parametric B-spline curves. These templates are scored in a sliding window style over the entire facade using a likelihood function in a probabilistic matching procedure. This produces many competing results for which a two layered model selection based on Bayes factor is applied. A major thrust in our work is the introduction of a 2D shape-space of similar shapes under affine transform in this architectural scene. This transforms the initial parametric B-splines curves representing the outlines of objects to curves of affine similarity in a strongly reduced dimensionality thus facilitating the generation of competing hypotheses within the search space. A further computational speedup is achieved through the clustering of the search space to disjoint regions, thus enabling a parallel implementation. We obtain state-of-the results on self-acquired data sets. The robustness of our algorithm is evaluated on 3D point clouds from image matching and LiDAR data of diverse quality.


Author(s):  
Tim Loossens ◽  
Kristof Meers ◽  
Niels Vanhasbroeck ◽  
Nil Anarat ◽  
Stijn Verdonck ◽  
...  

AbstractComputational modeling plays an important role in a gamut of research fields. In affect research, continuous-time stochastic models are becoming increasingly popular. Recently, a non-linear, continuous-time, stochastic model has been introduced for affect dynamics, called the Affective Ising Model (AIM). The drawback of non-linear models like the AIM is that they generally come with serious computational challenges for parameter estimation and related statistical analyses. The likelihood function of the AIM does not have a closed form expression. Consequently, simulation based or numerical methods have to be considered in order to evaluate the likelihood function. Additionally, the likelihood function can have multiple local minima. Consequently, a global optimization heuristic is required and such heuristics generally require a large number of likelihood function evaluations. In this paper, a Julia software package is introduced that is dedicated to fitting the AIM. The package includes an implementation of a numeric algorithm for fast computations of the likelihood function, which can be run both on graphics processing units (GPU) and central processing units (CPU). The numerical method introduced in this paper is compared to the more traditional Euler-Maruyama method for solving stochastic differential equations. Furthermore, the estimation software is tested by means of a recovery study and estimation times are reported for benchmarks that were run on several computing devices (two different GPUs and three different CPUs). According to these results, a single parameter estimation can be obtained in less than thirty seconds using a mainstream NVIDIA GPU.


2012 ◽  
Vol 4 (1) ◽  
pp. 185
Author(s):  
Irfan Wahyudi ◽  
Purhadi Purhadi ◽  
Sutikno Sutikno ◽  
Irhamah Irhamah

Multivariate Cox proportional hazard models have ratio property, that is the ratio of  hazard functions for two individuals with covariate vectors  z1 and  z2 are constant (time independent). In this study we talk about estimation of prameters on multivariate Cox model by using Maximum Partial Likelihood Estimation (MPLE) method. To determine the appropriate estimators  that maximize the ln-partial likelihood function, after a score vector and a Hessian matrix are found, numerical iteration methods are applied. In this case, we use a Newton Raphson method. This numerical method is used since the solutions of the equation system of the score vector after setting it equal to zero vector are not closed form. Considering the studies about multivariate Cox model are limited, including the parameter estimation methods, but the methods are urgently needed by some fields of study related such as economics, engineering and medical sciences. For this reasons, the goal of this study is designed to develop parameter estimation methods from univariate to multivariate cases.


2018 ◽  
Vol 2018 ◽  
pp. 1-9
Author(s):  
Xinting Zhai ◽  
Jixin Wang ◽  
Jinshi Chen

Due to the harsh working environment of the construction machinery, a simple distribution cannot be used to approximate the shape of the rainflow matrix. In this paper, the Weibull-normal (W-n) mixture distribution is used. The lowest Akaike information criterion (AIC) value is employed to determine the components number of the mixture. A parameter estimation method based on the idea of optimization is proposed. The method estimates parameters of the mixture by maximizing the log likelihood function (LLF) using an intelligent optimization algorithm (IOA), genetic algorithm (GA). To verify the performance of the proposed method, one of the already existing methods is applied in the simulation study and the practical case study. The fitting effects of the fitted distributions are compared by calculating the AIC and chi-square (χ2) value. It can be concluded that the proposed method is feasible and effective for parameter estimation of the mixture distribution.


2019 ◽  
Vol 2019 ◽  
pp. 1-8 ◽  
Author(s):  
Fan Yang ◽  
Hu Ren ◽  
Zhili Hu

The maximum likelihood estimation is a widely used approach to the parameter estimation. However, the conventional algorithm makes the estimation procedure of three-parameter Weibull distribution difficult. Therefore, this paper proposes an evolutionary strategy to explore the good solutions based on the maximum likelihood method. The maximizing process of likelihood function is converted to an optimization problem. The evolutionary algorithm is employed to obtain the optimal parameters for the likelihood function. Examples are presented to demonstrate the proposed method. The results show that the proposed method is suitable for the parameter estimation of the three-parameter Weibull distribution.


2014 ◽  
Vol 556-562 ◽  
pp. 4146-4150
Author(s):  
Shu Meng ◽  
Gui Xiang Shen ◽  
Ying Zhi Zhang ◽  
Shu Guang Sun ◽  
Qi Song

In this paper, the parameter estimation problem of products which are mutually independent and whose life belongs to two parameters Weibull distribution in fixed-time censoring experiment is discussed. And the rank of failure data is corrected by average rank time method, when the censoring experiments appeared. It is found that the method not only achieves the same effect for likelihood function theory, but also has the characters of high precision, simple process, no programming calculation, when model optimization is done by correlation index method. Finally, take field test data of a machine tool as an example to introduce the specific application process of this method, in order to verify the effectiveness and practical applicability.


2020 ◽  
Vol 9 (4) ◽  
pp. 255
Author(s):  
Hua Liu ◽  
Xiaoming Zhang ◽  
Yuancheng Xu ◽  
Xiaoyong Chen

The degree of automation and efficiency are among the most important factors that influence the availability of Terrestrial light detection and ranging (LiDAR) Scanning (TLS) registration algorithms. This paper proposes an Ortho Projected Feature Images (OPFI) based 4 Degrees of Freedom (DOF) coarse registration method, which is fully automated and with high efficiency, for TLS point clouds acquired using leveled or inclination compensated LiDAR scanners. The proposed 4DOF registration algorithm decomposes the parameter estimation into two parts: (1) the parameter estimation of horizontal translation vector and azimuth angle; and (2) the parameter estimation of the vertical translation vector. The parameter estimation of the horizontal translation vector and the azimuth angle is achieved by ortho projecting the TLS point clouds into feature images and registering the ortho projected feature images by Scale Invariant Feature Transform (SIFT) key points and descriptors. The vertical translation vector is estimated using the height difference of source points and target points in the overlapping regions after horizontally aligned. Three real TLS datasets captured by the Riegl VZ-400 and the Trimble SX10 and one simulated dataset were used to validate the proposed method. The proposed method was compared with four state-of-the-art 4DOF registration methods. The experimental results showed that: (1) the accuracy of the proposed coarse registration method ranges from 0.02 m to 0.07 m in horizontal and 0.01 m to 0.02 m in elevation, which is at centimeter-level and sufficient for fine registration; and (2) as many as 120 million points can be registered in less than 50 s, which is much faster than the compared methods.


2008 ◽  
Vol 8 (2) ◽  
pp. 335-347 ◽  
Author(s):  
E. Tanir ◽  
K. Felsenstein ◽  
M. Yalcinkaya

Abstract. In order to investigate the deformations of an area or an object, geodetic observations are repeated at different time epochs and then these observations of each period are adjusted independently. From the coordinate differences between the epochs the input parameters of a deformation model are estimated. The decision about the deformation is given by appropriate models using the parameter estimation results from each observation period. So, we have to be sure that we use accurately taken observations (assessing the quality of observations) and that we also use an appropriate mathematical model for both adjustment of period measurements and for the deformation modelling (Caspary, 2000). All inaccuracies of the model, especially systematic and gross errors in the observations, as well as incorrectly evaluated a priori variances will contaminate the results and lead to apparent deformations. Therefore, it is of prime importance to employ all known methods which can contribute to the development of a realistic model. In Albertella et al. (2005), a new testing procedure from Bayesian point of view in deformation analysis was developed by taking into consideration prior information about the displacements in case estimated displacements are small w.r.t. (with respect to) measurement precision. Within our study, we want to introduce additional parameter estimation from the Bayesian point of view for a deformation monitoring network which is constructed for landslide monitoring in Macka in the province of Trabzon in north eastern Turkey. We used LSQ parameter estimation results to set up prior information for this additional parameter estimation procedure. The Bayesian inference allows evaluating the probability of an event by available prior evidences and collected observations. Bayes theorem underlines that the observations modify through the likelihood function the prior knowledge of the parameters, thus leading to the posterior density function of the parameters themselves.


2019 ◽  
Vol 62 (4) ◽  
pp. 941-949
Author(s):  
Junwei Tan ◽  
Qingyun Duan

Abstract. The Generalized Likelihood Uncertainty Estimation (GLUE) method is one of the popular methods for parameter estimation and uncertainty analysis, although it has been criticized for some drawbacks in numerous studies. In this study, we performed an uncertainty analysis for the ORYZA_V3 model using the GLUE method integrated with Latin hypercube sampling (LHS). Different likelihood measures were examined to understand the differences in derived posterior parameter distributions and uncertainty estimates of the model predictions based on a variety of observations from field experiments. The results indicated that the parameter posterior distributions and 95% confidence intervals (95CI) of model outputs were very sensitive to the choice of likelihood measure, as well as the weights assigned to observations at different dates and to different observation types within a likelihood measure. Performance of the likelihood measure with a proper likelihood function based on normal distribution of model errors and the combining method based on mathematical multiplication was the best, with respect to the effectiveness of reducing the uncertainties of parameter values and model predictions. Moreover, only the means and standard deviations of observation replicates were enough to construct an effective likelihood function in the GLUE method. This study highlighted the importance of using appropriate likelihood measures integrated with multiple observation types in the GLUE method. Keywords: GLUE, Likelihood measures, Model uncertainty, Crop model.


Sign in / Sign up

Export Citation Format

Share Document