scholarly journals Modeling in Forestry Using Mixture Models Fitted to Grouped and Ungrouped Data

Forests ◽  
2021 ◽  
Vol 12 (9) ◽  
pp. 1196
Author(s):  
Eric K. Zenner ◽  
Mahdi Teimouri

The creation and maintenance of complex forest structures has become an important forestry objective. Complex forest structures, often expressed in multimodal shapes of tree size/diameter (DBH) distributions, are challenging to model. Mixture probability density functions of two- or three-component gamma, log-normal, and Weibull mixture models offer a solution and can additionally provide insights into forest dynamics. Model parameters can be efficiently estimated with the maximum likelihood (ML) approach using iterative methods such as the Newton-Raphson (NR) algorithm. However, the NR algorithm is sensitive to the choice of initial values and does not always converge. As an alternative, we explored the use of the iterative expectation-maximization (EM) algorithm for estimating parameters of the aforementioned mixture models because it always converges to ML estimators. Since forestry data frequently occur both in grouped (classified) and ungrouped (raw) forms, the EM algorithm was applied to explore the goodness-of-fit of the gamma, log-normal, and Weibull mixture distributions in three sample plots that exhibited irregular, multimodal, highly skewed, and heavy-tailed DBH distributions where some size classes were empty. The EM-based goodness-of-fit was further compared against a nonparametric kernel-based density estimation (NK) model and the recently popularized gamma-shaped mixture (GSM) models using the ungrouped data. In this example application, the EM algorithm provided well-fitting two- or three-component mixture models for all three model families. The number of components of the best-fitting models differed among the three sample plots (but not among model families) and the mixture models of the log-normal and gamma families provided a better fit than the Weibull distribution for grouped and ungrouped data. For ungrouped data, both log-normal and gamma mixture distributions outperformed the GSM model and, with the exception of the multimodal diameter distribution, also the NK model. The EM algorithm appears to be a promising tool for modeling complex forest structures.

2021 ◽  
Author(s):  
Samyajoy Pal ◽  
Christian Heumann

Abstract A generalized way of building mixture models using different distributions is explored in this article. The EM algorithm is used with some modifications to accommodate different distributions within the same model. The model uses any point estimate available for the respective distributions to estimate the mixture components and model parameters. The study is focused on the application of mixture models in unsupervised learning problems, especially cluster analysis. The convenience of building mixture models using the generalized approach is further emphasised by appropriate examples, exploiting the well-known maximum likelihood and Bayesian estimates of the parameters of the parent distributions.


2012 ◽  
Vol 532-533 ◽  
pp. 1445-1449
Author(s):  
Ting Ting Tong ◽  
Zhen Hua Wu

EM algorithm is a common method to solve mixed model parameters in statistical classification of remote sensing image. The EM algorithm based on fuzzification is presented in this paper to use a fuzzy set to represent each training sample. Via the weighted degree of membership, different samples will be of different effect during iteration to decrease the impact of noise on parameter learning and to increase the convergence rate of algorithm. The function and accuracy of classification of image data can be completed preferably.


2018 ◽  
Vol 15 (03) ◽  
pp. 1850012 ◽  
Author(s):  
Andrzej Polanski ◽  
Michal Marczyk ◽  
Monika Pietrowska ◽  
Piotr Widlak ◽  
Joanna Polanska

Setting initial values of parameters of mixture distributions estimated by using the EM recursive algorithm is very important to the overall quality of estimation. None of the existing methods are suitable for heteroscedastic mixtures with a large number of components. We present relevant novel methodology of estimating the initial values of parameters of univariate, heteroscedastic Gaussian mixtures, on the basis of dynamic programming partitioning of the range of observations into bins. We evaluate variants of the dynamic programming method corresponding to different scoring functions for partitioning. We demonstrate the superior efficiency of the proposed method compared to existing techniques for both simulated and real datasets.


Sensors ◽  
2021 ◽  
Vol 21 (16) ◽  
pp. 5549
Author(s):  
Ossi Kaltiokallio ◽  
Roland Hostettler ◽  
Hüseyin Yiğitler ◽  
Mikko Valkama

Received signal strength (RSS) changes of static wireless nodes can be used for device-free localization and tracking (DFLT). Most RSS-based DFLT systems require access to calibration data, either RSS measurements from a time period when the area was not occupied by people, or measurements while a person stands in known locations. Such calibration periods can be very expensive in terms of time and effort, making system deployment and maintenance challenging. This paper develops an Expectation-Maximization (EM) algorithm based on Gaussian smoothing for estimating the unknown RSS model parameters, liberating the system from supervised training and calibration periods. To fully use the EM algorithm’s potential, a novel localization-and-tracking system is presented to estimate a target’s arbitrary trajectory. To demonstrate the effectiveness of the proposed approach, it is shown that: (i) the system requires no calibration period; (ii) the EM algorithm improves the accuracy of existing DFLT methods; (iii) it is computationally very efficient; and (iv) the system outperforms a state-of-the-art adaptive DFLT system in terms of tracking accuracy.


2000 ◽  
Vol 21 (8) ◽  
pp. 759-769 ◽  
Author(s):  
Aleix M. Martı́nez ◽  
Jordi Vitrià

2016 ◽  
Vol 12 (1) ◽  
pp. 65-77
Author(s):  
Michael D. Regier ◽  
Erica E. M. Moodie

Abstract We propose an extension of the EM algorithm that exploits the common assumption of unique parameterization, corrects for biases due to missing data and measurement error, converges for the specified model when standard implementation of the EM algorithm has a low probability of convergence, and reduces a potentially complex algorithm into a sequence of smaller, simpler, self-contained EM algorithms. We use the theory surrounding the EM algorithm to derive the theoretical results of our proposal, showing that an optimal solution over the parameter space is obtained. A simulation study is used to explore the finite sample properties of the proposed extension when there is missing data and measurement error. We observe that partitioning the EM algorithm into simpler steps may provide better bias reduction in the estimation of model parameters. The ability to breakdown a complicated problem in to a series of simpler, more accessible problems will permit a broader implementation of the EM algorithm, permit the use of software packages that now implement and/or automate the EM algorithm, and make the EM algorithm more accessible to a wider and more general audience.


2019 ◽  
Vol 2019 ◽  
pp. 1-10 ◽  
Author(s):  
Yupeng Li ◽  
Jianhua Zhang ◽  
Ruisi He ◽  
Lei Tian ◽  
Hewen Wei

In this paper, the Gaussian mixture model (GMM) is introduced to the channel multipath clustering. In the GMM field, the expectation-maximization (EM) algorithm is usually utilized to estimate the model parameters. However, the EM widely converges into local optimization. To address this issue, a hybrid differential evolution (DE) and EM (DE-EM) algorithms are proposed in this paper. To be specific, the DE is employed to initialize the GMM parameters. Then, the parameters are estimated with the EM algorithm. Thanks to the global searching ability of DE, the proposed hybrid DE-EM algorithm is more likely to obtain the global optimization. Simulations demonstrate that our proposed DE-EM clustering algorithm can significantly improve the clustering performance.


Sign in / Sign up

Export Citation Format

Share Document