A Comparison of Two Logit Models in the Analysis of Qualitative Marketing Data

1979 ◽  
Vol 16 (4) ◽  
pp. 533-538 ◽  
Author(s):  
David Flath ◽  
E. W. Leonard

The authors compare the application of two logit models for the analysis of qualitative marketing data. A weighted least squares logit model is compared with a maximum likelihood logit model different from that mentioned by Green et ai. Empirical applications are used to compare the models. Suggestions are presented for interpreting and reporting the results of logit-type models, with special attention to interaction effects.

2012 ◽  
Vol 04 (03) ◽  
pp. 1250019 ◽  
Author(s):  
STAN LIPOVETSKY

This work considers maximum likelihood objectives for estimating the probability of each multivariate observation's assignment to one particular cluster or to one or more clusters. Combining both objectives yields a maximization of the total probability odds of belonging to one or another cluster. The gradient of the total odds objective can be reduced to the multinomial-logit probabilities leading to a convenient Newton–Raphson clustering procedure presented via an iteratively re-weighted least squares technique. Besides the total odds, several other new objectives are also considered, and numerical examples are discussed.


2009 ◽  
Vol 12 (03) ◽  
pp. 297-317 ◽  
Author(s):  
ANOUAR BEN MABROUK ◽  
HEDI KORTAS ◽  
SAMIR BEN AMMOU

In this paper, fractional integrating dynamics in the return and the volatility series of stock market indices are investigated. The investigation is conducted using wavelet ordinary least squares, wavelet weighted least squares and the approximate Maximum Likelihood estimator. It is shown that the long memory property in stock returns is approximately associated with emerging markets rather than developed ones while strong evidence of long range dependence is found for all volatility series. The relevance of the wavelet-based estimators, especially, the approximate Maximum Likelihood and the weighted least squares techniques is proved in terms of stability and estimation accuracy.


2022 ◽  
Vol 7 (2) ◽  
pp. 2820-2839
Author(s):  
Saurabh L. Raikar ◽  
◽  
Dr. Rajesh S. Prabhu Gaonkar ◽  

<abstract> <p>Jaya algorithm is a highly effective recent metaheuristic technique. This article presents a simple, precise, and faster method to estimate stress strength reliability for a two-parameter, Weibull distribution with common scale parameters but different shape parameters. The three most widely used estimation methods, namely the maximum likelihood estimation, least squares, and weighted least squares have been used, and their comparative analysis in estimating reliability has been presented. The simulation studies are carried out with different parameters and sample sizes to validate the proposed methodology. The technique is also applied to real-life data to demonstrate its implementation. The results show that the proposed methodology's reliability estimates are close to the actual values and proceeds closer as the sample size increases for all estimation methods. Jaya algorithm with maximum likelihood estimation outperforms the other methods regarding the bias and mean squared error.</p> </abstract>


2018 ◽  
Vol 616 ◽  
pp. A95 ◽  
Author(s):  
Sebastian Espinosa ◽  
Jorge F. Silva ◽  
Rene A. Mendez ◽  
Rodrigo Lobos ◽  
Marcos Orchard

Context. Astrometry relies on the precise measurement of the positions and motions of celestial objects. Driven by the ever-increasing accuracy of astrometric measurements, it is important to critically assess the maximum precision that could be achieved with these observations. Aims. The problem of astrometry is revisited from the perspective of analyzing the attainability of well-known performance limits (the Cramér–Rao bound) for the estimation of the relative position of light-emitting (usually point-like) sources on a charge-coupled device (CCD)-like detector using commonly adopted estimators such as the weighted least squares and the maximum likelihood. Methods. Novel technical results are presented to determine the performance of an estimator that corresponds to the solution of an optimization problem in the context of astrometry. Using these results we are able to place stringent bounds on the bias and the variance of the estimators in close form as a function of the data. We confirm these results through comparisons to numerical simulations under a broad range of realistic observing conditions. Results. The maximum likelihood and the weighted least square estimators are analyzed. We confirm the sub-optimality of the weighted least squares scheme from medium to high signal-to-noise found in an earlier study for the (unweighted) least squares method. We find that the maximum likelihood estimator achieves optimal performance limits across a wide range of relevant observational conditions. Furthermore, from our results, we provide concrete insights for adopting an adaptive weighted least square estimator that can be regarded as a computationally efficient alternative to the optimal maximum likelihood solution. Conclusions. We provide, for the first time, close-form analytical expressions that bound the bias and the variance of the weighted least square and maximum likelihood implicit estimators for astrometry using a Poisson-driven detector. These expressions can be used to formally assess the precision attainable by these estimators in comparison with the minimum variance bound.


2021 ◽  
Vol 16 (4) ◽  
pp. 251-260
Author(s):  
Marcos Vinicius de Oliveira Peres ◽  
Ricardo Puziol de Oliveira ◽  
Edson Zangiacomi Martinez ◽  
Jorge Alberto Achcar

In this paper, we order to evaluate via Monte Carlo simulations the performance of sample properties of the estimates of the estimates for Sushila distribution, introduced by Shanker et al. (2013). We consider estimates obtained by six estimation methods, the known approaches of maximum likelihood, moments and Bayesian method, and other less traditional methods: L-moments, ordinary least-squares and weighted least-squares. As a comparison criterion, the biases and the roots of mean-squared errors were used through nine scenarios with samples ranging from 30 to 300 (every 30rd). In addition, we also considered a simulation and a real data application to illustrate the applicability of the proposed estimators as well as the computation time to get the estimates. In this case, the Bayesian method was also considered. The aim of the study was to find an estimation method to be considered as a better alternative or at least interchangeable with the traditional maximum likelihood method considering small or large sample sizes and with low computational cost.


Sign in / Sign up

Export Citation Format

Share Document