confidence regions
Recently Published Documents


TOTAL DOCUMENTS

664
(FIVE YEARS 90)

H-INDEX

42
(FIVE YEARS 2)

2022 ◽  
Vol 9 ◽  
Author(s):  
Xiuzhen Zhang ◽  
Riquan Zhang ◽  
Zhiping Lu

This article develops two new empirical likelihood methods for long-memory time series models based on adjusted empirical likelihood and mean empirical likelihood. By application of Whittle likelihood, one obtains a score function that can be viewed as the estimating equation of the parameters of the long-memory time series model. An empirical likelihood ratio is obtained which is shown to be asymptotically chi-square distributed. It can be used to construct confidence regions. By adding pseudo samples, we simultaneously eliminate the non-definition of the original empirical likelihood and enhance the coverage probability. Finite sample properties of the empirical likelihood confidence regions are explored through Monte Carlo simulation, and some real data applications are carried out.


Author(s):  
Mohammad Fayaz

Background: In the functional data analysis (FDA), the hybrid or mixed data are scalar and functional datasets. The semi-functional partial linear regression model (SFPLR) is one of the first semiparametric models for the scalar response with hybrid covariates. Various extensions of this model are explored and summarized. Methods: Two first research articles, including “semi-functional partial linear regression model”, and “Partial functional linear regression” have more than 300 citations in Google Scholar. Finally, only 106 articles remained according to the inclusion and exclusion criteria such as 1) including the published articles in the ISI journals and excluding 2) non-English and 3) preprints, slides, and conference papers. We use the PRISMA standard for systematic review. Results: The articles are categorized into the following main topics: estimation procedures, confidence regions, time series, and panel data, Bayesian, spatial, robust, testing, quantile regression, varying Coefficient Models, Variable Selection, Single-index model, Measurement error, Multiple Functions, Missing values, Rank Method and Others. There are different applications and datasets such as the Tecator dataset, air quality, electricity consumption, and Neuroimaging, among others. Conclusions: SFPLR is one of the most famous regression modeling methods for hybrid data that has a lot of extensions among other models.


2021 ◽  
pp. 1-26
Author(s):  
Ulrich Hounyo

This paper introduces a novel wild bootstrap for dependent data (WBDD) as a means of calculating standard errors of estimators and constructing confidence regions for parameters based on dependent heterogeneous data. The consistency of the bootstrap variance estimator for smooth function of the sample mean is shown to be robust against heteroskedasticity and dependence of unknown form. The first-order asymptotic validity of the WBDD in distribution approximation is established when data are assumed to satisfy a near epoch dependent condition and under the framework of the smooth function model. The WBDD offers a viable alternative to the existing non parametric bootstrap methods for dependent data. It preserves the second-order correctness property of blockwise bootstrap (provided we choose the external random variables appropriately), for stationary time series and smooth functions of the mean. This desirable property of any bootstrap method is not known for extant wild-based bootstrap methods for dependent data. Simulation studies illustrate the finite-sample performance of the WBDD.


2021 ◽  
Author(s):  
◽  
Xiaoyu Zhai

<p>The Global Positioning System (GPS) has become widely used in modern life and most people use GPS to find locations, therefore the accuracy of these locations is very important.  In this thesis, we will use Longitude and Latitude from raw GPS data to estimate the location of a GPS receiver. To improve accuracy of the estimation, we will use two methods to delete outliers in Longitude and Latitude: the Euclidean distance method and the Mahalanobis distance method. We will then use two methods to estimate the location: Maximum Likelihood and Bootstrap method.  The confidence ellipse and the simultaneous confidence intervals are used to construct confidence regions for bivariate data, and we compared the two methods. In this thesis, we also did some simulations to understand the effect of sample size and variance in the linear regression model for AIC and BIC, and use these two criteria to find a best model to fit the multivariate linear regression model with response variables Latitude and Longitude. This thesis forms part of a larger project to detect land movement, such as that seen in landslides using low cost GPS devices. We therefore consider methods for detecting changes in location over time.  In this thesis, we used converted Longitude, Latitude and Altitude (in meters) from the same GPS data set after deleting outliers as our variables and applied two methods (Hotelling’s T2 chart method and Multivariate exponentially weighted moving average method) to detect changes in location in our data.</p>


2021 ◽  
Author(s):  
◽  
Xiaoyu Zhai

<p>The Global Positioning System (GPS) has become widely used in modern life and most people use GPS to find locations, therefore the accuracy of these locations is very important.  In this thesis, we will use Longitude and Latitude from raw GPS data to estimate the location of a GPS receiver. To improve accuracy of the estimation, we will use two methods to delete outliers in Longitude and Latitude: the Euclidean distance method and the Mahalanobis distance method. We will then use two methods to estimate the location: Maximum Likelihood and Bootstrap method.  The confidence ellipse and the simultaneous confidence intervals are used to construct confidence regions for bivariate data, and we compared the two methods. In this thesis, we also did some simulations to understand the effect of sample size and variance in the linear regression model for AIC and BIC, and use these two criteria to find a best model to fit the multivariate linear regression model with response variables Latitude and Longitude. This thesis forms part of a larger project to detect land movement, such as that seen in landslides using low cost GPS devices. We therefore consider methods for detecting changes in location over time.  In this thesis, we used converted Longitude, Latitude and Altitude (in meters) from the same GPS data set after deleting outliers as our variables and applied two methods (Hotelling’s T2 chart method and Multivariate exponentially weighted moving average method) to detect changes in location in our data.</p>


Symmetry ◽  
2021 ◽  
Vol 13 (11) ◽  
pp. 2173
Author(s):  
Isaac Akoto ◽  
João T. Mexia ◽  
Filipe J. Marques

In this work, we derived new asymptotic results for multinomial models. To obtain these results, we started by studying limit distributions in models with a compact parameter space. This restriction holds since the key parameter whose components are the probabilities of the possible outcomes have non-negative components that add up to 1. Based on these results, we obtained confidence ellipsoids and simultaneous confidence intervals for models with normal limit distributions. We then studied the covariance matrices of the limit normal distributions for the multinomial models. This was a transition between the previous general results and on the inference for multinomial models in which we considered the chi-square tests, confidence regions and non-linear statistics—namely log-linear models with two numerical applications to those models. Namely, our approach overcame the hierarchical restrictions assumed to analyse the multidimensional contingency table.


2021 ◽  
Author(s):  
◽  
Euan George Campbell Smith

<p>Aspects of the standard least squares method of locating earthquakes and its extensions are discussed. It is shown that there is a need to carefully separate and distinguish between the statistical and deterministic properties of the least squares solution and the algorithm used to obtain it. Standard linear statistical analysis gives reasonable confidence regions for the hypocentre provided that the errors in the model travel time to pairs of stations are not correlated. The travel time residuals which result from the overdetermined system are unreliable estimates of the model errors, as are the pooled residuals from groups of events whether or not the data are homogeneous. The concepts of Absolute and Relative hypocentre determination are clarified and the Homogeneous Station method is developed and demonstrated to be a good relative location method. The application of the method to a group of North Island, New Zealand subcrustal earthquakes chosen for homogeneity revealed that the earthquakes occurred in a thin, fairly that dipping zone that could be as thin as 9 km and is not thicker than 18 km. The result is a significant refinement of previous estimates for New Zealand. The method of Joint Hypocentre Determination first described by Douglas (1967) is examined. The advantage of the method is that the error in the travel time model is estimated as well as allowing for and estimating the effect of an interaction of this error with the hypocentre parameters of the earthquakes. The application of this method to groups of, North Island, New Zealand earthquakes allows very significant improvements to the travel time model to be made and confirms the result that there is a velocity contrast for both P and S of between six and ten percent between paths in and entirely out of the downgoing Pacific plate. Estimates of the velocities in the plate are 8.6 [plus or minus] .1 km/sec. for P and 4.74 [plus or minus] km/sec. for S. In addition, station terms are calculated which describe the average departure from the new model of travel times to the stations contributing data to the study. These terms may be interpreted as arising from crustal structure local to the station which is different from that of the average crustal model used. The conclusion is reached that apart from providing better absolute hypocentre estimates, the method of Joint Hypocentre Determination can be made to yield worthwhile information about structure on the scale considered here.</p>


2021 ◽  
Author(s):  
◽  
Euan George Campbell Smith

<p>Aspects of the standard least squares method of locating earthquakes and its extensions are discussed. It is shown that there is a need to carefully separate and distinguish between the statistical and deterministic properties of the least squares solution and the algorithm used to obtain it. Standard linear statistical analysis gives reasonable confidence regions for the hypocentre provided that the errors in the model travel time to pairs of stations are not correlated. The travel time residuals which result from the overdetermined system are unreliable estimates of the model errors, as are the pooled residuals from groups of events whether or not the data are homogeneous. The concepts of Absolute and Relative hypocentre determination are clarified and the Homogeneous Station method is developed and demonstrated to be a good relative location method. The application of the method to a group of North Island, New Zealand subcrustal earthquakes chosen for homogeneity revealed that the earthquakes occurred in a thin, fairly that dipping zone that could be as thin as 9 km and is not thicker than 18 km. The result is a significant refinement of previous estimates for New Zealand. The method of Joint Hypocentre Determination first described by Douglas (1967) is examined. The advantage of the method is that the error in the travel time model is estimated as well as allowing for and estimating the effect of an interaction of this error with the hypocentre parameters of the earthquakes. The application of this method to groups of, North Island, New Zealand earthquakes allows very significant improvements to the travel time model to be made and confirms the result that there is a velocity contrast for both P and S of between six and ten percent between paths in and entirely out of the downgoing Pacific plate. Estimates of the velocities in the plate are 8.6 [plus or minus] .1 km/sec. for P and 4.74 [plus or minus] km/sec. for S. In addition, station terms are calculated which describe the average departure from the new model of travel times to the stations contributing data to the study. These terms may be interpreted as arising from crustal structure local to the station which is different from that of the average crustal model used. The conclusion is reached that apart from providing better absolute hypocentre estimates, the method of Joint Hypocentre Determination can be made to yield worthwhile information about structure on the scale considered here.</p>


2021 ◽  
Author(s):  
Amir Parnianifard ◽  
Shahid Mumtaz ◽  
Sushank Chaudhary ◽  
Muhammad Ali Imran ◽  
Lunchakorn Wuttisittikulkij

Abstract Transmitting antenna positioning or transmitter placement is a well-known NP-hard optimization problem pertinent to communication systems. Furthermore, it is of practical importance to yield an optimal location of transmitters to ensure low sensitivity with respect to potential uncertainties. Notwithstanding, incorporating uncertainties in the optimization problem can highly increase the computational expenses. This paper aims at the development of a new reducedcost algorithm for a multi-objective robust transmitter placement under uncertainties. Toward this end, a new hybrid surrogate-metaheuristic approach is developed using Grey Wolf Optimizer (GWO) and the Kriging surrogate in a mathematical framework of a robust dual-surface model. The proposed algorithm is also able to analyze the sensitivity of the obtained optimal results. The latter is achieved by obtaining the bootstrapped confidence regions without extra simulation experiments. The paper investigates the performance of the proposed algorithm for robust optimal placing of two, three, and four transmitters, under uncertainties concerning the transmitting antenna gain. The results demonstrate the utility and the efficiency of the proposed method in rendering the robust optimal design and analyzing the sensitivity of the transmitter placement problem under practically acceptable computational efforts.


Sign in / Sign up

Export Citation Format

Share Document