weighted likelihood
Recently Published Documents


TOTAL DOCUMENTS

142
(FIVE YEARS 35)

H-INDEX

20
(FIVE YEARS 2)

2022 ◽  
Vol 14 (2) ◽  
pp. 403
Author(s):  
Chongdi Duan ◽  
Yu Li ◽  
Weiwei Wang ◽  
Jianguo Li

With the rapid development of cooperative detection technology, target fusion detection with regard of LEO satellites can be realized by means of their diverse observation configurations. However, the existing constant false alarm ratio (CFAR) detection research rarely involves the space-based target fusion detection theory. In this paper, a novel multi-source fusion detection method based on LEO satellites is presented. Firstly, the pre-compensation function is constructed by employing the range and Doppler history of the cell where the antenna beam center is pointed. As a result, not only is the Doppler band broadening problem caused by the high-speed movement of the satellite platform, but the Doppler frequency rate (DFR) offset issue resulted from different observation configurations are alleviated synchronously. Then, the theoretical upper and lower limits of DFR are designed to achieve the effective clutter suppression and the accurate target echo fusion. Finally, the CFAR detection threshold based on the exponential weighted likelihood ratio is derived, which effectively increases the contrast ratio between the target cell and other background cells, and thus to provide an effective multi-source fusion detection method for LEO-based satellite constellation. Simulation results verify the effectiveness of the proposed algorithm.


Author(s):  
Matteo Taroni ◽  
Jiancang Zhuang ◽  
Warner Marzocchi

Abstract Taroni et al. (2021; hereafter TZM21) proposed a method to perform a spatial b-value mapping based on the weighted-likelihood estimation and applied this method to the Italian region as a tutorial example. In the accompanying comment, Gulia et al. (2021; hereafter GGW21) did not challenge the TZM21’s method, but they argued that the catalog used by TZM21 is contaminated by quarry blasts, introducing a bias that may impact any seismotectonic or hazard interpretations. Although in TZM21 the application to the Italian territory was only a tutorial example and we purposely did not make any thorough discussion on the meaning of the results in terms of seismotectonic or seismic hazards (that would have required many more analyses), we acknowledge the potential role of the quarry blasts, and we add some further analysis here. We thank GGW21 for giving us this opportunity. Here, removing the part of the catalog contaminated by quarry blasts and applying the same analysis as in TZM21, we obtain results that are very similar to the ones reported in TZM21; specifically, only one region that is characterized by low natural seismicity rate shows a marked effect of the quarry blasts on the b-value.


2021 ◽  
Vol 2021 ◽  
pp. 1-13
Author(s):  
Leili Tapak ◽  
Michael R. Kosorok ◽  
Majid Sadeghifar ◽  
Omid Hamidi ◽  
Saeid Afshar ◽  
...  

Variable selection and penalized regression models in high-dimension settings have become an increasingly important topic in many disciplines. For instance, omics data are generated in biomedical researches that may be associated with survival of patients and suggest insights into disease dynamics to identify patients with worse prognosis and to improve the therapy. Analysis of high-dimensional time-to-event data in the presence of competing risks requires special modeling techniques. So far, some attempts have been made to variable selection in low- and high-dimension competing risk setting using partial likelihood-based procedures. In this paper, a weighted likelihood-based penalized approach is extended for direct variable selection under the subdistribution hazards model for high-dimensional competing risk data. The proposed method which considers a larger class of semiparametric regression models for the subdistribution allows for taking into account time-varying effects and is of particular importance, because the proportional hazards assumption may not be valid in general, especially in the high-dimension setting. Also, this model relaxes from the constraint of the ability to simultaneously model multiple cumulative incidence functions using the Fine and Gray approach. The performance/effectiveness of several penalties including minimax concave penalty (MCP); adaptive LASSO and smoothly clipped absolute deviation (SCAD) as well as their L2 counterparts were investigated through simulation studies in terms of sensitivity/specificity. The results revealed that sensitivity of all penalties were comparable, but the MCP and MCP-L2 penalties outperformed the other methods in term of selecting less noninformative variables. The practical use of the model was investigated through the analysis of genomic competing risk data obtained from patients with bladder cancer and six genes of CDC20, NCF2, SMARCAD1, RTN4, ETFDH, and SON were identified using all the methods and were significantly correlated with the subdistribution.


Author(s):  
Carmen Volk ◽  
Stephanie Rosenstiel ◽  
Yolanda Demetriou ◽  
Gorden Sudeck ◽  
Ansgar Thiel ◽  
...  

AbstractFostering health-related fitness knowledge is a common goal across physical education curricula. However, carefully developed knowledge tests that satisfy the psychometric criteria of educational assessment are lacking. Therefore, two studies were conducted to evaluate a newly developed health-related fitness knowledge test within the framework of classical test and item response theory regarding item quality, test reliability, construct validity, and dimensionality. Overall, 794 ninth graders (Mage = 14.3 years, 50.6% girls) took part in Study 1. They differed in the type of physical education classes (minor or major subject) and school (lower or higher educational level) they attended. Study 2 incorporated 834 ninth graders at the same educational level (Mage = 14.2 years, 52.5% girls). Item–test correlation, test reliability, and validity were examined. In addition, item and test quality were investigated using unidimensional two-parameter logistic item response models. In Study 1, pupils at the same educational level with physical education as a major achieved higher knowledge scores than pupils with physical education as a minor (t = −5.99, p < 0.001; d = 0.58), which confirmed the test’s construct validity. In Study 2, the weighted likelihood estimate reliability of the final 27 items was 0.65, and the test–retest reliability reached rtt = 0.70. The items satisfied the assumption of local independence. The final test fulfilled the psychometric criteria of reliability and construct validity to assess health-related fitness knowledge in cross-sectional and interventional studies. This test extends the possibilities of research on health-related fitness knowledge in physical education.


Forecasting ◽  
2021 ◽  
Vol 3 (3) ◽  
pp. 561-569
Author(s):  
Matteo Taroni ◽  
Giorgio Vocalelli ◽  
Andrea De Polis

We introduce a novel approach to estimate the temporal variation of the b-value parameter of the Gutenberg–Richter law, based on the weighted likelihood approach. This methodology allows estimating the b-value based on the full history of the available data, within a data-driven setting. We test this methodology against the classical “rolling window” approach using a high-definition Italian seismic catalogue as well as a global catalogue of high magnitudes. The weighted likelihood approach outperforms competing methods, and measures the optimal amount of past information relevant to the estimation.


2021 ◽  
Vol 12 ◽  
Author(s):  
Xuemei Xue ◽  
Jing Lu ◽  
Jiwei Zhang

In this paper, a new item-weighted scheme is proposed to assess examinees’ growth in longitudinal analysis. A multidimensional Rasch model for measuring learning and change (MRMLC) and its polytomous extension is used to fit the longitudinal item response data. In fact, the new item-weighted likelihood estimation method is not only suitable for complex longitudinal IRT models, but also it can be used to estimate the unidimensional IRT models. For example, the combination of the two-parameter logistic (2PL) model and the partial credit model (PCM, Masters, 1982) with a varying number of categories. Two simulation studies are carried out to further illustrate the advantages of the item-weighted likelihood estimation method compared to the traditional Maximum a Posteriori (MAP) estimation method, Maximum likelihood estimation method (MLE), Warm’s (1989) weighted likelihood estimation (WLE) method, and type-weighted maximum likelihood estimation (TWLE) method. Simulation results indicate that the improved item-weighted likelihood estimation method better recover examinees’ true ability level for both complex longitudinal IRT models and unidimensional IRT models compared to the existing likelihood estimation (MLE, WLE and TWLE) methods and MAP estimation method, with smaller bias, root-mean-square errors, and root-mean-square difference especially at the low-and high-ability levels.


METRON ◽  
2021 ◽  
Author(s):  
Giovanni Saraceno ◽  
Claudio Agostinelli ◽  
Luca Greco

AbstractA weighted likelihood technique for robust estimation of multivariate Wrapped distributions of data points scattered on a $$p-$$ p - dimensional torus is proposed. The occurrence of outliers in the sample at hand can badly compromise inference for standard techniques such as maximum likelihood method. Therefore, there is the need to handle such model inadequacies in the fitting process by a robust technique and an effective downweighting of observations not following the assumed model. Furthermore, the employ of a robust method could help in situations of hidden and unexpected substructures in the data. Here, it is suggested to build a set of data-dependent weights based on the Pearson residuals and solve the corresponding weighted likelihood estimating equations. In particular, robust estimation is carried out by using a Classification EM algorithm whose M-step is enhanced by the computation of weights based on current parameters’ values. The finite sample behavior of the proposed method has been investigated by a Monte Carlo numerical study and real data examples.


2021 ◽  
Vol 213 ◽  
pp. 104285
Author(s):  
Abdul Wahid ◽  
Dost Muhammad Khan ◽  
Sajjad Ahmad Khan ◽  
Ijaz Hussain ◽  
Zardad Khan

Sign in / Sign up

Export Citation Format

Share Document