frequentist method
Recently Published Documents


TOTAL DOCUMENTS

16
(FIVE YEARS 8)

H-INDEX

4
(FIVE YEARS 1)

2021 ◽  
Vol 19 (3) ◽  
pp. e0802
Author(s):  
Antonio Martinez-Ruiz ◽  
Irineo L. López-Cruz ◽  
Agustín Ruiz-García ◽  
Joel Pineda-Pineda ◽  
Prometeo Sánchez-García ◽  
...  

Aim of study: The objective was to perform an uncertainty analysis (UA) of the dynamic HORTSYST model applied to greenhouse grown hydroponic tomato crop. A frequentist method based on Monte Carlo simulation and the Generalized Likelihood Uncertainty Estimation (GLUE) procedure were used.Area of study: Two tomato cultivation experiments were carried out, during autumn-winter and spring-summer crop seasons, in a research greenhouse located at University of Chapingo, Chapingo, Mexico.Material and methods: The uncertainties of the HORTSYST model predictions PTI, LAI, DMP, ETc, Nup, Pup, Kup, Caup, and Mgup uptake, were calculated, by specifying the uncertainty of model parameters 10% and 20% around their nominal values. Uniform PDFs were specified for all model parameters and LHS sampling was applied. The Monte Carlo and the GLUE methods used 10,000 and 2,000 simulations, respectively. The frequentist method included the statistical measures: minimum, maximum, average values, CV, skewness, and kurtosis whilst GLUE used CI, RMSE, and scatter plots.Main results: As parameters were changed 10%, the CV, for all outputs, were lower than 15%. The smallest values were for LAI (10.75%) and DMP (11.14%) and the largest was for ETc (14.47%). For Caup (12.15%) and Pup (12.27%), the CV was lower than the one for Nup and Kup. Kurtosis and skewness values were close as expected for a normal distribution. According to GLUE, crop density was found to be the most relevant parameter given that it yielded the lowest RMSE value between the simulated and measured values.Research highlights: Acceptable fitting of HORTSYST was achieved since its predictions were inside 95% CI with the GLUE procedure.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yunpeng Zhao ◽  
Yongqiang Wang ◽  
Lei Shan ◽  
Chuanliang Peng ◽  
Wenhao Zhang ◽  
...  

AbstractThe optimal treatment for resectable esophageal squamous cell carcinoma (ESCC) is still a debatable point; however, randomized trials for strategies including neoadjuvant or adjuvant chemotherapy (CT), radiotherapy, or chemoradiotherapy (CRT) are not always available. This network meta-analysis aimed to identify an effective approach through indirect comparisons. An extensive literature search comparing multimodality treatment and surgery was performed, and a network meta-analysis was conducted with the frequentist method. Twenty-three trials including a total of 3636 ESCC patients were included. Neoadjuvant CRT and neoadjuvant CT, which were recommended by most guidelines for esophageal cancer, were associated with an overall survival advantage compared with surgery alone (HR = 0.43, 95% CI 0.26–0.73; HR = 0.71, 95% CI 0.32–1.59). A statistically significant survival benefit from neoadjuvant CRT compared with neoadjuvant CT could not be demonstrated in our study (HR = 0.61, 95% CI 0.32–1.17, P = 0.08). Our network meta-analysis showed that both neoadjuvant CRT and neoadjuvant CT were effective in improving the survival of patients with ESCC. Individual clinical decisions need further study in the future.


2021 ◽  
Author(s):  
Janelle L. Lennie ◽  
John T. Mondick ◽  
Marc R. Gastonguay

AbstractRare disease clinical trials are constrained to small sample sizes and may lack placebo-control, leading to challenges in drug development. This paper proposes a Bayesian model-based framework for early go/no-go decision making in rare disease drug development, using Duchenne muscular dystrophy (DMD) as an example. Early go/no-go decisions were based on projections of long-term functional outcomes from a Bayesian model-based analysis of short-term trial data informed by prior knowledge based on 6MWT natural history literature data in DMD patients. Frequentist hypothesis tests were also applied as a reference analysis method. A number of combinations of hypothetical trial designs, drug effects and cohort comparison methods were assessed.The proposed Bayesian model-based framework was superior to the frequentist method for making go/no-go decisions across all trial designs and cohort comparison methods in DMD. The average decision accuracy rates across all trial designs for the Bayesian and frequentist analysis methods were 45.8 and 8.98%, respectively. A decision accuracy rate of at least 50% was achieved for 42 and 7% of the trial designs under the Bayesian and frequentist analysis methods, respectively. The frequentist method was limited to the short-term trial data only, while the Bayesian methods were informed with both the short-term data and prior information. The specific results of the DMD case study were limited due to incomplete specification of individual-specific covariates in the natural history literature data and should be reevaluated using a full natural history dataset. These limitations aside, the framework presented provides a proof of concept for the utility of Bayesian model-based methods for decision making in rare disease trials.


2020 ◽  
Author(s):  
Marta Martinengo ◽  
Daniel Zugliani ◽  
Giorgio Rosatti

Abstract. Rainfall thresholds, namely rainfall intensity-duration conditions beyond which the probability of debris flow occurrence is considered significant, can be used as a forecasting tool in debris-flow early warning system. Many uncertainties may affect the thresholds calibration and, in turn, the reliability and effectiveness of this tool. The purpose of this study is to assess the uncertainty in the determination of the rainfall threshold for stony debris flow based on the Back Dynamical Approach (BDA) (Rosatti et al., 2019), an innovative method to estimate the rainfall duration and averaged intensity strictly related to measured debris flow. The uncertainty analysis has been computed performing two Monte Carlo cascade simulations: (i) to assess the variability in the estimate of rainfall conditions due to the uncertainty of some of the BDA parameters and (ii) to quantify the impact of this variability on the threshold parameters, obtained by using the frequentist method. Then, the deviation between these analysis outcomes and the values obtained in Rosatti et al. (2019) has been examined. The results highlight that the variability in the rainfall condition estimate is strongly related to the debris flow characteristics and the hyetograph shape. Depending on these features, the spreading of the obtained distributions can take both low and high values. Instead, the threshold parameters are characterised by a low statistical spreading. Finally, the consistency between the outcome of this study and the results obtained in Rosatti et al. (2019) has been proved and the critical issues related to the rainfall condition estimation have been discussed.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-16
Author(s):  
Woraphon Yamaka ◽  
Songsak Sriboonchitta

This paper introduces an entropy-based belief function to the forecasting problem. While the likelihood-based belief function needs to know the distribution of the objective function for the prediction, the entropy-based belief function does not. This is because the observed data likelihood is somewhat complex in practice. We, thus, replace the likelihood function with the entropy. That is, we propose an approach in which a belief function is built from the entropy function. As an illustration, the proposed method is compared to the likelihood-based belief function in the simulation and empirical studies. According to the results, our approach performs well under a wide array of simulated data models and distributions. There are pieces of evidence that the prediction interval obtained from the frequentist method has a much narrower prediction interval, while our entropy-based method performs the widest. However, our entropy-based belief function still produces an acceptable range for prediction intervals as the true prediction value always lay in the prediction intervals.


Author(s):  
Céline Cunen ◽  
Nils Lid Hjort ◽  
Tore Schweder

The recent article ‘Satellite conjunction analysis and the false confidence theorem’ (Balch et al . 2019, Satellite conjunction analysis and the false confidence theorem. Proc. R. Soc. A 475 , 20180565) points to certain difficulties with Bayesian analysis when used for models for satellite conjuntion and ensuing operative decisions. Here, we supplement these previous analyses and findings with further insights, uncovering what we perceive of as being the crucial points, explained in a prototype set-up where exact analysis is attainable. We also show that a different and frequentist method, involving confidence distributions, is free of the false confidence syndrome.


Mathematics ◽  
2020 ◽  
Vol 8 (1) ◽  
pp. 62 ◽  
Author(s):  
Autcha Araveeporn

This paper compares the frequentist method that consisted of the least-squares method and the maximum likelihood method for estimating an unknown parameter on the Random Coefficient Autoregressive (RCA) model. The frequentist methods depend on the likelihood function that draws a conclusion from observed data by emphasizing the frequency or proportion of the data namely least squares and maximum likelihood methods. The method of least squares is often used to estimate the parameter of the frequentist method. The minimum of the sum of squared residuals is found by setting the gradient to zero. The maximum likelihood method carries out the observed data to estimate the parameter of a probability distribution by maximizing a likelihood function under the statistical model, while this estimator is obtained by a differential parameter of the likelihood function. The efficiency of two methods is considered by average mean square error for simulation data, and mean square error for actual data. For simulation data, the data are generated at only the first-order models of the RCA model. The results have shown that the least-squares method performs better than the maximum likelihood. The average mean square error of the least-squares method shows the minimum values in all cases that indicated their performance. Finally, these methods are applied to the actual data. The series of monthly averages of the Stock Exchange of Thailand (SET) index and daily volume of the exchange rate of Baht/Dollar are considered to estimate and forecast based on the RCA model. The result shows that the least-squares method outperforms the maximum likelihood method.


2018 ◽  
Vol 18 (3) ◽  
pp. 765-780 ◽  
Author(s):  
Zhao Shi ◽  
Fangqiang Wei ◽  
Venkatachalam Chandrasekar

Abstract. Both Ms 8.0 Wenchuan earthquake on 12 May 2008 and Ms 7.0 Lushan earthquake on 20 April 2013 occurred in the province of Sichuan, China. In the earthquake-affected mountainous area, a large amount of loose material caused a high occurrence of debris flow during the rainy season. In order to evaluate the rainfall intensity–duration (I–D) threshold of the debris flow in the earthquake-affected area, and to fill up the observational gaps caused by the relatively scarce and low-altitude deployment of rain gauges in this area, raw data from two S-band China New Generation Doppler Weather Radar (CINRAD) were captured for six rainfall events that triggered 519 debris flows between 2012 and 2014. Due to the challenges of radar quantitative precipitation estimation (QPE) over mountainous areas, a series of improvement measures are considered: a hybrid scan mode, a vertical reflectivity profile (VPR) correction, a mosaic of reflectivity, a merged rainfall–reflectivity (R − Z) relationship for convective and stratiform rainfall, and rainfall bias adjustment with Kalman filter (KF). For validating rainfall accumulation over complex terrains, the study areas are divided into two kinds of regions by the height threshold of 1.5 km from the ground. Three kinds of radar rainfall estimates are compared with rain gauge measurements. It is observed that the normalized mean bias (NMB) is decreased by 39 % and the fitted linear ratio between radar and rain gauge observation reaches at 0.98. Furthermore, the radar-based I–D threshold derived by the frequentist method is I = 10.1D−0.52 and is underestimated by uncorrected raw radar data. In order to verify the impacts on observations due to spatial variation, I–D thresholds are identified from the nearest rain gauge observations and radar observations at the rain gauge locations. It is found that both kinds of observations have similar I–D thresholds and likewise underestimate I–D thresholds due to undershooting at the core of convective rainfall. It is indicated that improvement of spatial resolution and measuring accuracy of radar observation will lead to the improvement of identifying debris flow occurrence, especially for events triggered by the strong small-scale rainfall process in the study area.


Author(s):  
Zhao Shi ◽  
Fangqiang Wei ◽  
Chandrasekar Venkatachalam

Abstract. Both of Ms8.0 Wenchuan earthquake on May 12, 2008 and Ms7.0 Lushan earth quake on April 20, 2013 occurred in Sichuan Province of China. In the earthquake affected mountainous area, a large amount of loose material caused a high occurrence of debris flow during the rainy season. In order to evaluate the rainfall Intensity–Duration (I-D) threshold of the debris flow in the earthquake-affected area, and for filling up the observational gaps caused by the relatively scarce and low altitude deployment of rain gauges in this area, raw data from two S-band China New Generation Doppler weather radar (CINRAD) were captured for six rainfall events which triggered 519 debris flows between 2012 and 2014. Due to the challenges of radar quantitative precipitation estimation (QPE) over mountainous area, a series of improving measures are considered including the hybrid scan mode, the vertical reflectivity profile (VPR) correction, the mosaic of reflectivity, a merged rainfall-reflectivity(R-Z) relationship for convective and stratiform rainfall and rainfall bias adjustment with Kalman filter (KF). For validating rainfall accumulation over complex terrains, the study areas are divided into two kinds of regions by the height threshold of 1.5 km from the ground. Three kinds of radar rainfall estimates are compared with rain gauge measurements. It is observed that the normalized mean bias (NMB) is decreased by 39 % and the fitted linear ratio between radar and rain gauge observation reaches at 0.98. Furthermore, the radar-based I-D threshold derived by the Frequentist method is I = 10.1D−0.52, and it's also found that the I-D threshold is underestimated by uncorrected raw radar data. In order to verify the impacts on observations due to spatial variation, I-D thresholds are identified from the nearest rain gauge observations and radar observations at the rain gauge locations. It is found that both kinds of observations have similar I-D threshold and likewise underestimate I-D thresholds owing to under shooting at the core of convective rainfall. It is indicated that improvement of spatial resolution and measuring accuracy of radar observation will lead to the improvement of identifying debris flow occurrence, especially for events triggered by the small-scale strong rainfall process in the study area.


Sign in / Sign up

Export Citation Format

Share Document