Testing the elicitation procedure of the Minimum Acceptable Probability

Author(s):  
Maria Polipciuc
Author(s):  
Pedram Sendi ◽  
Arta Ramadani ◽  
Michael M. Bornstein

Background: The number of contingent valuation (CV) studies in dental medicine using willingness-to-pay (WTP) methodology has substantially increased in recent years. Missing values due to absent information (i.e., missingness) or false information (i.e., protest zeros) are a common problem in WTP studies. The objective of this study is to evaluate the prevalence of missing values in CV studies in dental medicine, to assess how these have been dealt with, and to suggest recommendations for future research. Methods: We systematically searched electronic databases (MEDLINE, Web of Science, Cochrane Library, PROSPERO) on 8 June 2021, and hand-searched references of selected reviews. CV studies in clinical dentistry using WTP for valuing a good or service were included. Results: We included 49 WTP studies in our review. Out of these, 19 (38.8%) reported missing values due to absent information, and 28 (57.1%) reported zero values (i.e., WTP valued at zero). Zero values were further classified into true zeros (i.e., representing the underlying preference of the respondent) or protest zeros (i.e., false information as a protest behavior) in only 9 studies. Most studies used a complete case analysis to address missingness while only one study used multiple imputation. Conclusions: There is uncertainty in the dental literature on how to address missing values and zero values in CV studies. Zero values need to be classified as true zeros versus protest zeros with follow-up questions after the WTP elicitation procedure, and then need to be handled differently. Advanced statistical methods are available to address both missing values due to missingness and due to protest zeros but these are currently underused in dental medicine. Failing to appropriately address missing values in CV studies may lead to biased WTP estimates of dental interventions.


Author(s):  
Philippe Cambos ◽  
Guy Parmentier

During ship life, operating conditions may change, tanker may be converted into FPSO, and flag requirements may be modified. Generally these modifications have few impacts on existing structures; flag requirements only rarely are to be applied retroactively. Nevertheless in some cases modifications of operating condition may induce considerable consequences, making in the worst cases impossible any reengineering. For example converting a common tanker, built with plain steel of grade A into an Offshore Floating Unit able operating in cold region, may require a grade change corresponding to a grade B. It is obviously meaningless to replace all material just because material certificates. Steels used by shipyards have to fulfill Classification society’s requirements dealing with mechanical strength; generally shipbuilding corresponds to a small part of steelmaker’s production. For this reason steelmakers are reluctant to produce steels with mechanical properties corresponding exactly to the minima required. They generally deliver steels already in stock, with higher mechanical characteristics than required. In this case it can be taken advantage of this common practice. In order to demonstrate that the material fulfill the requirements of grade B it has been decided to adopt a statistic approach. At this stage there are two main issues, the first one is that it is needed to provide evidences that the actual material Charpy V characteristics fulfill the requirements of grade B; the second one is to provide these evidences with a minimum testing. To assess this assumption a random check has been carried out. Different probabilistic model have been tested in order to check common approaches and probabilistic model based on physical considerations. In the paper the main assumptions for estimating the minimum Charpy value main assumption in the probabilistic models are recalled, the behavior of empirical sample is examined, the parameters of probability laws fitting the empirical distribution and definitely as accuracy of probability law parameters determination is not perfect with a finite number of specimens the uncertainty in the determination of parameters is taken into account with confidence limits. According to the selected probabilistic model the minimum value corresponds to an acceptable probability of failure, taking into account the target confidence level, or is independent of any acceptable probability of failure and is defined with the same confidence level. At the end it is concluded that a random check with a data treatment assuming a random distribution of Charpy V test results distributed according to a Weibull probability law of the minimum allows providing evidences that with a sufficient confidence level the steel used for the considered structure fulfill the requirements of the new operating conditions.


2011 ◽  
Vol 261-263 ◽  
pp. 380-384
Author(s):  
Hai Tao Wang ◽  
Jin Qing Jia

The evaluation of the correct stability factor of tunnel is a critical element in the various design and construction phases of a tunnel excavated in difficult geotechnical conditions. An innovative, and well-applied, procedure for optimize the construction phase management is described in this article. The starting point of this procedure involves the verification of the results of numerical methods obtained from referenced analytical methods. In the first step of the procedure the results obtained through the analytical method are verified by means of a numerical method in order to evaluate the practical consequences in terms of development of deformations and plastic zone. In this manner, the assumed design risk is evaluated for the different methods and the solution that gives the best correspondence with numerical simulation is selected. Finally, residual uncertainties and parametric variations are incorporated in the analysis and Monte Carlo simulation is used to calculate the statistical distribution of the face-stabilizing pressure and the design value is selected on the basis of an acceptable probability of failure.


Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 1552-1552
Author(s):  
Jack M. Lionberger ◽  
Kathleen Shannon Dorcy ◽  
Carol Dean ◽  
Nathan Holm ◽  
Bart Lee Scott ◽  
...  

Abstract Abstract 1552 Background: Novel drugs or drug combinations are conventionally tested first in Phase I studies (in which therapeutic decisions are based solely on toxicity) with Phase II (efficacy) evaluations following as a separate trial. This process not only slows new drug development, it is challenging for patients during the informed consent process, because they usually enter trials not merely in hope of “no toxicity” but in hope of response. Response rates in Phase I at doses less than the maximum tolerated dose (MTD) may be irrelevant to efficacy, but this common assumption remains unproven. An equally plausible alternative is efficacy failure at these lower doses augurs failure at the MTD in Phase II. This hypothesis prompted development of a Phase I-II Bayesian design that uses both efficacy and toxicity to find a clinically relevant dose (Biometrics 2004;60: 684–93). In the current study, we apply the innovative Bayesian approach to the design of a Phase I-II trial using bendamustine + idarubicin in older patients (>50 yo) with newly-diagnosed AML or high risk MDS (>10% marrow blasts). We then compare and contrast our trial operation with that of the standard 3+3 Phase I design. Methods: The design specifies anticipated probabilities (“priors”) of response (CR or no CR) and toxicity (grade 3–4 or not) at each of 4 doses of bendamustine (45,60,75,90 mg/m2 daily × 5 together with idarubicin 12 mg/m2 daily on day 1 and 2). Patients are entered in groups of 3 beginning at the 45 mg/m2 dose. As response/toxicity data became available for each cohort, Bayes theorem is used to update the priors and derive current probabilities (“posteriors”) of response/toxicity at each dose. The priors are set to be relatively non-informative allowing the posteriors to be primarily influenced by the data from the trial. The posteriors are referred to a minimum acceptable probability of response (here 40%) and a maximum acceptable probability of toxicity (30%). If the posteriors indicate that it is highly unlikely (< 2% chance) that any dose is associated with both of these probabilities the trial stops. Otherwise the next cohort of patients is treated at a dose so associated. This process is repeated iteratively to a maximum sample size of 48 patients. The parameters noted above were chosen to give desirable probabilities of selecting for future study doses meeting the minimum acceptable response and maximum acceptable toxicity rates. Results: Table 1 compares the operation of this trial with a standard 3+3 Phase I trial. Given that 2/3 patients had toxicity at the 75 dose, a Phase I 3+3 design would have declared 60 the MTD. Subsequently, an “expansion cohort” as a Phase II trial would be treated at this dose without any possibility of revisiting the 75 dose. This conclusion flies in the face of basic notions of statistical reliability and ignores the possibility that patients experiencing toxicity may have been particularly old, had significant comorbidities, or have a variable functional reserve for undefined reasons. In contrast, the Phase I-II design allows the trial to continue, and potentially revisit higher doses of therapy depending on the collective outcome of a greater number of patients. Based on our actual data, this trial continued to treat patients at the 60 mg/m2 dose level, and in the next three patients there was no toxicity. In this case response data becomes the determining factor, which improves the efficiency of the trial. If 0/3 patients had a response, the trial would return to 75 mg/m2, however, because 2/3 patients had a response, the trial continues to accrue at 60mg/m2, with the statistical force of twice the number of patients. Conclusion: Accounting for response during dose finding seems to permit more sophisticated/flexible decisions about dosing in addition to improving efficiency. Disclosures: Shannon Dorcy: Cephalon: Consultancy, Honoraria, Speakers Bureau.


2014 ◽  
Vol 30 (2) ◽  
pp. 111-127 ◽  
Author(s):  
Ronald P. Leow ◽  
Sarah Grey ◽  
Silvia Marijuan ◽  
Colleen Moorman

Given the current methodological interest in eliciting direct data on the cognitive processes L2 learners employ as they interact with L2 data during the early stages of the learning process, this article takes a critical and comparative look at three concurrent data elicitation procedures currently employed in the SLA literature: Think aloud (TA) protocols, eye-tracking (ET), and reaction time (RT). The section on each data elicitation procedure begins with a brief historical and descriptive account of its usage and application in the SLA literature to address cognitive processes as they occur during the early stages of the L2 learning process, followed by its strengths and some methodological issues that should be considered. Suggestions are provided for their usage in future studies investigating concurrent cognitive processes in L2 learning at these early stages of the L2 learning process.


2005 ◽  
Author(s):  
Ronald L. Boring ◽  
David Gertman ◽  
Jeffrey Joe ◽  
Julie Marble ◽  
William Galyean ◽  
...  

2013 ◽  
Vol 3 (2) ◽  
pp. 79-97 ◽  
Author(s):  
C. Andrade

RESUMENLos cálculos de vida útil de las estructuras de hormigón están pasando rápidamente de los laboratorios a las normativas y a ser especificados en la licitación de grandes infraestructuras. Así vidas útiles de 100 años o más se han requerido en puentes como Oresund o en el nuevo canal de Panamá. Sin embargo la especificación se realiza de forma resumida sin que se defina la forma de demostrar esa durabilidad y en algunos casos, sin siquiera mencionar los ensayos y sus valores limites que se deben utilizar. En la presente comunicación se describen los aspectos más importantes que se deben especificar en los modelos que deben ser además de los coeficientes de difusión, la concentración superficial, los factores de envejecimiento y el límite de cloruros así como la probabilidad de corrosión que se considera inaceptable.Palabras Clave: hormigón; cloruros; resistividad; difusión.ABSTRACTEstimates of service life of concrete structures are rapidly moving from laboratories to the standards and to be specified in the construction for large infrastructures. So service life of 100 years or more were required to Oresund bridge or the new Panama Canal. However, the specification is made without defining how to prove the specified durability and in some cases, without even mentioning the tests and limit values to be used. Present communication describes the most important aspects to be specified in the chloride prediction models in addition to the diffusion coefficients, which are the surface concentration, the aging factor, the limit of chlorides and the acceptable probability of corrosion.Keywords: concrete; chlorides; resistivity; diffusion.


1990 ◽  
Vol 4 (2) ◽  
pp. 201-211 ◽  
Author(s):  
Amos Tversky ◽  
Richard H Thaler

The preference reversal phenomenon has been established in numerous studies during the last two decades, but its causes have only recently been uncovered. This phenomenon, or cluster of phenomena, challenges the traditional assumption that the decisionmaker has a fixed preference order that is captured accurately by any reliable elicitation procedure. If option A is priced higher than option B, we cannot always assume that A is preferred to B in a direct comparison. The evidence shows that different methods of elicitation could change the relative weighting of the attributes and give rise to different orderings.


Sign in / Sign up

Export Citation Format

Share Document