scholarly journals Estimating Causal Effects of New Treatments Despite Self-Selection: The Case of Experimental Medical Treatments

2019 ◽  
Vol 7 (1) ◽  
Author(s):  
Chad Hazlett

AbstractProviding terminally ill patients with access to experimental treatments, as allowed by recent “right to try” laws and “expanded access” programs, poses a variety of ethical questions. While practitioners and investigators may assume it is impossible to learn the effects of these treatment without randomized trials, this paper describes a simple tool to estimate the effects of these experimental treatments on those who take them, despite the problem of selection into treatment, and without assumptions about the selection process. The key assumption is that the average outcome, such as survival, would remain stable over time in the absence of the new treatment. Such an assumption is unprovable, but can often be credibly judged by reference to historical data and by experts familiar with the disease and its treatment. Further, where this assumption may be violated, the result can be adjusted to account for a hypothesized change in the non-treatment outcome, or to conduct a sensitivity analysis. The method is simple to understand and implement, requiring just four numbers to form a point estimate. Such an approach can be used not only to learn which experimental treatments are promising, but also to warn us when treatments are actually harmful – especially when they might otherwise appear to be beneficial, as illustrated by example here. While this note focuses on experimental medical treatments as a motivating case, more generally this approach can be employed where a new treatment becomes available or has a large increase in uptake, where selection bias is a concern, and where an assumption on the change in average non-treatment outcome over time can credibly be imposed.

2015 ◽  
Vol 26 (3) ◽  
pp. 1295-1307 ◽  
Author(s):  
Huang Li-Ching ◽  
Wen Miin-Jye ◽  
Cheung Siu Hung ◽  
Kwong Koon Shing

The increasing popularity of noninferiority trials reflects the ongoing efforts to replace existing treatments (reference treatments) with new treatments (experimental treatments) that retain a substantial fraction of the effect of the reference treatments. The adoption of any new treatment has to be vindicated by a demonstration of benefits that outweigh a possible clinically insignificant reduction in the reference treatment efficacy. Statistical methods have been developed to analyze data collected from noninferiority trials. However, these methods focus on cases with only one reference treatment. In this paper, we provide the statistical inferential procedures for situations with multiple reference treatments. The computation of the corresponding critical values for simultaneous testings of noninferiority of several new treatments to multiple reference treatments in the presence of a placebo is provided. Furthermore, for a prespecified level of test power, a technique to determine the optimal sample size before the onset of a noninferiority trial is derived. A clinical example is given to illustrate our proposed procedure.


2017 ◽  
Vol 27 (11) ◽  
pp. 3255-3270 ◽  
Author(s):  
Wenfu Xu ◽  
Feifang Hu ◽  
Siu Hung Cheung

The increase in the popularity of non-inferiority clinical trials represents the increasing need to search for substitutes for some reference (standard) treatments. A new treatment would be preferred to the standard treatment if the benefits of adopting it outweigh a possible clinically insignificant reduction in treatment efficacy (non-inferiority margin). Statistical procedures have recently been developed for treatment comparisons in non-inferiority clinical trials that have multiple experimental (new) treatments. An ethical concern for non-inferiority trials is that some patients undergo the less effective treatments; this problem is more serious when multiple experimental treatments are included in a balanced trial in which the sample sizes are the same for all experimental treatments. With the aim of giving fewer patients the inferior treatments, we propose a response-adaptive treatment allocation scheme that is based on the doubly adaptive biased coin design. The proposed adaptive design is also shown to be superior to the balanced design in terms of testing power.


Author(s):  
Mette Eilstrup-Sangiovanni

AbstractMany observers worry that growing numbers of international institutions with overlapping functions undermine governance effectiveness via duplication, inconsistency and conflict. Such pessimistic assessments may undervalue the mechanisms available to states and other political agents to reduce conflictual overlap and enhance inter-institutional synergy. Drawing on historical data I examine how states can mitigate conflict within Global Governance Complexes (GGCs) by dissolving or merging existing institutions or by re-configuring their mandates. I further explore how “order in complexity” can emerge through bottom-up processes of adaptation in lieu of state-led reform. My analysis supports three theoretical claims: (1) states frequently refashion governance complexes “top-down” in order to reduce conflictual overlap; (2) “top-down” restructuring and “bottom-up” adaptation present alternative mechanisms for ordering relations among component institutions of GGCs; (3) these twin mechanisms ensure that GGCs tend to (re)produce elements of order over time–albeit often temporarily. Rather than evolving towards ever-greater fragmentation and disorder, complex governance systems thus tend to fluctuate between greater or lesser integration and (dis)order.


2019 ◽  
Vol 122 (1) ◽  
pp. 681-699 ◽  
Author(s):  
E. Tattershall ◽  
G. Nenadic ◽  
R. D. Stevens

AbstractResearch topics rise and fall in popularity over time, some more swiftly than others. The fastest rising topics are typically called bursts; for example “deep learning”, “internet of things” and “big data”. Being able to automatically detect and track bursty terms in the literature could give insight into how scientific thought evolves over time. In this paper, we take a trend detection algorithm from stock market analysis and apply it to over 30 years of computer science research abstracts, treating the prevalence of each term in the dataset like the price of a stock. Unlike previous work in this domain, we use the free text of abstracts and titles, resulting in a finer-grained analysis. We report a list of bursty terms, and then use historical data to build a classifier to predict whether they will rise or fall in popularity in the future, obtaining accuracy in the region of 80%. The proposed methodology can be applied to any time-ordered collection of text to yield past and present bursty terms and predict their probable fate.


1989 ◽  
Vol 5 (3) ◽  
pp. 459-472 ◽  
Author(s):  
Richard J. Lilford

This article develops arguments for the use of decision theory, rather than intuition, to determine the size of trials. It is wrong to expect doctors to ignore personal preferences in favor of clinical experiments unless the trial is capable of showing differences in treatment effect that would influence clinical practice substantially. It follows from our analysis that if delta (the treatment effect that the trial is designed to detect) is sufficient to alter clinical practice, then the alpha and beta errors of a trial should be equal. This applies even if a new treatment is to be compared with conventional therapy or if a treatment with high “costs” is compared with a less invasive or more inexpensive method.


2002 ◽  
Vol 46 (2) ◽  
pp. 385-391 ◽  
Author(s):  
Bharat D. Damle ◽  
Vanaja Mummaneni ◽  
Sanjeev Kaul ◽  
Catherine Knupp

ABSTRACT Didanosine formulation that contains a buffer to prevent it from acid-mediated degradation can result in a significant decrease in the oral absorption of certain drugs because of interactions with antacids. An enteric formulation of didanosine is unlikely to cause such drug interactions because it lacks antacids. This study was undertaken to determine whether the enteric bead formulation of didanosine (Videx EC) influences the bioavailability of indinavir, ketoconazole, and ciprofloxacin, three drugs that are representative of a broader class of drugs affected by interaction with antacids. Healthy subjects of either gender were enrolled in three separate open-label, single-dose, two-way crossover studies. Subjects were randomized to treatment A (800 mg of indinavir, 200 mg of ketoconazole, or 750 mg of ciprofloxacin) or treatment B (same dose of indinavir, ketoconazole, or ciprofloxacin, but with 400 mg of didanosine as an encapsulated enteric bead formulation). A lack of interaction was concluded if the 90% confidence interval (CI) of the ratio of the geometric means of log-transformed C max and AUC0-∞ values (i.e., values for the area under the concentration-time curve from time zero to infinity) of indinavir, ketoconazole, and ciprofloxacin were contained entirely between 0.75 and 1.33. For indinavir (n = 23), the point estimate (90% CI; minimum, maximum) of the ratios of C max and AUC0-∞ values were 0.99 (0.91, 1.06) and 0.96 (0.91, 1.02), respectively. In the ketoconazole study, 3 of 24 subjects showed anomalous absorption of ketoconazole (i.e., an ∼8-fold-lower AUC compared to historical data), which was the reference treatment. A post hoc analysis performed after these three subjects were excluded indicated that the point estimates (90% CI) of the ratios of Cmax and AUC0-∞ values were 0.99 (0.86, 1.14) and 0.97 (0.85, 1.10), respectively. For ciprofloxacin (n = 16), the point estimate (90% CI) of the ratios of C max and AUC0-∞ values were 0.92 (0.79, 1.07) and 0.91 (0.76, 1.08), respectively. All three studies clearly indicated a lack of interaction. The T max and t 1/2 for indinavir, ketoconazole, and ciprofloxacin were similar between treatments. Our results showed that the lack of interaction of didanosine encapsulated enteric bead formulation with indinavir, ketoconazole, and ciprofloxacin indicates that this enteric formulation of didanosine can be concomitantly administered with drugs whose bioavailability is known to be reduced by interaction with antacids.


2018 ◽  
Vol 40 (11) ◽  
pp. 1613-1629 ◽  
Author(s):  
Philippe Accard

Self-organizing systems are social systems which are immanently and constantly recreated by agents. In a self-organizing system, agents make changes while preserving stability. If they do not preserve stability, they push the system toward chaos and cannot recreate it. How changes preserve stability is thus a fundamental issue. In current works, changes preserve stability because agents’ ability to make changes is limited by interaction rules and power. However, how agents diffuse the changes throughout the system while preserving its stability has not been addressed in these works. We have addressed this issue by borrowing from a complex system theory neglected thus far in organization theories: self-organized criticality theory. We suggest that self-organizing systems are in critical states: agents have equivalent ability to make changes, and none are able to foresee or control how their changes diffuse throughout the system. Changes, then, diffuse unpredictably – they may diffuse to small or large parts of the system or not at all, and it is this unpredictable diffusion that preserves stability in the system over time. We call our theoretical framework self-organiz ing criticality theory. It presents a new treatment of change and stability and improves the understanding of self-organizing.


2019 ◽  
pp. 106-138
Author(s):  
Christopher Peacocke

This chapter presents a metaphysics-first treatment of subjects and the first-person way of representing subjects. It develops a new explanation of the metaphysical principle that it is in the nature of mental events that they have subjects. It advocates the view that the identity of a subject over time involves the identity of a subpersonal integration apparatus, and contrasts the resulting position with Johnston’s conception of personites. A new treatment of the first person is developed that gives a greater role for agency than in previous accounts. Only by doing so can we explain how the first person brings a subject, rather than something else, into the contents of the states and events in which it is involved. Some of the consequences of the resulting agency-involving account of the first person are traced out.


2020 ◽  
Vol 31 (11) ◽  
pp. 2631-2641
Author(s):  
Marcello Tonelli ◽  
Natasha Wiebe ◽  
Matthew T. James ◽  
Scott W. Klarenbach ◽  
Braden J. Manns ◽  
...  

BackgroundFew new treatments have been developed for kidney failure or CKD in recent years, leading to perceptions of slower improvement in outcomes associated with CKD or kidney failure than for other major noncommunicable diseases.MethodsOur retrospective cohort study included 548,609 people with an incident noncommunicable disease, including cardiovascular diseases, diabetes, various cancers, and severe CKD or kidney failure treated with renal replacement (KF-RRT), treated in Alberta, Canada, 2004–2015. For each disease, we assessed presence or absence of 8 comorbidities; we also compared secular trends in relative (compared to a referent year of 2004) and absolute risks of mortality and mean annual days in the hospital associated with each disease after 1 year and 5 years.ResultsComorbidities increased significantly in number over time for all noncommunicable diseases except diabetes, and increased most rapidly for CKD and KF-RRT. Significant but relatively small reductions over time in the risk ratio of mortality at 1 year occurred for nearly all noncommunicable diseases. Secular trends in the absolute risk of mortality were similar; CKD and KF-RRT had a relatively favorable ranking at 1 year. Breast cancer, KF-RRT, diabetes, and colorectal cancer displayed the largest relative reductions in number of hospital days at 1 year. Significant absolute reductions in the number of hospital days were observed for both KF-RRT and CKD; the former had the highest absolute reduction among all noncommunicable diseases. Results were similar at 5 years.ConclusionsWe observed secular reductions in mortality and annual hospital days at 1 year and 5 years among incident patients with KF-RRT and severe CKD, as well as several other common noncommunicable diseases.


2020 ◽  
Vol 16 (7) ◽  
pp. 20200096
Author(s):  
James M. Smoliga

Gut capacity and plasticity have been examined across multiple species, but are not typically explored in the context of extreme human performance. Here, I estimate the theoretical maximal active consumption rate (ACR) in humans, using 39 years of historical data from the annual Nathan's Famous Hot Dog Eating Contest. Through nonlinear modelling and generalized extreme value analysis, I show that humans are theoretically capable of achieving an ACR of approximately 832 g min −1 fresh matter over 10 min duration. Modelling individual performances across 5 years reveals that maximal ACR significantly increases over time in ‘elite’ competitive eaters, likely owing to training effects. Extreme digestive plasticity suggests that eating competition records are quite biologically impressive, especially in the context of carnivorous species and other human athletic competitions.


Sign in / Sign up

Export Citation Format

Share Document