scholarly journals Composite endpoint for ALS clinical trials based on patient preference: Patient-Ranked Order of Function (PROOF)

2021 ◽  
pp. jnnp-2021-328194
Author(s):  
Ruben P A van Eijk ◽  
L H van den Berg ◽  
Ying Lu

BackgroundPatients with amyotrophic lateral sclerosis (ALS) show considerable variation in symptoms. Treatments targeting an overall improvement in symptomatology may not address what the majority of patients consider to be most important. Here, we propose a composite endpoint for ALS clinical trials that weighs the improvement in symptoms compared with what the patient population actually wants.MethodsAn online questionnaire was sent out to a population-based registry in The Netherlands. Patients with ALS were asked to score functional domains with a validated self-reported questionnaire, and rank the order of importance of each domain. This information was used to estimate variability in patient preferences and to develop the Patient-Ranked Order of Function (PROOF) endpoint.ResultsThere was extensive variability in patient preferences among the 433 responders. The majority of the patients (62.1%) preferred to prioritise certain symptoms over others when evaluating treatments. The PROOF endpoint was established by comparing each patient in the treatment arm to each patient in the placebo arm, based on their preferred order of functional domains. PROOF averages all pairwise comparisons, and reflects the probability that a patient receiving treatment has a better outcome on domains that are most important to them, compared with a patient receiving placebo. By means of simulation we illustrate how incorporating patient preference may upgrade or downgrade trial results.ConclusionsThe PROOF endpoint provides a balanced patient-focused analysis of the improvement in function and may help to refine the risk–benefit assessment of new treatments for ALS.

Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 899-899
Author(s):  
Dianne Pulte ◽  
Adam Gondos ◽  
Teresa Redaniel ◽  
Hermann Brenner

Abstract Abstract 899 Background: The gold standard for determination of the superiority of new treatments is the clinical trial. However, patients in clinical trials tend to be “ideal patients”, i.e. are otherwise healthy except for the condition being examined, have good performance status, may be younger than the average patient with the disease under study, etc, and thus results from clinical trials may not pertain to all patients in the general population. Five year survival for patients with chronic myelocytic leukemic (CML) in recent clinical trials are as high as 95%. However, population based studies of CML show much lower 5-year survival rates. In this study, we compare survival in clinical trials to survival for patients identified from the SEER database as being diagnosed with CML during the same years as the relevant trial was recruiting patients. Methods: We examined survival of patients in randomized controlled trials of treatment of CML between 1980 and 2005 and compared the survival data obtained in these trials to survival of CML patients in the general population of the United States using the Surveillance, Epidemiology, and End Results (SEER) database in the same years as the years of recruitment for the trial. Because age may be a factor in survival, we also calculated age adapted survival for patients in the SEER database for each trial by calculating survival for patients of the same age as the age range given in the trial or for a fifty year interval centered around the median age of patients in the relevant trial. Results: 27 trials were identified for data extraction. Median age on clinical trials varied from 37 to 60, whereas the median age of CML patients in the SEER database was 62. The majority of trials recruited patients with chronic phase only. Two trials of patients in accelerated phase were identified. Five year survival on the clinical trials ranged from 30-40% in the earliest trials to 89% for the first trials of imatinib. Five year survival for patients in the general population over the same time period ranged from 22.2% in 1980-87 to 42.7% in 2000-01. Age adapted survival ranged from 26% to 60%. Overall 5-year survival calculated from the SEER database was uniformly lower than survival in clinical trials in the corresponding time period, although age adapted survival overlapped with survival in clinical trials in some cases (see figure). In general, survival from the SEER database was much lower than survival in the corresponding trial for trials of hematopoietic stem cell transplant or interferon and relatively close to that observed in clinical trials after age adaptation for other treatment types. Discussion: Survival in clinical trials of treatment for CML is higher than survival of patients with CML in the general population. The difference can be attributed to access to newer medications, a bias toward selecting younger, healthier patients for clinical trials, the requirement in most trials that patients be in the chronic phase of the disease, and time necessary for new treatments identified as superior by clinical trials to be adopted by practitioners. In particular, the difference in survival was larger for trials of more difficult to tolerate treatments such as interferon or stem cell transplantation. This finding underscores the need for population based studies to give a more realistic idea of survival of patients with a given malignancy in the general population. The inclusion of a more diverse patient population in clinical trials, including older and less fit patients, may reduce the disparity. Figure legend: Five year survival for patients in clinical trials (squares) and age adapted survival for patients in the SEER database (triangles.) In trials in which survival was different between treatment types, the bold squares represent the higher survival value, the lighter the lower survival. Date on the x-axis represents the middle year of recruitment for the trial. When the middle year of recruitment was the same for more than one trial, the values are staggered for clarity. Disclosures: No relevant conflicts of interest to declare.


Nutrients ◽  
2021 ◽  
Vol 13 (7) ◽  
pp. 2154
Author(s):  
Maria Luz Fernandez ◽  
Sarah A. Blomquist ◽  
Brian Hallmark ◽  
Floyd H. Chilton

Omega-3 (n-3) polyunsaturated fatty acids (PUFA) and their metabolites have long been recognized to protect against inflammation-related diseases including heart disease. Recent reports present conflicting evidence on the effects of n-3 PUFAs on major cardiovascular events including death. While some studies document that n-3 PUFA supplementation reduces the risk for heart disease, others report no beneficial effects on heart disease composite primary outcomes. Much of this heterogeneity may be related to the genetic variation in different individuals/populations that alters their capacity to synthesize biologically active n-3 and omega 6 (n-6) PUFAs and metabolites from their 18 carbon dietary precursors, linoleic acid (LA, 18:2 n-6) and alpha-linolenic (ALA, 18:3, n-3). Here, we discuss the role of a FADS gene-by-dietary PUFA interaction model that takes into consideration dietary exposure, including the intake of LA and ALA, n-3 PUFAs, eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) in determining the efficacy of n-3 PUFA supplementation. We also review recent clinical trials with n-3 PUFA supplementation and coronary heart disease in the context of what is known about fatty acid desaturase (FADS) gene-by-dietary PUFA interactions. Given the dramatic differences in the frequencies of FADS variants that impact the efficiency of n-3 and n-6 PUFA biosynthesis, and their downstream signaling products among global and admixture populations, we conclude that large clinical trials utilizing “one size fits all” n-3 PUFA supplementation approaches are unlikely to show effectiveness. However, evidence discussed in this review suggests that n-3 PUFA supplementation may represent an important opportunity where precision interventions can be focused on those populations that will benefit the most from n-3 PUFA supplementation.


1998 ◽  
Vol 173 (4) ◽  
pp. 345-350 ◽  
Author(s):  
Kenneth S. Kendler ◽  
Carol A. Prescott

BackgroundAlthough cocaine use in women has increased substantially over the past half-century, we understand little about the aetiology in women of cocaine use and abuse, and know almost nothing about the role of genetic factors.MethodWe obtained by telephone interview a history of lifetime cocaine use, abuse and dependence from 1934 individual twins from female–female pairs ascertained through a population-based registry, including both members of 485 monozygotic (MZ) and 335 dizygotic (DZ) pairs.ResultsThe prevalence of lifetime cocaine use, abuse and dependence were 14.0%, 3.3% and 2.3%. Probandwise concordance rates, in MZ and DZ twins, respectively, were: cocaine use 54% and 42%; cocaine abuse 47% and 8% and cocaine dependence 35% and 0%. In MZ and DZ twins, odds ratios were: cocaine use 14.2 and 6.7 and cocaine abuse 40.8 and 2.7. Biometrical model-fitting suggested that twin resemblance for liability to cocaine use was due to both genetic and familial–environmental factors while twin resemblance for cocaine abuse and symptoms of dependence was due solely to genetic factors. Estimated heritabilities were: cocaine use 0.39, cocaine abuse 0.79 and symptoms of dependence 0.65.ConclusionsThe vulnerability to cocaine use and particularly cocaine abuse and dependence in women is substantially influenced by genetic factors.


2006 ◽  
Vol 95 (3) ◽  
pp. 393-397 ◽  
Author(s):  
I Soerjomataram ◽  
W J Louwman ◽  
M J C van der Sangen ◽  
R M H Roumen ◽  
J W W Coebergh

2016 ◽  
Vol 214 (3) ◽  
pp. 378.e1-378.e10 ◽  
Author(s):  
Erica Ginström Ernstad ◽  
Christina Bergh ◽  
Ali Khatibi ◽  
Karin B.M. Källén ◽  
Göran Westlander ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document