scholarly journals Platelet-Rich Plasma: Does It Decrease Meniscus Repair Failure Risk?

2019 ◽  
Vol 7 (7_suppl5) ◽  
pp. 2325967119S0024
Author(s):  
Joshua Scott Everhart ◽  
David C. Flanigan ◽  
Robert A. Magnussen ◽  
Christopher C. Kaeding

Objectives: (1) To determine whether intraoperative PRP affects meniscus repair failure risk. (2) To determine whether the effect of PRP on meniscus failure risk is influenced by ACL reconstruction status or by PRP preparation. Methods: 550 patients (mean age 28.8 years SD 11.3) who underwent meniscus repair surgery with PRP (n=203 total, n=148 prepared with GPS III system, n=55 Angel system) or without PRP (n=347) and with (n=399) or without (n=151) concurrent ACL reconstruction were assessed for meniscus repair failure within 3 years. The independent effect of PRP on meniscus repair failure risk was determined by multivariate Cox proportional hazards modeling with adjustment for age, sex, body mass index (BMI), ACL status, tear pattern, tear vascularity, repair technique, side (medial or lateral) and number of sutures or implants utilized. Results: Failures within 3 years occurred in 17.0% of patients without PRP and 14.7% of patients with PRP (p=0.52) (Angel PRP: 14.6%; GPS III PRP: 12.0%; p=0.59). Increased patient age was protective against meniscus failure regardless of ACL or PRP status (per 5-year increase in age: adjusted Hazard Ratio [aHR] 0.90, 95% confidence interval [CI] 0.81, 1.0; p=0.047). The effect of PRP on meniscus failure risk was dependent upon concomitant ACL injury status (Figure). Among isolated meniscus repairs (20.3% failures at 3 years), PRP was independently associated with lower risk of failure (aHR 0.18, 95% confidence interval (CI) 0.03, 0.59; p=0.002) with no difference between PRP vendors (p=0.84). Among meniscus repairs with concomitant ACLR (14.1% failures at 3 years), PRP was not independently associated with risk of failure (aHR 1.39 CI 0.81, 2.36; p=0.23) with no difference between PRP venders (p=0.78). Conclusion: Both PRP preparations utilized in the current study had a substantial protective effect on isolated meniscus repair failure risk over 3 years. In the setting of concomitant ACL reconstruction, intraoperative PRP does not reduce meniscus repair failure risk. [Figure: see text]

2019 ◽  
Vol 47 (8) ◽  
pp. 1789-1796 ◽  
Author(s):  
Joshua S. Everhart ◽  
Parker A. Cavendish ◽  
Alex Eikenberry ◽  
Robert A. Magnussen ◽  
Christopher C. Kaeding ◽  
...  

Background: The effect of platelet-rich plasma (PRP) on the risk of meniscal repair failure is unclear. Current evidence is limited to small studies without comparison between isolated repairs and meniscal repairs with concomitant anterior cruciate ligament (ACL) reconstruction. It is also unclear whether the efficacy of PRP differs between preparation systems in the setting of meniscal repair. Purpose: (1) To determine whether intraoperative PRP affects the risk of meniscal repair failure. (2) To determine whether the effect of PRP on meniscal failure risk is influenced by ACL reconstruction status or by PRP preparation system. Study Design: Cohort study; Level of evidence, 3. Methods: The study entailed 550 patients (mean ± SD age, 28.8 ± 11.2 years) who underwent meniscal repair surgery with PRP (n = 203 total; n = 148 prepared with GPS III system, n = 55 prepared with Angel system) or without PRP (n = 347) and with (n = 399) or without (n = 151) concurrent ACL reconstruction. The patients were assessed for meniscal repair failure within 3 years. The independent effect of PRP on the risk of meniscal repair failure was determined by multivariate Cox proportional hazards modeling with adjustment for age, sex, body mass index, ACL status, tear pattern, tear vascularity, repair technique, side (medial or lateral), and number of sutures or implants used. Results: Failures within 3 years occurred in 17.0% of patients without PRP and 14.6% of patients with PRP ( P = .60) (Angel PRP, 15.9%; GPS III PRP, 14.2%; P = .58). Increased patient age was protective against meniscal failure regardless of ACL or PRP status (per 5-year increase in age: adjusted hazard ratio [aHR], 0.90; 95% CI, 0.81-1.0; P = .047). The effect of PRP on meniscal failure risk was dependent on concomitant ACL injury status. Among isolated meniscal repairs (20.3% failures at 3 years), PRP was independently associated with lower risk of failure (aHR, 0.18; 95% CI, 0.03-0.59; P = .002) with no difference between PRP preparation systems ( P = .84). Among meniscal repairs with concomitant ACL reconstruction (14.1% failures at 3 years), PRP was not independently associated with risk of failure (aHR, 1.39; 95% CI, 0.81-2.36; P = .23) with no difference between PRP preparation systems ( P = .78). Conclusion: Both PRP preparations used in the current study had a substantial protective effect in terms of the risk of isolated meniscal repair failure over 3 years. In the setting of concomitant ACL reconstruction, PRP does not reduce the risk of meniscal repair failure.


2021 ◽  
pp. 000486742110096
Author(s):  
Oleguer Plana-Ripoll ◽  
Patsy Di Prinzio ◽  
John J McGrath ◽  
Preben B Mortensen ◽  
Vera A Morgan

Introduction: An association between schizophrenia and urbanicity has long been observed, with studies in many countries, including several from Denmark, reporting that individuals born/raised in densely populated urban settings have an increased risk of developing schizophrenia compared to those born/raised in rural settings. However, these findings have not been replicated in all studies. In particular, a Western Australian study showed a gradient in the opposite direction which disappeared after adjustment for covariates. Given the different findings for Denmark and Western Australia, our aim was to investigate the relationship between schizophrenia and urbanicity in these two regions to determine which factors may be influencing the relationship. Methods: We used population-based cohorts of children born alive between 1980 and 2001 in Western Australia ( N = 428,784) and Denmark ( N = 1,357,874). Children were categorised according to the level of urbanicity of their mother’s residence at time of birth and followed-up through to 30 June 2015. Linkage to State-based registers provided information on schizophrenia diagnosis and a range of covariates. Rates of being diagnosed with schizophrenia for each category of urbanicity were estimated using Cox proportional hazards models adjusted for covariates. Results: During follow-up, 1618 (0.4%) children in Western Australia and 11,875 (0.9%) children in Denmark were diagnosed with schizophrenia. In Western Australia, those born in the most remote areas did not experience lower rates of schizophrenia than those born in the most urban areas (hazard ratio = 1.02 [95% confidence interval: 0.81, 1.29]), unlike their Danish counterparts (hazard ratio = 0.62 [95% confidence interval: 0.58, 0.66]). However, when the Western Australian cohort was restricted to children of non-Aboriginal Indigenous status, results were consistent with Danish findings (hazard ratio = 0.46 [95% confidence interval: 0.29, 0.72]). Discussion: Our study highlights the potential for disadvantaged subgroups to mask the contribution of urban-related risk factors to risk of schizophrenia and the importance of stratified analysis in such cases.


2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Raquel Araujo-Gutierrez ◽  
Kalyan R. Chitturi ◽  
Jiaqiong Xu ◽  
Yuanchen Wang ◽  
Elizabeth Kinder ◽  
...  

Abstract Background Cancer therapy-related cardiac dysfunction (CTRD) is a major source of morbidity and mortality in long-term cancer survivors. Decreased GLS predicts decreased left ventricular ejection fraction (LVEF) in patients receiving anthracyclines, but knowledge regarding the clinical utility of baseline GLS in patients at low-risk of (CTRD) is limited. Objectives The purpose of this study was to investigate whether baseline echocardiographic assessment of global longitudinal strain (GLS) before treatment with anthracyclines is predictive of (CTRD) in a broad cohort of patients with normal baseline LVEF. Methods Study participants comprised 188 patients at a single institution who underwent baseline 2-dimensional (2D) speckle-tracking echocardiography before treatment with anthracyclines and at least one follow-up echocardiogram 3 months after chemotherapy initiation. Patients with a baseline LVEF <55% were excluded from the analysis. The primary endpoint, (CTRD), was defined as an absolute decline in LVEF > 10% from baseline and an overall reduced LVEF <50%. Potential and known risk factors were evaluated using univariable and multivariable Cox proportional hazards regression analysis. Results Twenty-three patients (12.23%) developed (CTRD). Among patients with (CTRD), the mean GLS was -17.51% ± 2.77%. The optimal cutoff point for (CTRD) was -18.05%. The sensitivity was 0.70 and specificity was 0.70. The area under ROC curve was 0.70. After adjustment for cardiovascular and cancer therapy related risk factors, GLS or decreased baseline GLS ≥-18% was predictive of (CTRD) (adjusted hazards ratio 1.17, 95% confidence interval 1.00, 1.36; p = 0.044 for GLS, or hazards ratio 3.54; 95% confidence interval 1.34, 9.35; p = 0.011 for decreased GLS), along with history of tobacco use, pre-chemotherapy systolic blood pressure, and cumulative anthracycline dose. Conclusions Baseline GLS or decreased baseline GLS was predictive of (CTRD) before anthracycline treatment in a cohort of cancer patients with a normal baseline LVEF. This data supports the implementation of strain-protocol echocardiography in cardio-oncology practice for identifying and monitoring patients who are at elevated risk of (CTRD).


2021 ◽  
Vol 23 (Supplement_6) ◽  
pp. vi51-vi51
Author(s):  
Kristen Batich ◽  
Duane Mitchell ◽  
Patrick Healy ◽  
James Herndon ◽  
Gloria Broadwater ◽  
...  

Abstract INTRODUCTION Vaccination with dendritic cells (DCs) fares poorly in primary and recurrent glioblastoma (GBM). Moreover, GBM vaccine trials are often underpowered due to limited sample size. METHODS To address these limitations, we conducted three sequential clinical trials utilizing Cytomegalovirus (CMV)-specific DC vaccines in patients with primary GBM. Autologous DCs were generated and electroporated with mRNA encoding for the CMV protein pp65. Serial vaccination was given throughout adjuvant temozolomide cycles, and 111Indium radiolabeling was implemented to assess migration efficiency of DC vaccines. Patients were followed for median overall survival (mOS) and OS. RESULTS Our initial study was the phase II ATTAC study (NCT00639639; total n=12) with 6 patients randomized to vaccine site preconditioning with tetanus-diphtheria (Td) toxoid. This led to an expanded cohort trial (ATTAC-GM; NCT00639639) of 11 patients receiving CMV DC vaccines containing granulocyte-macrophage colony-stimulating factor (GM-CSF). Follow-up data from ATTAC and ATTAC-GM revealed 5-year OS rates of 33.3% (mOS 38.3 months; CI95 17.5-undefined) and 36.4% (mOS 37.7 months; CI95 18.2-109.1), respectively. ATTAC additionally revealed a significant increase in DC migration to draining lymph nodes following Td preconditioning (P=0.049). Increased DC migration was associated with OS (Cox proportional hazards model, HR=0.820, P=0.023). Td-mediated increased migration has been recapitulated in our larger confirmatory trial ELEVATE (NCT02366728) of 43 patients randomized to preconditioning (Wilcoxon rank sum, Td n=24, unpulsed DC n=19; 24h, P=0.031 and 48h, P=0.0195). In ELEVATE, median follow-up of 42.2 months revealed significantly longer OS in patients randomized to Td (P=0.026). The 3-year OS for Td-treated patients in ELEVATE was 34% (CI95 19-63%) compared to 6% given unpulsed DCs (CI95 1-42%). CONCLUSION We report reproducibility of our findings across three sequential clinical trials using CMV pp65 DCs. Despite their small numbers, these successive trials demonstrate consistent survival outcomes, thus supporting the efficacy of CMV DC vaccine therapy in GBM.


Neurosurgery ◽  
2015 ◽  
Vol 77 (6) ◽  
pp. 880-887 ◽  
Author(s):  
Eric J. Heyer ◽  
Joanna L. Mergeche ◽  
Shuang Wang ◽  
John G. Gaudet ◽  
E. Sander Connolly

BACKGROUND: Early cognitive dysfunction (eCD) is a subtle form of neurological injury observed in ∼25% of carotid endarterectomy (CEA) patients. Statin use is associated with a lower incidence of eCD in asymptomatic patients having CEA. OBJECTIVE: To determine whether eCD status is associated with worse long-term survival in patients taking and not taking statins. METHODS: This is a post hoc analysis of a prospective observational study of 585 CEA patients. Patients were evaluated with a battery of neuropsychometric tests before and after surgery. Survival was compared for patients with and without eCD stratifying by statin use. At enrollment, 366 patients were on statins and 219 were not. Survival was assessed by using Kaplan-Meier methods and multivariable Cox proportional hazards models. RESULTS: Age ≥75 years (P = .003), diabetes mellitus (P &lt; .001), cardiac disease (P = .02), and statin use (P = .014) are significantly associated with survival univariately (P &lt; .05) by use of the log-rank test. By Cox proportional hazards model, eCD status and survival adjusting for univariate factors within statin and nonstatin use groups suggested a significant effect by association of eCD on survival within patients not taking statin (hazard ratio, 1.61; 95% confidence interval, 1.09–2.40; P = .018), and no significant effect of eCD on survival within patients taking statin (hazard ratio, 0.98; 95% confidence interval, 0.59–1.66; P = .95). CONCLUSION: eCD is associated with shorter survival in patients not taking statins. This finding validates eCD as an important neurological outcome and suggests that eCD is a surrogate measure for overall health, comorbidity, and vulnerability to neurological insult.


2019 ◽  
Vol 15 (1) ◽  
pp. 101-108 ◽  
Author(s):  
Guofen Yan ◽  
Jenny I. Shen ◽  
Rubette Harford ◽  
Wei Yu ◽  
Robert Nee ◽  
...  

Background and objectivesIn the United States mortality rates for patients treated with dialysis differ by racial and/or ethnic (racial/ethnic) group. Mortality outcomes for patients undergoing maintenance dialysis in the United States territories may differ from patients in the United States 50 states.Design, setting, participants, & measurementsThis retrospective cohort study of using US Renal Data System data included 1,547,438 adults with no prior transplantation and first dialysis treatment between April 1, 1995 and September 28, 2012. Cox proportional hazards regression was used to calculate hazard ratios (HRs) of death for the territories versus 50 states for each racial/ethnic group using the whole cohort and covariate-matched samples. Covariates included demographics, year of dialysis initiation, cause of kidney failure, comorbid conditions, dialysis modality, and many others.ResultsOf 22,828 patients treated in the territories (American Samoa, Guam, Puerto Rico, Virgin Islands), 321 were white, 666 were black, 20,299 were Hispanic, and 1542 were Asian. Of 1,524,610 patients in the 50 states, 838,736 were white, 444,066 were black, 182,994 were Hispanic, and 58,814 were Asian. The crude mortality rate (deaths per 100 patient-years) was lower for whites in the territories than the 50 states (14 and 29, respectively), similar for blacks (18 and 17, respectively), higher for Hispanics (27 and 16, respectively), and higher for Asians (22 and 15). In matched analyses, greater risks of death remained for Hispanics (HR, 1.65; 95% confidence interval, 1.60 to 1.70; P<0.001) and Asians (HR, 2.01; 95% confidence interval, 1.78 to 2.27; P<0.001) living in the territories versus their matched 50 states counterparts. There were no significant differences in mortality among white or black patients in the territories versus the 50 states.ConclusionsMortality rates for patients undergoing dialysis in the United States territories differ substantially by race/ethnicity compared with the 50 states. After matched analyses for comparable age and risk factors, mortality risk no longer differed for whites or blacks, but remained much greater for territory-dwelling Hispanics and Asians.


2019 ◽  
Vol 14 (6) ◽  
pp. 854-861 ◽  
Author(s):  
Mark E. Molitch ◽  
Xiaoyu Gao ◽  
Ionut Bebu ◽  
Ian H. de Boer ◽  
John Lachin ◽  
...  

Background and objectivesGlomerular hyperfiltration has been considered to be a contributing factor to the development of diabetic kidney disease (DKD). To address this issue, we analyzed GFR follow-up data on participants with type 1 diabetes undergoing 125I-iothalamate clearance on entry into the Diabetes Control and Complications Trial (DCCT)/Epidemiology of Diabetes Interventions and Complications study.Design, setting, participants, & measurementsThis was a cohort study of DCCT participants with type 1 diabetes who underwent an 125I-iothalamate clearance (iGFR) at DCCT baseline. Presence of hyperfiltration was defined as iGFR levels ≥140 ml/min per 1.73 m2, with secondary thresholds of 130 or 150 ml/min per 1.73 m2. Cox proportional hazards models assessed the association between the baseline hyperfiltration status and the subsequent risk of reaching an eGFR <60 ml/min per 1.73 m2.ResultsOf the 446 participants, 106 (24%) had hyperfiltration (iGFR levels ≥140 ml/min per 1.73 m2) at baseline. Over a median follow-up of 28 (interquartile range, 23, 33) years, 53 developed an eGFR <60 ml/min per 1.73 m2. The cumulative incidence of eGFR <60 ml/min per 1.73 m2 at 28 years of follow-up was 11.0% among participants with hyperfiltration at baseline, compared with 12.8% among participants with baseline GFR <140 ml/min per 1.73 m2. Hyperfiltration was not significantly associated with subsequent risk of developing an eGFR <60 ml/min per 1.73 m2 in an unadjusted Cox proportional hazards model (hazard ratio, 0.83; 95% confidence interval, 0.43 to 1.62) nor in an adjusted model (hazard ratio, 0.77; 95% confidence interval, 0.38 to 1.54). Application of alternate thresholds to define hyperfiltration (130 or 150 ml/min per 1.73 m2) showed similar findings.ConclusionsEarly hyperfiltration in patients with type 1 diabetes was not associated with a higher long-term risk of decreased GFR. Although glomerular hypertension may be a mechanism of kidney injury in DKD, higher total GFR does not appear to be a risk factor for advanced DKD.


Neurology ◽  
2019 ◽  
Vol 93 (24) ◽  
pp. e2247-e2256 ◽  
Author(s):  
Miguel Arce Rentería ◽  
Jet M.J. Vonk ◽  
Gloria Felix ◽  
Justina F. Avila ◽  
Laura B. Zahodne ◽  
...  

ObjectiveTo investigate whether illiteracy was associated with greater risk of prevalent and incident dementia and more rapid cognitive decline among older adults with low education.MethodsAnalyses included 983 adults (≥65 years old, ≤4 years of schooling) who participated in a longitudinal community aging study. Literacy was self-reported (“Did you ever learn to read or write?”). Neuropsychological measures of memory, language, and visuospatial abilities were administered at baseline and at follow-ups (median [range] 3.49 years [0–23]). At each visit, functional, cognitive, and medical data were reviewed and a dementia diagnosis was made using standard criteria. Logistic regression and Cox proportional hazards models evaluated the association of literacy with prevalent and incident dementia, respectively, while latent growth curve models evaluated the effect of literacy on cognitive trajectories, adjusting for relevant demographic and medical covariates.ResultsIlliterate participants were almost 3 times as likely to have dementia at baseline compared to literate participants. Among those who did not have dementia at baseline, illiterate participants were twice as likely to develop dementia. While illiterate participants showed worse memory, language, and visuospatial functioning at baseline than literate participants, literacy was not associated with rate of cognitive decline.ConclusionWe found that illiteracy was independently associated with higher risk of prevalent and incident dementia, but not with a more rapid rate of cognitive decline. The independent effect of illiteracy on dementia risk may be through a lower range of cognitive function, which is closer to diagnostic thresholds for dementia than the range of literate participants.


2019 ◽  
Vol 26 (14) ◽  
pp. 1510-1518 ◽  
Author(s):  
Claudia T Lissåker ◽  
Fredrika Norlund ◽  
John Wallert ◽  
Claes Held ◽  
Erik MG Olsson

Background Patients with symptoms of depression and/or anxiety – emotional distress – after a myocardial infarction (MI) have been shown to have worse prognosis and increased healthcare costs. However, whether specific subgroups of patients with emotional distress are more vulnerable is less well established. The purpose of this study was to identify the association between different patterns of emotional distress over time with late cardiovascular and non-cardiovascular mortality among first-MI patients aged <75 years in Sweden. Methods We utilized data on 57,602 consecutive patients with a first-time MI from the national SWEDEHEART registers. Emotional distress was assessed using the anxiety/depression dimension of the European Quality of Life Five Dimensions questionnaire two and 12 months after the MI, combined into persistent (emotional distress at both time-points), remittent (emotional distress at the first follow-up only), new (emotional distress at the second-follow up only) or no distress. Data on cardiovascular and non-cardiovascular mortality were obtained until the study end-time. We used multiple imputation to create complete datasets and adjusted Cox proportional hazards models to estimate hazard ratios. Results Patients with persistent emotional distress were more likely to die from cardiovascular (hazard ratio: 1.46, 95% confidence interval: 1.16, 1.84) and non-cardiovascular causes (hazard ratio: 1.54, 95% confidence interval: 1.30, 1.82) than those with no distress. Those with remittent emotional distress were not statistically significantly more likely to die from any cause than those without emotional distress. Discussion Among patients who survive 12 months, persistent, but not remittent, emotional distress was associated with increased cardiovascular and non-cardiovascular mortality. This indicates a need to identify subgroups of individuals with emotional distress who may benefit from further assessment and specific treatment.


2016 ◽  
Vol 23 (4) ◽  
pp. 362 ◽  
Author(s):  
M. Giuliani ◽  
L.R. Sampson ◽  
O. Wong ◽  
J. Gay ◽  
L.W. Le ◽  
...  

PurposeIn the present study, we determined the association of pretreatment circulating neutrophils, monocytes, and lymphocytes with clinical outcomes after lung stereotactic body radiotherapy (sbrt).Methods All patients with primary lung cancer and with a complete blood count within 3 months of lung sbrt from 2005 to 2012 were included. Overall survival (os) was calculated using the Kaplan–Meier method. Factors associated with os were investigated using univariable and multivariable Cox proportional hazards regression. Fine–Gray competing risk regression was performed to test the association of the neutrophil:lymphocyte (nlr) and monocyte:lymphocyte (mlr) ratios with two types of failure: disease-related failure and death, and death unrelated to disease.Results Of the 299 sbrt patients identified, 122 were eligible for analysis. The median and range of the nlr and mlr were 3.0 (0.3–22.0) and 0.4 (0.1–1.9) respectively. On multivariable analysis, sex (p = 0.02), T stage (p = 0.04), and nlr (p < 0.01) were associated with os. On multivariable analysis, T stage (p < 0.01) and mlr (p < 0.01) were associated with disease-related failure; mlr (p = 0.03), nlr (p < 0.01), and sbrt dose of 48 Gy in 4 fractions (p = 0.03) and 54 Gy or 60 Gy in 3 fractions (p = 0.02) were associated with disease-unrelated death. Median survival was 4.3 years in the nlr≤3 group (95% confidence interval: 3.5 to not reached) and 2.5 years in the nlr>3 group (95% confidence interval: 1.7 to 4.8; p < 0.01).Conclusions In lung sbrt patients, nlr and mlr are independently associated with os and disease-unrelated death. If validated, nlr and mlr could help to identify patients who would benefit most from sbrt.


Sign in / Sign up

Export Citation Format

Share Document