scholarly journals Varicella zoster virus infections increase the risk of disease flares in patients with SLE: a matched cohort study

2019 ◽  
Vol 6 (1) ◽  
pp. e000339 ◽  
Author(s):  
Fangfang Sun ◽  
Yi Chen ◽  
Wanlong Wu ◽  
Li Guo ◽  
Wenwen Xu ◽  
...  

ObjectiveTo explore whether varicella zoster virus (VZV) infection could increase the risk of disease flares in patients with SLE.MethodsPatients who had VZV reactivations between January 2013 and April 2018 were included from the SLE database (n=1901) of Shanghai Ren Ji Hospital, South Campus. Matched patients with SLE were selected as background controls with a 3:1 ratio. Patients with SLE with symptomatic bacterial infections of the lower urinary tract (UTI) were identified as infection controls. Baseline period and index period were defined as 3 months before and after infection event, respectively. Control period was the following 3 months after the index period. Flare was defined by SELENA SLEDAI Flare Index. Kaplan-Meier analysis, Cox regression model and propensity score weighting were applied.ResultsPatients with VZV infections (n=47), UTI controls (n=28) and matched SLE background controls (n=141) were included. 16 flares (34%) in the VZV group within the index period were observed, as opposed to only 7.1% in UTI controls and 9.9% in background controls. Kaplan-Meier curve revealed that patients with a VZV infection had a much lower flare-free survival within the index period compared with the controls (p=0.0003). Furthermore, after adjusting for relevant confounders including baseline disease activity and intensity of immunosuppressive therapy, Cox regression analysis and propensity score weighting confirmed that VZV infection within 3 months was an independent risk factor for SLE flares (HR 3.70 and HR 4.16, respectively).ConclusionsIn patients with SLE, recent VZV infection within 3 months was associated with increased risk of disease flares.

2021 ◽  
Vol 10 (8) ◽  
pp. 1680
Author(s):  
Urban Berg ◽  
Annette W-Dahl ◽  
Anna Nilsdotter ◽  
Emma Nauclér ◽  
Martin Sundberg ◽  
...  

Purpose: We aimed to study the influence of fast-track care programs in total hip and total knee replacements (THR and TKR) at Swedish hospitals on the risk of revision and mortality within 2 years after the operation. Methods: Data were collected from the Swedish Hip and Knee Arthroplasty Registers (SHAR and SKAR), including 67,913 THR and 59,268 TKR operations from 2011 to 2015 on patients with osteoarthritis. Operations from 2011 to 2015 Revision and mortality in the fast-track group were compared with non-fast-track using Kaplan–Meier survival analysis and Cox regression analysis with adjustments. Results: The hazard ratio (HR) for revision within 2 years after THR with fast-track was 1.19 (CI: 1.03–1.39), indicating increased risk, whereas no increased risk was found in TKR (HR 0.91; CI: 0.79–1.06). The risk of death within 2 years was estimated with a HR of 0.85 (CI: 0.74–0.97) for TKR and 0.96 (CI: 0.85–1.09) for THR in fast-track hospitals compared to non-fast-track. Conclusions: Fast-track programs at Swedish hospitals were associated with an increased risk of revision in THR but not in TKR, while we found the mortality to be lower (TKR) or similar (THR) as compared to non-fast track.


2021 ◽  
Vol 10 (17) ◽  
pp. 4032
Author(s):  
Chun-Hao Kao ◽  
Chi-Hsiang Chung ◽  
Wu-Chien Chien ◽  
Daniel Hueng-Yuan Shen ◽  
Li-Fan Lin ◽  
...  

(1) Background: This study aimed to investigate the association between radioactive iodine (RAI) and long-term cardiovascular disease (CVD) morbidity/mortality in thyroid cancer. (2) Methods: The study was conducted using data from the Taiwan National Health Insurance Database during 2000–2015. Thyroid cancer patients aged ≥20 years were categorized into RAI (thyroidectomy with RAI) and non-RAI (thyroidectomy only) groups. The Cox proportional hazard regression model and Kaplan–Meier method were used for analysis. (3) Results: A total of 13,310 patients were included. Kaplan–Meier analysis demonstrated that the two groups had similar cumulative risks of CVD (log-rank p = 0.72) and CVD-specific mortality (log-rank p = 0.62). On Cox regression analysis of different RAI doses, the risk of CVD was higher in the cumulative dosage >3.7 GBq (hazard ratio = 1.69, 95% confidence interval = 1.24–2.40, p < 0.001). (4) Conclusions: RAI was not associated with an increased risk of CVD in thyroid cancer. However, CVD surveillance is indicated in the patients receiving the cumulative RAI dosage above 3.7 GBq.


2020 ◽  
Vol 52 (1) ◽  
pp. 86-92 ◽  
Author(s):  
Jiliang Chen ◽  
Zhiping Xie ◽  
Zou Bin

Abstract Objective Cardiovascular diseases (CVDs) are important complications for patients with rheumatoid arthritis (RA). The study aimed to explore whether serum leptin is associated with a increased risk of cardiovascular (CV) events in patients with RA. Methods Two hundred twenty-three patients with RA were followed for a mean of 40 (range = 8-42) months. Serum leptin levels were measured at baseline. Cox regression analysis was performed to assess the association between leptin levels and the risk of CV events. Results The univariate analysis showed that patients with RA with higher serum leptin levels had higher rates of CV events and CV mortality, respectively (P &lt;.001). The logistic regression model showed that leptin was independently related to CVD history (odds ratio = 1.603, 95% confidence interval [CI], 1.329–2.195; P =.005) after adjusting for confounding factors in patients with RA at baseline. The multivariate Cox proportional hazard model suggested that leptin was an independent prognostic factor for CV events in patients with RA after adjustments were made for clinical confounding factors (hazard ratio = 2.467, 95% CI, 2.019–4.495; P &lt;.001). The Kaplan-Meier analysis showed that compared with patients with RA with leptin levels below the median value (≤15.4 mg/L), patients with leptin above the median value (&gt;15.4 μg/L) had a higher rate of CV events (P &lt;.001). Conclusion Leptin was significantly associated with CV events in patients with RA. Elevated serum leptin levels may be a reliable prognostic factor for predicting CV complications in patients with RA.


2020 ◽  
Vol 38 (4_suppl) ◽  
pp. 3-3
Author(s):  
Grace Lee ◽  
Daniel W. Kim ◽  
Vinayak Muralidhar ◽  
Devarati Mitra ◽  
Nora Horick ◽  
...  

3 Background: While treatment-related lymphopenia (TRL) is common and associated with poorer survival in multiple solid malignancies, little data exists for anal cancer. We evaluated TRL and its association with survival in anal cancer patients treated with chemoradiation (CRT). Methods: A retrospective analysis of 140 patients with non-metastatic anal squamous cell carcinoma (SCC) treated with definitive CRT was performed. Total lymphocyte counts (TLC) at baseline and monthly intervals up to 12 months after initiating CRT were analyzed. Multivariable Cox regression analysis was performed to evaluate the association between overall survival (OS) and TRL, dichotomized by G4 TRL ( < 0.2k/μl) two months after initiating CRT. Kaplan-Meier and log-rank tests were used to compare OS between patients with versus without G4 TRL. Results: Median time of follow-up was 55 months. Prior to CRT, 95% of patients had a normal TLC ( > 1k/μl). Two months after initiating CRT, there was a median of 71% reduction in TLC from baseline and 84% of patients had TRL: 11% G1, 31% G2, 34% G3, and 8% G4. On multivariable Cox model, G4 TRL at two months was associated with a 3.7-fold increased risk of death (p = 0.013). On log-rank test, the 5-year OS rate was shorter in the cohort with versus without G4 TRL at two months (32% vs. 86%, p < 0.001). Conclusions: TRL is common and may be another prognostic marker of OS in anal cancer patients treated with CRT. The association between TRL and OS supports the hypothesis that host immunity plays an important role in survival among patients with anal cancer. These results support ongoing efforts of randomized trials underway to evaluate the potential role of immunotherapy in localized anal cancer.


2020 ◽  
Author(s):  
Gang Wang ◽  
Ling Wen Wang ◽  
Jie Hai Jin ◽  
min Hong Dong ◽  
wei Wei Chen ◽  
...  

Abstract Background: To evaluate the impact of primary tumor radiotherapy on survival in patients with unresectable metastatic rectal or rectosigmoid cancer. Methods: Form September 2008 to September 2017, 350 patients with unresectable metastatic rectal or rectosigmoid cancer were retrospectively reviewed in our center. All patients received at least 4 cycles of chemotherapy, and were divided into two groups according to with primary tumor radiotherapy or without. 163 patients received primary tumor radiotherapy, and the median radiation dose was 56.69Gy(50.4-60). Survival curves were estimated from the Kaplan–Meier procedure to roughly compare survival among two groups. Subsequently, 18-month survival was used as the outcome variable for this study. This study mainly evaluated the impact of primary tumor radiotherapy on survival of these patients through a series of multivariate Cox regression analyses after propensity score matching (PSM). Results: The median follow-up time was 21 months. All 350 patients received a median of 7 cycles of chemotherapy (range 4-12), 163 (46.67%) patients received primary tumor radiotherapy for local symptoms. The Kaplan–Meier survival curves showed a significant overall survival (OS) advantage for primary tumor radiotherapy group to without radiotherapy (20.07 vs 17.33 months; P=0.002). In this study, multivariate Cox regression analysis after adjusted covariates, multivariate Cox regression analysis after PSM, and inverse probability of treatment weighting (IPTW) analysis and propensity score (PS)-adjusted model analysis consistently showed that primary tumor radiotherapy could effectively reduce the risk of death for these patients at 18 months (HR: 0.62, 95% CI 0.40-0.98; HR:0.79, 95% CI:0.93-1.45; HR: 0.70, 95% CI 0.55-0.99 and HR: 0.74, 95% CI:0.59-0.94). Conclusion: Compared with patients with stage IV rectal or rectosigmoid cancer who did not receive primary tumor radiotherapy, received primary tumor radiotherapy reduced the risk of death in these patients. The radical doses(59.4Gy/ 33 fractions or 60Gy/ 30 fractions) of radiation for primary tumors might be considered for unresectable metastatic rectal or rectosigmoid cancer, not just for relieve symptoms. Keywords: Stage IV Rectal cancer, primary tumor radiotherapy, propensity score matching.


2021 ◽  
Vol 11 ◽  
Author(s):  
Xinzhu Qiu ◽  
Hongbo He ◽  
Hao Zeng ◽  
Xiaopeng Tong ◽  
Qing Liu

BackgroundSoft tissue sarcomas on extremities with regional lymph nodes metastasis (STSE-RLNM) is a devastating situation. Optimizing therapeutic approaches is vital but hampered by a shortage of randomized trials. We used a population-level database to evaluate radiotherapy’s impact on sarcoma-specific survival (SSS) and overall survival (OS) for surgery for STSE-RLNM.MethodsWe retrospectively screened data from the SEER database (2004–2015), and 265 patients with STSE-RLNM who received surgery, with (134) or without (131) radiotherapy, were enrolled in this study. A propensity-score-matched analysis with the inverse probability of treatment weighting (IPTW) Kaplan–Meier curve was created. The log-rank test and Cox regression analysis were performed to compare SSS and OS in patients with and without radiotherapy. Further analysis of radiotherapy time was conducted, and the Kaplan–Meier curve and the log-rank test were done. Landmark analysis was introduced to attenuate the immortal bias.ResultsIn the original unadjusted cohort, the radiotherapy + surgery group is associated with improved SSS [hazard ratio (HR), 0.66; 95% CI, 0.47–0.91; p = 0.011] and OS (HR, 0.64; 95% CI, 0.47–0.88; p = 0.006). This significant treatment effect was also noted in IPTW-adjusted Cox regression either on SSS (HR, 0.65; 95% CI, 0.45–0.93; p = 0.020) or on OS (HR, 0.64; 95% CI, 0.46–0.91; p = 0.013). The Kaplan–Meier curve and log-rank test showed that pre- and postoperative radiotherapy was not related to SSS (p = 0.980 or OS (p = 0.890).ConclusionRadiotherapy and surgery has a significant benefit on the prognosis of patients with STSE-RLNM compared to surgery alone. These findings should be considered when making treatment decisions for them.


Circulation ◽  
2015 ◽  
Vol 132 (suppl_3) ◽  
Author(s):  
Larisa H Cavallari ◽  
Oyunbileg Magvanjav ◽  
R.David Anderson ◽  
Yan Gong ◽  
Aniwaa Owusu-Obeng ◽  
...  

Introduction: Clopidogrel is bioactivated by CYP2C19, and the CYP2C19 loss-of-function (LOF) genotype leads to reduced clopidogrel effectiveness after percutaneous coronary intervention (PCI). We examined whether clinical implementation of CYP2C19 genotype-guided antiplatelet therapy (APT) reduces the risk for cardiovascular events after PCI. Methods: CYP2C19 genotyping post-PCI was implemented at UF Health Shands Hospital in July 2012, with alternative APT recommended for LOF allele carriers. Major adverse cardiovascular events (MACE, comprised of cardiovascular death, MI, stroke and stent thrombosis) over the 6 months after PCI were determined via medical record review. MACE was compared between LOF allele carriers switched to alternative APT (LOF-alternative) and both LOF allele carriers who remained on clopidogrel (LOF-clopidogrel) and non-LOF carriers (non-LOF) using Kaplan-Meier method with additional multivariable Cox regression analysis and propensity score adjustment in LOF groups. Results: Of 412 patients (80% with ACS) who underwent PCI and genotyping, 126 (31%) had a LOF allele and 68 (54%) of these received alternative APT (prasugrel n=57, ticagrelor n=8, triple dose clopidogrel n=3). On Kaplan-Meier analysis (Figure), there was a lower incidence of MACE in LOF-alternative vs. LOF-clopidogrel groups and no significant difference between the LOF-alternative and non-LOF groups. On multivariable Cox regression analysis with propensity score adjustment, there was reduced risk of MACE in LOF-alternative vs. LOF-clopidogrel patients (HR 0.09, 95% CI 0.01-0.84, p=0.035). In the LOF-clopidogrel group, the majority of events (83%) occurred within 30 days; all were in patients who presented with an ACS. Conclusion: Changing from clopidogrel to alternative APT after PCI in patients with the CYP2C19 LOF genotype reduces the risk for MACE. These data support CYP2C19 genotyping in patients undergoing PCI, especially in those with ACS.


2014 ◽  
Vol 170 (6) ◽  
pp. 821-828 ◽  
Author(s):  
Débora Rodrigues Siqueira ◽  
Lucieli Ceolin ◽  
Carla Vaz Ferreira ◽  
Mírian Romitti ◽  
Silvana Cavalcante Maia ◽  
...  

Background: RET polymorphisms have been involved in the clinical presentation and prognosis of multiple endocrine neoplasia type 2 (MEN2)-associated medullary thyroid carcinoma.ObjectiveTo investigate the effect of RET variants on the penetrance of pheochromocytoma (PHEO) in MEN2 patients. Methods: The RET variants L769L, S836S, and G691S/S904S were evaluated in a cohort of 153 MEN2 patients attending a tertiary teaching hospital. A comparison of RET variant frequencies between patients with and without PHEO was performed. Kaplan–Meier curves and Cox regression analysis were used to estimate the effect of RET variants on the age-dependent penetrance.ResultsA total of 48 (31.4%) patients presented with MEN2-associated PHEOs. The mean age at diagnosis was 35.5±13.4 years, 60.4% of patients were women, and 92.8% had RET mutations at codon 634. The frequencies of RET polymorphisms were as follows: 20.1% L769L, 4.75% S836S, and 17.3% S904S/G691S. We did not observe any association between the frequencies of L769L, S836S, or S904S/G691S variants and PHEO development (all P>0.05). However, individuals carrying two RET polymorphic alleles had an increased estimated risk of PHEO (2.63; 95% CI, 1.4–5.0; P=0.004) and were younger at diagnosis when compared with those with one or no polymorphism (29.6±6.3 and 39.3±14.4 years respectively; P=0.006). Accordingly, additional analysis using Cox proportional hazard models demonstrated that the presence of two RET variants was associated with an increased risk for early PHEO development (hazard ratio, 5.99 (95% CI, 2.24–16.03); P<0.001).ConclusionsRET polymorphic alleles have an additive effect on the estimated risk of age-related PHEO penetrance in MEN2 patients.


2021 ◽  
Author(s):  
Ghasem Fakhraie ◽  
Zakieh Vahedian ◽  
Reza Zarei ◽  
Yadollah Eslami ◽  
seyed mehdi tabatabaei ◽  
...  

Abstract PurposeTo evaluate the intraocular pressure (IOP) trend and risk factors for IOP rise after myopic photorefractive keratectomy (PRK).Patients and MethodsOne eye of each patient undergone PRK for myopia was randomly assigned to this study. All eyes underwent tonometry by CorVis Scheimpflug Technology (CST) tonometer (Oculus Optikgeräte GmbH, Wetzlar, Germany) 1 week, 2 weeks, 1 month, 2 months, 3 months and 4 months after surgery. The eyes with IOP rise more than 5 mmHg and the risk factors were evaluated by Kaplan-Meier graph and multiple Cox regression analysis.Results348 eyes of 348 patients were enrolled in this study. Forty-three eyes (12.35%) experienced an IOP rise of more than 5 mmHg. Eyes with IOP rise had higher baseline IOP (Median 19 mmHg (IQR 18 – 22) versus. Median 15 mmHg (IQR 14 – 16); p< 0.001). Baseline central corneal thickness (CCT) was higher in eyes without IOP rise (Median 520 µm (IQR 509 – 541) versus. Median 535 µm (IQR 518 – 547); p= 0.009). in multivariate Cox regression analysis higher baseline IOP was a risk factor for IOP rise (Hazard Ratio (HR) 1.59 (95% CI 1.43 – 1.77); p< 0.001) while higher baseline CCT was protective (HR 0.97 (95% CI 0.95 – 0.98); p< 0.001).ConclusionEyes with higher baseline IOP and lower baseline CCT are at increased risk of IOP rise after PRK and should be monitored more frequently.


2021 ◽  
Vol 8 ◽  
Author(s):  
Zhongxing Cai ◽  
Haoyu Wang ◽  
Sheng Yuan ◽  
Dong Yin ◽  
Weihua Song ◽  
...  

Background: Coronary artery ectasia (CAE) is found in about 1% of coronary angiography and is associated with poor clinical outcomes. The prognostic value of plasma big Endothelin-1 (ET-1) in CAE remains unknown.Methods: Patients with angiographically confirmed CAE from 2009 to 2015, who had big ET-1 data available were included. The primary outcome was 5-year major adverse cardiovascular events (MACE), defined as a component of cardiovascular death and non-fatal myocardial infarction (MI). Patients were divided into high or low big ET-1 groups using a cut-off value of 0.58 pmol/L, according to the receiver operating characteristic curve. Kaplan-Meier method, propensity score method, and Cox regression were used to assess the clinical outcomes in the 2 groups.Results: A total of 992 patients were included, with 260 in the high big ET-1 group and 732 in the low big ET-1 group. At 5-year follow-up, 57 MACEs were observed. Kaplan-Meier analysis and univariable Cox regression showed that patients with high big ET-1 levels were at increased risk of MACE (9.87 vs. 4.50%; HR 2.23, 95% CI 1.32–3.78, P = 0.003), cardiovascular death (4.01 vs. 1.69%; HR 2.37, 95% CI 1.02–5.48, P = 0.044), and non-fatal MI (6.09 vs. 2.84%; HR 2.17, 95% CI 1.11–4.24, P = 0.023). A higher risk of MACE in the high big ET-1 group was consistent in the propensity score matched cohort and propensity score weighted analysis. In multivariable analysis, a high plasma big ET-1 level was still an independent predictor of MACE (HR 1.82, 95% CI 1.02–3.25, P = 0.043). A combination of high plasma big ET-1 concentrate and diffuse dilation, when used to predict 5-year MACE risk, yielded a C-statistic of 0.67 (95% CI 0.59–0.74).Conclusion: Among patients with CAE, high plasma big ET-1 level was associated with increased risk of MACE, a finding that could improve risk stratification.


Sign in / Sign up

Export Citation Format

Share Document