scholarly journals Trends and factors associated with modification or discontinuation of the initial antiretroviral regimen during the first year of treatment in the Turkish HIV-TR Cohort, 2011–2017

2021 ◽  
Vol 18 (1) ◽  
Author(s):  
Volkan Korten ◽  
◽  
Deniz Gökengin ◽  
Gülhan Eren ◽  
Taner Yıldırmak ◽  
...  

Abstract Background There is limited evidence on the modification or stopping of antiretroviral therapy (ART) regimens, including novel antiretroviral drugs. The aim of this study was to evaluate the discontinuation of first ART before and after the availability of better tolerated and less complex regimens by comparing the frequency, reasons and associations with patient characteristics. Methods A total of 3019 ART-naive patients registered in the HIV-TR cohort who started ART between Jan 2011 and Feb 2017 were studied. Only the first modification within the first year of treatment for each patient was included in the analyses. Reasons were classified as listed in the coded form in the web-based database. Cumulative incidences were analysed using competing risk function and factors associated with discontinuation of the ART regimen were examined using Cox proportional hazards models and Fine-Gray competing risk regression models. Results The initial ART regimen was discontinued in 351 out of 3019 eligible patients (11.6%) within the first year. The main reason for discontinuation was intolerance/toxicity (45.0%), followed by treatment simplification (9.7%), patient willingness (7.4%), poor compliance (7.1%), prevention of future toxicities (6.0%), virologic failure (5.4%), and provider preference (5.4%). Non-nucleoside reverse transcriptase inhibitor (NNRTI)-based (aHR = 4.4, [95% CI 3.0–6.4]; p < 0.0001) or protease inhibitor (PI)-based regimens (aHR = 4.3, [95% CI 3.1–6.0]; p < 0.0001) relative to integrase strand transfer inhibitor (InSTI)-based regimens were significantly associated with ART discontinuation. ART initiated at a later period (2015-Feb 2017) (aHR = 0.6, [95% CI 0.4–0.9]; p < 0.0001) was less likely to be discontinued. A lower rate of treatment discontinuation for intolerance/toxicity was observed with InSTI-based regimens (2.0%) than with NNRTI- (6.6%) and PI-based regimens (7.5%) (p < 0.001). The percentage of patients who achieved HIV RNA < 200 copies/mL within 12 months of ART initiation was 91% in the ART discontinued group vs. 94% in the continued group (p > 0.05). Conclusion ART discontinuation due to intolerance/toxicity and virologic failure decreased over time. InSTI-based regimens were less likely to be discontinued than PI- and NNRTI-based ART.

2019 ◽  
Vol 69 (12) ◽  
pp. 2145-2152 ◽  
Author(s):  
Christie Joya ◽  
Seung Hyun Won ◽  
Christina Schofield ◽  
Tahaniyat Lalani ◽  
Ryan C Maves ◽  
...  

Abstract Background Whether persistent low-level viremia (pLLV) predicts virologic failure (VF) is unclear. We used data from the US Military HIV Natural History Study (NHS), to examine the association of pLLV and VF. Methods NHS subjects who initiated combination antiretroviral therapy (ART) after 1996 were included if they had 2 or more VLs measured with a lower limit of detection of ≤50 copies/mL. VF was defined as a confirmed VL ≥200 copies/mL or any VL >1000 copies/mL. Participants were categorized into mutually exclusive virologic categories: intermittent LLV (iLLV) (VL of 50–199 copies/mL on <25% of measurements), pLLV (VL of 50–199 copies/mL on ≥25% of measurements), high-level viremia (hLV) (VL of 200–1000 copies/mL), and continuous suppression (all VL <50 copies/mL). Cox proportional hazards models were used to evaluate the association between VF and LLV; hazard ratios and 95% confidence interval (CI) are presented. Results Two thousand six subjects (median age 29.2 years, 93% male, 41% black) were included; 383 subjects (19%) experienced VF. After adjusting for demographics, VL, CD4 counts, ART regimen, prior use of mono or dual antiretrovirals, and time to ART start, pLLV (3.46 [2.42–4.93]), and hLV (2.29 [1.78–2.96]) were associated with VF. Other factors associated with VF include black ethnicity (1.33 [1.06–1.68]) and antiretroviral use prior to ART (1.79 [1.34–2.38]). Older age at ART initiation (0.71 [0.61–0.82]) and non-nucleoside reverse transcriptase inhibitor (0.68 [0.51–0.90]) or integrase strand transfer inhibitor use (0.26 [0.13–0.53]) were protective. Conclusion Our data add to the body of evidence that suggests persistent LLV is associated with deleterious virologic consequences.


2019 ◽  
Author(s):  
B.N. Harding ◽  
B.M. Whitney ◽  
R.M. Nance ◽  
H.M. Crane ◽  
G. Burkholder ◽  
...  

AbstractOBJECTIVESAnemia is common among people living with HIV (PLWH) and has been associated with certain, often older, antiretroviral medications. Information on current antiretroviral therapy (ART) and anemia is limited. The objectives were to compare associations between anemia incidence or hemoglobin change with core ART classes in the current ART era.DESIGNRetrospective cohort study.SETTINGU.S.-based prospective clinical cohort of PLWH aged 18 and above receiving care at 8 sites between 1/2010-3/2018.PARTICIPANTS16,505 PLWH were included in this study.MAIN OUTCOME MEASURESAnemia risk and hemoglobin change were measured for person-time on a protease inhibitor (PI) or an integrase strand transfer inhibitor (INSTI), relative to a non-nucleoside reverse transcriptase inhibitor (NNRTI) reference. We also examined PLWH on multiple core classes. Cox proportional hazards regression analyses were conducted to measure associations between time-updated ART classes and incident anemia or severe anemia. Linear mixed effects models were used to examine relationships between ART classes and hemoglobin change.RESULTSDuring a median of 4.9 years of follow-up, 1,040 developed anemia and 488 developed severe anemia during. Compared to NNRTI use, INSTI-based regimens were associated with an increased risk of anemia (adjusted hazard ratio [aHR] 1.17, 95% confidence interval [CI] 0.94-1.47) and severe anemia (aHR1.55 95%CI 1.11-2.17), and a decrease in hemoglobin level. Time on multiple core classes was also associated with increased anemia risk (aHR 1.30, 95%CI 1.06-1.60) and severe anemia risk (aHR 1.35, 95%CI 0.99-1.85), while no associations were found for PI use.CONCLUSIONThese findings suggest INSTI use may increase the risk of anemia. If confirmed, screening for anemia development in users of INSTIs may be beneficial. Further research into underlying mechanisms is warranted.Strengths and limitations of this studyThis study utilized a large and geographically diverse population of PLWH in care across the U.S.This study leveraged comprehensive clinical data, including information on diagnoses, medication use, laboratory test results, demographic information, and medical history.This study investigated associations between specific types of ART core regimens and anemia risk.This observational study is subject to residual confounding.This study focused on anemia assessed from hemoglobin lab values taken at regular medical care visits without excluding participants with conditions strongly associated with hemoglobin level through non-traditional HIV mechanisms.


BMJ Open ◽  
2020 ◽  
Vol 10 (3) ◽  
pp. e031487
Author(s):  
Barbara N Harding ◽  
Bridget M Whitney ◽  
Robin M Nance ◽  
Heidi M Crane ◽  
Greer Burkholder ◽  
...  

ObjectiveAnaemia is common among people living with HIV (PLWH) and has been associated with certain, often older, antiretroviral medications. Information on current antiretroviral therapy (ART) and anaemia is limited. The objective was to compare the associations between anaemia incidence or haemoglobin change with core ART classes in the current ART era.DesignRetrospective cohort study.SettingUSA-based prospective clinical cohort of PLWH aged 18 and above receiving care at eight sites between January 2010 and March 2018.Participants16 505 PLWH were included in this study.Main outcome measuresAnaemia risk and haemoglobin change were estimated among PLWH for person-time on a protease inhibitor (PI) or an integrase strand transfer inhibitor (INSTI)-based regimen, relative to a non-nucleoside reverse transcriptase inhibitor (NNRTI)-based reference. We also examined PLWH on regimens containing multiple core classes. Cox proportional hazards regression analyses were conducted to measure the associations between time-updated ART classes and incident anaemia or severe anaemia. Linear mixed effects models were used to examine the relationships between ART classes and haemoglobin change.ResultsDuring a median of 4.9 years of follow-up, 1040 developed anaemia and 488 developed severe anaemia. Compared with NNRTI use, INSTI-based regimens were associated with an increased risk of anaemia (adjusted HR (aHR) 1.26, 95% CI 1.00 to 1.58) and severe anaemia (aHR 1.51, 95% CI 1.07 to 2.11) and a decrease in haemoglobin level. Time on multiple core classes was also associated with increased anaemia risk (aHR 1.39, 95% CI 1.13 to 1.70), while no associations were found for PI use.ConclusionThese findings suggest INSTI use may increase the risk of anaemia. If confirmed, screening for anaemia development in users of INSTIs may be beneficial. Further research into the underlying mechanisms is warranted.


2002 ◽  
Vol 36 (3) ◽  
pp. 278-284 ◽  
Author(s):  
Maria F Guerreiro ◽  
Ligia RS Kerr-Pontes ◽  
Rosa S Mota ◽  
Marcondes C França Jr. ◽  
Fábio F Távora ◽  
...  

OBJECTIVE: To evaluate the influence of sociodemographic, clinical, and epidemiological factors in AIDS patients survival in a reference hospital. METHODS: A sample of 502 adult AIDS patients out of 1,494 AIDS cases registered in a hospital in Fortaleza, Brazil, was investigated between 1986 and 1998. Sixteen cases were excluded due to death at the moment of the AIDS diagnosis and 486 were analyzed in the study. Socioeconomic and clinical epidemiological were the variables studied. Statistical analysis was conducted using the Kaplan-Meier survival analysis and the Cox proportional hazards model. RESULTS: Three hundred and sixty two out of the 486 patients studied took at least one antiretroviral drug and their survival was ten times longer than those who did not take any drug (746 and 79 days, respectively, p <0.001). Patients who took two nucleoside reverse transcriptase inhibitors (NRTI) plus protease inhibitor were found to have higher survival rates (p <0.001). The risk of dying in the first year was significantly lower for patients who took NRTI and a protease inhibitor compared to those who took only NRTI. In addition, this risk was much lower from the second year on (0.10; 95%CI: 0.42-0.23). The risk of dying in the first year was significantly higher for less educated patients (15.58; 95%CI: 6.64-36.58) and those who had two or more systemic diseases (3.03; 95%CI: 1.74-5.25). After the first year post-diagnosis, there was no risk difference for these factors. CONCLUSIONS: Higher education revealed to exert a significant influence in the first-year survival. Antiretroviral drugs had a greater impact in the survival from the second year on. A more aggressive antiretroviral therapy started earlier could benefit those patients.


2021 ◽  
Vol 8 (2) ◽  
pp. 27-33
Author(s):  
Jiping Zeng ◽  
Ken Batai ◽  
Benjamin Lee

In this study, we aimed to evaluate the impact of surgical wait time (SWT) on outcomes of patients with renal cell carcinoma (RCC), and to investigate risk factors associated with prolonged SWT. Using the National Cancer Database, we retrospectively reviewed the records of patients with pT3 RCC treated with radical or partial nephrectomy between 2004 and 2014. The cohort was divided based on SWT. The primary out-come was 5-year overall survival (OS). Logistic regression analysis was used to investigate the risk factors associated with delayed surgery. Cox proportional hazards models were fitted to assess relations between SWT and 5-year OS after adjusting for confounding factors. A total of 22,653 patients were included in the analysis. Patients with SWT > 10 weeks had higher occurrence of upstaging. Using logistic regression, we found that female patients, African-American or Spanish origin patients, treatment in academic or integrated network cancer center, lack of insurance, median household income of <$38,000, and the Charlson–Deyo score of ≥1 were more likely to have prolonged SWT. SWT > 10 weeks was associated with decreased 5-year OS (hazard ratio [HR], 1.24; 95% confidence interval [CI], 1.15–1.33). This risk was not markedly attenuated after adjusting for confounding variables, including age, gender, race, insurance status, Charlson–Deyo score, tumor size, and surgical margin status (adjusted HR, 1.13; 95% CI, 1.04–1.24). In conclusion, the vast majority of patients underwent surgery within 10 weeks. There is a statistically significant trend of increasing SWT over the study period. SWT > 10 weeks is associated with decreased 5-year OS.


Neurology ◽  
2018 ◽  
Vol 91 (17) ◽  
pp. e1611-e1618 ◽  
Author(s):  
Paola Gilsanz ◽  
Kathleen Albers ◽  
Michal Schnaider Beeri ◽  
Andrew J. Karter ◽  
Charles P. Quesenberry ◽  
...  

ObjectiveTo examine the association between traumatic brain injury (TBI) and dementia risk among a cohort of middle-aged and elderly individuals with type 1 diabetes (T1D).MethodsWe evaluated 4,049 members of an integrated health care system with T1D ≥50 years old between January 1, 1996, and September 30, 2015. Dementia and TBI diagnoses throughout the study period were abstracted from medical records. Cox proportional hazards models estimated associations between time-dependent TBI and dementia adjusting for demographics, HbA1c, nephropathy, neuropathy, stroke, peripheral artery disease, depression, and dysglycemic events. Fine and Gray regression models evaluated the association between baseline TBI and dementia risk accounting for competing risk of death.ResultsA total of 178 individuals (4.4%) experienced a TBI and 212 (5.2%) developed dementia. In fully adjusted models, TBI was associated with 3.6 times the dementia risk (hazard ratio [HR] 3.64; 95% confidence interval [CI] 2.34, 5.68). When accounting for the competing risk of death, TBI was associated with almost 3 times the risk of dementia (HR 2.91; 95% CI 1.29, 5.68).ConclusionThis study demonstrates a marked increase in risk of dementia associated with TBI among middle-aged and elderly people with T1D. Given the complexity of self-care for individuals with T1D, and the comorbidities that predispose them to trauma and falls, future work is needed on interventions protecting brain health in this vulnerable population, which is now living to old age.


RMD Open ◽  
2019 ◽  
Vol 5 (2) ◽  
pp. e001015 ◽  
Author(s):  
Fernando Pérez Ruiz ◽  
Pascal Richette ◽  
Austin G Stack ◽  
Ravichandra Karra Gurunath ◽  
Ma Jesus García de Yébenes ◽  
...  

ObjectiveTo determine the impact of achieving serum uric acid (sUA) of <0.36 mmol/L on overall and cardiovascular (CV) mortality in patients with gout.MethodsProspective cohort of patients with gout recruited from 1992 to 2017. Exposure was defined as the average sUA recorded during the first year of follow-up, dichotomised as ≤ or >0.36 mmol/L. Bivariate and multivariate Cox proportional hazards models were used to determine mortality risks, expressed HRs and 95% CIs.ResultsOf 1193 patients, 92% were men with a mean age of 60 years, 6.8 years’ disease duration, an average of three to four flares in the previous year, a mean sUA of 9.1 mg/dL at baseline and a mean follow-up 48 months; and 158 died. Crude mortality rates were significantly higher for an sUA of ≥0.36 mmol/L, 80.9 per 1000 patient-years (95% CI 59.4 to 110.3), than for an sUA of <0.36 mmol/L, 25.7 per 1000 patient-years (95% CI 21.3 to 30.9). After adjustment for age, sex, CV risk factors, previous CV events, observation period and baseline sUA concentration, an sUA of ≥0.36 mmol/L was associated with elevated overall mortality (HR=2.33, 95% CI 1.60 to 3.41) and CV mortality (HR=2.05, 95% CI 1.21 to 3.45).ConclusionsFailure to reach a target sUA level of 0.36 mmol/L in patients with hyperuricaemia of gout is an independent predictor of overall and CV-related mortality. Targeting sUA levels of <0.36 mmol/L should be a principal goal in these high-risk patients in order to reduce CV events and to extend patient survival.


2016 ◽  
Vol 10 (9-10) ◽  
pp. 321 ◽  
Author(s):  
R. Christopher Doiron ◽  
Melanie Jaeger ◽  
Christopher M. Booth ◽  
Xuejiao Wei ◽  
D. Robert Siemens

Introduction: Thoracic epidural analgesia (TEA) is commonly used to manage postoperative pain and facilitate early mobilization after major intra-abdominal surgery. Evidence also suggests that regional anesthesia/analgesia may be associated with improved survival after cancer surgery. Here, we describe factors associated with TEA at the time of radical cystectomy (RC) for bladder cancer and its association with both short- and long-term outcomes in routine clinical practice.Methods: All patients undergoing RC in the province of Ontario between 2004 and 2008 were identified using the Ontario Cancer Registry (OCR). Modified Poisson regression was used to describe factors associated with epidural use, while a Cox proportional hazards model describes associations between survival and TEA use.Results: Over the five-year study period, 1628 patients were identified as receiving RC, 54% (n=887) of whom received TEA. Greater anesthesiologist volume (lowest volume providers relative risk [RR] 0.85, 95% confidence interval [CI] 0.75‒0.96) and male sex (female sex RR 0.89, 95% CI 0.79‒0.99) were independently associated with greater use of TEA. TEA use was not associated with improved short-term outcomes. In multivariable analysis, TEA was not associated with cancer-specific survival (hazard ratio [HR] 1.02, 95% CI 0.87‒1.19; p=0.804) or overall survival (HR 0.91, 95% CI 0.80‒1.03; p=0.136).Conclusions: In routine clinical practice, 54% of RC patients received TEA and its use was associated with anesthesiologist provider volume. After controlling for patient, disease and provider variables, we were unable to demonstrate any effect on either short- or long-term outcomes at the time of RC.


2012 ◽  
Vol 30 (4_suppl) ◽  
pp. 618-618
Author(s):  
Chi Lin ◽  
Christopher K Brown ◽  
Charles Arthur Enke ◽  
Fausto R. Loberiza

618 Background: Gastrointestinal melanoma (GIM) is a rare disease. The objective of this study is to compare the overall survival (OS), cancer specific survival (CSS) and prognostic factors of GIM to those of skin melanoma (SKM) using the Surveillance, Epidemiology, and End Results (SEER) registry. Methods: Patients diagnosed with invasive GIM (406) and SKM (173,622) between 1973 and 2008 were identified from the SEER database. Factors analyzed included age (18-40/41-60/61-100), gender, race (White/nonwhite), marital status, stage (localized/regional/distant), year of diagnosis (1973-87/1988-97/1998-2008), and type of treatment (radiotherapy (RT)/surgery). OS and CSS were evaluated using the Kaplan-Meier method. Cox proportional hazards regression analysis examined what factors were prognostic of survival. Results: The median age was 69 and 57 for patients with GIM and SKM, respectively. The GIM group was older with more advanced-stage cancer than the SKM group. Surgery was performed on 85% and 95%, while RT was received by 18% and 2% of GIM and SKM patients, respectively. The GIM group had a median OS and CSS of 15 and 16 months, respectively, while the SKM group had a median OS of 283 months and did not reach a median CSS. Cox analysis showed that SKM had significantly lower risk of total and cancer-specific mortality compared to GIM (Hazard Ratio (HR) 0.40, p<0.0001) and (HR 0.34, p<0.0001). Factors associated with improved OS and CSS in SKM included: age ≤60, female gender, non-white race, early stage, being married, more recent diagnosis, undergoing surgery and not receiving RT. Factors associated with improved OS and CSS in GIM included: age ≤60, early stage, non-white race and undergoing surgery. Subgroup analysis on patients who underwent surgery showed that lymph node status was the only prognostic factor for GIM, while all of the previously identified prognostic factors except for race were associated with OS and CSS for SKM. Conclusions: Outcomes of patients with GIM are inferior to those with SKM. The melanomas in these two sites also have different prognostic factors. Future studies should explore the reasons behind these differences to improve treatment outcomes.


2019 ◽  
Vol 37 (15_suppl) ◽  
pp. e18250-e18250
Author(s):  
Jifang Zhou ◽  
Karen Sweiss ◽  
Pritesh Rajni Patel ◽  
Edith Nutescu ◽  
Naomi Ko ◽  
...  

e18250 Background: Adjuvant intravenous bisphosphonates (IV BP) reduce the risk of skeletal-related events (SRE) in patients with multiple myeloma (MM). We examined the effects of bisphosphonate utilization patterns (adherence, cumulative dose and frequency) on risk of SRE. Methods: Patients aged 65 years or older and diagnosed with first primary MM between 2001 and 2011 were identified using the Surveillance, Epidemiology and End Results (SEER)-Medicare linked database. Patients receiving at least one dose of IV BP after MM diagnosis were identified and 5-year SRE-free survival was estimated using the Kaplan-Meier method stratified by demographic groups and compared with the log rank test. Cox proportional hazards models were fit to determine the association between IV BP utilization patterns and SRE after propensity score matching. We investigated the outcome of multiple recurrent SRE using the approach of Andersen-Gill, and estimated subdistribution hazard ratios (SHR) and 95% confidence intervals for risk of first SRE, accounting for death as competing risk. Results: The final cohort included 9176 MM patients with a median age of 76 years. The adjusted 5-year competing-risk SRE model showed a 48% reduction in risk of SRE (95% CI 0.49-0.55) with use of IV BP. In multivariable analyses taking into account competing risks, greater adherence to IV BP, higher cumulative IV BP dose and more frequent administration were all associated with a statistically significant reduction in SRE risks (See Table). Conclusions: Use of IV BP in patients with MM was associated with significant reduction in SRE risk over the 5-year period after MM diagnosis. The effectiveness of IV BP therapy was greater with increasing cumulative dose, adherence to and greater frequency of IV BP administration. [Table: see text]


Sign in / Sign up

Export Citation Format

Share Document