scholarly journals Rates and Correlates of Short Term Virologic Response among Treatment-Naïve HIV-Infected Children Initiating Antiretroviral Therapy in Ethiopia: A Multi-Center Prospective Cohort Study

Pathogens ◽  
2019 ◽  
Vol 8 (4) ◽  
pp. 161
Author(s):  
Birkneh Tilahun Tadesse ◽  
Adugna Chala ◽  
Jackson Mukonzo ◽  
Tolosssa Eticha Chaka ◽  
Sintayehu Tadesse ◽  
...  

There is limited data on virologic outcome and its correlates among HIV-infected children in resource-limited settings. We investigated rate and correlates of virologic outcome among treatment naïve HIV-infected Ethiopian children initiating cART, and were followed prospectively at baseline, 8, 12, 24 and 48 weeks using plasma viral load, clinical examination, laboratory tests and pretreatment HIV drug resistance (PDR) screening. Virologic outcome was assessed using two endpoints–virological suppression defined as having “undetectable” plasma viral load < 150 RNA copies/mL, and rebound defined as viral load ≥150 copies/mL after achieving suppression. Cox Proportional Hazards Regression was employed to assess correlates of outcome. At the end of follow up, virologic outcome was measured for 110 participants. Overall, 94(85.5%) achieved virological suppression, of which 36(38.3%) experienced virologic rebound. At 48 weeks, 9(8.2%) children developed WHO-defined virological treatment failure. Taking tenofovir-containing regimen (Hazard Ratio (HR) 3.1-[95% confidence interval (95%CI) 1.0–9.6], p = 0.049) and absence of pretreatment HIV drug resistance (HR 11.7-[95%CI 1.3–104.2], p = 0.028) were independently associated with earlier virologic suppression. In conclusion, PDR and cART regimen type correlate with rate of virologic suppression which was prominent during the first year of cART initiation. However, the impact of viral rebound in 38.3% of the children needs evaluation.

2016 ◽  
Vol 19 (2) ◽  
pp. 333-351 ◽  
Author(s):  
Tara Matsuda ◽  
Jeffrey S. McCombs ◽  
Ivy Tonnu-Mihara ◽  
Justin McGinnis ◽  
D. Steven Fox

Abstract Background: The high cost of new hepatitis C (HCV) treatments has resulted in “watchful waiting” strategies being developed to safely delay treatment, which will in turn delay viral load suppression (VLS). Objective: To document if delayed VLS adversely impacted patient risk for adverse events and death. Methods: 187,860 patients were selected from the Veterans Administration’s (VA) clinical registry (CCR), a longitudinal compilation of electronic medical records (EMR) data for 1999–2010. Inclusion criteria required at least 6 months of CCR/EMR data prior to their HCV diagnosis and sufficient data post-diagnosis to calculate one or more FIB-4 scores. Primary outcome measures were time-to-death and time-to-a composite of liver-related clinical events. Cox proportional hazards models were estimated separately using three critical FIB-4 levels to define early and late viral response. Results: Achieving an undetectable viral load before the patient’s FIB-4 level exceed pre-specified critical values (1.00, 1.45 and 3.25) effectively reduced the risk of an adverse clinical events by 33–35% and death by 21–26%. However, achieving VLS after FIB-4 exceeds 3.25 significantly reduced the benefit of viral response. Conclusions: Delaying VLS until FIB-4 >3.25 reduces the benefits of VLS in reducing patient risk.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2019 ◽  
Vol 17 (4) ◽  
pp. 225-239 ◽  
Author(s):  
Lulu Zuo ◽  
Ke Peng ◽  
Yihong Hu ◽  
Qinggang Xu

AIDS is a globalized infectious disease. In 2014, UNAIDS launched a global project of “90-90-90” to end the HIV epidemic by 2030. The second and third 90 require 90% of HIV-1 infected individuals receiving antiretroviral therapy (ART) and durable virological suppression. However, wide use of ART will greatly increase the emergence and spreading of HIV drug resistance and current HIV drug resistance test (DRT) assays in China are seriously lagging behind, hindering to achieve virological suppression. Therefore, recommending an appropriate HIV DRT method is critical for HIV routine surveillance and prevention in China. In this review, we summarized the current existing HIV drug resistance genotypic testing methods around the world and discussed the advantages and disadvantages of these methods.


2021 ◽  
pp. 1-9
Author(s):  
Leonard Naymagon ◽  
Douglas Tremblay ◽  
John Mascarenhas

Data supporting the use of etoposide-based therapy in hemophagocytic lymphohistiocytosis (HLH) arise largely from pediatric studies. There is a lack of comparable data among adult patients with secondary HLH. We conducted a retrospective study to assess the impact of etoposide-based therapy on outcomes in adult secondary HLH. The primary outcome was overall survival. The log-rank test was used to compare Kaplan-Meier distributions of time-to-event outcomes. Multivariable Cox proportional hazards modeling was used to estimate adjusted hazard ratios (HRs) with 95% confidence intervals (CIs). Ninety adults with secondary HLH seen between January 1, 2009, and January 6, 2020, were included. Forty-two patients (47%) received etoposide-based therapy, while 48 (53%) received treatment only for their inciting proinflammatory condition. Thirty-three patients in the etoposide group (72%) and 32 in the no-etoposide group (67%) died during follow-up. Median survival in the etoposide and no-etoposide groups was 1.04 and 1.39 months, respectively. There was no significant difference in survival between the etoposide and no-etoposide groups (log-rank <i>p</i> = 0.4146). On multivariable analysis, there was no association between treatment with etoposide and survival (HR for death with etoposide = 1.067, 95% CI: 0.633–1.799, <i>p</i> = 0.8084). Use of etoposide-based therapy was not associated with improvement in outcomes in this large cohort of adult secondary HLH patients.


Cancers ◽  
2021 ◽  
Vol 13 (6) ◽  
pp. 1453
Author(s):  
Chiara Fabbroni ◽  
Giovanni Fucà ◽  
Francesca Ligorio ◽  
Elena Fumagalli ◽  
Marta Barisella ◽  
...  

Background. We previously showed that grading can prognosticate the outcome of retroperitoneal liposarcoma (LPS). In the present study, we aimed to explore the impact of pathological stratification using grading on the clinical outcomes of patients with advanced well-differentiated LPS (WDLPS) and dedifferentiated LPS (DDLPS) treated with trabectedin. Patients: We included patients with advanced WDLPS and DDLPS treated with trabectedin at the Fondazione IRCCS Istituto Nazionale dei Tumori between April 2003 and November 2019. Tumors were categorized in WDLPS, low-grade DDLPS, and high-grade DDLPS according to the 2020 WHO classification. Patients were divided in two cohorts: Low-grade (WDLPS/low-grade DDLPS) and high-grade (high-grade DDLPS). Results: A total of 49 patients were included: 17 (35%) in the low-grade cohort and 32 (65%) in the high-grade cohort. Response rate was 47% in the low-grade cohort versus 9.4% in the high-grade cohort (logistic regression p = 0.006). Median progression-free survival (PFS) was 13.7 months in the low-grade cohort and 3.2 months in the high-grade cohort. Grading was confirmed as an independent predictor of PFS in the Cox proportional-hazards regression multivariable model (adjusted hazard ratio low-grade vs. high-grade: 0.45, 95% confidence interval: 0.22–0.94; adjusted p = 0.035). Conclusions: In this retrospective case series, sensitivity to trabectedin was higher in WDLPS/low-grade DDLPS than in high-grade DDLPS. If confirmed in larger series, grading could represent an effective tool to personalize the treatment with trabectedin in patients with advanced LPS.


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4142-4142
Author(s):  
Lucy Xiaolu Ma ◽  
Gun Ho Jang ◽  
Amy Zhang ◽  
Robert Edward Denroche ◽  
Anna Dodd ◽  
...  

4142 Background: KRAS mutations (m) (KRASm) are present in over 90% of pancreatic adenocarcinomas (PDAC) with a predominance of G12 substitutions. KRAS wildtype (WT) PDAC relies on alternate oncogenic drivers, and the prognostic impact of these remains unknown. We evaluated alterations in WT PDAC and explored the impact of specific KRASm and WT status on survival. Methods: WGS and RNAseq were performed on 570 patients (pts) ascertained through our translational research program from 2012-2021, of which 443 were included for overall survival (OS) analyses. This included 176 pts with resected and 267 pts with advanced PDAC enrolled on the COMPASS trial (NCT02750657). The latter cohort underwent biopsies prior to treatment with first line gemcitabine-nab-paclitaxel or mFOLFIRINOX as per physician choice. The Kaplan-Meier and Cox proportional hazards methods were used to estimate OS. Results: KRAS WT PDAC (n = 52) represented 9% of pts, and these cases trended to be younger than pts with KRASm (median age 61 vs 65 years p = 0.1). In resected cases, the most common alterations in WT PDAC (n = 23) included GNASm (n = 6) and BRAFm/fusions (n = 5). In advanced WT PDAC (n = 27), alterations in BRAF (n = 11) and ERBB2/3/4 (n = 6) were most prevalent. Oncogenic fusions (NTRK, NRG1, BRAF/RAF, ROS1, others) were identified in 9 pts. The BRAF in-frame deletion p.486_491del represented the most common single variant in WT PDAC, with organoid profiling revealing sensitivity to both 3rd generation BRAF inhibitors and MEK inhibition. In resected PDAC, multivariable analyses documented higher stage (p = 0.043), lack of adjuvant chemotherapy (p < 0.001), and the KRAS G12D variant (p = 0.004) as poor prognostic variables. In advanced disease, neither WT PDAC nor KRAS specific alleles had an impact on prognosis (median OS WT = 8.5 mths, G12D = 8.2, G12V = 10.0, G12R = 12.0, others = 9.2, p = 0.73); the basal-like RNA subtype conferred inferior OS (p < 0.001). A targeted therapeutic approach following first line chemotherapy was undertaken in 10% of pts with advanced PDAC: MMRd (n = 1), homologous recombination deficiency (HRD) (n = 19), KRASG12C (n = 1), CDK4/6 amplification (n = 3), ERBB family alterations (n = 2), BRAF variants (n = 2). OS in this group was superior (14.7 vs 8.8 mths, p = 0.04), mainly driven by HRD-PDAC where KRASm were present in 89%. Conclusions: In our dataset, KRAS G12D is associated with inferior OS in resected PDAC, however KRAS mutational status was not prognostic in advanced disease. This suggests that improved OS in the WT PDAC population can only be achieved if there is accelerated access to targeted drugs for pts.


2010 ◽  
Vol 2010 ◽  
pp. 1-11 ◽  
Author(s):  
Chizobam Ani ◽  
Deyu Pan ◽  
David Martins ◽  
Bruce Ovbiagele

Background. Literature regarding the influence of age/sex on mortality trends for acute myocardial infarction (AMI) hospitalizations is limited to hospitals participating in voluntary AMI registries.Objective. Evaluate the impact of age and sex on in-hospital AMI mortality using a nationally representative hospital sample.Methods. Secondary data analysis using AMI hospitalizations identified from the Nationwide-Inpatient-Sample (NIS). Descriptive and Cox proportional hazards analysis explored mortality trends by age and sex from 1997–2006 while adjusting for the influence of, demographics, co-morbidity, length of hospital stay and hospital characteristics.Results. From 1997–2006, in-hospital AMI mortality rates decreased across time in all subgroups (), except for males aged <55 years. The greatest decline was observed in females aged <55 years, compared to similarly aged males, mortality outcomes were poorer in 1997-1998 (RR 1.47, 95% CI  =  1.30–1.66), when compared with 2005-2006 (RR 1.03, 95% CI  =  0.90–1.18), adjusted value for trend demonstrated a statistically significant decline in the relative AMI mortality risk for females when compared with males (<0.001).Conclusion. Over the last decade, in-hospital AMI mortality rates declined for every age/sex group except males <55 years. While AMI female-male mortality disparity has narrowed, some room for improvement remains.


2021 ◽  
Vol 8 (2) ◽  
pp. 27-33
Author(s):  
Jiping Zeng ◽  
Ken Batai ◽  
Benjamin Lee

In this study, we aimed to evaluate the impact of surgical wait time (SWT) on outcomes of patients with renal cell carcinoma (RCC), and to investigate risk factors associated with prolonged SWT. Using the National Cancer Database, we retrospectively reviewed the records of patients with pT3 RCC treated with radical or partial nephrectomy between 2004 and 2014. The cohort was divided based on SWT. The primary out-come was 5-year overall survival (OS). Logistic regression analysis was used to investigate the risk factors associated with delayed surgery. Cox proportional hazards models were fitted to assess relations between SWT and 5-year OS after adjusting for confounding factors. A total of 22,653 patients were included in the analysis. Patients with SWT > 10 weeks had higher occurrence of upstaging. Using logistic regression, we found that female patients, African-American or Spanish origin patients, treatment in academic or integrated network cancer center, lack of insurance, median household income of <$38,000, and the Charlson–Deyo score of ≥1 were more likely to have prolonged SWT. SWT > 10 weeks was associated with decreased 5-year OS (hazard ratio [HR], 1.24; 95% confidence interval [CI], 1.15–1.33). This risk was not markedly attenuated after adjusting for confounding variables, including age, gender, race, insurance status, Charlson–Deyo score, tumor size, and surgical margin status (adjusted HR, 1.13; 95% CI, 1.04–1.24). In conclusion, the vast majority of patients underwent surgery within 10 weeks. There is a statistically significant trend of increasing SWT over the study period. SWT > 10 weeks is associated with decreased 5-year OS.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Judy Tung ◽  
Musarrat Nahid ◽  
Mangala Rajan ◽  
Lia Logio

Abstract Background Academic medical centers invest considerably in faculty development efforts to support the career success and promotion of their faculty, and to minimize faculty attrition. This study evaluated the impact of a faculty development program called the Leadership in Academic Medicine Program (LAMP) on participants’ (1) self-ratings of efficacy, (2) promotion in academic rank, and (3) institutional retention. Method Participants from the 2013–2020 LAMP cohorts were surveyed pre and post program to assess their level of agreement with statements that spanned domains of self-awareness, self-efficacy, satisfaction with work and work environment. Pre and post responses were compared using McNemar’s tests. Changes in scores across gender were compared using Wilcoxon Rank Sum/Mann-Whitney tests. LAMP participants were matched to nonparticipant controls by gender, rank, department, and time of hire to compare promotions in academic rank and departures from the organization. Kaplan Meier curves and Cox proportional hazards models were used to examine differences. Results There were significant improvements in almost all self-ratings on program surveys (p < 0.05). Greatest improvements were seen in “understand the promotions process” (36% vs. 94%), “comfortable negotiating” (35% vs. 74%), and “time management” (55% vs. 92%). There were no statistically significant differences in improvements by gender, however women faculty rated themselves lower on all pre-program items compared to men. There was significant difference found in time-to-next promotion (p = 0.003) between LAMP participants and controls. Kaplan-Meier analysis demonstrated that LAMP faculty achieved next promotion more often and faster than controls. Cox-proportional-hazards analyses found that LAMP faculty were 61% more likely to be promoted than controls (hazard ratio [HR] 1.61, 95% confidence interval [CI] 1.16–2.23, p-value = 0.004). There was significant difference found in time-to-departure (p < 0.0001) with LAMP faculty retained more often and for longer periods. LAMP faculty were 77% less likely to leave compared to controls (HR 0.23, 95% CI 0.16–0.34, p < 0.0001). Conclusions LAMP is an effective faculty development program as measured subjectively by participant self-ratings and objectively through comparative improvements in academic promotions and institutional retention.


2015 ◽  
Vol 53 (7) ◽  
pp. 2195-2202 ◽  
Author(s):  
Sylvie Larrat ◽  
Sophie Vallet ◽  
Sandra David-Tchouda ◽  
Alban Caporossi ◽  
Jennifer Margier ◽  
...  

The pretherapeutic presence of protease inhibitor (PI) resistance-associated variants (RAVs) has not been shown to be predictive of triple-therapy outcomes in treatment-naive patients. However, they may influence the outcome in patients with less effective pegylated interferon (pegIFN)-ribavirin (RBV) backbones. Using hepatitis C virus (HCV) population sequence analysis, we retrospectively investigated the prevalence of baseline nonstructural 3 (NS3) RAVs in a multicenter cohort of poor IFN-RBV responders (i.e., prior null responders or patients with a viral load decrease of <1 log IU/ml during the pegIFN-RBV lead-in phase). The impact of the presence of these RAVs on the outcome of triple therapy was studied. Among 282 patients, the prevalances (95% confidence intervals) of baseline RAVs ranged from 5.7% (3.3% to 9.0%) to 22.0% (17.3% to 27.3%), depending to the algorithm used. Among mutations conferring a >3-fold shift in 50% inhibitory concentration (IC50) for telaprevir or boceprevir, T54S was the most frequently detected mutation (3.9%), followed by A156T, R155K (0.7%), V36M, and V55A (0.35%). Mutations were more frequently found in patients infected with genotype 1a (7.5 to 23.6%) than 1b (3.3 to 19.8%) (P= 0.03). No other sociodemographic or viroclinical characteristic was significantly associated with a higher prevalence of RAVs. No obvious effect of baseline RAVs on viral load was observed. In this cohort of poor responders to IFN-RBV, no link was found with a sustained virological response to triple therapy, regardless of the algorithm used for the detection of mutations. Based on a cross-study comparison, baseline RAVs are not more frequent in poor IFN-RBV responders than in treatment-naive patients and, even in these difficult-to-treat patients, this study demonstrates no impact on treatment outcome, arguing against resistance analysis prior to treatment.


Sign in / Sign up

Export Citation Format

Share Document