scholarly journals Effectiveness of Maintenance Immunosuppression Therapies in a Matched-Pair Analysis Cohort of 16 Years of Renal Transplant in the Brazilian National Health System

Author(s):  
Rosângela Maria Gomes ◽  
Wallace Breno Barbosa ◽  
Brian Godman ◽  
Juliana de Oliveira Costa ◽  
Nélio Gomes Ribeiro Junior ◽  
...  

The maintenance of patients with renal transplant typically involves two or more drugs to prevent rejection and prolong graft survival. The calcineurin inhibitors (CNI) are the most commonly recommended medicines in combinations with others. While immunosuppressive treatment regimens are well established, there is insufficient long-term effectiveness data to help guide future management decisions. The study analyzes the effectiveness of treatment regimens containing CNI after renal transplantation during 16 years of follow-up with real-world data from the Brazilian National Health System (SUS). This was a retrospective study of 2318 SUS patients after renal transplantion. Patients were propensity score-matched (1:1) by sex, age, type and year of transplantation. Kaplan–Meier analysis was used to estimate the cumulative probabilities of survival. A Cox proportional hazard model was used to evaluate factors associated with progression to graft loss. Multivariable analysis, adjusted for diabetes mellitus and race/color, showed a greater risk of graft loss for patients using tacrolimus plus mycophenolate compared to patients treated with cyclosporine plus azathioprine. In conclusion, this Brazilian real-world study, with a long follow-up period using matched analysis for relevant clinical features and the representativeness of the sample, demonstrated improved long-term effectiveness for therapeutic regimens containing cyclosporine plus azathioprine. Consequently, we recommend that protocols and clinical guidelines for renal transplantation should consider the cyclosporine plus azathioprine regimen as a potential first line option, along with others.

2007 ◽  
Vol 8 (2) ◽  
pp. 61-69
Author(s):  
Nicola Giotta ◽  
Ercole Biamino ◽  
Mario Eandi

The main aim of this retrospective study was to perform a pharmacoeconomic analysis of long term use of darbepoetin-α (DARB) after switch from erythropoietin-ß (EPO-ß) in treating chronic nephropathy-induced anemia in dialysed patients. Secondary objective was the assessment of the actual EPO-ß-to-DARB dose conversion factor. We extracted data of 78 patients who have been treated with EPO-ß for at least 6 months and then switched to DARB from the database of the dialysis center of the Asti (Piedmont, Italy) hospital. From these, we selected 47 patients (23 males and 24 females) who completed a 120-weeks follow-up treatment with DARB.
All patients were treated with a dose adjustment schedule to keep haemoglobin levels in the range 11-12g/dl. Pre-switch EPO-ß administration was thrice a week, while DARB was administered once a week, both via intravenous. Initial DARB dose has been calculated on the basis of the theoretical 200:1 conversion factor. Actual cumulative EPO and DARB consumption was recorded for all patients. Drug costs were valued according to purchasing prices for the Italian National Health System (October 2006).
In the 24 pre-switch weeks the average cost (±SD) per patient for EPO-ß was € 2,309.86 (±1,434.78). In the 120 weeks of follow-up the average cost (±SD) per patient for DARB/24 weeks ranged from a minimum of € 1,487.09 (±1,125.51) to a maximum of € 2,125.73 (±1,546.85). 
The switch of 47 patients to DARB produced an overall net saving for the dialysis centre estimated in 119,540.72 Euro/120 weeks, under the hypothesis that EPO-ß semester costs remain constant: the conversion from EPO-ß to DARB has the potential to maintain long term good haemoglobin control and induces significant savings for the National Health System.
However the dosage should be adjusted on an individual basis in order to avoid excessive fluctuation of Hb concentrations. The actual conversion factor resulted on average higher than theoretical factor settling to 240-280:1.



2017 ◽  
Vol 135 (4) ◽  
pp. 369-375 ◽  
Author(s):  
Bruna Camilo Turi ◽  
Jamile Sanches Codogno ◽  
Rômulo Araújo Fernandes ◽  
Kyle Robinson Lynch ◽  
Eduardo Kokubun ◽  
...  

ABSTRACT CONTEXT AND OBJECTIVE: In this longitudinal study, we aimed to describe time trends of physical activity (PA) in different domains from 2010 to 2014 among users of the Brazilian National Health System, taking into account the effects of sex, age and economic status (ES). DESIGN AND SETTING: Longitudinal study conducted in five primary care units in Bauru (SP), Brazil. METHODS: The sample was composed of 620 men and women who were interviewed in 2010, 2012 and 2014. The same group of researchers conducted the interviews, using the questionnaire developed by Baecke et al. Scores for occupational, exercise/sport, leisure-time/transportation and overall PA were considered in this longitudinal survey. Time trends of PA over the four years of follow-up were assessed according to sex, age and ES. RESULTS: We found that after four years of follow-up, the reduction in overall PA (-13.6%; 95% confidence interval, CI = -11.9 to -15.3) was statistically significant. Additionally, declines in the occupational domain and exercise/sports participation were affected by age, while the reduction in overall PA was affected by sex, age and ES. CONCLUSIONS: Overall PA decreased significantly from 2010 to 2014 among these outpatients of the Brazilian National Health System, and age and male sex were important determinants of PA in its different domains.


2020 ◽  
Vol 11 ◽  
Author(s):  
Isabel Hurtado ◽  
Anibal García-Sempere ◽  
Salvador Peiró ◽  
Asier Bengoetxea ◽  
Jesús Luis Prieto ◽  
...  

Author(s):  
Sabin Nsanzimana ◽  
Michael J Penkunas ◽  
Carol Y Liu ◽  
Dieudonne Sebuhoro ◽  
Alida Ngwije ◽  
...  

Abstract Background Direct-acting antivirals (DAAs) are becoming accessible in sub-Saharan Africa. This study examined the effectiveness of DAAs in patients treated through the Rwandan national health system and identified factors associated with treatment outcomes. Methods This retrospective study used data from the national hepatitis C virus (HCV) program for patients who initiated DAAs between November 2015 and March 2017. Sustained virological response at 12 weeks after treatment (SVR12) was the primary outcome. Logistic regression models were fit to estimate the relationship between patients’ clinical and demographic characteristics and treatment outcome. Results 894 patients started treatment during the study period; 590 completed treatment and had SVR12 results. Among the 304 patients without SVR12 results, 48 were lost to follow-up and 256 had no SVR12 results but clinical data indicated they likely completed treatment; these patients were classified as nonvirological failure because viral clearance could not be determined. In a per-protocol analysis of 590 patients with SVR12 results, SVR12 was achieved in 540 (92%), and virological failure occurred in 50 (8%). Pretreatment HCV RNA above the median split was associated with virological failure. Intention-to-treat analyses including all patients showed that SVR12 was achieved in 540 (60%), with nonvirological failure in 304 (34%) and virological failure in 50 (6%). Patients in Western Province were more likely to experience nonvirological failure than patients in Kigali, likely owing to the 5–7-hour travel required to access testing and treatment. Conclusions DAAs were effective when implemented through the Rwandan national health system. Decentralization and enhanced financing are underway in Rwanda, which could improve access to treatment and follow-up as the country prepares for HCV elimination.


Author(s):  
Dominik Steubl ◽  
Anna Vogel ◽  
Stefan Hettwer ◽  
Susanne Tholen ◽  
Peter B. Luppa ◽  
...  

AbstractC-terminal agrin fragment (CAF), cleavage product of agrin, was previously correlated with kidney function in renal transplant patients. This article studies the predictive value of CAF for long-term outcomes in renal transplant recipients.In this observational cohort study, serum CAF, creatinine and blood-urea-nitrogen (BUN) concentrations and eGFR (CKD-EPI) were assessed 1–3 months after transplantation in 105 patients undergoing kidney transplantation. Cox regression models were used to analyse the predictive value of all parameters concerning all-cause mortality (ACM), graft loss (GL), doubling of creatinine/proteinuria at the end of follow-up.Median follow-up time was 3.1 years. The mean concentrations were 191.9±152.4 pM for CAF, 176±96.8 μmol/L for creatinine, 12.6±6.2 mmol/L for BUN and 44.9±21.2 mL/min for CKD-EPI formula, respectively. In univariate analysis CAF and BUN concentrations predicted ACM (CAF: HR=1.003, 1.1-fold risk, p=0.043; BUN: HR=1.037, 1.3-fold risk, p=0.006). Concerning GL, CAF (HR=1.006, 3.1-fold risk, p<0.001), creatinine (HR=2.396, 2.6-fold risk, p<0.001), BUN (HR=1.048, 1.7-fold risk, p=0.001) and eGFR (CKD-EPI) (HR=0.941, 0.45-fold risk reduction, p=0.006) showed a statistically significant association. CAF was the only parameter significantly associated with doubling of proteinuria (HR=1.005, 1.7-fold risk, p<0.001). In multiple regression analysis (CAF only) the association remained significant for GL and doubling of proteinuria but not ACM.Early postoperative serum CAF appears to be a useful tool for the assessment of long-term outcomes in renal transplant recipients. Most importantly it represents a promising predictor for the development of proteinuria.


2021 ◽  
Author(s):  
Nimrod Maimon ◽  
Michal Maimon ◽  
Lior Hassan ◽  
Itamar Grotto ◽  
Yasmeen Abu-Fraiha ◽  
...  

Abstract Background Outbreaks of Coronavirus 2019 (COVID-19) in Long-Term Care Facilities (LTCFs) have resulted mainly from disease transmission by asymptomatic health care workers (HCW's). It is not known whether routine COVID-19 screening tests carried out on HCW's would reduce mortality of LTCF residents. Since mid-July 2020, the Israeli national LTCF defense program – "Senior Shield" - has used weekly COVID-19 PCR tests on all LTCF employees.Methods A nationwide, government funded, screening program on all LTCF personnel for four months during the second COVID-19 wave. We evaluated differences between the two waves in the national LTCF's system with regard to hospitalizations and mortality. Estimation of national health system predicted outcomes, in the absence of this weekly screening plan, was calculated.Results COVID-19 tests were taken weekly in all 1,107 LTCFs which includes 62,159 HCWs and 100,046 residents. A median of 55,282 (range – 16,249) tests were performed each week. Turnaround time from sampling to result was less than 24 hours in 95% of tests. Compared to the first wave, in which 45.3% of national mortality was attributed to the LTCF's, the second wave saw a 33.8% reduction in this mortality ratio. Estimation of national health system outcomes during the second wave showed that the activation of the screening program reduced hospital load by 35% and prevented 30% of national mortality from COVID-19.Conclusions Routine weekly COVID-19 PCR testing of all LTCF employees may reduce national hospitalizations and mortality.


Sign in / Sign up

Export Citation Format

Share Document