This study aimed to determine the association between serum D-dimer levels and the risk of acute kidney injury (AKI) in patients undergoing living donor liver transplantation (LDLT). Clinical data of 675 patients undergoing LDLT were retrospectively analyzed. The exclusion criteria included a history of kidney dysfunction, emergency cases, and missing data. The final study population of 617 patients was divided into the normal and high D-dimer groups (cutoff: 0.5 mg/L). After LDLT, 145 patients (23.5%) developed AKI. A high D-dimer level (>0.5 mg/L) was an independent predictor of postoperative development of AKI in the multivariate analysis when combined with diabetes mellitus [DM], platelet count, and hourly urine output. AKI was significantly higher in the high D-dimer group than in the normal D-dimer group (odds ratio [OR], 2.792; 95% confidence interval [CI], 1.227–6.353). Patients with a high D-dimer exhibited a higher incidence of early allograft dysfunction, longer intensive care unit stay, and a higher mortality rate. These results could improve the risk stratification of postoperative AKI development by encouraging the determination of preoperative D-dimer levels in patients undergoing LDLT.
We aimed to develop and validate a nomogram model for predicting CKD after orthotopic liver transplantation (OLT).
The retrospective data of 399 patients who underwent transplantation and were followed in our centre were collected. They were randomly assigned to the training set (n = 293) and validation set (n = 106). Multivariable Cox regression analysis was performed in the training set to identify predictors of CKD. According to the Cox regression analysis results, a nomogram model was developed and validated. The renal function of recipients was monitored, and the long-term survival prognosis was assessed.
The incidence of CKD at 5 years after OLT was 25.6%. Cox regression analysis identified several predictors of post-OLT CKD, including recipient age at surgery (HR 1.036, 95% CI 1.006-1.068; p = 0.018), female sex (HR 2.867, 95% CI 1.709-4.810; p < 0.001), preoperative hypertension (HR 1.670, 95% CI 0.962-2.898; p = 0.068), preoperative eGFR (HR 0.996, 95% CI 0.991-1.001; p = 0.143), uric acid at 3 months (HR 1.002, 95% CI 1.001-1.004; p = 0.028), haemoglobin at 3 months (HR 0.970, 95% CI 0.956-0.983; p < 0.001), and average concentration of cyclosporine A at 3 months (HR 1.002, 95% CI 1.001-1.003; p < 0.001). According to these parameters, a nomogram model for predicting CKD after OLT was constructed and validated. The C-indices were 0.75 and 0.80 in the training and validation sets. The calibration curve of the nomogram showed that the CKD probabilities predicted by the nomogram agreed with the observed probabilities at 1, 3, and 5 years after OLT (p > 0.05). Renal function declined slowly year by year, and there were significant differences between patients divided by these predictors. Kaplan-Meier survival analysis showed that the survival prognosis of recipients decreased significantly with the progression of renal function.
With excellent predictive abilities, the nomogram may be a simple and reliable tool to identify patients at high risk for CKD and poor long-term prognosis after OLT.
Recurrent or de novo non-alcoholic fatty liver disease (NAFLD)/non-alcoholic steatohepatitis (NASH) following liver transplantation (LT) is a frequent event being increasingly recognized over the last decade, but the influence of recurrent NASH on graft and patient outcomes is not yet established. Taking into consideration the long term survival of liver transplanted patients and long term complications with associated morbidity and mortality, it is important to define and minimize risk factors for recurrent NAFLD/NASH. Metabolic syndrome, obesity, dyslipidemia, diabetes mellitus are life style risk factors that can be potentially modified by various interventions and thus, decrease the risk of recurrent NAFLD/NASH. On the other hand, genetic factors like recipient and/or donor PNPLA3, TM6SF2, GCKR, MBOAT7 or ADIPOQ gene polymorphisms proved to be risk factors for recurrent NASH. Personalized interventions to influence the different metabolic disorders occurring after LT in order to minimize the risks, as well as genetic screening of donors and recipients should be performed pre-LT in order to achieve diagnosis and treatment as early as possible.
Hepatocellular carcinoma (HCC) is the third highest cause of cancer-related mortality, and liver transplantation is the ideal treatment for this disease. The Milan criteria provided the opportunity for HCC patients to undergo LT with favorable outcomes and have been the international gold standard and benchmark. With the accumulation of data, however, the Milan criteria are not regarded as too restrictive. After the implementation of the Milan criteria, many extended criteria have been proposed, which increases the limitations regarding the morphological tumor burden, and incorporates the tumor’s biological behavior using surrogate markers. The paradigm for the patient selection for LT appears to be shifting from morphologic criteria to a combination of biologic, histologic, and morphologic criteria, and to the establishment of a model for predicting post-transplant recurrence and outcomes. This review article aims to characterize the various patient selection criteria for LT, with reference to several surrogate markers for the biological behavior of HCC (e.g., AFP, PIVKA-II, NLR, 18F-FDG PET/CT, liquid biopsy), and the response to locoregional therapy. Furthermore, the allocation rules in each country and the present evidence on the role of down-staging large tumors are addressed.