scholarly journals Effect of a Mobile Web App on Kidney Transplant Candidatesʼ Knowledge About Increased Risk Donor Kidneys

2017 ◽  
Vol 101 (6) ◽  
pp. 1167-1176 ◽  
Author(s):  
Elisa J. Gordon ◽  
Min-Woong Sohn ◽  
Chih-Hung Chang ◽  
Gwen McNatt ◽  
Karina Vera ◽  
...  
2014 ◽  
Vol 33 (3) ◽  
pp. 45 ◽  
Author(s):  
David Ward ◽  
James Hahn ◽  
Lori Mestre

<p>This article presents a case study exploring the use of a student Coding Camp as a bottom-up mobile design process to generate library mobile apps. A code camp sources student programmer talent and ideas for designing software services and features.  This case study reviews process, outcomes, and next steps in mobile web app coding camps. It concludes by offering implications for services design beyond the local camp presented in this study. By understanding how patrons expect to integrate library services and resources into their use of mobile devices, librarians can better design the user experience for this environment.</p>


Antioxidants ◽  
2021 ◽  
Vol 10 (7) ◽  
pp. 1102
Author(s):  
Angelica Rodriguez-Niño ◽  
Diego O. Pastene ◽  
Adrian Post ◽  
M. Yusof Said ◽  
Antonio W. Gomes-Neto ◽  
...  

Carnosine affords protection against oxidative and carbonyl stress, yet high concentrations of the carnosinase-1 enzyme may limit this. We recently reported that high urinary carnosinase-1 is associated with kidney function decline and albuminuria in patients with chronic kidney disease. We prospectively investigated whether urinary carnosinase-1 is associated with a high risk for development of late graft failure in kidney transplant recipients (KTRs). Carnosine and carnosinase-1 were measured in 24 h urine in a longitudinal cohort of 703 stable KTRs and 257 healthy controls. Cox regression was used to analyze the prospective data. Urinary carnosine excretions were significantly decreased in KTRs (26.5 [IQR 21.4–33.3] µmol/24 h versus 34.8 [IQR 25.6–46.8] µmol/24 h; p < 0.001). In KTRs, high urinary carnosinase-1 concentrations were associated with increased risk of undetectable urinary carnosine (OR 1.24, 95%CI [1.06–1.45]; p = 0.007). During median follow-up for 5.3 [4.5–6.0] years, 84 (12%) KTRs developed graft failure. In Cox regression analyses, high urinary carnosinase-1 excretions were associated with increased risk of graft failure (HR 1.73, 95%CI [1.44–2.08]; p < 0.001) independent of potential confounders. Since urinary carnosine is depleted and urinary carnosinase-1 imparts a higher risk for graft failure in KTRs, future studies determining the potential of carnosine supplementation in these patients are warranted.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Juhan Lee ◽  
Eun Jin Kim ◽  
Jae Geun Lee ◽  
Beom Seok Kim ◽  
Kyu Ha Huh ◽  
...  

AbstractSerum bilirubin, a potent endogenous antioxidant, has been associated with decreased risks of cardiovascular disease, diabetes, and kidney disease. However, the effects of serum bilirubin on kidney transplant outcomes remain undetermined. We analyzed 1628 patients who underwent kidney transplantations between 2003 and 2017. Patients were grouped into sex-specific quartiles according to mean serum bilirubin levels, 3–12 months post-transplantation. Median bilirubin levels were 0.66 mg/dL in males and 0.60 mg/dL in females. The intra-individual variability of serum bilirubin levels was low (9%). Serum bilirubin levels were inversely associated with graft loss, death-censored graft failure, and all-cause mortality, independent of renal function, donor status, and transplant characteristics. Multivariable analysis revealed that the lowest serum bilirubin quartile was associated with increased risk of graft loss (HR 2.64, 95% CI 1.67–4.18, P < 0.001), death-censored graft failure (HR 2.97, 95% CI 1.63–5.42, P < 0.001), and all-cause mortality (HR 2.07, 95% CI 1.01–4.22, P = 0.046). Patients with lower serum bilirubin were also at greater risk of rejection and exhibited consistently lower glomerular filtration rates than those with higher serum bilirubin. Serum bilirubin levels were significantly associated with transplantation outcomes, suggesting that bilirubin could represent a therapeutic target for improving long-term transplant outcomes.


2021 ◽  
Vol 10 (13) ◽  
pp. 2854
Author(s):  
Fernanda Rodrigues ◽  
J. Swarte ◽  
Rianne Douwes ◽  
Tim Knobbe ◽  
Camilo Sotomayor ◽  
...  

Background: Diarrhea is common among kidney transplant recipients (KTR). Exhaled hydrogen (H2) is a surrogate marker of small bowel dysbiosis, which may drive diarrhea. We studied the relationship between exhaled H2 and diarrhea in KTR, and explored potential clinical and dietary determinants. Methods: Clinical, laboratory, and dietary data were analyzed from 424 KTR participating in the TransplantLines Biobank and Cohort Study (NCT03272841). Fasting exhaled H2 concentration was measured using a model DP Quintron Gas Chromatograph. Diarrhea was defined as fast transit time (types 6 and 7 according to the Bristol Stool Form Scale, BSFS) of 3 or more episodes per day. We studied the association between exhaled H2 and diarrhea with multivariable logistic regression analysis, and explored potential determinants using linear regression. Results: KTR (55.4 ± 13.2 years, 60.8% male, mean eGFR 49.8 ± 19.1 mL/min/1.73 m2) had a median exhaled H2 of 11 (5.0–25.0) ppm. Signs of small intestinal bacterial overgrowth (exhaled H2 ≥ 20 ppm) were present in 31.6% of the KTR, and 33.0% had diarrhea. Exhaled H2 was associated with an increased risk of diarrhea (odds ratio 1.51, 95% confidence interval 1.07–2.14 per log2 ppm, p = 0.02). Polysaccharide intake was independently associated with higher H2 (std. β 0.24, p = 0.01), and a trend for an association with proton-pump inhibitor use was observed (std. β 0.16 p = 0.05). Conclusion: Higher exhaled H2 is associated with an increased risk of diarrhea in KTR. Our findings set the stage for further studies investigating the relationship between dietary factors, small bowel dysbiosis, and diarrhea after kidney transplantation.


2018 ◽  
Author(s):  
Chin Hai Teo ◽  
Chirk Jenn Ng ◽  
Sin Kuang Lo ◽  
Chip Dong Lim ◽  
Alan White

BACKGROUND Globally, the uptake of health screening is suboptimal, especially in men and those of younger age. In view of the increasing internet access and mobile phone ownership, ScreenMen, a mobile Web app, was developed to improve health screening uptake in men. OBJECTIVE This study aimed to evaluate the utility and usability of ScreenMen. METHODS This study used both qualitative and quantitative methods. Healthy men working in a banking institution were recruited to participate in this study. They were purposively sampled according to job position, age, education level, and screening status. Men were asked to use ScreenMen independently while the screen activities were being recorded. Once completed, retrospective think aloud with playback was conducted with men to obtain their feedback. They were asked to answer the System Usability Scale (SUS). Intention to undergo screening pre- and postintervention was also measured. Qualitative data were analyzed using a framework approach followed by thematic analysis. For quantitative data, the mean SUS score was calculated and change in intention to screening was analyzed using McNemar test. RESULTS In total, 24 men participated in this study. On the basis of the qualitative data, men found ScreenMen useful as they could learn more about their health risks and screening. They found ScreenMen convenient to use, which might trigger men to undergo screening. In terms of usability, men thought that ScreenMen was user-friendly and easy to understand. The key revision done on utility was the addition of a reminder function, whereas for usability, the revisions done were in terms of attracting and gaining users’ trust, improving learnability, and making ScreenMen usable to all types of users. To attract men to use it, ScreenMen was introduced to users in terms of improving health instead of going for screening. Another important revision made was emphasizing the screening tests the users do not need, instead of just informing them about the screening tests they need. A Quick Assessment Mode was also added for users with limited attention span. The quantitative data showed that 8 out of 23 men (35%) planned to attend screening earlier than intended after using the ScreenMen. Furthermore, 4 out of 12 (33%) men who were in the precontemplation stage changed to either contemplation or preparation stage after using ScreenMen with P=.13. In terms of usability, the mean SUS score of 76.4 (SD 7.72) indicated that ScreenMen had good usability. CONCLUSIONS This study showed that ScreenMen was acceptable to men in terms of its utility and usability. The preliminary data suggested that ScreenMen might increase men’s intention to undergo screening. This paper also presented key lessons learned from the beta testing, which is useful for public health experts and researchers when developing a user-centered mobile Web app.


2021 ◽  
Vol 5 (11) ◽  
pp. 1009-1013
Author(s):  
Eriawan Agung Nugroho ◽  
Erwin Wibowo ◽  
Prathita Amanda Aryani

Background: Chronic kidney disease (CKD) is a rising health concern worldwide, especially in Indonesia. The treatment of choice for end-stage renal disease is Kidney Transplantation.1 Numerous studies showed that prolonged total ischemic ischemic time may cause hypoxia of the graft tissue and increased risk of ischemia and reperfusion injury (IRI) and delayed graft function (DGF).2 Body mass index of kidney transplant recipients may cause prolonged duration of the procedure, as well as prolonged total ischemic time. This study aimed to determine the correlation between prolonged total ischemic time with body mass index. Method: This was an observational and cross-sectional analysis at Kariadi General Hospital Semarang involving patients who underwent kidney transplantation from January 2016 to December 2019. The total ischemic time was recorded intraoperatively. The Body Mass Index data were provided by medical records. The program used to statistically analyze the data was SPSS 23.0, and Spearman was used for hypothesis testing. Result: This study included 25 kidney transplant recipients. The mean total ischemic time was 43,27 ± 6,63 minutes. There was a significant positive correlation between prolonged ischemic time and body mass index (r= 0,506 ; p= 0,010). Conclusion: Prolonged total ischemic time was positively correlated with increased body mass index and these results are statistically significant.


2017 ◽  
Vol 46 (4) ◽  
pp. 343-354 ◽  
Author(s):  
Ngan N. Lam ◽  
Amit X. Garg ◽  
Greg A. Knoll ◽  
S. Joseph Kim ◽  
Krista L. Lentine ◽  
...  

Background: The implications of venous thromboembolism (VTE) for morbidity and mortality in kidney transplant recipients are not well described. Methods: We conducted a retrospective study using linked healthcare databases in Ontario, Canada to determine the risk and complications of VTE in kidney transplant recipients from 2003 to 2013. We compared the incidence rate of VTE in recipients (n = 4,343) and a matched (1:4) sample of the general population (n = 17,372). For recipients with evidence of a VTE posttransplant, we compared adverse clinical outcomes (death, graft loss) to matched (1:2) recipients without evidence of a VTE posttransplant. Results: During a median follow-up of 5.2 years, 388 (8.9%) recipients developed a VTE compared to 254 (1.5%) in the matched general population (16.3 vs. 2.4 events per 1,000 person-years; hazard ratio [HR] 7.1, 95% CI 6.0-8.4; p < 0.0001). Recipients who experienced a posttransplant VTE had a higher risk of death (28.5 vs. 11.2%; HR 4.1, 95% CI 2.9-5.8; p < 0.0001) and death-censored graft loss (13.1 vs. 7.5%; HR 2.3, 95% CI 1.4-3.6; p = 0.0006) compared to matched recipients who did not experience a posttransplant VTE. Conclusions: Kidney transplant recipients have a sevenfold higher risk of VTE compared to the general population with VTE conferring an increased risk of death and graft loss.


Author(s):  
Sai Sindhu Thangaraj ◽  
Helle Charlotte Thiesson ◽  
Per Svenningsen ◽  
Jane Stubbe ◽  
Yaseelan Palarasah ◽  
...  

Kidney transplantation is associated with increased risk of cardiovascular morbidity. Interleukin-17A (IL-17A) mediates kidney injury. Aldosterone promotes T-helper-17 (Th-17) lymphocyte differentiation and IL-17A production through the mineralocorticoid receptor (MR). In this exploratory, post-hoc substudy, it was hypothesized that 1-year intervention with the MR antagonist spironolactone lowers IL-17A and related cytokines and reduces epithelial injury in kidney transplant recipients. Plasma and urine samples were obtained from kidney transplant recipients from a double-blind randomized clinical trial testing spironolactone (n=39) versus placebo (n=41). Plasma concentrations of cytokines IFN-γ, IL-17A, TNF-α, IL-6, IL-1β, and IL-10 were determined before and after 1-year treatment. Urine calbindin, clusterin, KIM-1, osteoactivin, TFF3, and VEGF/creatinine ratios were analyzed. Blood pressure and plasma aldosterone concentration at inclusion did not relate to plasma cytokines and injury markers. None of the cytokines changed in plasma after spironolactone intervention. Plasma IL-17A increased in the placebo group. Spironolactone induced an increase in plasma K+ (0.4 ± 0.4 mmol/L). This increase did not correlate with plasma IL-17A or urine calbindin and TFF3 changes. Ongoing treatment at inclusion with angiotensin-converting-enzyme inhibitor and/or angiotensin II receptor blockers was not associated with changed levels of IL-17A and injury markers and had no effect on the response to spironolactone. Urinary calbindin and TFF3 decreased in the spironolactone group with no difference in between-group analyses. In conclusion, irrespective of ongoing ANGII inhibition, spironolactone has no effect on plasma IL-17A and related cytokines or urinary injury markers in kidney transplant recipients.


2020 ◽  
Vol 31 (6) ◽  
pp. 1150-1156 ◽  
Author(s):  

BackgroundThe novel SARS-CoV-2 virus has caused a global pandemic of coronavirus disease 2019 (COVID-19). Although immunosuppressed individuals are thought to be at an increased risk of severe disease, little is known about their clinical presentation, disease course, or outcomes.MethodsWe report 15 kidney transplant recipients from the Columbia University kidney transplant program who required hospitalization for confirmed COVID-19, and describe their management, clinical course, and outcomes.ResultsPatients presented most often with a fever (87%) and/or cough (67%). Initial chest x-ray most commonly showed bilateral infiltrates, but 33% had no acute radiographic findings. Patients were managed with immunosuppression reduction and the addition of hydroxychloroquine and azithromycin. Although 27% of our patients needed mechanical ventilation, over half were discharged home by the end of follow-up.ConclusionsKidney transplant recipients with COVID-19 have presentations that are similar to that of the general population. Our current treatment protocol appears to be associated with favorable outcomes, but longer follow-up of a larger cohort of patients is needed.


2020 ◽  
Vol 10 (3) ◽  
pp. 139-146 ◽  
Author(s):  
Ramy M. Hanna ◽  
Farid Abd-El-Malak ◽  
Ammar Alnaser ◽  
Rumi Cader ◽  
Julie M. Yabu

Kidney transplant recipients require lifelong immunosuppression to prevent organ rejection. The need for this intervention, however, leads to decreased cellular immunity and, in turn, increased risk of developing herpes zoster (HZ) from reactivation of latent varicella zoster virus. HZ commonly presents as a painful rash in a dermatome presentation followed by post-herpetic neuralgia. In immunosuppressed individuals, the presentation can be atypical and vary in severity depending on degree of immunosuppression and host immune response. We present the clinical course of 3 kidney transplant recipients who developed HZ after transplantation at different times post-transplant with varying clinical manifestations. The balance between maintaining immunosuppression and preventing or subsequently treating disseminated disease is discussed.


Sign in / Sign up

Export Citation Format

Share Document