scholarly journals Juvenile Idiopathic Arthritis Associated Uveitis

Children ◽  
2021 ◽  
Vol 8 (8) ◽  
pp. 646
Author(s):  
Emil Carlsson ◽  
Michael W. Beresford ◽  
Athimalaipet V. Ramanan ◽  
Andrew D. Dick ◽  
Christian M. Hedrich

Juvenile idiopathic arthritis (JIA) is the most common childhood rheumatic disease. The development of associated uveitis represents a significant risk for serious complications, including permanent loss of vision. Initiation of early treatment is important for controlling JIA-uveitis, but the disease can appear asymptomatically, making frequent screening procedures necessary for patients at risk. As our understanding of pathogenic drivers is currently incomplete, it is difficult to assess which JIA patients are at risk of developing uveitis. Identification of specific risk factors for JIA-associated uveitis is an important field of research, and in this review, we highlight the genomic, transcriptomic, and proteomic factors identified as potential uveitis risk factors in JIA, and discuss therapeutic strategies.

2019 ◽  
Vol 112 (7) ◽  
pp. 720-727 ◽  
Author(s):  
Lucas K Vitzthum ◽  
Paul Riviere ◽  
Paige Sheridan ◽  
Vinit Nalawade ◽  
Rishi Deka ◽  
...  

Abstract Background Although opioids play a critical role in the management of cancer pain, the ongoing opioid epidemic has raised concerns regarding their persistent use and abuse. We lack data-driven tools in oncology to understand the risk of adverse opioid-related outcomes. This project seeks to identify clinical risk factors and create a risk score to help identify patients at risk of persistent opioid use and abuse. Methods Within a cohort of 106 732 military veteran cancer survivors diagnosed between 2000 and 2015, we determined rates of persistent posttreatment opioid use, diagnoses of opioid abuse or dependence, and admissions for opioid toxicity. A multivariable logistic regression model was used to identify patient, cancer, and treatment risk factors associated with adverse opioid-related outcomes. Predictive risk models were developed and validated using a least absolute shrinkage and selection operator regression technique. Results The rate of persistent opioid use in cancer survivors was 8.3% (95% CI = 8.1% to 8.4%); the rate of opioid abuse or dependence was 2.9% (95% CI = 2.8% to 3.0%); and the rate of opioid-related admissions was 2.1% (95% CI = 2.0% to 2.2%). On multivariable analysis, several patient, demographic, and cancer and treatment factors were associated with risk of persistent opioid use. Predictive models showed a high level of discrimination when identifying individuals at risk of adverse opioid-related outcomes including persistent opioid use (area under the curve [AUC] = 0.85), future diagnoses of opioid abuse or dependence (AUC = 0.87), and admission for opioid abuse or toxicity (AUC = 0.78). Conclusion This study demonstrates the potential to predict adverse opioid-related outcomes among cancer survivors. With further validation, personalized risk-stratification approaches could guide management when prescribing opioids in cancer patients.


2011 ◽  
Vol 2011 ◽  
pp. 1-7 ◽  
Author(s):  
Debbie L. Cohen ◽  
Raymond R. Townsend

Hypertension, diabetes, and proteinuria are well-recognized risk factors for progressive kidney function loss. However, despite excellent antihypertensive and antidiabetic drug therapies, which also often lower urinary protein excretion, there remains a significant reservoir of patients with chronic kidney disease who are at high risk for progression to end-stage kidney disease. This has led to the search for less traditional cardiovascular risk factors that will help stratify patients at risk for more rapid kidney disease progression. Among these are noninvasive estimates of vascular structure and function. Arterial stiffness, manifested by the pulse wave velocity in the aorta, has been established in a number of studies as a significant risk factor for kidney disease progression and cardiovascular endpoints. Much less well studied in chronic kidney disease are measures of central arterial pressures. In this paper we cover the physiology behind the generation of the central pulse wave contour and the studies available using these approaches and conclude with some speculations on the rationale for why measurements of central pressure may be informative for the study of chronic kidney disease progression.


BJPsych Open ◽  
2021 ◽  
Vol 7 (6) ◽  
Author(s):  
Rachael W. Taylor ◽  
Rebecca Strawbridge ◽  
Allan H. Young ◽  
Roland Zahn ◽  
Anthony J. Cleare

Background Treatment-resistant depression (TRD) is classically defined according to the number of suboptimal antidepressant responses experienced, but multidimensional assessments of TRD are emerging and may confer some advantages. Patient characteristics have been identified as risk factors for TRD but may also be associated with TRD severity. The identification of individuals at risk of severe TRD would support appropriate prioritisation of intensive and specialist treatments. Aims To determine whether TRD risk factors are associated with TRD severity when assessed multidimensionally using the Maudsley Staging Method (MSM), and univariately as the number of antidepressant non-responses, across three cohorts of individuals with depression. Method Three cohorts of individuals without significant TRD, with established TRD and with severe TRD, were assessed (n = 528). Preselected characteristics were included in linear regressions to determine their association with each outcome. Results Participants with more severe TRD according to the MSM had a lower age at onset, fewer depressive episodes and more physical comorbidities. These associations were not consistent across cohorts. The number of episodes was associated with the number of antidepressant treatment failures, but the direction of association varied across the cohorts studied. Conclusions Several risk factors for TRD were associated with the severity of resistance according to the MSM. Fewer were associated with the raw number of inadequate antidepressant responses. Multidimensional definitions may be more useful for identifying patients at risk of severe TRD. The inconsistency of associations across cohorts has potential implications for the characterisation of TRD.


Nutrients ◽  
2020 ◽  
Vol 12 (12) ◽  
pp. 3745
Author(s):  
Pamela Klassen ◽  
Vickie Baracos ◽  
Leah Gramlich ◽  
Gregg Nelson ◽  
Vera Mazurak ◽  
...  

Pre-operative nutrition screening is recommended to identify cancer patients at risk of malnutrition, which is associated with poor outcomes. Low muscle mass (sarcopenia) and lipid infiltration to muscle cells (myosteatosis) are similarly associated with poor outcomes but are not routinely screened for. We investigated the prevalence of sarcopenia and myosteatosis across the nutrition screening triage categories of the Patient-Generated Subjective Global Assessment Short Form (PG-SGASF) in a pre-operative colorectal cancer (CRC) cohort. Data were prospectively collected from patients scheduled for surgery at two sites in Edmonton, Canada. PG-SGASF scores ≥ 4 identified patients at risk for malnutrition; sarcopenia and myosteatosis were identified using computed-tomography (CT) analysis. Patients (n = 176) with a mean age of 63.8 ± 12.0 years, 52.3% male, 90.3% with stage I–III disease were included. Overall, 25.2% had PG-SGASF score ≥ 4. Sarcopenia alone, myosteatosis alone or both were identified in 14.0%, 27.3%, and 6.4% of patients, respectively. Sarcopenia and/or myosteatosis were identified in 43.4% of those with PG-SGASF score < 4 and in 58.5% of those with score ≥ 4. Overall, 32.9% of the cohort had sarcopenia and/or myosteatosis with PG-SGASF score < 4. CT-defined sarcopenia and myosteatosis are prevalent in pre-operative CRC patients, regardless of the presence of traditional nutrition risk factors (weight loss, problems eating); therefore, CT image analysis effectively adds value to nutrition screening by identifying patients with other risk factors for poor outcomes.


2018 ◽  
Vol 8 (6) ◽  
pp. 468-471 ◽  
Author(s):  
Martha A. Mulvey ◽  
Aravindhan Veerapandiyan ◽  
David A. Marks ◽  
Xue Ming

BackgroundPrior studies have reported that patients with epilepsy have a higher prevalence of obstructive sleep apnea (OSA) that contributes to poor seizure control. Detection and treatment of OSA can improve seizure control in some patients with epilepsy. In this study, we sought to develop, implement, and evaluate the effectiveness of an electronic health record (EHR) alert to screen for OSA in patients with epilepsy.MethodsA 3-month retrospective chart review was conducted of all patients with epilepsy >18 years of age who were evaluated in our epilepsy clinics prior to the intervention. An assessment for obstructive sleep apnea (AOSA) consisting of 12 recognized risk factors for OSA was subsequently developed and embedded in the EHR. The AOSA was utilized for a 3-month period. Patients identified with 2 or more risk factors were referred for polysomnography. A comparison was made to determine if there was a difference in the number of patients at risk for OSA detected and referred for polysomnography with and without an EHR alert to screen for OSA.ResultsThere was a significant increase in OSA patient recognition. Prior to the EHR alert, 25/346 (7.23%) patients with epilepsy were referred for a polysomnography. Postintervention, 405/414 patients were screened using an EHR alert for AOSA and 134/405 (33.1%) were referred for polysomnography (p < 0.001).ConclusionAn intervention with AOSA cued in the EHR demonstrated markedly improved identification of epilepsy patients at risk for OSA and referral for polysomnography.


2019 ◽  
Vol 40 (02) ◽  
pp. 108-121 ◽  
Author(s):  
Jessica Paken ◽  
Cyril Govender ◽  
Mershen Pillay ◽  
Vikash Sewram

AbstractCisplatin, an effective antineoplastic drug used in the treatment of many cancers, has ototoxic potential, thus placing cancer patients, receiving this treatment, at risk of hearing loss. It is therefore important for health care professionals managing these patients to be aware of cisplatin's ototoxic properties and its clinical signs to identify patients at risk of developing a hearing impairment. Eighty-five English peer-reviewed articles and two books, from January 1975 to July 2015, were identified from PubMed, ScienceDirect, and EBSCOhost. An overview of cisplatin-associated ototoxicity, namely its clinical features, incidence rates, molecular and cellular mechanisms, and risk factors, is presented in this article. This review further highlights the importance of a team-based approach to complement an audiological monitoring program in reducing any further loss in the quality of life of affected patients, as there is currently no otoprotective agent routinely recommended for the prevention of cisplatin-associated ototoxicity.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 1876-1876
Author(s):  
Steven B. Deitelzweig ◽  
Jay Lin ◽  
Josh Benner ◽  
Russ Becker

Abstract Background: Hospitalized medical patients are at significant risk of venous thromboembolism (VTE). Although evidence-based guidelines exist which provide recommendations for thromboprophylaxis in hospitalized medical patients, the optimum regimen for prophylaxis is not clear. We have therefore created a model, based on established literature, which examines the 2-year clinical outcomes following no prophylaxis, thromboprophylaxis with unfractionated heparin (UFH), or thromboprophylaxis with low-molecular-weight heparin (LMWH) in medical patients at risk of VTE. Methods: A decision-analytic model was developed that replicates and extends an existing, published VTE model (McGarry et al. Am J Manag Care.2004;10:632–42) from 30 days to 2 years. Hypothetical cohorts of 10,000 medically ill patients at risk of VTE (according to MEDENOX criteria) were assembled, using a Markov chain model with resampling, to compare the rates of primary VTE events and related outcomes at 90 days and VTE complications and recurrent events at 2 years. Clinical outcomes were estimated from published clinical trial and observational data, and compared between three interventions, namely no prophylaxis, UFH, and the LMWH enoxaparin. Outcomes included in the analysis were clinical or venographically detected primary VTE, major and minor bleeds, asymptomatic or symptomatic heparin-induced thrombocytopenia, and death within the first 90 days, as well as VTE recurrence, post-thrombotic syndrome, pulmonary hypertension, and death within 2 years. Sensitivity and threshold analyses were performed to test the general applicability of the model. The simulation model was run using TreeAge software (Williamstown, USA). Results: VTE rates and death were the lowest in the enoxaparin prophylaxis cohort, followed by the UFH and no prophylaxis cohorts respectively (Table 1). Adverse events were lowest in the no prophylaxis group, followed by the enoxaparin group and the UFH group (Table 1). Conclusion: In this Markov model, based on robust data from clinical trials and observational studies, prophylaxis with enoxaparin reduced VTE occurrence and mortality over two years when compared with no prophylaxis or UFH prophylaxis in hospitalized medical patients at risk of VTE. Enoxaparin was also associated with a reduced incidence of adverse events when compared with UFH. Table 1. Two-year outcomes in simulated cohorts of hospitalized medical patients at risk of VTE Outcome (n) Enoxaparin (n=10,000) Unfractionated heparin (n=10,000) No prophylaxis (n=10,000) VTE at 2 years 683 791 1787 DVT 545 633 1426 PE 138 158 361 Death 1573 1600 1745 Adverse events at 90 days 364 725 314 Minor bleed 285 510 244 Major bleed 65 116 55 Asymptomatic HIT 6 45 7 Symptomatic HIT 8 54 8


2014 ◽  
Vol 71 (8) ◽  
pp. 757-766
Author(s):  
Jelena Nikolic ◽  
Tatjana Loncar-Turukalo ◽  
Srdjan Sladojevic ◽  
Marija Marinkovic ◽  
Zlata Janjic

Background/Aim. The lack of effective therapy for advanced stages of melanoma emphasizes the importance of preventive measures and screenings of population at risk. Identifying individuals at high risk should allow targeted screenings and follow-up involving those who would benefit most. The aim of this study was to identify most significant factors for melanoma prediction in our population and to create prognostic models for identification and differentiation of individuals at risk. Methods. This case-control study included 697 participants (341 patients and 356 controls) that underwent extensive interview and skin examination in order to check risk factors for melanoma. Pairwise univariate statistical comparison was used for the coarse selection of the most significant risk factors. These factors were fed into logistic regression (LR) and alternating decision trees (ADT) prognostic models that were assessed for their usefulness in identification of patients at risk to develop melanoma. Validation of the LR model was done by Hosmer and Lemeshow test, whereas the ADT was validated by 10-fold cross-validation. The achieved sensitivity, specificity, accuracy and AUC for both models were calculated. The melanoma risk score (MRS) based on the outcome of the LR model was presented. Results. The LR model showed that the following risk factors were associated with melanoma: sunbeds (OR = 4.018; 95% CI 1.724- 9.366 for those that sometimes used sunbeds), solar damage of the skin (OR = 8.274; 95% CI 2.661-25.730 for those with severe solar damage), hair color (OR = 3.222; 95% CI 1.984-5.231 for light brown/blond hair), the number of common naevi (over 100 naevi had OR = 3.57; 95% CI 1.427-8.931), the number of dysplastic naevi (from 1 to 10 dysplastic naevi OR was 2.672; 95% CI 1.572-4.540; for more than 10 naevi OR was 6.487; 95%; CI 1.993-21.119), Fitzpatricks phototype and the presence of congenital naevi. Red hair, phototype I and large congenital naevi were only present in melanoma patients and thus were strongly associated with melanoma. The percentage of correctly classified subjects in the LR model was 74.9%, sensitivity 71%, specificity 78.7% and AUC 0.805. For the ADT percentage of correctly classified instances was 71.9%, sensitivity 71.9%, specificity 79.4% and AUC 0.808. Conclusion. Application of different models for risk assessment and prediction of melanoma should provide efficient and standardized tool in the hands of clinicians. The presented models offer effective discrimination of individuals at high risk, transparent decision making and real-time implementation suitable for clinical practice. A continuous melanoma database growth would provide for further adjustments and enhancements in model accuracy as well as offering a possibility for successful application of more advanced data mining algorithms.


Author(s):  
Fahad Shabbir Ahmed ◽  
Raza-Ul-Mustafa ◽  
Liaqat Ali ◽  
Imad-ud-Deen ◽  
Tahir Hameed ◽  
...  

ABSTRACTIntroductionDiverticulitis is the inflammation and/or infection of small pouches known as diverticula that develop along the walls of the intestines. Patients with diverticulitis are at risk of mortality as high as 17% with abscess formation and 45% with secondary perforation, especially patients that get admitted to the inpatient services are at risk of complications including mortality. We developed a deep neural networks (DNN) based machine learning framework that could predict premature death in patients that are admitted with diverticulitis using electronic health records (EHR) to calculate the statistically significant risk factors first and then to apply deep neural network.MethodsOur proposed framework (Deep FLAIM) is a two-phase hybrid works framework. In the first phase, we used National In-patient Sample 2014 dataset to extract patients with diverticulitis patients with and without hemorrhage with the ICD-9 codes 562.11 and 562.13 respectively and analyzed these patients for different risk factors for statistical significance with univariate and multivariate analyses to generate hazard ratios, to rank the diverticulitis associated risk factors. In the second phase, we applied deep neural network model to predict death. Additionally, we have compared the performance of our proposed system by using the popular machine learning models such as DNN and Logistic Regression (LR).ResultsA total of 128,258 patients were used, we tested 64 different variables for using univariate and multivariate (age, gender and ethnicity) cox-regression for significance only 16 factors were statistically significant for both univariate and multivariate analysis. The mortality prediction for our DNN out-performed the conventional machine learning (logistic regression) in terms of AUC (0.977 vs 0.904), training accuracy (0.931 vs 0.900), testing accuracy (0.930 vs 0.910), sensitivity (90% vs 88%) and specificity (95% vs 93%).ConclusionOur Deep FLAIM Framework can predict mortality in patients admitted to the hospital with diverticulitis with high accuracy. The proposed framework can be expanded to predict premature death for other disease.


2021 ◽  
Vol 42 (Supplement_1) ◽  
Author(s):  
C V Madsen ◽  
B Leerhoey ◽  
L Joergensen ◽  
C S Meyhoff ◽  
A Sajadieh ◽  
...  

Abstract Introduction Post-operative atrial fibrillation (POAF) is currently considered a phenomenon rather than a definite diagnosis. Nevertheless, POAF is associated with an increased rate of complications, including stroke and mortality. The incidence of POAF in acute abdominal surgery has not been reported and prediction of patients at risk has not previously been attempted. Purpose We aim to report the incidence of POAF after acute abdominal surgery and provide a POAF prediction model based on pre-surgery risk-factors. Methods Designed as a prospective, single-centre, cohort study of unselected adult patients referred for acute, general, abdominal surgery. Consecutive patients (&gt;16 years) were included during a three month period. No exclusion criteria were applied. Follow-up was based on chart reviews, including medical history, vital signs, blood samples and electrocardiograms. Chart reviews were performed prior to surgery, at discharge, and three months after surgery. Atrial fibrillation was diagnosed either by specialists in Cardiology or Anaesthesiology on ECG or cardiac rhythm monitoring (≥30 seconds duration). Multiple logistic regression with backward stepwise selection was used for model development. Receiver operating characteristic curves (ROC) including area under the curve (AUC) was produced. The study was approved by the Regional Ethics committee (H-19033464) and comply with the principles of the Declaration of Helsinki of the World Medical Association. Results In total, 466 patients were included. Mean (±SD) age was 51.2 (20.5), 194 (41.6%) were female, and cardiovascular comorbidity was present in ≈10% of patients. Overall incidence of POAF was 5.8% (27/466) and no cases were observed in patients &lt;60 years. Incidence was 15.7% (27/172) for patients ≥60 years. Prolonged hospitalization and death were observed in 40.7% of patients with POAF vs 8.4% patients without POAF (p&lt;0.001). Significant age-adjusted risk-factors were previous atrial fibrillation odds ratio (OR) 6.84 [2.73; 17.18] (p&lt;0.001), known diabetes mellitus OR 3.49 [1.40; 8.69] (p=0.007), and chronic kidney disease OR 3.03 [1.20; 7.65] (p=0.019). A prediction model, based on age, previous atrial fibrillation, diabetes mellitus and chronic kidney disease was produced (Figure 1), and ROC analysis displayed AUC 88.26% (Figure 2). Conclusions A simple risk-stratification model as the one provided, can aid clinicians in identifying those patients at risk of developing POAF in relation to acute abdominal surgery. This is important, as patients developing POAF are more likely to experience complications, such as prolonged hospitalization and death. Closer monitoring of heart rhythm and vital signs should be considered in at-risk patients older than 60 years. Model validation is warranted. FUNDunding Acknowledgement Type of funding sources: None.


Sign in / Sign up

Export Citation Format

Share Document