scholarly journals A high number of diseases and consultations: a warning signal for GPs when following up a multimorbid patient

2019 ◽  
Author(s):  
Paul Aujoulat ◽  
Patrice NABBE ◽  
Sophie LALANDE ◽  
Delphine LE GOFF ◽  
Jeremy DERRIENIC ◽  
...  

Abstract Background: the European General Practitioners Research Network (EGPRN) designed and validated a comprehensive definition of multimorbidity using a systematic literature review and qualitative research throughout Europe. Detecting risk factors for decompensation would be an interesting challenge for family physicians (FPs) in the management of multimorbid patients. The purpose of the survey was to assess which items belonging to the EGPRN multimorbidity definition could help to identify patients at risk of decompensation in a cohort pilot study over a 24-month follow-up among primary care outpatients. Method : 131 patients meeting the multimorbidity definition were included using two inclusion periods between 2014 and 2015. Over a 24-month follow-up, the « decompensation » or « nothing to report » status was collected. A logistic regression, following a Cox model, was then performed to identify risk factors for decompensation. Results : After 24 months of follow-up, 120 patients were analyzed. 3 clusters were identified. 44 patients, representing 36.6 % of the population, were still alive and had not been hospitalized for a period exceeding 6 days. Two variables were significantly linked to decompensation: the number of visits to the FP per year (HR 1.06, IC 95 %, 1,03-1,10, p-value <0,001) and the total number of diseases (HR 1,12, IC 95 %, 1,013-1,33, p-value = 0,039). Conclusion: FPs should be aware that a high number of consultations and a high total number of diseases are linked to severe outcomes such as death or unplanned hospitalization. A large-scale cohort in primary care seems feasible to confirm these results.

Cancers ◽  
2021 ◽  
Vol 13 (9) ◽  
pp. 2242
Author(s):  
Charlotte M. Heidsma ◽  
Diamantis I. Tsilimigras ◽  
Flavio Rocha ◽  
Daniel E. Abbott ◽  
Ryan Fields ◽  
...  

Background: Identifying patients at risk for early recurrence (ER) following resection for pancreatic neuroendocrine tumors (pNETs) might help to tailor adjuvant therapies and surveillance intensity in the post-operative setting. Methods: Patients undergoing surgical resection for pNETs between 1998–2018 were identified using a multi-institutional database. Using a minimum p-value approach, optimal cut-off value of recurrence-free survival (RFS) was determined based on the difference in post-recurrence survival (PRS). Risk factors for early recurrence were identified. Results: Among 807 patients who underwent curative-intent resection for pNETs, the optimal length of RFS to define ER was identified at 18 months (lowest p-value of 0.019). Median RFS was 11.0 months (95% 8.5–12.60) among ER patients (n = 49) versus 41.0 months (95% CI: 35.0–45.9) among non-ER patients (n = 77). Median PRS was worse among ER patients compared with non-ER patients (42.6 months vs. 81.5 months, p = 0.04). On multivariable analysis, tumor size (OR: 1.20, 95% CI: 1.05–1.37, p = 0.007) and positive lymph nodes (OR: 4.69, 95% CI: 1.41–15.58, p = 0.01) were independently associated with ER. Conclusion: An evidence-based cut-off value for ER after surgery for pNET was defined at 18 months. These data emphasized the importance of close follow-up in the first two years after surgery.


2019 ◽  
Vol 30 (3) ◽  
pp. 402-407
Author(s):  
Daphne M Stol ◽  
Monika Hollander ◽  
Ilse F Badenbroek ◽  
Mark M J Nielen ◽  
François G Schellevis ◽  
...  

Abstract Background Early detection and treatment of cardiometabolic diseases (CMD) in high-risk patients is a promising preventive strategy to anticipate the increasing burden of CMD. The Dutch guideline ‘the prevention consultation’ provides a framework for stepwise CMD risk assessment and detection in primary care. The aim of this study was to assess the outcome of this program in terms of newly diagnosed CMD. Methods A cohort study among 30 934 patients, aged 45–70 years without known CMD or CMD risk factors, who were invited for the CMD detection program within 37 general practices. Patients filled out a CMD risk score (step 1), were referred for additional risk profiling in case of high risk (step 2) and received lifestyle advice and (pharmacological) treatment if indicated (step 3). During 1-year follow-up newly diagnosed CMD, prescriptions and abnormal diagnostic tests were assessed. Results Twelve thousand seven hundred and thirty-eight patients filled out the risk score of which 865, 6665 and 5208 had a low, intermediate and high CMD risk, respectively. One thousand seven hundred and fifty-five high-risk patients consulted the general practitioner, in 346 of whom a new CMD was diagnosed. In an additional 422 patients a new prescription and/or abnormal diagnostic test were found. Conclusions Implementation of the CMD detection program resulted in a new CMD diagnosis in one-fifth of high-risk patients who attended the practice for completion of their risk profile. However, the potential yield of the program could be higher given the considerable number of additional risk factors—such as elevated glucose, blood pressure and cholesterol levels—found, requiring active follow-up and presumably treatment in the future.


Circulation ◽  
2018 ◽  
Vol 137 (suppl_1) ◽  
Author(s):  
Di Zhao ◽  
Eliseo Guallar ◽  
Elena Blasco-Colmenares ◽  
Nona Sotoodehnia ◽  
Wendy Post

Background: In hospital-based studies and in studies of participants with pre-existing conditions, African Americans have a higher risk of in- and out-of-hospital sudden cardiac death (SCD) compared with Whites. However, the risk of SCD of African Americans and Whites has never been compared in large-scale community-based cohort studies. Objective: To compare the risk of SCD among African Americans and Whites, and to evaluate the risk factors that may explain racial differences in incidence. Methods: Cohort study of 3,838 African Americans and 11,245 Whites participating in ARIC. Race was self-reported. SCD cases were defined as a sudden pulseless condition from a cardiac cause in a previously stable individual and adjudicated by an expert committee. Mediation effect of covariates was calculated using boot-strapping method. Cox proportional hazards models were adjusted for demographics, social economic status, cardiovascular (CVD), and electrocardiographic risk factors. Results: The mean (SD) age was 53.6 (5.8) for African Americans and 54.4 (5.7) years for Whites. During 25.3 years of follow-up, 215 African Americans and 332 Whites experienced SCD. In multivariable-adjusted models, the HRs (95% CI) for SCD comparing African Americans and Whites were 1.70 (1.37, 2.10) overall, 2.00 (1.40, 2.84) in women, and 1.46 (1.10, 1.92) in men (p-value for race by sex interaction 0.02; Table ). CVD and electrocardiographic risk factors explained 36.6% (21.4, 51.8%) of the excess risk of SCD in African Americans, with a large proportion of racial differences unexplained. Conclusions: The risk of SCD in community-dwelling African Americans was significantly higher than in Whites, particularly among women. CVD risk factors, including higher prevalence of obesity, diabetes, hypertension and LV hypertrophy in African Americans, explained only a small fraction of this difference. Further research is needed to identify factors responsible for race differences in SCD and to implement prevention strategies in high-risk minorities.


2021 ◽  
pp. 72-75
Author(s):  
Hung-Chune Maa ◽  
Pham van Tuyen ◽  
Yen-Lin Chen ◽  
Yao-Nan Yuan

INTRODUCTION:Microporous protein 1 (MCRS1) acts as a cancer gene. MCRS1 is associated with poor prognosis in several types of cancer including colorectal cancer, hepatocellular carcinoma, glioma, and non-small cell lung cancer. In the current study, we are trying to shed light on the role of MCRS1 in the extrahepatic cholangiocarcinoma. METHODS: We retrospectively selected 13 patients who diagnosed extrahepatic cholangiocarcinoma. All clinical charts and histopathology reports were reviewed for and recoded for age, gender, tumor size, surgical margin status, lymph node metastasis, distant metastasis and TMN staging. All patients were followed for 1~10 years. The median follow-up period was 3.2 years. RESULTS: The expression level of MCRS1 showed signicantly higher in tumor part than non-tumor part. In the Kaplan-Meier survival plot , the high MCRS1 expression group showed poor survival probability with p value of 0.020. The Hazard ratio of MCRS1 showed 8.393 folds in high MCRS1 expression group when compared with low expression group with the borderline p value of 0.05. CONCLUSION:MCRS1 serves as a poor prognostic factor. Further analysis, no correlation was found in proliferation, apoptosis, angiogenesis and EMT markers. The reason may be the sample size and large-scale study in the future is mandatory


Author(s):  
Abid Abdullah ◽  
Nafees Ahmad ◽  
Muhammad Atif ◽  
Shereen Khan ◽  
Abdul Wahid ◽  
...  

Abstract Background This study aimed to evaluate treatment outcomes and factors associated unsuccessful outcomes among pediatric tuberculosis (TB) patients (age ≤14 years). Methods This was a retrospective cohort study conducted at three districts (Quetta, Zhob and Killa Abdullah) of Balochistan, Pakistan. All childhood TB patients enrolled for treatment at Bolan Medical Complex Hospital (BMCH) Quetta and District Headquarter Hospitals of Zhob and Killa Abdullah from 1 January 2016 to 31 December 2018 were included in the study and followed until their treatment outcomes were reported. Data were collected through a purpose developed standardized data collection form and analyzed by using SPSS 20. A p-value &lt;0.05 was considered statistically significant. Results Out of 5152 TB patients enrolled at the study sites, 2184 (42.4%) were children. Among them, 1941 childhood TB patients had complete medical record were included in the study. Majority of the study participants were &lt;5 years old (66.6%) and had pulmonary TB (PTB; 65%). A total of 45 (2.3%) patients were cured, 1680 (86.6%) completed treatment, 195 (10%) lost to follow-up, 15 (0.8%) died, 5 (0.3%) failed treatment and 1 (0.1%) was not evaluated for outcomes. In multivariate binary logistic regression analysis, treatment at BMCH Quetta (OR = 25.671, p-value &lt; 0.001), rural residence (OR = 3.126, p-value &lt; 0.001) and extra-PTB (OR = 1.619, p-value = 0.004) emerged as risk factors for unsuccessful outcomes. Conclusion The study sites collectively reached the World Health Organization’s target of treatment success (&gt;85%). Lost to follow-up was the major reason for unsuccessful outcomes. Special attention to patients with identified risk factors for unsuccessful outcomes may improve outcomes further.


Author(s):  
Joseph Freer ◽  
Hassan Mahomed ◽  
Anthony Westwood

Abstract Background In South Africa, Cape Town’s health facilities are stretched by the volume of cases of diarrhoea during the summer months, particularly with severely dehydrated children, who often require complex inpatient management. The prevalence of severe disease in children living in the settlements around Cape Town is particularly high. Methods An observational study of a systematic sample of children under 5 who presented to any primary care facility in Khayelitsha, an informal settlement of Cape Town, with diarrhoea and referred to secondary care between 1 November 2015 and 30 April 2016. We recruited participants from the sub-district office and identified risk factors associated with the index presentation, captured the triage and management of patients in primary care and investigated post-discharge follow-up. Results We recruited 87 children into the study, out of a total of 115 cases of severe dehydration. There was a significantly higher number of households in this group with no income than in Khayelitsha overall (65% vs. 47.4%; p &lt; 0.001). In the sample, HIV-exposed, uninfected children were younger than unexposed children (median 9.44 months in exposed vs. 17.36 months in unexposed; p = 0.0015) and were more likely to be malnourished (weight-for-age Z-score; WAZ score &lt; −2) [13 cases exposed vs. 8 cases unexposed (p = 0.04)]. Outreach staff were able to trace only 33.3% of children at home following discharge, yet 65% of children attended follow-up appointments in clinics. Conclusions This cohort of children with diarrhoeal disease complicated by severe dehydration was a particularly socially deprived group. The results demonstrating zero vertical transmission of HIV in this very socioeconomically deprived area of Cape Town are encouraging. In the HIV-exposed, uninfected group, children were younger and had a higher prevalence of malnutrition, which should be the subject of future research, especially given existing evidence for immunological differences in children exposed to HIV in utero. Locating children with severe diarrhoea post-discharge was challenging and further research is needed on the cost-effectiveness and outcomes of different follow-up approaches.


Blood ◽  
2020 ◽  
Vol 136 (Supplement 1) ◽  
pp. 28-28
Author(s):  
Carmen Landry ◽  
Jon Dorling ◽  
Ketan Kulkarni ◽  
Marsha Campbell-Yeo ◽  
Michael Vincer ◽  
...  

Background: Iron is an essential micronutrient, especially in infants and young children and is required for erythropoiesis and development of the central nervous system. However, iron deficiency (ID) is the most common micronutrient deficiency worldwide. ID and iron deficiency anemia (IDA) have been associated with poor neurodevelopmental and behavioural outcomes later in life. Preterm infants are particularly at risk of developing ID in early life due to lower iron stores at birth, accelerated growth in the first weeks of life and multiple phlebotomies while in hospital. Therefore, international recommendations suggest prophylactic iron therapy of 2-4 mg/kg/day starting at 2-6 weeks of age until at least 6-12 months in preterm and low birth weight infants. This prophylactic iron supplementation has been shown to be effective at reducing the incidence of ID and IDA. However, the published work mainly involves moderate to late preterm infants and the research is lacking on iron status after discharge in very preterm infants (VPI, &lt;31 weeks gestational age). Based on our previous work, 32% of the VPIs were iron deficient at 4-6 months corrected age despite this early supplementation. Since the development of ID may have permanent detrimental effects on the developing brain of these high-risk preterm infants, a knowledge of risk factors for ID is also important to identify strategies focused on its prevention. Objective: To investigate the risk factors associated with development of ID Methods: A retrospective cohort study was conducted at the IWK Health Centre using a population based provincial Perinatal Follow-Up Program database. All live-born VPIs born in Nova Scotia between 2005-2018 were included. Patients with congenital malformations, chromosomal anomalies, or who died prior to outcome assessment were excluded. As a standard of care, all these infants were started on prophylactic iron supplements (2-3 mg/kg/day) at 2-4 weeks of chronological age. Iron dosage was regularly adjusted during the hospital stay as guided by serum ferritin levels. At discharge, it was recommended to continue iron prophylaxis until 9-12 months corrected age. All these infants underwent a blood test during their first neonatal follow-up visit at 4-6 months corrected age to check for hemoglobin, reticulocyte count and serum ferritin. ID was defined as serum ferritin &lt;20g/l or &lt;12g/l at 4 and 6 months respectively. A univariate analysis was performed by using a series of single variable logistic regression models to identify the factors associated with presence of ID. Factors with a p-value &lt; 0.20 in the univariate analysis were entered into a multivariable risk model for occurrence of ID using a backwards selection procedure. Variables with a p-value &lt; 0.05 were retained. Results: Of 411 infants included in the study, 32.1% (n=132) had ID. The prevalence of ID decreased over time (37.6% in 2005-2011 vs 25.8% in 2012-2018 cohort). Table 1 compares the antenatal and neonatal characteristics of the ID and non-ID groups. Table 2 compares sociodemographic variables and clinical variables at the time of follow up of the two groups. Independent risk factors for ID were: gestational age (&lt;27 weeks to &gt;27 weeks) (OR:1.7 (1.0-2.9), p=0.04) and gestational hypertension (OR: 2.1(1.2-3.7), p=0.009). Independent factors protective for ID were: mixed feeding (breast milk and formula compared to formula alone) (OR: 0.5 (0.2-0.9), p=0.021) and iron supplementation at follow-up (OR:0.5 (0.3-0.9), p=0.02). Conclusion(s): Despite prophylactic iron supplementation, one-third of VPIs had ID at 4-6 months corrected age. Gestational hypertension in mother and gestational age &lt; 27 weeks were independent risk factors for ID. In addition, despite adjusting for iron supplementation at follow-up, the formula feeding group was more likely to have ID compared to the mixed feeding group. This may be because of the sub-therapeutic iron intake in the formula fed infants. It is often thought that formula milk may have sufficient iron to meet the demands of growing infants and thus, they are less likely to receive higher doses of supplemental iron beyond what is contained in the formula. However, this may not be true since the iron present in formula may not have the same bioavailability as breast milk. Future prospective studies are required to further validate these observations. Nonetheless, the study identified important areas to mitigate ID in VPIs. Disclosures No relevant conflicts of interest to declare.


2013 ◽  
Vol 31 (15_suppl) ◽  
pp. e19538-e19538
Author(s):  
Claudio J. Flores ◽  
Luis Augusto Casanova

e19538 Background: To evaluate prognostic factors in patients with primary non-Hodgkin lymphomas (NHL). Methods: We retrospectively analyzed prognostic factors (PFs) for overall survival (OS) in 2160 patients (pts) with NHL treated at INEN between 1990-2002. PFs were determinate according to Cox model with P-splines. Results: The median age was 54 years (range 14 - 96) and 51% were male. The majority of the pts had good performance status (73%, WHO 0-1). The Ann Arbor stage was I-II in 51%, III-IV in 49% and B symptoms were present in 38% of the pts. The hemoglobin (Hb) was low in 48%. Leukocytes (WBC), lymphocytes and LDH were elevated in 17.7%, 7.7% and 60% of pts, respectively. Of all patients, 709 (32.8 %) pts had died. The median follow-up was 12.6 months, with a median survival of 61.8 months and the survival rate at 5 and 10 years of 51.2 % and 41.7 % respectively. PFs identified were: age, sex, zubrod, clinical stage, Hb, leukocytes, lymphocytes and LDH. The following table shows the p-value and hazard ratio (HR). The effect of continuous covariates in the log(HR) is non-linear. The cutoff points of highest (HR >1) match the clinically defined ones, which are: age >60yrs, Hb<12g/dl and LDH >240UI/L. Both leukocytes and lymphocytes have two higher risk breakpoints: leukocytes >3x103 and >10x103, lymphocytes <20% and >60%. Conclusions: PFs for OS in our group of pts were similar to other reports in NHL. Age, Hb, leukocytes, and lymphocytes are relevant to OS, which showed a nonlinear effect in the log (HR). [Table: see text]


2020 ◽  
Vol 38 (4_suppl) ◽  
pp. 332-332
Author(s):  
Meena Sadaps ◽  
Neal Mehta ◽  
Michael J. McNamara ◽  
Alok A. Khorana ◽  
Amit Bhatt

332 Background: Adjuvant therapy after endoscopic resection (ER) of T1 EAC in non-surgical candidates is largely based on the risk of LNM. Risk factors for LNM in T1 EAC are not clearly defined. Our aim is to evaluate risk factors for LNM in T1 EAC patients following esophagectomy or ER with ≥ 5 years of follow-up. Methods: This is a retrospective analysis at a large tertiary referral center. Our pathology database identified patients who underwent esophagectomy or ER with ≥ 5 years follow-up, with histologically proven T1 EAC from 2010-2017. Patients were excluded if they (a) received chemoradiation prior to esophagectomy or before/after ER (b) had any other primary cancer treated within the previous 5 years. Specimens were reviewed by an expert GI pathologist for accuracy. Results: Of 80 patients [85% males], 61 (76%) underwent esophagectomy and 19 (24%) underwent ER. Twelve (15%) developed LNM per study criteria. Tumor size was significantly (p-value 0.014) associated with risk of LNM (Table). No other factors including lymphovascular invasion, differentiation on pathology, macroscopic appearance, infiltration growth pattern, or tumor distance from the gastroesophageal junction were significant risk factors for LNM. Conclusions: In T1 EAC, tumor size appears to be a significant risk factor for LNM at five years following surgical or endoscopic resection. Adjuvant therapy should be considered in patients with larger tumor size. [Table: see text]


2002 ◽  
Vol 32 (4) ◽  
pp. 595-607 ◽  
Author(s):  
K. BARKOW ◽  
W. MAIER ◽  
T. B. ÜSTÜN ◽  
M. GÄNSICKE ◽  
H.-U. WITTCHEN ◽  
...  

Background. Studies that examined community samples have reported several risk factors for the development of depressive episodes. The few studies that have been performed on primary care samples were mostly cross-sectional. Most samples had originated from highly developed industrial countries. This is the first study that prospectively investigates the risk factors of depressive episodes in an international primary care sample.Methods. A stratified primary care sample of initially non-depressed subjects (N = 2445) from 15 centres from all over the world was examined for the presence or absence of a depressive episode (ICD-10) at the 12 month follow-up assessment. The initial measures addressed sociodemographic variables, psychological/psychiatric problems and social disability. Logistic regression analysis was carried out to determine their relationship with the development of new depressive episodes.Results. At the 12-month follow-up, 4·4% of primary care patients met ICD-10 criteria for a depressive episode. Logistic regression analysis revealed that the recognition by the general practitioner as a psychiatric case, repeated suicidal thoughts, previous depressive episodes, the number of chronic organic diseases, poor general health, and a full or subthreshold ICD-10 disorder were related to the development of new depressive episodes.Conclusions. Psychological/psychiatric problems were found to play the most important role in the prediction of depressive episodes while sociodemographic variables were of lower importance. Differences compared with other studies might be due to our prospective design and possibly also to our culturally different sample. Applied stratification procedures, which resulted in a sample at high risk of developing depression, might be a limitation of our study.


Sign in / Sign up

Export Citation Format

Share Document