Hohe Prävalenz der Verwendung von Antipsychotika bei Demenzpatienten in deutschen neurologischen und psychiatrischen Praxen

2017 ◽  
Vol 85 (06) ◽  
pp. 345-351 ◽  
Author(s):  
Jens Bohlken ◽  
Anke Booker ◽  
Karel Kostev

Zusammenfassung Hintergrund Das Ziel der vorliegenden Studie ist es, sowohl in Pflegeheimen als auch zu Hause lebende Patienten in Bezug auf die Häufigkeit der Verwendung von Antipsychotika zu untersuchen und die Faktoren zu bestimmen, die zur erstmaligen Verwendung von Antipsychotika nach einer Demenzdiagnose führen. Methoden Diese Studie umfasste Patienten im Alter ab 60 Jahren mit einer Erstdokumentation einer Demenz beliebigen Ursprungs (Indexdatum) durch neuropsychiatrische Fachärzte in der Disease-Analyzer-Datenbank (IMS Health). Der Hauptzielparameter war der Anteil der Patienten, die nach dem Indexdatum zum ersten Mal eine Antipsychotikaverordnung erhielten. Mithilfe von Kaplan-Meier-Analysen wurde die Zeit bis zur Einleitung der Antipsychotikatherapie in Abhängigkeit von Alter und Unterbringung in einem Pflegeheim untersucht. Ein Cox-Proportional-Hazards-Regressionsmodell wurde verwendet, um das Verhältnis zwischen der Wahrscheinlichkeit der Therapieeinleitung und den vordefinierten demografischen und klinischen Variablen abzuschätzen. Ergebnisse Insgesamt wurden 14 915 Patienten mit Demenz (Durchschnittsalter 80,3 Jahre, 34,7 % männlich, 53,3 % in Pflegeheimen untergebracht) in die Studie eingeschlossen. Innerhalb von zwei Jahren nach dem Indexdatum wurden 47,7 % der Demenzpatienten mit Antipsychotika behandelt. Unterbringung in Pflegeheimen, höheres Alter, organisch bedingte psychische Störungen, organisch bedingte Persönlichkeitsstörungen, vaskuläre Demenz und Privatversicherungsstatus waren positiv mit der Einleitung einer Antipsychotikabehandlung assoziiert. Schlussfolgerung Die Prävalenz der Verwendung von Antipsychotika bei Demenzpatienten ist, insbesondere bei Patienten in Pflegeheimen, sehr hoch. Weitere Studien, einschließlich qualitativer Untersuchungen, sind nötig, um die Gründe für dieses Verordnungsverhalten zu verstehen und zu erklären.

2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
I.D Poveda Pinedo ◽  
I Marco Clement ◽  
O Gonzalez ◽  
I Ponz ◽  
A.M Iniesta ◽  
...  

Abstract Background Previous parameters such as peak VO2, VE/VCO2 slope and OUES have been described to be prognostic in heart failure (HF). The aim of this study was to identify further prognostic factors of cardiopulmonary exercise testing (CPET) in HF patients. Methods A retrospective analysis of HF patients who underwent CPET from January to November 2019 in a single centre was performed. PETCO2 gradient was defined by the difference between final PETCO2 and baseline PETCO2. HF events were defined as decompensated HF requiring hospital admission or IV diuretics, or decompensated HF resulting in death. Results A total of 64 HF patients were assessed by CPET, HF events occurred in 8 (12.5%) patients. Baseline characteristics are shown in table 1. Patients having HF events had a negative PETCO2 gradient while patients not having events showed a positive PETCO2 gradient (−1.5 [IQR −4.8, 2.3] vs 3 [IQR 1, 5] mmHg; p=0.004). A multivariate Cox proportional-hazards regression analysis revealed that PETCO2 gradient was an independent predictor of HF events (HR 0.74, 95% CI [0.61–0.89]; p=0.002). Kaplan-Meier curves showed a significantly higher incidence of HF events in patients having negative gradients, p=0.002 (figure 1). Conclusion PETCO2 gradient was demonstrated to be a prognostic parameter of CPET in HF patients in our study. Patients having negative gradients had worse outcomes by having more HF events. Time to first event, decompensated heart Funding Acknowledgement Type of funding source: None


2021 ◽  
pp. 1-9
Author(s):  
Leonard Naymagon ◽  
Douglas Tremblay ◽  
John Mascarenhas

Data supporting the use of etoposide-based therapy in hemophagocytic lymphohistiocytosis (HLH) arise largely from pediatric studies. There is a lack of comparable data among adult patients with secondary HLH. We conducted a retrospective study to assess the impact of etoposide-based therapy on outcomes in adult secondary HLH. The primary outcome was overall survival. The log-rank test was used to compare Kaplan-Meier distributions of time-to-event outcomes. Multivariable Cox proportional hazards modeling was used to estimate adjusted hazard ratios (HRs) with 95% confidence intervals (CIs). Ninety adults with secondary HLH seen between January 1, 2009, and January 6, 2020, were included. Forty-two patients (47%) received etoposide-based therapy, while 48 (53%) received treatment only for their inciting proinflammatory condition. Thirty-three patients in the etoposide group (72%) and 32 in the no-etoposide group (67%) died during follow-up. Median survival in the etoposide and no-etoposide groups was 1.04 and 1.39 months, respectively. There was no significant difference in survival between the etoposide and no-etoposide groups (log-rank <i>p</i> = 0.4146). On multivariable analysis, there was no association between treatment with etoposide and survival (HR for death with etoposide = 1.067, 95% CI: 0.633–1.799, <i>p</i> = 0.8084). Use of etoposide-based therapy was not associated with improvement in outcomes in this large cohort of adult secondary HLH patients.


Risks ◽  
2021 ◽  
Vol 9 (7) ◽  
pp. 121
Author(s):  
Beata Bieszk-Stolorz ◽  
Krzysztof Dmytrów

The aim of our research was to compare the intensity of decline and then increase in the value of basic stock indices during the SARS-CoV-2 coronavirus pandemic in 2020. The survival analysis methods used to assess the risk of decline and chance of rise of the indices were: Kaplan–Meier estimator, logit model, and the Cox proportional hazards model. We observed the highest intensity of decline in the European stock exchanges, followed by the American and Asian plus Australian ones (after the fourth and eighth week since the peak). The highest risk of decline was in America, then in Europe, followed by Asia and Australia. The lowest risk was in Africa. The intensity of increase was the highest in the fourth and eleventh week since the minimal value had been reached. The highest odds of increase were in the American stock exchanges, followed by the European and Asian (including Australia and Oceania), and the lowest in the African ones. The odds and intensity of increase in the stock exchange indices varied from continent to continent. The increase was faster than the initial decline.


BMC Nutrition ◽  
2021 ◽  
Vol 7 (1) ◽  
Author(s):  
Akiko Nakanishi ◽  
Erika Homma ◽  
Tsukasa Osaki ◽  
Ri Sho ◽  
Masayoshi Souri ◽  
...  

Abstract Background Dairy products are known as health-promoting foods. This study prospectively examined the association between milk and yogurt intake and mortality in a community-based population. Methods The study population comprised of 14,264 subjects aged 40–74 years who participated in an annual health checkup. The frequency of yogurt and milk intake was categorized as none (< 1/month), low (< 1/week), moderate (1–6/week), and high (> 1/day) intake. The association between yogurt and milk intake and total, cardiovascular, and cancer-related mortalities was determined using the Cox proportional hazards model. Results During the follow-up period, there were 265 total deaths, 40 cardiovascular deaths and 90 cancer-related deaths. Kaplan–Meier analysis showed that the total mortality in high/moderate/low yogurt intake and moderate/low milk intake groups was lower than that in none group (log-rank, P < 0.01). In the multivariate Cox proportional hazard analysis adjusted for possible confounders, the hazard ratio (HR) for total mortality significantly decreased in high/moderate yogurt intake group (HR: 0.62, 95% confidence interval [CI]: 0.42–0.91 for high intake, HR: 0.70, 95%CI: 0.49–0.99 for moderate intake) and moderate milk intake group (HR: 0.67, 95% CI: 0.46–0.97) compared with the none yogurt and milk intake groups. A similar association was observed for cancer-related mortality, but not for cardiovascular mortality. Conclusions Our study showed that yogurt and milk intake was independently associated with a decrease in total and cancer-related mortalities in the Japanese population.


Author(s):  
Majdi Imterat ◽  
Tamar Wainstock ◽  
Eyal Sheiner ◽  
Gali Pariente

Abstract Recent evidence suggests that a long inter-pregnancy interval (IPI: time interval between live birth and estimated time of conception of subsequent pregnancy) poses a risk for adverse short-term perinatal outcome. We aimed to study the effect of short (<6 months) and long (>60 months) IPI on long-term cardiovascular morbidity of the offspring. A population-based cohort study was performed in which all singleton live births in parturients with at least one previous birth were included. Hospitalizations of the offspring up to the age of 18 years involving cardiovascular diseases and according to IPI length were evaluated. Intermediate interval, between 6 and 60 months, was considered the reference. Kaplan–Meier survival curves were used to compare the cumulative morbidity incidence between the groups. Cox proportional hazards model was used to control for confounders. During the study period, 161,793 deliveries met the inclusion criteria. Of them, 14.1% (n = 22,851) occurred in parturient following a short IPI, 78.6% (n = 127,146) following an intermediate IPI, and 7.3% (n = 11,796) following a long IPI. Total hospitalizations of the offspring, involving cardiovascular morbidity, were comparable between the groups. The Kaplan–Meier survival curves demonstrated similar cumulative incidences of cardiovascular morbidity in all groups. In a Cox proportional hazards model, short and long IPI did not appear as independent risk factors for later pediatric cardiovascular morbidity of the offspring (adjusted HR 0.97, 95% CI 0.80–1.18; adjusted HR 1.01, 95% CI 0.83–1.37, for short and long IPI, respectively). In our population, extreme IPIs do not appear to impact long-term cardiovascular hospitalizations of offspring.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Kochav ◽  
R.C Chen ◽  
J.M.D Dizon ◽  
J.A.R Reiffel

Abstract Background Theoretical concern exists regarding AV block (AVB) with class I antiarrhythmics (AADs) when bundle branch block (BBB) is present. Whether this is substantiated in real-world populations is unknown. Purpose To determine the relationship between type of AAD and incidence of AVB in patients with preexisting BBB. Methods We retrospectively studied all patients with BBB who received class I and III AADs between 1997–2019 to compare incidence of AVB. We defined index time as first exposure to either drug class and excluded patients with prior AVB or exposed to both classes. Time-at-risk window ended at first outcome occurrence or when patients were no longer observed in the database. We estimated hazard ratios for incident AVB using Cox proportional hazards models with propensity score stratification, adjusting for over 32,000 covariates from the electronic health record. Kaplan-Meier methods were used to determine treatment effects over time. Results Of 40,120 individuals with BBB, 148 were exposed to a class I AAD and 2401 to a class III AAD. Over nearly 4,200 person-years of follow up, there were 22 and 620 outcome events in the class I and class III cohorts, respectively (Figure). In adjusted analyses, AVB risk was markedly lower in patients exposed to class I AADs compared with class III (HR 0.48 [95% CI 0.30–0.75]). Conclusion Among patients with BBB, exposure to class III AADs was strongly associated with greater risk of incident AVB. This likely reflects differences in natural history of patients receiving class I vs class III AADs rather than adverse class III effects, however, the lack of worse outcomes acutely with class I AADs suggests that they may be safer in BBB than suspected. Funding Acknowledgement Type of funding source: None


2021 ◽  
Vol 39 (15_suppl) ◽  
pp. 4142-4142
Author(s):  
Lucy Xiaolu Ma ◽  
Gun Ho Jang ◽  
Amy Zhang ◽  
Robert Edward Denroche ◽  
Anna Dodd ◽  
...  

4142 Background: KRAS mutations (m) (KRASm) are present in over 90% of pancreatic adenocarcinomas (PDAC) with a predominance of G12 substitutions. KRAS wildtype (WT) PDAC relies on alternate oncogenic drivers, and the prognostic impact of these remains unknown. We evaluated alterations in WT PDAC and explored the impact of specific KRASm and WT status on survival. Methods: WGS and RNAseq were performed on 570 patients (pts) ascertained through our translational research program from 2012-2021, of which 443 were included for overall survival (OS) analyses. This included 176 pts with resected and 267 pts with advanced PDAC enrolled on the COMPASS trial (NCT02750657). The latter cohort underwent biopsies prior to treatment with first line gemcitabine-nab-paclitaxel or mFOLFIRINOX as per physician choice. The Kaplan-Meier and Cox proportional hazards methods were used to estimate OS. Results: KRAS WT PDAC (n = 52) represented 9% of pts, and these cases trended to be younger than pts with KRASm (median age 61 vs 65 years p = 0.1). In resected cases, the most common alterations in WT PDAC (n = 23) included GNASm (n = 6) and BRAFm/fusions (n = 5). In advanced WT PDAC (n = 27), alterations in BRAF (n = 11) and ERBB2/3/4 (n = 6) were most prevalent. Oncogenic fusions (NTRK, NRG1, BRAF/RAF, ROS1, others) were identified in 9 pts. The BRAF in-frame deletion p.486_491del represented the most common single variant in WT PDAC, with organoid profiling revealing sensitivity to both 3rd generation BRAF inhibitors and MEK inhibition. In resected PDAC, multivariable analyses documented higher stage (p = 0.043), lack of adjuvant chemotherapy (p < 0.001), and the KRAS G12D variant (p = 0.004) as poor prognostic variables. In advanced disease, neither WT PDAC nor KRAS specific alleles had an impact on prognosis (median OS WT = 8.5 mths, G12D = 8.2, G12V = 10.0, G12R = 12.0, others = 9.2, p = 0.73); the basal-like RNA subtype conferred inferior OS (p < 0.001). A targeted therapeutic approach following first line chemotherapy was undertaken in 10% of pts with advanced PDAC: MMRd (n = 1), homologous recombination deficiency (HRD) (n = 19), KRASG12C (n = 1), CDK4/6 amplification (n = 3), ERBB family alterations (n = 2), BRAF variants (n = 2). OS in this group was superior (14.7 vs 8.8 mths, p = 0.04), mainly driven by HRD-PDAC where KRASm were present in 89%. Conclusions: In our dataset, KRAS G12D is associated with inferior OS in resected PDAC, however KRAS mutational status was not prognostic in advanced disease. This suggests that improved OS in the WT PDAC population can only be achieved if there is accelerated access to targeted drugs for pts.


2021 ◽  
Vol 8 ◽  
Author(s):  
Xuejin Gao ◽  
Li Zhang ◽  
Siwen Wang ◽  
Yaqin Xiao ◽  
Deshuai Song ◽  
...  

Background: Patients with short bowel syndrome (SBS) are at a high risk of cholestasis or cholelithiasis. This study aimed to determine the incidence, risk factors, and clinical consequences of cholelithiasis in adults with SBS over an extended period.Methods: All eligible adults diagnosed with SBS and admitted to a tertiary hospital center between January 2010 and December 2019 were retrospectively identified from the hospital records database. Kaplan–Meier analysis was used to estimate the cumulative incidence of SBS during the 10-year period. For assessment the risk factors for cholelithiasis, we used multivariate Cox proportional hazards model with estimation of hazard ratio (HR) with 95% confidence intervals (95 %CI).Results: This study enrolled 345 eligible patients with SBS. Kaplan–Meier analysis revealed that 72 patients (20.9%) developed cholelithiasis during the 10-year observation period. In multivariate analyses using the Cox proportional hazard model revealed that the remnant jejunum (HR = 2.163; 95% confidence interval [CI]: 1.156–4.047, p = 0.016) and parenteral nutrition dependence (HR = 1.783; 95% CI: 1.077–2.952, p = 0.025) were independent risk factors for cholelithiasis in adults with SBS. Twenty-eight patients developed symptoms and/or complications in the cholelithiasis group. Proportions of acute cholecystitis or cholangitis and acute pancreatitis were significantly increased in the cholelithiasis group compared with the non-cholelithiasis group (31.9 vs. 7.7%, p &lt; 0.01; and 6.9 vs. 1.1%, p = 0.003, respectively).Conclusion: Because of the adverse clinical consequences of cholelithiasis, adult patients with SBS should be closely monitored, and preventive interventions should be considered.Clinical Trial Registration:www.ClinicalTrials.gov, identifier: NCT04867538.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Judy Tung ◽  
Musarrat Nahid ◽  
Mangala Rajan ◽  
Lia Logio

Abstract Background Academic medical centers invest considerably in faculty development efforts to support the career success and promotion of their faculty, and to minimize faculty attrition. This study evaluated the impact of a faculty development program called the Leadership in Academic Medicine Program (LAMP) on participants’ (1) self-ratings of efficacy, (2) promotion in academic rank, and (3) institutional retention. Method Participants from the 2013–2020 LAMP cohorts were surveyed pre and post program to assess their level of agreement with statements that spanned domains of self-awareness, self-efficacy, satisfaction with work and work environment. Pre and post responses were compared using McNemar’s tests. Changes in scores across gender were compared using Wilcoxon Rank Sum/Mann-Whitney tests. LAMP participants were matched to nonparticipant controls by gender, rank, department, and time of hire to compare promotions in academic rank and departures from the organization. Kaplan Meier curves and Cox proportional hazards models were used to examine differences. Results There were significant improvements in almost all self-ratings on program surveys (p < 0.05). Greatest improvements were seen in “understand the promotions process” (36% vs. 94%), “comfortable negotiating” (35% vs. 74%), and “time management” (55% vs. 92%). There were no statistically significant differences in improvements by gender, however women faculty rated themselves lower on all pre-program items compared to men. There was significant difference found in time-to-next promotion (p = 0.003) between LAMP participants and controls. Kaplan-Meier analysis demonstrated that LAMP faculty achieved next promotion more often and faster than controls. Cox-proportional-hazards analyses found that LAMP faculty were 61% more likely to be promoted than controls (hazard ratio [HR] 1.61, 95% confidence interval [CI] 1.16–2.23, p-value = 0.004). There was significant difference found in time-to-departure (p < 0.0001) with LAMP faculty retained more often and for longer periods. LAMP faculty were 77% less likely to leave compared to controls (HR 0.23, 95% CI 0.16–0.34, p < 0.0001). Conclusions LAMP is an effective faculty development program as measured subjectively by participant self-ratings and objectively through comparative improvements in academic promotions and institutional retention.


2015 ◽  
Vol 25 (4) ◽  
pp. 751-757 ◽  
Author(s):  
Hitoshi Hareyama ◽  
Kenichi Hada ◽  
Kumiko Goto ◽  
Sawako Watanabe ◽  
Minako Hakoyama ◽  
...  

ObjectiveLower extremity lymphedema (LEL) is a major long-term complication of radical surgery. We aimed to estimate the incidence and grading of LEL in women who underwent lymphadenectomy and to evaluate risk factors associated with LEL.Materials and MethodsWe retrospectively reviewed 358 patients with cervical, endometrial, and ovarian cancer who underwent transabdominal complete systematic pelvic and para-aortic lymphadenectomy between 1997 and 2011. Lower extremity lymphedema was graded according to criteria of the International Society of Lymphology. Incidence of LEL and its correlation with various clinical characteristics were investigated using Kaplan-Meier survival and Cox proportional hazards methods.ResultsOverall incidence of LEL was 21.8% (stage 1, 60%; stage 2, 32%; and stage 3, 8%). Cumulative incidence increased with observation period: 12.9% at 1 year, 20.3% at 5 years, and 25.4% at 10 years. Age, cancer type, stage (International Federation of Gynecology and Obstetrics), body mass index, hysterectomy type, lymphocyst formation, lymph node metastasis, and chemotherapy were not associated with LEL. Multivariate analysis confirmed that removal of circumflex iliac lymph nodes (hazard ratio [HR], 4.28; 95% confidence interval [CI], 2.09–8.77; P < 0.0001), cellulitis (HR, 3.48; 95% CI, 2.03–5.98; P < 0.0001), and number of removed lymph nodes (HR, 0.99; 95% CI, 0.98–0.99; P = 0.038) were independent risk factors for LEL.ConclusionsPostoperative LEL incidence increased over time. The results of the present study showed a significant correlation with removal of circumflex iliac lymph nodes and cellulitis with the incidence of LEL. Multicenter or prospective studies are required to clarify treatment efficacies.


Sign in / Sign up

Export Citation Format

Share Document