Benefits of a Secondary Prevention Program in Suicide

Crisis ◽  
2016 ◽  
Vol 37 (4) ◽  
pp. 281-289 ◽  
Author(s):  
Adriana Farré ◽  
Maria J. Portella ◽  
Luis De Angel ◽  
Ana Díaz ◽  
Javier de Diego-Adeliño ◽  
...  

Abstract. Background: The effectiveness of suicide intervention programs has not been assessed with experimental designs. Aim: To determine the risk of suicide reattempts in patients engaged in a secondary prevention program. Method: We included 154 patients with suicidal behavior in a quasi-experimental study with a nontreatment concurrent control group. In all, 77 patients with suicidal behavior underwent the Suicide Behavior Prevention Program (SBPP), which includes specialized early assistance during a period of 3–6 months. A matched sample of patients with suicidal behavior (n = 77) was selected without undergoing any specific suicide prevention program. Data on sociodemographics, clinical characteristics, and suicidal behavior were collected at baseline (before SBPP) and at 12 months. Results: After 12 months, SBPP patients showed a 67% lower relative risk of reattempt (χ2 = 11.75, p = .001, RR = 0.33 95% CI = 0.17–0.66). Cox proportional hazards models revealed that patients under SBPP made a new suicidal attempt significantly much later than control patients did (Cox regression = 0.293, 95% CI = 0.138–0.624, p = .001). The effect was even stronger among first attempters. Limitations: Sampling was naturalistic and patients were not randomized. Conclusion: The SBPP was effective in delaying and preventing suicide reattempts at least within the first year after the suicide behavior. In light of our results, implementation of suicide prevention programs is strongly advisable.

2019 ◽  
Vol 8 (4) ◽  
pp. 398-406 ◽  
Author(s):  
Elena Izkhakov ◽  
Joseph Meyerovitch ◽  
Micha Barchana ◽  
Yacov Shacham ◽  
Naftali Stern ◽  
...  

Objective Thyroid cancer (TC) survivors may be at risk of subsequent cardiovascular and cerebrovascular (CaV&CeV) morbidity. The 2009 American Thyroid Association (ATA) guidelines recommended less aggressive treatment for low-risk TC patients. The aim of this study was to assess the atherosclerotic CaV&CeV outcome of Israeli TC survivors compared to individuals with no thyroid disease, and the atherosclerotic CaV&CeV outcome before (2000–2008) and after (2009–2011) implementation of the 2009 ATA guidelines. Methods All members of the largest Israeli healthcare organization who were diagnosed with TC from 1/2000 to 12/2014 (study group) and age- and sex-matched members with no thyroid disease (controls) were included. Adjusted hazard ratios (HRs) and 95% confidence intervals (95% CIs) were calculated using Cox proportional hazards models. Results The mean follow-up was 7.6 ± 4.2 and 7.8 ± 4.1 years for the study (n = 5,677, 79% women) and control (n = 23,962) groups, respectively. The former had an increased risk of new atherosclerotic CaV&CeV events (adjusted HR 1.26, 95% CI 1.15–1.39). The 5-year incidence of CaV&CeV was lower (adjusted HR 0.49, 95% CI 0.38–0.62) from 2009 to 2011 compared to 2000 to 2008, but remained higher in the study group than in the control group (adjusted HR 1.5, 95% CI 1.14–1.69). Conclusions This large Israeli population-based cohort study showed greater atherosclerotic CaV&CeV morbidity in TC survivors compared to individuals with no thyroid diseases. There was a trend toward a decreased 5-year incidence of atherosclerotic CaV&CeV events among TC survivors following the implementation of the 2009 ATA guidelines, but it remained higher compared to the general population.


2021 ◽  
Author(s):  
Je Hun Song ◽  
Hyuk Huh ◽  
Eunjin Bae ◽  
Jeonghwan Lee ◽  
Jung Pyo Lee ◽  
...  

Abstract Background: Hyperhomocysteinemia (HHcy) is considered a risk factor for cardiovascular disease (CVD) including chronic kidney disease (CKD). In this study, we investigated the association between serum homocysteine (Hcy) level and mortality according to the presence of CKD.Methods: Our study included data of 9,895 participants from the 1996–2016 National Health and Nutrition Examination Surveys (NHANES). Moreover, linked mortality data were included and classified into four groups according to the Hcy level. Multivariable-adjusted Cox proportional hazards models using propensity-score were used to examine dose-response associations between Hcy level and mortality.Results: Of 9,895 participants, 1032 (21.2%) participants were diagnosed with CKD. In a multivariate Cox regression analysis including all participants, Hcy level was associated with all-cause mortality, compared with the 1st quartile in Model 3 (2nd quartile: hazard ratio (HR) 1.751, 95% confidence interval (CI) 1.348-2.274, p<0.001; 3rd quartile: HR 2.220, 95% CI 1.726-2.855, p<0.001; 4th quartile: HR 3.776, 95% CI 2.952-4.830, p<0.001). In the non-CKD group, there was a significant association with all-cause mortality; however, this finding was not observed in the CKD group. The observed pattern was similar after propensity score matching. In the non-CKD group, overall mortality increased in proportion to Hcy concentration (2nd quartile: HR 2.195, 95% CI 1.299-3.709, p = 0.003; 3rd quartile: HR 2.607, 95% CI 1.570-4.332, p<0.001; 4th quartile: HR 3.720, 95% CI 2.254-6.139, p<0.001). However, the risk of all-cause mortality according to the quartile of Hcy level did not increase in the CKD groupConclusion: This study found a correlation between the Hcy level and mortality rate only in the non-CKD group. This altered risk factor patterns may be attributed to protein-energy wasting or chronic inflammation status that is accompanied by CKD.


2020 ◽  
Vol 38 (15_suppl) ◽  
pp. e21071-e21071
Author(s):  
Matthew C Lee ◽  
Dimitre C Stefanov ◽  
Mallorie B Angert ◽  
Erica C Cohn ◽  
Nina Kohn ◽  
...  

e21071 Background: Stage I patients (pts) have 5-year survival ranging 50-75% suggesting heterogeneity within. While American Joint Committee on Cancer 8th edition upstages tumors with visceral pleural invasion (VPI) to IB, other histological features namely lymphovascular invasion (LVI), micropapillary pattern (MIP), spread through airspace (STAS) & neuroendocrine differentiation (NE) may also affect prognosis. This retrospective single institution study evaluated influence of these factors along with pt variables age, gender, smoking, Charleston comorbidity index (CCI) & chemotherapy (CT) on recurrence & mortality. Methods: 351 resected stage I cases from 2015-2019 were included. Data was summarized as means (standard deviation/SD) or percentages. Association between variables & outcomes (measured from diagnosis till event or last visit if no event) were investigated using Univariate & Multiple Cox proportional hazards models. Survival curves were compared using the Log-Rank test when the assumption for the proportional hazards was not satisfied. All predictors were included in the multiple Cox regression models based on their clinical importance. P < 0.05 was considered statistically significant. SAS 9.4 (SAS Institute, Cary, NC) was used for the analysis. Results: Mean age was 69.62 years (9.83). Majority were female (57.3%), smokers (76.9%), & had adenocarcinoma (AC) (78.6%). 39% had COPD & mean CCI was 6.3 (1.74). 193 (55%) pts had lobectomy or larger procedure while 158 (45%) had sub-lobar resection. 45 (12.8%) pts received CT. Recurrence & death occurred in 33 (9.4%) & 15 (4.3%) pts respectively. Univariate models indicated higher recurrence risk with NE (HR = 4.18 95% CI 1.47-11.9, p = 0.0075), LVI (HR = 2.68, 95% CI 1.03-6.94, p = 0.0423), COPD (HR = 3.28 95% CI 1.56-6.9, p = 0.0017), age (HR = 1.05 95% CI 1.01-1.09, p = 0.0212), & CCI (HR = 1.57 95% CI 1.35-1.83, p < .0001). CT was also associated with increased recurrence risk (HR = 8.61, 95% CI 4.28-17.33, p < .0001). Multivariable model for recurrence retained significance for CT & CCI. Age (HR = 1.07 95% CI 1.01-1.14, p = 0.0312), CCI (HR = 1.27 95 % CI 1.02-1.59, p = 0.0347) were associated with mortality in univariate models. Multivariate analysis for mortality wasn’t feasible due to few events. Conclusions: Histological features other than VPI may be associated with recurrence. Pts who received CT had increased recurrence but they possibly had multiple risk factors or other adverse features not assessed here. Limitations included retrospective nature, limited sample size & small number of events.


2018 ◽  
Vol 47 (5) ◽  
pp. 317-324 ◽  
Author(s):  
Xiaoxiao Yang ◽  
Yijing Tong ◽  
Hao Yan ◽  
Zhaohui Ni ◽  
Jiaoqi Qian ◽  
...  

Background: To evaluate the predictive value of dialysate interleukin-6 (IL-6) representing local subclinical intraperitoneal inflammation for the development of peritonitis in continuous ambulatory peritoneal dialysis (CAPD) patients. Methods: Stable prevalent CAPD patients were enrolled in this prospective study. IL-6 concentration in the overnight effluent was determined and expressed as the IL-6 appearance rate (IL-6AR). Patients were divided into 2 groups according to the median of IL-6AR and prospectively followed up until the first episode of peritonitis, cessation of PD, or the end of the study (December 30, 2017). The utility of IL-6AR in predicting peritonitis-free survival was analyzed using the Kaplan-Meier and Cox proportional hazards models. Results: A total of 149 patients were enrolled, including 72 males (48%) with mean age 52.0 ± 13.6 years and median PD duration 26 (5.9–45.5) months. During follow-up, 7,923 patient months were observed and 154 episodes of peritonitis occurred in 82 patients. Previous peritonitis episodes were significantly associated with log dialysate IL-6AR levels (β = 0.187 [0.022–0.299], p = 0.023). Patients in the high IL-6AR group showed a significantly inferior peritonitis-free survival when compared with their counterparts in the low IL-6AR group (48.8 vs. 67.7 months, p = 0.026), as well as higher treatment failure percentage of peritonitis (20.3 vs. 9.3%, p = 0.049). A multivariate Cox regression showed that high dialysate IL-6AR (hazard ratio [HR] 1.247 [1.052–1.478]; p = 0.011) and high serum C-reactive protein (HR 1.072 [1.005–1.144]; p = 0.036) were independent risk factors for inferior peritonitis-free survival. Conclusion: This prospective study suggested that the intraperitoneal inflammation marker, dialysate IL-6 level, might be a potential predictor of peritonitis development in patients undergoing PD.


2021 ◽  
pp. 152692482110246
Author(s):  
Amanda Vinson ◽  
Alyne Teixeira ◽  
Bryce Kiberd ◽  
Karthik Tennankore

Background: Leukopenia occurs frequently following kidney transplantation and is associated with adverse clinical outcomes including increased infectious risk. In this study we sought to characterize the causes and complications of leukopenia following kidney transplantation. Methods: In a cohort of adult patients (≥18 years) who underwent kidney transplant from Jan 2006-Dec 2017, we used univariable Cox proportional Hazards models to identify predictors of post-transplant leukopenia (WBC < 3500 mm3). Factors associated with post-transplant leukopenia were then included in a multivariable backwards stepwise selection process to create a prediction model for the outcome of interest. Cox regression analyses were subsequently used to determine if post-transplant leukopenia was associated with complications. Results: Of 388 recipients, 152 (39%) developed posttransplant leukopenia. Factors associated with leukopenia included antithymocyte globulin as induction therapy (HR 3.32, 95% CI 2.25-4.91), valganciclovir (HR 1.84, 95% CI 1.25-2.70), tacrolimus (HR 3.05, 95% CI 1.08-8.55), prior blood transfusion (HR 1.17 per unit, 95% CI 1.09- 1.25), and donor age (HR 1.02 per year, 95% CI 1.00-1.03). Cytomegalovirus infection occurred in 26 patients with leukopenia (17.1%). Other than cytomegalovirus, leukopenia was not associated with posttransplant complications. Conclusion: Leukopenia commonly occurred posttransplant and was associated with modifiable and non-modifiable pretransplant factors.


Rheumatology ◽  
2019 ◽  
Vol 59 (8) ◽  
pp. 1889-1897 ◽  
Author(s):  
So Young Kim ◽  
Min Chanyang ◽  
Dong Jun Oh ◽  
Hyo Geun Choi

Abstract Objective To investigate the bidirectional relation between RA and depression. Methods Data from the Korean Health Insurance Review and Assessment Service – National Sample Cohort from 2002 to 2013 were analysed. Patients ≥20 years of age were included. Study I was conducted with 38 087 depression patients and 152 348 matched control participants. Study II was conducted with 7385 RA patients and 29 540 matched control participants. Stratified Cox proportional hazards models were used to analyse the hazard ratios (HRs) for depression and RA (study I) and for RA and depression (study II). The data were adjusted by the Charlson comorbidity index; rheumatic disease was excluded. Subgroups were also analysed according to age and sex. Results A total of 0.7% (1260/38 087) of the depression group and 0.6% (883/152 348) of the control I group had RA (P = 0.02). The HR for RA in the depression group was not significantly higher than that in control I group. In study II, 5.5% (408/7385) of the RA group and 4.3% (1246/29 540) of the control II group presented with depression (P &lt; 0.001). The RA patients showed an adjusted depression HR that was 1.20 times higher (95% CI 1.07, 1.34; P = 0.002) than that of the control group. The &gt;30-years-old and women subgroups of RA patients showed higher depression HRs than the control subgroups. Conclusion RA increased the risk of depression; however, depression did not increase the risk of RA in the Korean adult population.


2021 ◽  
Vol 10 (7) ◽  
pp. 1466
Author(s):  
Den-Ko Wu ◽  
Kai-Shan Yang ◽  
James Cheng-Chung Wei ◽  
Hei-Tung Yip ◽  
Renin Chang ◽  
...  

The potential association between appendectomy and non-typhoidal Salmonella (NTS) infection has not been elucidated. We hypothesized that appendectomy may be associated with gut vulnerability to NTS. The data were retrospectively collected from the Taiwan National Health Insurance Research Database to describe the incidence rates of NTS infection requiring hospital admission among patients with and without an appendectomy. A total of 208,585 individuals aged ≥18 years with an appendectomy were enrolled from January 2000 to December 2012, and compared with a control group of 208,585 individuals who had never received an appendectomy matched by propensity score (1:1) by index year, age, sex, occupation, and comorbidities. An appendectomy was defined by the International Classification of Diseases, Ninth Revision, Clinical Modification Procedure Codes. The main outcome was patients who were hospitalized for NTS. Cox proportional hazards models were applied to estimate the hazard ratios (HRs) and 95% confidence intervals (CIs). Two sensitivity analyses were conducted for cross-validation. Of the 417,170 participants (215,221 (51.6%) male), 208,585 individuals (50.0%) had an appendectomy, and 112 individuals developed NTS infection requiring hospitalization. In the fully adjusted multivariable Cox proportional hazards regression model, the appendectomy group had an increased risk of NTS infection (adjusted HR (aHR), 1.61; 95% CI, 1.20–2.17). Females and individuals aged 18 to 30 years with a history of appendectomy had a statistically higher risk of NTS than the control group (aHR, 1.92; 95% CI, 1.26–2.93 and aHR, 2.67; 95% CI, 1.41–5.07). In this study, appendectomy was positively associated with subsequent hospitalization for NTS. The mechanism behind this association remains uncertain and needs further studies to clarify the interactions between appendectomy and NTS.


2017 ◽  
Vol 35 (6_suppl) ◽  
pp. 181-181 ◽  
Author(s):  
David Frazier Jarrard ◽  
Yu-Hui Chen ◽  
Glenn Liu ◽  
Michael Anthony Carducci ◽  
Mario A. Eisenberger ◽  
...  

181 Background: To evaluate whether metformin (Met) a widely-used, nontoxic oral antidiabetic drug with putative anticancer properties leads to improvements in prostate cancer (PC) outcomes in the CHAARTED trial. Methods: In the CHAARTED database where metformin use at baseline was recorded prospectively, we identified patients with metastatic PC who underwent either ADT alone or ADT and docetaxel (D) chemotherapy. Cox proportional hazards models were used to determine the effect of Metformin on outcomes. Results: A total of 788 patients (median age, 63 y) had complete data after randomization. Comparison of ADT+D+Met (n = 39) to ADT+D (n = 357) and ADT+Met (n = 29) to ADT alone (n = 363) revealed similar clinicopathologic characteristics. Cause of death was PC in 13(81%) of ADT+D+Met, 72(85%) ADT+D, 9(82%) ADT+Met and 105(84%) ADT alone groups. See table for PC outcomes and overall survival by metformin use. Cox regression analysis for overall survival stratified by stratification factors at randomization demonstrates Met use was associated with a trend for worse overall survival (HR 1.47 95%CI: [0.95,2.26], p = 0.08) with adjustment for treatment arm and prior local therapy. In contrast, ADT+D use (HR 0.62; 95%CI: [0.47,0.81]) and prior local therapy with surgery or radiation (HR 0.56; 95% CI: [0.38, 0.82]) were associated with improved survival. Conclusions: In this study, baseline metformin did not improve PC outcomes. Partial support and drug supply by Sanofi. Clinical trial information: NCT00309985. [Table: see text]


2020 ◽  
Vol 9 (1) ◽  
pp. 57-77
Author(s):  
Andreas Ledwon ◽  
Clemens Jäger

This study evaluates three corporate failure prediction models using latest available data on corporate insolvencies for non-financial constitutes represented in CDAX. We estimate semiparametric Cox proportional hazards models considering Andersen-Gill counting process (AG-CP) to explore the importance of accounting and financial ratios as well as industry effects that are useful in detecting potential insolvencies. The contribution of this paper is twofold. First, the literature on corporate default prediction is manifold and predominantly focused on U.S. data. Thus, academic contribution based on German-listed companies is limited. To our best knowledge, we are the first to conduct thorough comparative out-of-sample Cox regression models considering AG-CP based on a unique dataset for non-financial constitutes subject to the German insolvency statute (“InsO”). Relying on a parsimonious accounting-based approach inspired by Altman (1968) and Ohlson (1980) is merely adequate. Shumway (2001) and Campbell et al. (2008) variable selection delivers the best discriminatory power and calibration results. In particular, a combination of pure accounting ratios augmented with market-driven information in Model (2) indicates superior accuracy rates in top deciles. However, in-sample empirical results underpin the importance towards market-based indicators, as all accounting ratios enter statistically insignificant. Secondly, we test to what extend industry variables improve discriminatory power and forecasting accuracy of fitted models. Contrary to the findings of Chava & Jarrow (2004), our research implies that industry grouping adds marginal predictive power and no overall improvement in accuracy rates when market variables are already included in the probability of default (PD) model.


Author(s):  
Mukesh Ranjan ◽  
Laxmi Kant Dwivedi ◽  
Rahul Mishra ◽  
Brajesh

Higher infant mortality among tribal populations in India is well-documented. However, it is rare to compare factors associated with infant mortality in tribal populations with those in non-tribal populations. In the present paper, Cox proportional hazards models were employed to examine factors influencing infant mortality in tribal and non-tribal populations in the Central and Eastern Indian states using data from the District Level Household Survey-III in 2007-2008. Characteristics of mothers, infants, and households/communities plus a program variable reflecting the place of pregnancy registration were included in the analyses. We found that the gap in infant mortality between tribal and non-tribal populations was substantial in the early months after birth, narrowed between the fourth and eighth months, and enlarged mildly afterwards. Cox regression models show that while some factors were similarly associated with infant mortality in tribes and non-tribes, distinctive differences between tribal and non-tribal populations were striking. Sex of infants, breastfeeding with colostrum, and age of mother at birth acted similarly between tribes and non-tribes, yet factors such as state of residence, wealth, religion, place of residence, mother’s education, and birth order behaved differently. The program factor was non-significant in both tribal and non-tribal populations.


Sign in / Sign up

Export Citation Format

Share Document