scholarly journals Potentially important periods of change in the development of social and role functioning in youth at clinical high risk for psychosis

2017 ◽  
Vol 30 (1) ◽  
pp. 39-47 ◽  
Author(s):  
Eva Velthorst ◽  
Jamie Zinberg ◽  
Jean Addington ◽  
Kristin S. Cadenhead ◽  
Tyrone D. Cannon ◽  
...  

AbstractThe developmental course of daily functioning prior to first psychosis-onset remains poorly understood. This study explored age-related periods of change in social and role functioning. The longitudinal study included youth (aged 12–23, mean follow-up years = 1.19) at clinical high risk (CHR) for psychosis (converters [CHR-C], n = 83; nonconverters [CHR-NC], n = 275) and a healthy control group (n = 164). Mixed-model analyses were performed to determine age-related differences in social and role functioning. We limited our analyses to functioning before psychosis conversion; thus, data of CHR-C participants gathered after psychosis onset were excluded. In controls, social and role functioning improved over time. From at least age 12, functioning in CHR was poorer than in controls, and this lag persisted over time. Between ages 15 and 18, social functioning in CHR-C stagnated and diverged from that of CHR-NC, who continued to improve (p = .001). Subsequently, CHR-C lagged behind in improvement between ages 21 and 23, further distinguishing them from CHR-NC (p < .001). A similar period of stagnation was apparent for role functioning, but to a lesser extent (p = .007). The results remained consistent when we accounted for the time to conversion. Our findings suggest that CHR-C start lagging behind CHR-NC in social and role functioning in adolescence, followed by a period of further stagnation in adulthood.

2016 ◽  
Vol 26 (3) ◽  
pp. 287-298 ◽  
Author(s):  
T. H. Zhang ◽  
H. J. Li ◽  
K. A. Woodberry ◽  
L. H. Xu ◽  
Y. Y. Tang ◽  
...  

Background.Chinese psychiatrists have gradually started to focus on those who are deemed to be at ‘clinical high-risk (CHR)’ for psychosis; however, it is still unknown how often those individuals identified as CHR from a different country background than previously studied would transition to psychosis. The objectives of this study are to examine baseline characteristics and the timing of symptom onset, help-seeking, or transition to psychosis over a 2-year period in China.Method.The presence of CHR was determined with the Structured Interview for Prodromal Syndromes (SIPS) at the participants' first visit to the mental health services. A total of 86 (of 117) CHR participants completed the clinical follow-up of at least 2 years (73.5%). Conversion was determined using the criteria of presence of psychotic symptoms (in SIPS). Analyses examined baseline demographic and clinical predictors of psychosis and trajectory of symptoms over time. Survival analysis (Kaplan–Meier) methods along with Log-rank tests were performed to illustrate the relationship of baseline data to either conversion or non-conversion over time. Cox regression was performed to identify baseline predictors of conversion by the 2-year follow-up.Results.In total 25 (29.1%) of 86 completers transitioned to a psychotic disorder over the course of follow-up. Among the CHR sample, the mean time between attenuated symptom onset and professional help-seeking was about 4 months on average, and converters developed fully psychotic symptoms about 12 months after symptom onset. Compared with those CHR participants whose risk syndromes remitted over the course of the study, converters had significantly longer delays (p = 0.029) for their first visit to a professional in search of help. At baseline assessment, the conversion subgroup was younger, had poorer functioning, higher total SIPS positive symptom scores, longer duration of untreated prodromal symptoms, and were more often given psychosis-related diagnoses and subsequently prescribed antipsychotics in the clinic.Conclusions.Chinese CHR identified primarily by a novel clinical screening approach had a 2-year transition rate comparable with those of specialised help-seeking samples world-wide. Early clinical intervention with this functionally deteriorating clinical population who are suffering from attenuated psychotic symptoms, is a next step in applying the CHR construct in China.


2020 ◽  
Vol 46 (Supplement_1) ◽  
pp. S57-S58
Author(s):  
Kate Haining ◽  
Gina Brunner ◽  
Ruchika Gajwani ◽  
Joachim Gross ◽  
Andrew Gumley ◽  
...  

Abstract Background Research in individuals at clinical-high risk for psychosis (CHR-P) has focused on developing algorithms to predict transition to psychosis. However, it is becoming increasingly important to address other outcomes, such as the level of functioning of CHR-P participants. To address this important question, this study investigated the relationship between baseline cognitive performance and functional outcome between 6–12 months in a sample of CHR-P individuals using a machine-learning approach to identify features that are predictive of long-term functional impairments. Methods Data was available for 111 CHR-P individuals at 6–12 months follow-up. In addition, 47 CHR-negative (CHR-N) participants who did not meet CHR criteria and 55 healthy controls (HCs) were recruited. CHR-P status was assessed using the Comprehensive Assessment of At-Risk Mental States (CAARMS) and the Schizophrenia Proneness Instrument, Adult version (SPI-A). Cognitive assessments included the Brief Assessment of Cognition in Schizophrenia (BACS) and the Penn Computerized Neurocognitive Battery (CNB). Global, social and role functioning scales were used to measure functional status. CHR-P individuals were divided into good functional outcome (GFO, GAF ≥ 65) and poor functional outcome groups (PFO, GAF &lt; 65). Feature selection was performed using LASSO regression with the LARS algorithm and 10-fold cross validation with GAF scores at baseline as the outcome variable. The following features were identified as predictors of GAF scores at baseline: verbal memory, verbal fluency, attention, emotion recognition, social and role functioning and SPI-A distress. This model explained 47% of the variance in baseline GAF scores. In the next step, Support Vector Machines (SVM), Linear Discriminant Analysis (LDA), Logistic Regression (LR), Gaussian Naïve Bayes (GNB), and Random Forest (RF) classifiers with 10-fold cross validation were then trained on those features with GAF category at follow-up used as the binary label column. Models were compared using a calculated score incorporating area under the curve (AUC), accuracy, and AUC consistency across runs, whereby AUC was given a higher weighting than accuracy due to class imbalance. Results CHR-P individuals had slower motor speed, reduced attention and processing speed and increased emotion recognition reaction times (RTs) compared to HCs and reduced attention and processing speed compared to CHR-Ns. At follow-up, 66% of CHR-P individuals had PFO. LDA emerged as the strongest classifier, showing a mean AUC of 0.75 (SD = 0.15), indicating acceptable classification performance for GAF category at follow-up. PFO was detected with a sensitivity of 75% and specificity of 58%, with a total mean weighted accuracy of 68%. Discussion The CHR-P state was associated with significant impairments in cognition, highlighting the importance of interventions such as cognitive remediation in this population. Our data suggest that the development of features using machine learning approaches is effective in predicting functional outcomes in CHR-P individuals. Greater levels of accuracy, sensitivity and specificity might be achieved by increasing training sets and validating the classifier with external data sets. Indeed, machine learning methods have potential given that trained classifiers can easily be shared online, thus enabling clinical professionals to make individualised predictions.


2020 ◽  
pp. 1-9
Author(s):  
Andrea Raballo ◽  
Michele Poletti ◽  
Antonio Preti

Abstract Background The clinical high-risk (CHR) for psychosis paradigm is changing psychiatric practice. However, a widespread confounder, i.e. baseline exposure to antipsychotics (AP) in CHR samples, is systematically overlooked. Such exposure might mitigate the initial clinical presentation, increase the heterogeneity within CHR populations, and confound the evaluation of transition to psychosis at follow-up. This is the first meta-analysis examining the prevalence and the prognostic impact on transition to psychosis of ongoing AP treatment at baseline in CHR cohorts. Methods Major databases were searched for articles published until 20 April 2020. The variance-stabilizing Freeman-Tukey double arcsine transformation was used to estimate prevalence. The binary outcome of transition to psychosis by group was estimated with risk ratio (RR) and the inverse variance method was used for pooling. Results Fourteen studies were eligible for qualitative synthesis, including 1588 CHR individuals. Out of the pooled CHR sample, 370 individuals (i.e. 23.3%) were already exposed to AP at the time of CHR status ascription. Transition toward full-blown psychosis at follow-up intervened in 112 (29%; 95% CI 24–34%) of the AP-exposed CHR as compared to 235 (16%; 14–19%) of the AP-naïve CHR participants. AP-exposed CHR had higher RR of transition to psychosis (RR = 1.47; 95% CI 1.18–1.83; z = 3.48; p = 0.0005), without influence by age, gender ratio, overall sample size, duration of the follow-up, or quality of the studies. Conclusions Baseline AP exposure in CHR samples is substantial and is associated with a higher imminent risk of transition to psychosis. Therefore, such exposure should be regarded as a non-negligible red flag for clinical risk management.


CNS Spectrums ◽  
2017 ◽  
Vol 24 (03) ◽  
pp. 333-337 ◽  
Author(s):  
Maiara Zeni-Graiff ◽  
Adiel C. Rios ◽  
Pawan K. Maurya ◽  
Lucas B. Rizzo ◽  
Sumit Sethi ◽  
...  

IntroductionOxidative stress has been documented in chronic schizophrenia and in the first episode of psychosis, but there are very little data on oxidative stress prior to the disease onset.ObjectiveThis work aimed to compare serum levels of superoxide dismutase (SOD) and glutathione peroxidase (GPx) in young individuals at ultra-high risk (UHR) of developing psychosis with a comparison healthy control group (HC).MethodsThirteen UHR subjects and 29 age- and sex-matched healthy controls (HC) were enrolled in this study. Clinical assessment included the Comprehensive Assessment of At-Risk Mental States (CAARMS), the Semi-Structured Clinical Interview for DSM-IV Axis-I (SCID-I) or the Kiddie-SADS-Present and Lifetime Version (K-SADS-PL), and the Global Assessment of Functioning (GAF) scale. Activities of SOD and GPx were measured in serum by the spectrophotometric method using enzyme-linked immunosorbent assay kits.ResultsAfter adjusting for age and years of education, there was a significant lower activity of SOD and lower GPX activity in the UHR group compared to the healthy control group (rate ratio [RR]=0.330, 95% CI 0.187; 0.584, p&lt;0.001 and RR=0.509, 95% CI 0.323; 0.803, p=0.004, respectively). There were also positive correlations between GAF functioning scores and GPx and SOD activities.ConclusionOur results suggest that oxidative imbalances could be present prior to the onset of full-blown psychosis, including in at-risk stages. Future studies should replicate and expand these results.


Trials ◽  
2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Lena Violetta Krämer ◽  
Nadine Eschrig ◽  
Lena Keinhorst ◽  
Luisa Schöchlin ◽  
Lisa Stephan ◽  
...  

Abstract Background Many students in Germany do not meet recommended amounts of physical activity. In order to promote physical activity in students, web-based interventions are increasingly implemented. Yet, data on effectiveness of web-based interventions in university students is low. Our study aims at investigating a web-based intervention for students. The intervention is based on the Health Action Process Approach (HAPA), which discriminates between processes of intention formation (motivational processes) and processes of intention implementation (volitional processes). Primary outcome is change in physical activity; secondary outcomes are motivational and volitional variables as proposed by the HAPA as well as quality of life and depressive symptoms. Methods A two-armed randomized controlled trial (RCT) of parallel design is conducted. Participants are recruited via the internet platform StudiCare (www.studicare.com). After the baseline assessment (t1), participants are randomized to either intervention group (immediate access to web-based intervention) or control group (access only after follow-up assessment). Four weeks later, post-assessment (t2) is performed in both groups followed by a follow-up assessment (t3) 3 months later. Assessments take place online. Main outcome analyses will follow an intention-to-treat principle by including all randomized participants into the analyses. Outcomes will be analysed using a linear mixed model, assuming data are missing at random. The mixed model will include group, time, and the interaction of group and time as fixed effects and participant and university as random effect. Discussion This study is a high-quality RCT with three assessment points and intention-to-treat analysis meeting the state-of-the-art of effectiveness studies. Recruitment covers almost 20 universities in three countries, leading to high external validity. The results of this study will be of great relevance for student health campaigns, as they reflect the effectiveness of self-help interventions for young adults with regard to behaviour change as well as motivational and volitional determinants. From a lifespan perspective, it is important to help students find their way into regular physical activity. Trial registration The German clinical trials register (DRKS) DRKS00016889. Registered on 28 February 2019


2018 ◽  
Vol 28 (7) ◽  
pp. 957-971 ◽  
Author(s):  
Michele Poletti ◽  
Lorenzo Pelizza ◽  
Silvia Azzali ◽  
Federica Paterlini ◽  
Sara Garlassi ◽  
...  

2021 ◽  
Vol 9 ◽  
Author(s):  
Kelly Guedes de Oliveira Scudine ◽  
Camila Nobre de Freitas ◽  
Kizzy Silva Germano Nascimento de Moraes ◽  
Silvana Bommarito ◽  
Rosana de Fátima Possobon ◽  
...  

It is well recognized that pacifier habit leads to occlusal and orofacial functional changes in children. However, the effects of the interruption of prolonged pacifier habit on the development of the dento-facial complex has not yet been fully characterized. Thus, the aim of this study was to investigate the influence of pacifier removal on aspects of oro-dentofacial morphology and function in preschool children. For that, a pacifier group (n = 28) and a control group (n = 32) of 4-year-old children with and without pacifier habit, respectively, were followed up by a group of dentists and speech therapists at baseline, 6 and 12 months after habit removal. Bite force and lip pressure were assessed using digital systems, and the evaluation of breathing and speech functions was performed using validated protocols, together with the measurements of dental casts and facial anthropometry. The Two-way mixed model ANOVA was used in data analysis. After 12 months, a decrease in malocclusion frequency was observed in pacifier group. Additionally, a change over time was observed in facial, intermolar and palate depth measurements, as well in bite and lip forces and speech function scores, increasing in both groups (p &lt; 0.01). The upper and lower intercanine widths and breathing scores differed between groups at baseline and changed over time reducing the differences. The presence of speech distortions was more frequent in the pacifier group at baseline and decreased over time (p &lt; 0.05). The interruption of pacifier habit improved the maxillary and mandibular intercanine widths, as well as the breathing and speech functions, overcoming the oro-dentofacial changes found.Trial Registration: This clinical trial was registered in the Brazilian Clinical Trials Registry (ReBEC; http://www.ensaiosclinicos.gov.br/), protocol no. RBR-728MJ2.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
F Posch ◽  
T Glantschnig ◽  
S Firla ◽  
M Smolle ◽  
M Balic ◽  
...  

Abstract Background Monitoring left-ventricular ejection fraction (LVEF) is a routinely-practiced strategy to survey patients with breast cancer (BC) towards cardiotoxic treatment effects. However, whether the LVEF as a single measurement or as a trajectory over time is truly sufficient to identify patients at high risk for cardiotoxicity is currently debated. Purpose To quantify the prognostic impact of LVEF and its change over time for predicting cardiotoxicity in women with HER2+ early BC. Methods We analyzed 1,136 echocardiography reports from 185 HER2+ early BC patients treated with trastuzumab ± chemoimmunoendocrine therapy in the neoadjuvant/adjuvant setting (Table 1). Cardiotoxicity was defined as a 10% decline in LVEF below 50%. Results Median baseline LVEF was 64% (25th-75th percentile: 60–69). Nineteen patients (10%) experienced cardiotoxicity (asymptomatic n=12, symptomatic n=7, during treatment n=19, treatment modification/termination n=14), Median time to cardiotoxicity was 6.7 months, and median LVEF decline in patients with cardiotoxicity was 18%. One-year cardiotoxicity risk was 7.6% in the 35 patients with a baseline LVEF≥60% and 24.5% in the 150 patients with a baseline LVEF<60% (Hazard Ratio (HR)=3.45, 95% CI: 1.35–8.75, Figure 1). During treatment, LVEF declined significantly faster in patients who developed cardiotoxicity than in patients without cardiotoxicity (1.3%/month vs. 0.1%/month, p<0.0001). A higher rate of LVEF decrease predicted for higher cardiotoxicity risk (HR per 0.1%/month higher LVEF decrease/month=2.50, 95% CI: 1.31–4.76, p=0.005), and cardiotoxicity risk increased by a factor of 1.7 per 5% absolute LVEF decline from baseline to first follow-up (HR=1.70, 95% CI: 1.30–2.38, p<0.0001). Thirty-six patients (19%) developed an LVEF decline of at least 5% from baseline to first follow-up (“early LVEF decline”). One-year cardiotoxicity risk was 6.8% in those without early LVEF decline and a baseline LVEF≥60% (n=117), 15.7% in those without an early LVEF decline and a baseline LVEF<60% (n=65), and 66.7% in those with an early LVEF decline and a baseline LVEF<60% (n=3), respectively (log-rank p<0.0001). Table 1. Baseline characteristics Age (years, median [IQR]) 55 [49–65] Estrogen receptor positive (n, %) 124 (67%) Neoadjuvant setting (n, %) 103 (56%) Figure 1. Risk of Cardiotoxicity. Conclusion Both a single LVEF measurement and the rate of LVEF decrease strongly predict cardiotoxicity in early BC patients undergoing HER2-targeted therapy. Routine LVEF monitoring identifies individuals at high risk of cardiotoxicity that may benefit from more sensitive screening techniques such as strain imaging.


2019 ◽  
Vol 59 (7) ◽  
pp. 3045-3058 ◽  
Author(s):  
Julia Baudry ◽  
Johannes F. Kopp ◽  
Heiner Boeing ◽  
Anna P. Kipp ◽  
Tanja Schwerdtle ◽  
...  

Abstract Purpose We aimed to evaluate age-dependent changes of six trace elements (TE) [manganese (Mn), iron (Fe), zinc (Zn), copper (Cu), iodine (I), and selenium (Se)] over a 20-year period. Methods TE concentrations were determined using repeated serum samples taken at baseline and after 20 years of follow-up from 219 healthy participants of the EPIC-Potsdam study, using inductively coupled plasma tandem mass spectrometry. For each TE, absolute and relative differences were calculated between the two time points, as well as the proportion of individuals within normal reference ranges. Interdependence between age-related TE differences was investigated using principal component analysis (PCA). Relationships between selected factors (lifestyle, sociodemographic, anthropometric factors, and hypertension) and corresponding TE longitudinal variability were examined using multivariable linear regression models. Results Median age of our study sample was 58.32 years (4.42) at baseline and 40% were females. Median Mn, Zn, Se concentrations and Se to Cu ratio significantly decreased during aging while median Fe, Cu, I concentrations and Cu to Zn ratio significantly increased. A substantial percentage of the participants, at both time points, had Zn concentrations below the reference range. The first PCA-extracted factor reflected the correlated decline in both Mn and Zn over time while the second factor reflected the observed (on average) increase in both Cu and I over time. Overall, none of the investigated factors were strong determinants of TE longitudinal variability, except possibly dietary supplement use, and alcohol use for Fe. Conclusions In conclusion, in this population-based study of healthy elderly, decrease in Mn, Zn, and Se concentrations and increase in Fe, Cu, and I concentrations were observed over 20 years of follow-up. Further research is required to investigate dietary determinants and markers of TE status as well as the relationships between TE profiles and the risk of age-related diseases.


Sign in / Sign up

Export Citation Format

Share Document