Prevalence and Predictors of Self-Reported Neuropsychological Impairment in Patients Undergoing Hematopoietic Cell Transplantation (HCT) - Impact On Return to Work After HCT.

Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 808-808
Author(s):  
Alysia Bosworth ◽  
Lennie Wong ◽  
Sunita Patel ◽  
Doojduen Villaluna ◽  
Mitzi Gonzales ◽  
...  

Abstract Abstract 808 The growing population of HCT survivors may be at risk for neuropsychological impairment due to exposure to neurotoxic agents. HCT survivors frequently report problems with memory and attention (Cancer 2002;95:183-192), and even though the patients are acutely aware of neuropsychological declines, these changes do not correlate well with impairment on standardized neuropsychological assessments (Bone Marrow Transplant 2005;36:695-702). This discrepancy suggests that the tests may not be sensitive enough to detect subtle changes that could nonetheless impact patients' societal reintegration, highlighting the importance of self-report instruments. The present study aims to assess the longitudinal trajectory of self-reported neuropsychological impairment in patients from pre-HCT to 1 year post-HCT; to evaluate the impact of demographic and clinical factors on self-reported neuropsychological impairment; to examine the relationship between self-reported neuropsychological impairment data and objective data collected using standardized assessments; and to understand the relationship between self-reported impairment and return to work. Participants were 182 adult patients undergoing HCT for hematological malignancies. Mean age at HCT was 50 years (range, 18-73); 60% were males; 68% were non-Hispanic whites; 62% received autologous HCT. Patients completed a 2-hour battery of standardized neuropsychological tests (domains: processing speed; immediate, general, and working memory; cognitive reserve; executive function) and a self-reported Neuropsychological Impairment Scale (NIS – scales: Global Measure of Impairment [GMI, an overall summary score], Cognitive Efficiency [COG], Attention [ATT], Memory [MEM], Learning-Verbal [L-V], Academic Skills [ACD]). Self-reported information on return to work was obtained at 6 months and 1 year after HCT. Demographic (sex, age, race/ethnicity, education, income, marital status) and clinical data (diagnosis, donor source, risk of relapse, conditioning exposures) were collected. Raw scores were converted to t-scores using normative data; individuals with t-scores above 1 SD of the normative distribution were classified as impaired. Generalized estimating equations were used to examine longitudinal trends. The prevalence of domain-specific impairment at specified time points is shown in the Table. After adjusting for significant covariates, GMI worsened at 6 months and plateaued thereafter (p=0.04) and ATT worsened at 6 months but returned to baseline at 1 year (p=0.006) (Figure). Multivariate analyses revealed the following risk factors: at pre-HCT: female gender and less than high school education (higher MEM impairment, p=0.03, p=0.05, respectively); at both 6 months and 1 year post-HCT: annual household income less than $20,000 (higher GMI impairment, p=0.02); exposure to total body irradiation (TBI: higher COG impairment, p=0.006, and higher ATT impairment, p=0.05); female gender (p=0.05) and 4-year college education (p=0.058) (higher MEM impairment). Correlations between NIS scores and standardized assessments were weak (range, r= -0.3 to 0.09). At 6 months, 57% of the patients had not returned to work. Patients with COG impairment were less likely to return to work (p=0.05), while patients with higher cognitive reserve were more likely to return to work (p=0.03). These results suggest that a significant proportion of patients undergoing HCT report neuropsychological impairment that may not be readily captured by standardized assessments. The present study identifies low household income, TBI, female gender, and college education as risk factors and describes the impact of self-reported neuropsychological impairment on the ability to return to work. This study therefore helps characterize a vulnerable population that needs to be followed closely for appropriate intervention to ensure appropriate societal reintegration after HCT.Table.Prevalence of neuropsychological impairment (t>60) by time pointGMICOGATTMEML-VACD Pre-HCT29%33%25%25%24%22% 6 months (n=94)32%40%32%27%33%22% 1 year (n=69)32%29%23%29%25%23% Disclosures: Forman: City of Hope: Employment.

2021 ◽  
Author(s):  
Ekaterina Mosolova ◽  
Dmitry Sosin ◽  
Sergey Mosolov

During the COVID-19 pandemic, healthcare workers (HCWs) have been subject to increased workload while also exposed to many psychosocial stressors. In a systematic review we analyze the impact that the pandemic has had on HCWs mental state and associated risk factors. Most studies reported high levels of depression and anxiety among HCWs worldwide, however, due to a wide range of assessment tools, cut-off scores, and number of frontline participants in the studies, results were difficult to compare. Our study is based on two online surveys of 2195 HCWs from different regions of Russia during spring and autumn epidemic outbreaks revealed the rates of anxiety, stress, depression, emotional exhaustion and depersonalization and perceived stress as 32.3%, 31.1%, 45.5%, 74.2%, 37.7% ,67.8%, respectively. Moreover, 2.4% of HCWs reported suicidal thoughts. The most common risk factors include: female gender, nurse as an occupation, younger age, working for over 6 months, chronic diseases, smoking, high working demands, lack of personal protective equipment, low salary, lack of social support, isolation from families, the fear of relatives getting infected. These results demonstrate the need for urgent supportive programs for HCWs fighting COVID-19 that fall into higher risk factors groups.


2021 ◽  
Vol 13 (2) ◽  
pp. 445
Author(s):  
Wen-Kuo Chen ◽  
Venkateswarlu Nalluri ◽  
Suresh Ma ◽  
Mei-Min Lin ◽  
Ching-Torng Lin

Different sources of risk factors can occur in sustainable supply chain management due to its complex nature. The telecommunication service firm cannot implement multiple improvement practices altogether to overcome the risk factors with limited resources. The industries should evaluate the relationship between risk factors and explore the determinants of improvement measures. The purpose of the present study is to identify and analyze critical risk factors (CRFs) for enhancing sustainable supply chain management practices in the Indian telecommunication industry using interpretive structural modelling (ISM). Risk factors are identified through a literature survey, and then with the help of experts, nine CRFs are identified using a fuzzy Delphi method (FDM). The relationship among these CRFs has been analyzed using ISM, and the driving and the dependence power of those CRFs are analyzed. Results indicate that both “government policies (laws and regulations)” and “the impact of rapid change in technology” are independent or key factors that affect the sustainability of the telecommunications supply chain. In addition, results provide significant managerial implications, including enhanced sustainability, and the government should build justice, fairness, open laws, certainties, and regulations to prevent risk in the telecommunications industry supply chain; service providers should monitor the rapidly evolving technologies and focus on technical learning and organizational capacity development to overcome the impact of technological changes. The contribution of this study is using a novel approach to establish a hierarchical structural model for an effective understanding of CRFs relationships and to explore decisive risk factors that can help telecom service providers to better plan and design effective improvement strategies to enhance sustainability supply chain management.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 521-521 ◽  
Author(s):  
F. Lennie Wong ◽  
Alysia Bosworth ◽  
Rose Danao ◽  
Doojduen Villaluna ◽  
Sunita Patel ◽  
...  

Abstract Abstract 521 Patients undergoing HCT are at risk for neurocognitive impairment because of exposure to potential neurotoxic agents such as total body irradiation (TBI) and agents used for prophylaxis/ treatment of graft vs. host disease (GvHD). However, extant studies reporting neurocognitive outcome have been hindered by small sample size, retrospective or cross-sectional study design, limited prospective post-HCT follow-up, and restriction of study populations to either autologous or allogeneic HCT recipients, with inability to compare the two groups. Additionally, no prior studies have systematically assessed the impact of cognitive changes on the ability to return to work. This study addressed these gaps in knowledge by using a prospective longitudinal study design to examine neurocognitive changes from pre-HCT to 2 y after HCT in 284 patients undergoing HCT between 2005 and 2008. This study examined clinical and demographic predictors of neurocognitive function as well as the impact of neurocognitive function on return to part-time (PT) or full-time (FT) work post-HCT. Mean age at HCT was 50 years (range, 18-73); 40% were females; 69% were non-Hispanic whites and 21% were Hispanics. Primary diagnoses included NHL (33%), MM (26%), AML (16%), HL (10%), ALL (7%), and other (8%); 63% received autologous, 16% allogeneic related, and 20% unrelated donor HCT. A standardized 2-hour battery of neurocognitive tests was administered to the study participants at pre-HCT (n=284), 6 mo (n=202), 1 y (n=173), and 2 y (n=97) post-HCT to assess neurocognitive functioning in 8 domains: processing speed, executive function, immediate memory, general memory, working memory, psychomotor speed, verbal speed, and verbal fluency. The Wechsler Abbreviated Scale of Intelligence was used to measure cognitive reserve. Test scores were age-adjusted using normative data. Changes in neurocognitive functions over time and correlates of change were examined using the linear mixed effects model. Demographic factors examined included age at HCT, sex, race/ ethnicity, marital status, annual income at HCT, and education. Clinical factors included primary diagnosis, risk of relapse at HCT, stem cell source, presence of GvHD (for allogeneic HCT), conditioning (specific chemotherapeutic agents or TBI), and GvHD medications (allogeneic HCT only). Compared to pre-HCT, neurocognitive function generally remained unchanged or improved over the 2 years after HCT. At both pre- and post-HCT, higher cognitive reserve and education level were significantly associated with better neurocognitive function. Furthermore, women tended to have significantly better neurocognitive scores than men. However, after adjustment for significant covariates, allogeneic HCT recipients demonstrated significantly worse scores after HCT in processing speed, executive function, and verbal fluency when compared to autologous HCT recipients (p=0.001, p<0.05, p=0.04, respectively) (Figure). No differences were evident by GvHD status among allogeneic patients. Among patients who were employed prior to HCT, 66% were working either PT (27%) or FT (39%) at 1 y post-HCT. Patients with better scores in immediate memory (p=0.02), those who were either <35 y or >55 years of age at HCT (p=0.05), and those who had a primary diagnosis of lymphoma (p=0.03) were more likely to return to work either FT or PT. On the other hand, patients with chronic GvHD were less likely to return to work (p=0.02). Patients working FT at 1 y were more likely to have scored better for verbal speed (p=0.01); were more likely to have had pre-HCT income of $20,000 or more (p=0.002); and were less likely to have received allogeneic HCT (p=0.003). In summary, patients receiving allogeneic HCT are at risk for domain-specific neurocognitive deficits that persist for at least 2 years post-HCT and that are associated with inability to return to work. This study identifies vulnerable subgroups that could benefit from targeted surveillance and early intervention to facilitate smooth reintegration into society. Disclosures: No relevant conflicts of interest to declare.


2015 ◽  
Vol 33 (29_suppl) ◽  
pp. 239-239
Author(s):  
Kathryn Elizabeth Hudson ◽  
Habtamu Kassa Benecha ◽  
Kevin Leo Houck ◽  
Thomas William LeBlanc ◽  
Amy P Abernethy ◽  
...  

239 Background: Fatigue is a common and distressing effect of cancer and its treatment, potentially affecting quality of life (QOL) for years after treatment. However, the prevalence and persistence of fatigue among long-term survivors of non-Hodgkin lymphoma (NHL) remains unknown. We aimed to identify demographic, clinical, and psychosocial risk factors for persistent fatigue in this population. Methods: In 2010, surveys were mailed to 682 NHL survivors who participated in a study 5 years earlier; respondents were, on average, 10.4 years post diagnosis. Standardized measures of QOL, symptoms, medical history, and demographic variables were reported at both time points. We defined significant fatigue conservatively as 0.5 standard deviations below the SF-36 scale’s cutoff for fatigue, and we defined persistent fatigue as significant fatigue at both time points. Chi-square, t-tests, and logistic regression were used to determine risk factors and predictors for persistent fatigue. Results: 30.8% (n = 172) and 33.0% (n = 186) of patients reported significant fatigue at time point 1 and 2, respectively; 20% of patients had persistent fatigue. Patients with persistent fatigue were more likely to report: female gender, income < $30,000, less than college education, less exercise, active disease, chemotherapy, at least one recurrence of their disease, less social support, an average of 3.8 more comorbidities, and significantly more posttraumatic stress than those without persistent fatigue (all p < .05). Logistic regression showed that education less than college, more comorbidities, less exercise, and more posttraumatic stress were independent predictors of persistent fatigue (all p < .05). Conclusions: Fatigue plagues one-third of NHL survivors and persists in one-fifth of this population even years after diagnosis. These findings could inform clinical practice in NHL survivorship and highlight targets for intervention.


2021 ◽  
Vol 5 (Supplement_2) ◽  
pp. 572-572
Author(s):  
Hayat Alzahrani ◽  
Kim Jackson ◽  
Ditte Hobbs ◽  
Julie Lovegrove

Abstract Objectives To investigate the relationship between dietary nitrate consumption from vegetables (root and green leafy varieties), drinking water and cured meat, and cardiovascular disease (CVD) risk factors in a representative UK population, and determine whether the source (vegetables vs cured meats) impacts on these relationships. Methods For this analysis, we used data from the UK cross-sectional National Diet and Nutrition Survey (NDNS) years 1–8, which included 3407 men and women aged 19–64 y. Since data available on dietary analysis software for nitrate levels in vegetables and vegetable-based foods is very limited, a comprehensive database was first developed to evaluate the nitrate and nitrite levels in water, vegetables, cured meats and composite dishes to more accurately estimate the dietary nitrate intakes of the participants. The population was then classified into quartiles based on increasing daily nitrate intakes from vegetables (including drinking water) and meats. ANCOVA analysis determined the relationship between the level of nitrate intake from each dietary source with available data on biomarkers of CVD risk (BP, lipid profile, C-reactive protein (CRP), anthropometric measures and glycaemic control). Results Across increasing quartiles of dietary nitrate intake from vegetables, there were significant differences in systolic (P = 0.038) and diastolic (P = 0.014) BP, with significantly lower BP in Q3 than all other quartiles. Furthermore, nitrate intake from vegetables was significantly associated with lower glucose, glycated haemoglobin, CRP and total cholesterol concentrations in Q4 compare to Q1 (p = 0.046, p = 0.01, p = 0.03 and p = 0.04) respectively. In contrast, there were no changes in CVD markers including BP across quartiles of nitrate from meats. Conclusions Our findings suggest the source of dietary nitrate may play an important role in determining the relationship with BP, with an intake of between 95–130 mg/day from vegetables and drinking water associated with a lower BP. Funding Sources Hayat was supported by King Saud University (Saudi Arabia).


2020 ◽  
Author(s):  
Xilin Jiang ◽  
Chris Holmes ◽  
Gil McVean

AbstractInherited genetic variation contributes to individual risk for many complex diseases and is increasingly being used for predictive patient stratification. Recent work has shown that genetic factors are not equally relevant to human traits across age and other contexts, though the reasons for such variation are not clear. Here, we introduce methods to infer the form of the relationship between genetic risk for disease and age and to test whether all genetic risk factors behave similarly. We use a proportional hazards model within an interval-based censoring methodology to estimate age-varying individual variant contributions to genetic risk for 24 common diseases within the British ancestry subset of UK Biobank, applying a Bayesian clustering approach to group variants by their risk profile over age and permutation tests for age dependency and multiplicity of profiles. We find evidence for age-varying risk profiles in nine diseases, including hypertension, skin cancer, atherosclerotic heart disease, hypothyroidism and calculus of gallbladder, several of which show evidence, albeit weak, for multiple distinct profiles of genetic risk. The predominant pattern shows genetic risk factors having the greatest impact on risk of early disease, with a monotonic decrease over time, at least for the majority of variants although the magnitude and form of the decrease varies among diseases. We show that these patterns cannot be explained by a simple model involving the presence of unobserved covariates such as environmental factors. We discuss possible models that can explain our observations and the implications for genetic risk prediction.Author summaryThe genes we inherit from our parents influence our risk for almost all diseases, from cancer to severe infections. With the explosion of genomic technologies, we are now able to use an individual’s genome to make useful predictions about future disease risk. However, recent work has shown that the predictive value of genetic information varies by context, including age, sex and ethnicity. In this paper we introduce, validate and apply new statistical methods for investigating the relationship between age and genetic risk. These methods allow us to ask questions such as whether risk is constant over time, precisely how risk changes over time and whether all genetic risk factors have similar age profiles. By applying the methods to data from the UK Biobank, a prospective study of 500,000 people, we show that there is a tendency for genetic risk to decline with increasing age. We consider a series of possible explanations for the observation and conclude that there must be processes acting that we are currently unaware of, such as distinct phases of life in which genetic risk manifests itself, or interactions between genes and the environment.


2019 ◽  
Vol 53 (12) ◽  
pp. 1184-1191 ◽  
Author(s):  
Logan M. Olson ◽  
Andrea M. Nei ◽  
Ross A. Dierkhising ◽  
David L. Joyce ◽  
Scott D. Nei

Background: Post–cardiac surgery bleeding can have devastating consequences, and it is unknown if warfarin-induced rapid international normalized ratio (INR) rise during the immediate postoperative period increases bleed risk. Objective: To determine the impact of warfarin-induced rapid-rise INR on post–cardiac surgery bleeding. Methods: This was a single-center, retrospective chart review of post–cardiac surgery patients initiated on warfarin at Mayo Clinic Hospital, Rochester. Patients were grouped based on occurrence or absence of rapid-rise INR (increase ≥1.0 within 24 hours). The primary outcome compared bleed events between groups. Secondary outcomes assessed hospital length of stay (LOS) and identified risk factors associated with bleed events and rapid rise in INR. Results: During the study period, 2342 patients were included, and 56 bleed events were evaluated. Bleed events were similar between rapid-rise (n = 752) and non–rapid-rise (n = 1590) groups in both univariate (hazard ratio [HR] = 1.22; P = 0.594) and multivariable models (HR = 1.24; P = 0.561). Those with rapid-rise INR had longer LOS after warfarin administration (discharge HR = 0.84; P = 0.0002). The most common warfarin dose immediately prior to rapid rise was 5 mg. Risk factors for rapid-rise INR were low body mass index, female gender, and cross-clamp time. Conclusion and Relevance: This represents the first report to assess warfarin-related rapid-rise INR in post–cardiac surgery patients and found correlation to hospital LOS but not bleed events. Conservative warfarin dosing may be warranted until further research can be conducted.


2020 ◽  
Author(s):  
Anthony Morgan ◽  
Alexandra Gannoni

This study explores the relationship between methamphetamine dependence and domestic violence among male police detainees interviewed as part of the Drug Use Monitoring in Australia program. Detainees who were dependent on methamphetamine reported high rates of domestic violence. They were significantly more likely to have been violent towards an intimate partner in the previous 12 months than detainees who used methamphetamine but were not dependent. Similar patterns were observed for detainees who reported cannabis dependence. Attitudes minimising the impact of violence were also associated with an increased likelihood of domestic violence. The results illustrate the importance of integrated responses that address the co-occurrence of substance use disorders and domestic violence, and the underlying risk factors for both harmful behaviours.


2015 ◽  
Vol 35 (3) ◽  
pp. 351-359 ◽  
Author(s):  
Shang-Feng Yang ◽  
Chia-Jen Liu ◽  
Wu-Chang Yang ◽  
Chao-Fu Chang ◽  
Chih-Yu Yang ◽  
...  

ObjectivesThere is a lack of consensus on the risk factors for hernia formation, and the impact on peritoneal dialysis (PD) survival has seldom been studied.MethodsThis was a population-based study and all collected data were retrieved from the National Health Insurance Research Database of Taiwan. Patients who commenced PD between January 1998 and December 2006 were screened for inclusion. Multiple logistic regression and Cox proportional hazards models were applied to estimate the predictors for hernia formation and determine the predictors of PD withdrawal.ResultsA total of 6,928 PD patients were enrolled and followed until December 2009, with 631 hernia events and 391 hernioplasties being registered in 530 patients (7.7%). The incidence rate was 0.04 hernias/patient/year. Longer PD duration (per 1 month increase, hazard ratio (HR) 1.019) and history of mitral valve prolapse (MVP) (HR 1.584) were independent risk factors for hernia formation during PD, and female gender (HR 0.617) was a protective factor. On the other hand, there were 4,468 PD withdrawals, with cumulative incidence rates of 41% at 1 year, 66% at 3 years, and 82% at 5 years. Independent determinants for cumulative PD withdrawal included hernia formation during PD (HR 1.154), age (per 1 year increase, HR 1.014), larger dialysate volume (per 1 liter increase, HR 0.496), female gender (HR 0.763), heart failure (HR 1.092), hypertension (HR 1.207), myocardial infarction (HR 1.292), chronic obstructive pulmonary disease (COPD) (HR 1.227), cerebrovascular accident (CVA) (HR 1.364), and history of MVP (HR 0.712)ConclusionsProlonged PD duration was a risk factor for hernia formation, and female gender was protective. Hernia formation during PD therapy may increase the risk of PD withdrawal.


2014 ◽  
Vol 21 (11) ◽  
pp. 904-912 ◽  
Author(s):  
Tanja Nordström ◽  
Tuula Hurtig ◽  
Alina Rodriguez ◽  
Jukka Savolainen ◽  
Arja Rautio ◽  
...  

Objective: To examine different risk factors between disruptive behavior disorders (DBD) and ADHD or combined DBD and ADHD. Method: The study population was derived from the Northern Finland Birth Cohort 1986. Psychiatric diagnoses were defined from the Schedule for Affective Disorders and Schizophrenia for School-Age Children–Present and Lifetime Version (K-SADS-PL) interview. The study sample was divided into four groups—people with DBD ( n = 44), with ADHD ( n = 91), with both ( n = 72), and without either ( n = 250)—to evaluate the different risk factors behind these disorders. Results: After adjusting with possible confounding factors, female gender and paternal admittance to inpatient psychiatric care increased the odds that an adolescent was having DBD. Childhood hyperactivity symptoms increased the odds of having ADHD and childhood hyperactivity symptoms and scholastic impairment increased the odds of having both disorders. Conclusion: Our study indicates DBD and ADHD have clearly different risk factors, and the impact of the paternal factors on DBD should be noted more than has been before.


Sign in / Sign up

Export Citation Format

Share Document