scholarly journals ADL Recovery Trajectory after Discharge and its Predictors among Baseline-Independent Older Inpatients

2019 ◽  
Author(s):  
Xiuyue Li ◽  
Tingting Zheng ◽  
Yaqi Guan ◽  
Hui Li ◽  
Kexin Zhu ◽  
...  

Abstract Background Among the previous studies about the ADL recovery and its predictors,the researches and resources used to study and protect the baseline-independent older patients from being permanently ADL-dependent was few. We aimed to describe the level of activities of daily living (ADL) at discharge and ADL change within 6 months after discharge in older patients who were ADL-independent before admission but became dependent because of acute illness, and to identify the predictors of early rehabilitation,so as to provide the basis to early intervention. Methods Stratified cluster sampling was used to recruit 520 hospitalised older patients who were ADL-independent from departments of internal medicine at two tertiary hospitals from August 2017 to May 2018. Demographics, clinical data, and ADL status at 1, 3, and 6 months after discharge were collected. Data were analysed using descriptive statistics, Student’s t-test,Pearson’s chi-square test,Spearman’s correlation analysis, binary logistic regression analysis, and receiver operating characteristic (ROC) curve analysis. Results There were 403 out of 520 patients completing the 6-month follow-up, and 229 (56.8%) regained independence at 6 months after discharge. There was an overall increasing trend in ADL with time. The recovery rate was the highest within the first month after discharge, gradually declined after 1 month, and changed less obviously from 3 to 6 months after discharge (P<0.001). ADL score at discharge (OR=1.034, P<0.001), age (OR=0.269, P=0.001), post-discharge residence (OR=0.390, P<0.05), and cognition status at discharge(OR=1.685, P<0.05) were predictors of ADL recovery. The area under the curve of the four predictors combined was 0.763 (P<0.001). Conclusion Studying ADL recovery rate and its predicting indicators of the baseline independent inpatients at different time points provide a theoretical reference for the formulation of nursing plans and allocation of care resources.

2020 ◽  
Author(s):  
Xiuyue Li ◽  
Tingting Zheng ◽  
Yaqi Guan ◽  
Hui Li ◽  
Kexin Zhu ◽  
...  

Abstract Background Among the previous studies about the ADL recovery and its predictors, the researches and resources used to study and protect the baseline-independent older patients from being permanently ADL-dependent was few. We aimed to describe the level of activities of daily living (ADL) at discharge and ADL change within 6 months after discharge in older patients who were ADL-independent before admission but became dependent because of acute illness, and to identify the predictors of early rehabilitation,so as to provide the basis to early intervention.Methods Stratified cluster sampling was used to recruit 520 hospitalised older patients who were ADL-independent from departments of internal medicine at two tertiary hospitals from August 2017 to May 2018. Demographics, clinical data, and ADL status at 1, 3, and 6 months after discharge were collected. Data were analysed using descriptive statistics, Student’s t-test, Pearson’s chi-square test,Spearman’s correlation analysis, binary logistic regression analysis, and receiver operating characteristic (ROC) curve analysis.Results There were 403 out of 520 patients completing the 6-month follow-up, and 229 (56.8%) regained independence at 6 months after discharge. There was an overall increasing trend in ADL with time. The recovery rate was the highest within the first month after discharge, gradually declined after 1 month, and changed less obviously from 3 to 6 months after discharge (p<0.001). ADL score at discharge (OR=1.034, p<0.001), age (OR=0.269, p=0.001), post-discharge residence (OR=0.390, p<0.05), and cognition status at discharge (OR=1.685, p<0.05) were predictors of ADL recovery. The area under the curve of the four predictors combined was 0.763 (p<0.001). Conclusion Studying ADL recovery rate and its predicting indicators of the baseline independent inpatients at different time points provide a theoretical reference for the formulation of nursing plans and allocation of care resources.


2020 ◽  
Author(s):  
Xiuyue Li ◽  
Tingting Zheng ◽  
Yaqi Guan ◽  
Hui Li ◽  
Kexin Zhu ◽  
...  

Abstract Background Among the previous studies about the ADL recovery and its predictors, the researches and resources used to study and protect the baseline-independent older patients from being permanently ADL-dependent was few. We aimed to describe the level of activities of daily living (ADL) at discharge and ADL change within 6 months after discharge in older patients who were ADL-independent before admission but became dependent because of acute illness, and to identify the predictors of early rehabilitation,so as to provide the basis to early intervention. Methods Stratified cluster sampling was used to recruit 520 hospitalised older patients who were ADL-independent from departments of internal medicine at two tertiary hospitals from August 2017 to May 2018. Demographics, clinical data, and ADL status at 1, 3, and 6 months after discharge were collected. Data were analysed using descriptive statistics, Student’s t-test, Pearson’s chi-square test,Spearman’s correlation analysis, binary logistic regression analysis, and receiver operating characteristic (ROC) curve analysis. Results There were 403 out of 520 patients completing the 6-month follow-up, and 229 (56.8%) regained independence at 6 months after discharge. There was an overall increasing trend in ADL with time. The recovery rate was the highest within the first month after discharge, gradually declined after 1 month, and changed less obviously from 3 to 6 months after discharge (p<0.001). ADL score at discharge (OR=1.034, p<0.001), age (OR=0.269, p=0.001), post-discharge residence (OR=0.390, p<0.05), and cognition status at discharge (OR=1.685, p<0.05) were predictors of ADL recovery. The area under the curve of the four predictors combined was 0.763 (p<0.001). Conclusion Studying ADL recovery rate and its predicting indicators of the baseline independent inpatients at different time points provide a theoretical reference for the formulation of nursing plans and allocation of care resources.


2021 ◽  
Vol 8 (12) ◽  
pp. 706-710
Author(s):  
Kemal Göçer ◽  
Ahmet Çağrı Aykan ◽  
Bayram Öztürk ◽  
Alihan Erdoğan

Objective: This study aimed to evaluate whether neutrophil/lymphocyte (N/L) ratio assists in the diagnosis of coronary artery disease (CAD) in patients with suspected diaphragmatic attenuation artifact (DAA) on myocardial perfusion SPECT (MP-SPECT). Material and Methods: A total of 255 patients undergoing coronary angiography between 2015-2020 due to unclear DAA of the inferior wall on MP-SPECT were included in this retrospective study. Patients were divided into two groups (CAD and non-CAD) according to angiographic images. Significant CAD was defined as ≥50% stenosis of coronary arteries feeding the inferior wall. White blood cell count, biochemical parameters, and risk factors for CAD were compared between the two groups. Results: There was no statistically significant difference between the two groups in terms of age (p = 0.055), gender (p = 0.482), and body mass index (p = 0.305). N/L ratio (OR = 1.397 p = 0.002 95% Cl = 1.128-1.732) and left ventricle ejection fraction (OR = 0.896 p = 0.023 95% Cl = 0.815-0.985) were independent risk factors for CAD in multivariate binary logistic regression analysis. Receiver Operating Characteristic (ROC) curve analysis showed that a cut-off value of ≥2 for N/L ratio predicted the presence of CAD (sensitivity=63.5%, specificity=60.7%, AUC=0.668, 95% CI=0.596 – 0.740, p<0.001). Conclusion: N/L ratio is a simple and accessible test and may increase the diagnostic accuracy of MP-SPECT for CAD in patients with suspicious diaphragmatic attenuation on MP-SPECT.


2019 ◽  
Vol 105 (3) ◽  
pp. 898-907 ◽  
Author(s):  
Sylvie Job ◽  
Adrien Georges ◽  
Nelly Burnichon ◽  
Alexandre Buffet ◽  
Laurence Amar ◽  
...  

Abstract Context Pheochromocytomas and paragangliomas (PPGLs) are neuroendocrine tumors explained by germline or somatic mutations in about 70% of cases. Patients with SDHB mutations are at high risk of developing metastatic disease, yet no reliable tumor biomarkers are available to predict tumor aggressiveness. Objective We aimed at identifying long noncoding RNAs (lncRNAs) specific for PPGL molecular groups and metastatic progression. Design and Methods To analyze the expression of lncRNAs, we used a mining approach of transcriptome data from a well-characterized series of 187 tumor tissues. Clustering consensus analysis was performed to determine a lncRNA-based classification, and informative transcripts were validated in an independent series of 51 PPGLs. The expression of metastasis-related lncRNAs was confirmed by RT-qPCR. Receiver operating characteristic (ROC) curve analysis was used to estimate the predictive accuracy of potential markers. Main Outcome Measure Univariate/multivariate and metastasis-free survival (MFS) analyses were carried out for the assessment of risk factors and clinical outcomes. Results Four lncRNA-based subtypes strongly correlated with mRNA expression clusters (chi-square P-values from 1.38 × 10–32 to 1.07 × 10–67). We identified one putative lncRNA (GenBank: BC063866) that accurately discriminates metastatic from benign tumors in patients with SDHx mutations (area under the curve 0.95; P = 4.59 × 10–05). Moreover, this transcript appeared as an independent risk factor associated with poor clinical outcome of SDHx carriers (log-rank test P = 2.29 × 10–05). Conclusion Our findings extend the spectrum of transcriptional dysregulations in PPGL to lncRNAs and provide a novel biomarker that could be useful to identify potentially metastatic tumors in patients carrying SDHx mutations.


2019 ◽  
Vol 10 (1) ◽  
pp. 63-68 ◽  
Author(s):  
Christopher G. Varlotta ◽  
David H. Ge ◽  
Nicholas Stekas ◽  
Nicholas J. Frangella ◽  
Jordan H. Manning ◽  
...  

Study Design: Retrospective cohort study. Objective: To investigate radiological differences in lumbar disc herniations (herniated nucleus pulposus [HNP]) between patients receiving microscopic lumbar discectomy (MLD) and nonoperative patients. Methods: Patients with primary treatment for an HNP at a single academic institution between November 2012 to March 2017 were divided into MLD and nonoperative treatment groups. Using magnetic resonance imaging (MRI), axial HNP area; axial canal area; HNP canal compromise; HNP cephalad/caudal migration and HNP MRI signal (black, gray, or mixed) were measured. T test and chi-square analyses compared differences in the groups, binary logistic regression analysis determined odds ratios (ORs), and decision tree analysis compared the cutoff values for risk factors. Results: A total of 285 patients (78 MLD, 207 nonoperative) were included. Risk factors for MLD treatment included larger axial HNP area ( P < .01, OR = 1.01), caudal migration, and migration magnitude ( P < .05, OR = 1.90; P < .01, OR = 1.14), and gray HNP MRI signal ( P < .01, OR = 5.42). Cutoff values for risks included axial HNP area (70.52 mm2, OR = 2.66, P < .01), HNP canal compromise (20.0%, OR = 3.29, P < .01), and cephalad/caudal migration (6.8 mm, OR = 2.43, P < .01). MLD risk for those with gray HNP MRI signal (67.6% alone) increased when combined with axial HNP area >70.52 mm2 (75.5%, P = .01) and HNP canal compromise >20.0% (71.1%, P = .05) cutoffs. MLD risk in patients with cephalad/caudal migration >6.8 mm (40.5% alone) increased when combined with axial HNP area and HNP canal compromise (52.4%, 50%; P < .01). Conclusion: Patients who underwent MLD treatment had significantly different axial HNP area, frequency of caudal migration, magnitude of cephalad/caudal migration, and disc herniation MRI signal compared to patients with nonoperative treatment.


2017 ◽  
Vol 38 (3) ◽  
pp. 291-301 ◽  
Author(s):  
Haribondhu Sarma ◽  
Jahidur Rahman Khan ◽  
Mohammad Asaduzzaman ◽  
Fakhar Uddin ◽  
Sayeeda Tarannum ◽  
...  

Background: Poor nutrition during childhood impedes physical and mental development of children, which propagate the vicious cycle of intergenerational under nutrition. This paper is aimed at understanding the determinants of stunting among children aged 0 to 59 months in Bangladesh. Methods: The study used Bangladesh Demographic and Health Survey 2011 data and a multistage stratified cluster-sampling design. Anthropometric data (for height and weight) were collected and analysis was limited to 7647 children. Multiple binary logistic regression analysis was performed to assess the association of stunting with potential socioeconomic and demographic factors. Results: The prevalence of stunting has been found to be about 41% among children aged less than 60 months and higher in rural setting than in urban areas (43% vs 36%). Adjusted model revealed that several factors were influencing stunting. The children living in moderately food-insecure households had higher odds of becoming stunted (odds ratio [OR] = 1.27, 95% confidence interval [CI]: 1.05-1.54, P = .01) compared to the children living in food-secure households. The derived ORs of stunting for children delivered at institutions facilitated particularly by public (OR = 0.80, 95% CI: 0.67-0.96; P = .02) or private (OR = 0.81, 95% CI: 0.67-0.97; P = .02) sectors were less than for children delivered at home. Similarly, wealth index, exposure of mother to the mass media, age of child, size of child at birth, and parents’ education were significantly associated with stunting. Conclusions: Moreover, the demographic characteristics and other indicators appeared to have significant influence in the prevalence of stunting. Public health programs are needed to avert the risk factors of stunting among children in Bangladesh.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0244257
Author(s):  
John W. Francis ◽  
Alun J. Owen ◽  
Derek M. Peters

The purposes of this study were to (i) develop a field-goal shooting performance analysis template and (ii) explore the impact of each identified variable upon the likely outcome of a field-goal attempt using binary logistic regression modelling in elite men’s wheelchair basketball. First, a field-goal shooting performance analysis template was developed that included 71 Action Variables (AV) grouped within 22 Categorical Predictor Variables (CPV) representing offensive, defensive and game context variables. Second, footage of all 5,105 field-goal attempts from 12 teams during the men’s 2016 Rio De Janeiro Paralympic Games wheelchair basketball competition were analysed using the template. Pearson’s chi-square analyses found that 18 of the CPV were significantly associated with field-goal attempt outcome (p < 0.05), with seven of them reaching moderate association (Cramer’s V: 0.1–0.3). Third, using 70% of the dataset (3,574 field-goal attempts), binary logistic regression analyses identified that five offensive variables (classification category of the player, the action leading up to the field-goal attempt, the time left on the clock, the location of the shot, and the movement of the player), two defensive variables (the pressure being exerted by the defence, and the number of defenders within a 1-meter radius) and 1 context variable (the finishing position of the team in the competition) affected the probability of a successful field-goal attempt. The quality of the developed model was determined acceptable (greater than 65%), producing an area under the curve value of 68.5% when the model was run against the remaining 30% of the dataset (1,531 field-goal attempts). The development of the model from such a large sample of objective data is unique. As such it offers robust empirical evidence to enable coaches, performance analysts and players to move beyond anecdote, in order to appreciate the potential effect of various and varying offensive, defensive and contextual variables on field-goal success.


2020 ◽  
Author(s):  
Obasanjo Afolabi Bolarinwa

AbstractEvidence has shown that the prescribed lockdown and physical distancing due to the novel coronavirus disease 2019 (COVID-19) have made accessing essential health care services much difficult in low-and middle-income countries (LMICs). Access to contraception is essential and should not be denied, even in a global crisis, because it is associated with several health benefits. It is paramount to maintain timely access to contraception without unnecessary barriers. Hence, this study examines the factors contributing to limited access to condoms and sources of condoms during the COVID-19 pandemic in South Africa (SA).The first secondary dataset on coronavirus from the National Income Dynamic Study (NIDS) conducted in SA during the coronavirus pandemic was employed in this study. This study involved 4,517 respondents. Data were analysed using frequency analysis, chi-square test and binary logistic regression analysis. Almost one-quarter of South Africans could not access condoms, and every 7 in 10 South Africans preferred public or government hospitals as a source of condoms. Female South Africans (aOR=0.86; 95% CI=0.69-1.08), those aged 35-45 (aOR=0.96; 95% CI=0.73-1.28) and those residing in KwaZulu-Natal province (aOR=0.30; 95% CI=0.17-0.52) were 14%, 4% and 70% respectively less likely to have access to condoms during the COVID-19 lockdown. Findings from the study suggest strategies and interventions that will be tailored towards non-obstruction of contraception access during the on-going COVID-19 or any future pandemic. Moreover, special consideration should be given to certain provinces, the uneducated and those in the 4th quintile of wealth-income in South Africa.


2020 ◽  
Vol 11 (3) ◽  
pp. 251-263
Author(s):  
José Francisco dos Reis Neto ◽  
Michelle Da Rosa Lopes ◽  
Celso Correia de Souza ◽  
Marlucy Ferreira Machado Xavier

Urban green parks, considered as natural resources, should be seen as strategic elements for conservation of natural resources, socialization, health and well-being, since it significantly influences the local economy. The research sought to quantify the Willingness to Pay (WTP) for environmental conservation and its influencing factors, indicated by the population using three urban parks in Campo Grande, MS, Brazil: Parque das Nações Indígenas (PNI); Parque Ayrton Senna (PAS); and Parque Matas do Segredo (PMS). It was a quantitative approach, interviewing 824 users from the three parks object of the research, highlighting those who know and frequent the parks. A structured questionnaire was used, divided into groups of variables to identify the perceptions of socioeconomic, maintenance and preservation of parks, satisfaction with the equipment offered, and with the environmental preservation and maintenance of the Parks. Descriptive statistics and Chi-square test were used to compare the independence between the proportions obtained for each Park, at the level of significance at p <0.05, bilateral. A binary logistic regression analysis of discrete variables was also performed to identify the factors that may influence the user of the Park to WTP of environmental conservation. As a result, it was identified that there is satisfaction in the Parks, in relation to the questions: place for social interaction, appearance, attractiveness, calm, cozy, beautiful and clean environment. In relation to WTP, it was identified that the people who attend the Parks are willing to pay for the protection and environmental conservation of these. This article has its academic importance in contributing to the knowledge of the importance of Urban Green Parks, as well as providing indications for the evolution of others socio-environmental research projects. Under the business scope, it provides data and conclusions for the direction of public policies enlist the population and entrepreneurs to strengthen the preservation and maintenance.


Author(s):  
Lucy Maina ◽  
Elishiba Kimani

Retirees’ income security constitutes a key concern for nations aiming to secure their ageing populations. Kenya has a growing retirement sector with about 252,000 retired civil servants who are on pension and a significant number of private sector retirees who receive a gratuity at retirement. Though formally retired workers may receive a pension, studies consistently report low pensions uptake and inadequate incomes for those retired as well as an increasing national and societal burden. This paper explores the key determinants of income security among 978retired persons who were receiving dues on their retirement savings. Guided by the life cycle and third age theory, the study investigated whether retirees’ socio-economic attributes, pre-retirement financial status, retirees’ benefit package, retirees’ utilization of retirement savings and investments and pre-retirement preparation correlated with income security. A mixed-method study design was used combining survey and case study approaches. Cluster, purposive and random sampling methods were employed to select retirees under the four categories of retirement schemes in Kenya across 18 selected counties of Kenya. Hypotheses were tested using the Chi square test of significance and comparison of means (t-test) specifically to illustrate the relationship between socio-economic indicators, pre-retirement factors and income security at retirement. Logistic regression procedure was employed to isolate the significant factors that predict income security in retirement. The binary logistic regression analysis confirm that retirees with higher education had 26% higher chances of enjoying income security, those who earned higher pre-retirement salary had 25% higher chances of having a secure income at retirement, those knowledgeable about pensions had 35% higher chances of being income secure while those who had planned for their retirement had 14% higher chances of achieving income security. The study recommends crafting of a robust retirement planning package, financial and health plans for retirees’ income security and sustainable livelihoods.


Sign in / Sign up

Export Citation Format

Share Document