scholarly journals Sickness presenteeism in Norway and Sweden

2013 ◽  
Vol 3 ◽  
Author(s):  
Vegard Johansen

Introduction: Sickness presenteeism (SP) refers to the practice of going to work despite illness. This article describes the distribution of SP in Norway and Sweden. It also discusses relations between SP and various work characteristics and personal factors in the two countries. Methods: More than 2500 Norwegian and Swedish workers between 20 and 60 years of age answered a postal questionnaire. The Norwegian and Swedish samples are weighed and representative with regard to both variables of regional background and demography, but the response rate was low. The distribution of SP is measured by frequency (episodes in the previous year) and by length (total days of SP in the previous year). This study employed binary and multinomial logistic regression to detect which factors influence the frequency of SP. Results: Fifty-five per cent of the respondents in Norway and Sweden practised SP in the previous year. The frequency of SP episodes is similar in the two countries. Further, respondents with low/medium income, physical work, and managerial responsibilities report SP more often in both countries. Non-western immigrants, the less educated, and those employed by others are overrepresented with SP in Norway. Neither gender nor age had any particular influence. Discussion: In accordance with previous studies, this study among Norwegian and Swedish workers suggests that some SP during a working year may be more common than no SP. Our analyses of determinants of SP present some previously undocumented differences. Divisions between sedentary versus physical work and management versus non-management were important for SP in Norway and Sweden. Moreover, non-western immigrants are overrepresented with SP in Norway, but this pattern does not prevail in Sweden. Some possible causes for non-western immigrants to report more SP are suggested in the article, but we need more research to follow up on the missing correlation between ethnic background and SP in Sweden.  

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Ruibin Deng ◽  
Xian Tang ◽  
Jiaxiu Liu ◽  
Yuwen Gao ◽  
Xiaoni Zhong

Abstract Background A high rate of cesarean delivery has become a cause of global concern. Although the rate of cesarean delivery has declined over recent years, it remains at a high level largely because of cesarean delivery on maternal request (CDMR). Unnecessary cesarean delivery has limited significance in benefiting maternal and infant physical health; in some ways, it might pose potential risks instead. With the implementation of the “Two-child Policy” in China, an increasing number of women plan to have a second child. Accordingly, how to handle the CDMR rate in China remains an important issue. Methods Data were collected from a longitudinal follow-up study conducted in Chongqing, China, from 2018 to 2019. A structured questionnaire was administered to subjects for data collection. Basic information, including demographic characteristics, living habits, medical history, and follow-up data of pregnant women, as well as their families and society, was collected. Additionally, delivery outcomes were recorded. Logistic regression was performed to analyze the factors influencing CDMR. Results The rate of cesarean delivery in Chongqing, China was 36.01 %, and the CDMR rate was 8.42 %. Maternal request (23.38 %), fetal distress (22.73 %), and pregnancy complications (9.96 %) were the top three indications for cesarean delivery. Logistic regression analysis showed that older age (OR = 4.292, 95 % CI: 1.984–9.283) and being a primiparous woman (OR = 6.792, 95 % CI: 3.230-14.281) were risk factors for CDMR. In addition, CDMR was also associated with factors such as the tendency to choose cesarean delivery during late pregnancy (OR = 5.525, 95 % CI: 2.116–14.431), frequent contact with mothers who had undergone vaginal deliveries (OR = 0.547, 95 % CI: 0.311–0.961), and the recommendation of cesarean delivery by doctors (OR = 4.071, 95 % CI: 1.007–16.455). Conclusions “Maternal request” has become the primary indication for cesarean delivery. The occurrence of CDMR is related to both the personal factors of women during pregnancy and others. Medical institutions and obstetricians should continue popularizing delivery knowledge among pregnant women, enhancing their own professional knowledge about delivery, adhering to the standard indications for cesarean delivery, and providing pregnant women with adequate opportunities for attempting vaginal delivery.


2021 ◽  
Author(s):  
Kelly J McGorm ◽  
James David Brown ◽  
Rebecca Louise Thomson ◽  
Helena Oakey ◽  
Belinda Moore ◽  
...  

BACKGROUND Recruitment and retention of research participants is challenging. Social media, particularly Facebook, has emerged as a tool for connecting with participants due to its high uptake in the community. The Environmental Determinants of Islet Autoimmunity (ENDIA) study is an Australia-wide prospective pregnancy-birth cohort following children who have a first-degree relative with type 1 diabetes (ACTRN1261300794707). A dedicated Facebook page was established for the ENDIA study in 2013 with the aim to enhance recruitment and support participant retention. OBJECTIVE The purpose of this investigation was to evaluate the long-term impact of Facebook as a recruitment and retention tool. The hypotheses were that (1) Facebook was an important source of referral to the ENDIA study, (2) the sociodemographic characteristics of participants recruited by Facebook would be different from those of participants recruited by other means (i.e., ‘conventional recruits’), and (3) recruitment by Facebook would be associated with long-term retention. We also evaluated the most effective types of Facebook content based on post engagement. METHODS Recruitment of 1511 ENDIA participants was completed in December 2019. Characteristics of participants recruited through Facebook were compared to conventional recruits using linear, logistic, and multinomial logistic regression models. A logistic regression model was used to determine the risk of study withdrawal. Data pertaining to 794 Facebook posts over 7.5 years from June 2013 until December 2020 were extracted using the Facebook ‘Insights’ function for thematic analysis. RESULTS Facebook was the third largest source of referral to the ENDIA study (300/1511; 19.9%) behind in-person clinics (500/1511, 33.1%) and healthcare professional referrals (347/1511, 23.0%). The ENDIA Facebook page had 2337 followers at the close of recruitment. Approximately 20% of these could be identified as participating parents. Facebook recruits were more frequently Australian-born (P<.001) enrolling postnatally (P=.01) and withdrew from the study at a significantly lower rate compared to conventional recruits (4.7% vs 12.3%; P<.001) after a median of follow-up of 3.3 years. CONCLUSIONS Facebook was a valuable recruitment tool for the ENDIA study and participants recruited through Facebook were three times less likely to withdraw during long-term follow-up. The sociodemographic characteristics of Facebook recruits were different to conventional recruits, but perhaps in unintended ways. Facebook content featuring stories and images of participants received the highest engagement despite the fact that most Facebook followers were not enrolled in the study. These findings should inform social media strategies for future cohort studies involving pregnant women and young families, and for type 1 diabetes risk studies. CLINICALTRIAL Australia New Zealand Clinical Trials Registry: ACTRN1261300794707 INTERNATIONAL REGISTERED REPORT RR2-https://doi.org/10.1186/1471-2431-13-124


2021 ◽  
Author(s):  
Jui-Hung Hsu ◽  
Li-Ju Lai ◽  
Tao-Hsin Tung ◽  
Wei-Hsiu Hsu

Abstract Purpose:This study evaluated the incidence rate and risk factors for developing myopia in elementary school students in Chiayi, Taiwan.Methods:This prospective cohort study comprised 1816 students without myopia (grades 1 to 5 in Chiayi County). The students underwent a noncycloplegic ocular alignment examinations using an autorefractometer and completed a questionnaires at baseline and at a 1-year follow-up. A univariate logistic regression was used to assess the effects of the categorical variables on new cases of myopia. A multinomial logistic regression was then conducted. A chi-squared test was used to compare new cases of myopia in terms of ocular alignment. A Cox hazard ratio model was then used to validate factors associated with changes in ocular alignment. A P value of <.05 was considered significant.Results: In 370 participants with new cases of myopia out of 1816 participants, a spherical error of −1.51 ± 0.6 diopters was noted at follow-up. The baseline ocular alignment was not a significant risk factor for developing myopia (exophoria vs orthophoria: OR 1.26, 95% CI 0.97-1.62; other vs. orthophoria: OR 1.15, 95% CI 0.73-1.82). However, new cases of myopia (HR 1.36, 95% CI 1.14-1.61), and baseline ocular alignment (exophoria vs orthophoria: HR 3.76, 95% CI 3.20-4.42; other vs orthophoria: HR 3.02, 95% CI 2.05-4.45) were associated with exophoria at follow-up.Conclusions: This study provided epidemiological data on the incidence of myopia in elementary school students in Chiayi, Taiwan. It also demonstrated that physiological exophoria does not predispose patients to developing myopia.


2019 ◽  
Vol 33 (9) ◽  
pp. 1124-1131 ◽  
Author(s):  
Natasha E Wade ◽  
Kara S Bagot ◽  
Claudia I Cota ◽  
Aryandokht Fotros ◽  
Lindsay M Squeglia ◽  
...  

Background: Identifying neural characteristics that predict cannabis initiation is important for prevention efforts. The orbitofrontal cortex is critical for reward response and may be vulnerable to substance-induced alterations. Aims: We measured orbitofrontal cortex thickness, surface area, and volume prior to the onset of use to predict cannabis involvement during an average nine-year follow-up. Methods: Adolescents ( n=118) aged 12–15 years completed baseline behavioral assessment and magnetic resonance imaging scans, then were followed up to 13 years with annual substance use interviews. Logistic regression examined baseline (pre-substance use) bilateral medial and lateral orbitofrontal cortex characteristics (volume, surface area, or cortex thickness) as predictors of regular cannabis use by follow-up. Post-hoc multinomial logistic regression assessed whether orbitofrontal cortex characteristics significantly predicted either alcohol use alone or cannabis+alcohol co-use. Brain-behavior relationships were assessed through follow-up correlations of baseline relationships between orbitofrontal cortex and executive functioning, reward responsiveness, and behavioral approach traits. Results: Larger left lateral orbitofrontal cortex volume predicted classification as cannabis user by follow-up ( p=0.025, odds ratio=1.808). Lateral orbitofrontal cortex volume also predicted cannabis+alcohol co-user status ( p=0.008, odds ratio=2.588), but not alcohol only status. Larger lateral orbitofrontal cortex volume positively correlated with greater baseline reward responsiveness ( p=0.030, r=0.348). There were no significant results by surface area or cortex thickness ( ps>0.05). Conclusions: Larger left lateral orbitofrontal cortex measured from ages 12–15 years and prior to initiation of substance use was related to greater reward responsiveness at baseline and predicted classification as a cannabis user and cannabis+alcohol co-user by final follow-up. Larger lateral orbitofrontal cortex volume may represent aberrant orbitofrontal cortex maturation and increasing vulnerability for later substance use.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Yang Xu ◽  
Marie Evans ◽  
Franz Peter Barany ◽  
Glen James ◽  
Katarina Hedman ◽  
...  

Abstract Background and Aims Attaining the narrow hemoglobin range (10-12 g/dL) recommended by current ERBP renal anemia guidelines may be difficult, and whether this leads to better outcomes is not well known. This study aimed to identify patient and clinical factors associated with difficulties in maintaining hemoglobin target ranges in routine non-dialysis nephrologist care. We also evaluated whether adherence to ERBP hemoglobin recommendations during pre-dialysis care predicted early post-dialysis outcomes. Method Observational study from the Swedish Renal Registry including all patients with non-dialysis dependent CKD stages 3b-5 developing renal anemia or initiating treatment (iron, ESA or both) between 2012-2016. Through multinomial logistic regression with clustered variance, we identified clinical conditions associated to serum hemoglobin values outside the ERBP recommended range (&lt;10 and &gt;12 g/dL) throughout all recorded patient visits until death, dialysis or end of follow-up. For those who initiated dialysis, we calculated the proportion of patient-time in which hemoglobin was maintained within range (time in range [TIR]). We then explored associations between TIR and subsequent one-year risk of death or MACE (composite of death caused by CVD and non-fatal MI, stroke, heart failure) with Cox proportional hazards regression. Results A total 8106 patients with CKD 3b-5 developed incident anemia in Sweden during 2012-2016, contributing with 37422 nephrology visits during median 2 years of follow up. In multinomial logistic regression, being a man and having received iron or higher ESA doses was associated with hemoglobin values outside target range. Patients with CKD 3b and 4, ongoing transplant, history of CVD, or with higher serum calcium and albumin levels had higher odds of maintaining hemoglobin values above range. Conversely, recent bleeding or transfusions, nephrosclerosis, inflammation (CRP&gt;5 mg/dl), and higher phosphate levels increased the odds of having hemoglobin values below range. A total 2435 patients initiated maintenance dialysis during the study period. Of those, 327 died and 701 developed MACE during the subsequent year. Their median TIR during their pre-dialysis period was 44% (IQR: 34-50). On a continuous scale (FIGURE), we observed worse outcomes for patients with poor guideline recommendation adherence (low percentage TIR), although the association was judged weak. On a categorical scale, patients that spent more than 40% of their pre-dialysis TIR had lower hazards of death (0.57, 95% CI 0.41-0.80) and MACE (0.67, 95% CI, 0.54-0.84) compared to those with &lt;44% TIR. Conclusion This nationwide study reports that greater adherence to ERBP anemia guidelines during pre-dialysis care, using existing conventional therapeutic approaches, is associated with better post-dialysis outcomes. Whether active interaction by healthcare practitioners affected the observed relationship needs to be further explored.


2010 ◽  
Vol 92 (3) ◽  
pp. 236-239 ◽  
Author(s):  
Rhidian Jones ◽  
Sadie Burdett ◽  
Matthew Jefferies ◽  
Abhijit R Guha

INTRODUCTION There is no standardised treatment for fifth metacarpal neck fractures. Treatment of this common fracture can vary from immediate mobilisation to immobilisation in a plaster cast for 3 weeks. There is no literature identifying current practice amongst surgeons. SUBJECTS AND METHODS This survey's aim was to reveal current practice in Wales by means of a postal questionnaire sent to all Welsh orthopaedic consultants. RESULTS The questionnaire had a 60% response rate. Results demonstrated varied opinion regarding the degree of displacement warranting reduction. Overall, 10% of surgeons reduce the fracture at 30° of displacement, 29% at 40°, 18% at 50° and 20% at 60° of displacement. The treatment was also very varied. Most surgeons preferred to treat these fractures with neighbour strapping (43%,) while others preferred plaster immobilisation (39%) or immediate mobilisation (10%.) Only 22% of surgeons discharge these patients back to the community after their first visit to out-patients while 13% offer two follow-up appointments. CONCLUSIONS The treatment being offered for this common fracture in Wales is inconsistent. There is a need to develop evidence- based best practice guidelines which should standardise the treatment of this common injury. Perhaps, a large multicentre outcome study may enable this to be drawn up in the future.


2020 ◽  
Vol 9 (1) ◽  
Author(s):  
Adel Elfeky ◽  
Katie Gillies ◽  
Heidi Gardner ◽  
Cynthia Fraser ◽  
Timothy Ishaku ◽  
...  

Abstract Background Retention of participants is essential to ensure the statistical power and internal validity of clinical trials. Poor participant retention reduces power and can bias the estimates of intervention effect. There is sparse evidence from randomised comparisons of effective strategies to retain participants in randomised trials. Currently, non-randomised evaluations of trial retention interventions embedded in host clinical trials are rejected from the Cochrane review of strategies to improve retention because it only included randomised evaluations. However, the systematic assessment of non-randomised evaluations may inform trialists’ decision-making about retention methods that have been evaluated in a trial context.Therefore, we performed a systematic review to synthesise evidence from non-randomised evaluations of retention strategies in order to supplement existing randomised trial evidence. Methods We searched MEDLINE, EMBASE, and Cochrane CENTRAL from 2007 to October 2017. Two reviewers independently screened abstracts and full-text articles for non-randomised studies that compared two or more strategies to increase participant retention in randomised trials. The retention trials had to be nested in real ‘host’ trials ( including feasibility studies) but not hypothetical trials. Two investigators independently rated the risk of bias of included studies using the ROBINS-I tool and determined the certainty of evidence using GRADE (Grading of Recommendations Assessment, Development and Evaluation) framework. Results Fourteen non-randomised studies of retention were included in this review. Most retention strategies (in 10 studies) aimed to increase questionnaire response rate. Favourable strategies for increasing questionnaire response rate were telephone follow-up compared to postal questionnaire completion, online questionnaire follow-up compared to postal questionnaire, shortened version of questionnaires versus longer questionnaires, electronically transferred monetary incentives compared to cash incentives, cash compared with no incentive and reminders to non-responders (telephone or text messaging). However, each retention strategy was evaluated in a single observational study. This, together with risk of bias concerns, meant that the overall GRADE certainty was low or very low for all included studies. Conclusions This systematic review provides low or very low certainty evidence on the effectiveness of retention strategies evaluated in non-randomised studies. Some strategies need further evaluation to provide confidence around the size and direction of the underlying effect.


2021 ◽  
pp. 016402752110051
Author(s):  
Erika Kobayashi ◽  
Ikuko Sugawara ◽  
Taro Fukaya ◽  
Shohei Okamoto ◽  
Jersey Liang

Although retirement age is increasing in aging societies, its impact on individuals and communities is unclear. This study examined how age moderates the linkage between transition into retirement and participation in productive and non-productive social activities after retirement, using a nationwide longitudinal survey with a probability sample of Japanese aged 60 and over ( n = 3,493). Multinomial logistic regression analyses were performed to predict changes in volunteering and hobbies/learning during 3–5 years of follow-up and their participation level at the follow-up. The significant interactions between change in work status (remained working as reference, full/partial retirement, remained not-working) and age at baseline showed that fully retired persons were more likely to increase these activities than remained workers only when they retired by their early seventies. Thus it is important to encourage engagement in social activities before retirement and remove psychological and environmental barriers that hinder starting new activities at old age.


2019 ◽  
Vol 50 (13) ◽  
pp. 2230-2239 ◽  
Author(s):  
Joyce Y. Guo ◽  
Tara A. Niendam ◽  
Andrea M. Auther ◽  
Ricardo E. Carrión ◽  
Barbara A. Cornblatt ◽  
...  

AbstractBackgroundIdentifying risk factors of individuals in a clinical-high-risk state for psychosis are vital to prevention and early intervention efforts. Among prodromal abnormalities, cognitive functioning has shown intermediate levels of impairment in CHR relative to first-episode psychosis and healthy controls, highlighting a potential role as a risk factor for transition to psychosis and other negative clinical outcomes. The current study used the AX-CPT, a brief 15-min computerized task, to determine whether cognitive control impairments in CHR at baseline could predict clinical status at 12-month follow-up.MethodsBaseline AX-CPT data were obtained from 117 CHR individuals participating in two studies, the Early Detection, Intervention, and Prevention of Psychosis Program (EDIPPP) and the Understanding Early Psychosis Programs (EP) and used to predict clinical status at 12-month follow-up. At 12 months, 19 individuals converted to a first episode of psychosis (CHR-C), 52 remitted (CHR-R), and 46 had persistent sub-threshold symptoms (CHR-P). Binary logistic regression and multinomial logistic regression were used to test prediction models.ResultsBaseline AX-CPT performance (d-prime context) was less impaired in CHR-R compared to CHR-P and CHR-C patient groups. AX-CPT predictive validity was robust (0.723) for discriminating converters v. non-converters, and even greater (0.771) when predicting CHR three subgroups.ConclusionsThese longitudinal outcome data indicate that cognitive control deficits as measured by AX-CPT d-prime context are a strong predictor of clinical outcome in CHR individuals. The AX-CPT is brief, easily implemented and cost-effective measure that may be valuable for large-scale prediction efforts.


PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0238662
Author(s):  
Dana O. Sarnak ◽  
Shannon N. Wood ◽  
Linnea A. Zimmerman ◽  
Celia Karp ◽  
Fredrick Makumbi ◽  
...  

Background Understanding contraceptive use dynamics is critical to addressing unmet need for contraception. Despite evidence that male partners may influence contraceptive decision-making, few studies have prospectively examined the supportive ways that men influence women’s contraceptive use and continuation. Objective This study sought to understand the predictive effect of partner influence, defined as partner’s fertility intentions and support for contraception, and discussions about avoiding pregnancy prior to contraceptive use, on contraceptive use dynamics (continuation, discontinuation, switching, adoption) over a one-year period. Methods This study uses nationally representative longitudinal data of Ugandan women aged 15–49 collected in 2018–2019 (n = 4,288 women baseline; n = 2,755 women one-year follow-up). Two analytic sub-samples of women in union and in need of contraception at baseline were used (n = 618 contraceptive users at baseline for discontinuation/switching analysis; n = 687 contraceptive non-users at baseline for adoption analysis). Primary dependent variables encompassed contraceptive use dynamics (continuation, discontinuation, switching, and adoption); three independent variables assessed partner influence. For each sub-sample, bivariate associations explored differences in sociodemographic and partner influences by contraceptive dynamics. Multinomial regression models were used to examine discontinuation and switching for contraceptive users at baseline; logistic regression identified predictors of contraceptive adoption among non-users at baseline. Results Among users at baseline, 26.3% of women switched methods and 31.5% discontinued contraceptive use by follow-up. Multinomial logistic regression, adjusting for women’s characteristics, indicated the relative risk of contraceptive discontinuation doubled when women did not discuss pregnancy avoidance with their partner prior to contraceptive use. Partner influence was not related to method switching. Among non-users at baseline, partner support for future contraceptive use was associated with nearly three-fold increased odds of contraceptive adoption. Significance These results highlight the potentially supportive role of male partners in contraceptive adoption. Future research is encouraged to elucidate the complex pathways between couple-based decision-making and contraceptive dynamics through further prospective studies.


Sign in / Sign up

Export Citation Format

Share Document