annual risk
Recently Published Documents


TOTAL DOCUMENTS

158
(FIVE YEARS 44)

H-INDEX

21
(FIVE YEARS 4)

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Eun Hye Lee ◽  
Nak-Hoon Son ◽  
Se Hyun Kwak ◽  
Ji Soo Choi ◽  
Min Chul Kim ◽  
...  

Abstract Background Tuberculosis (TB) has been a major public health problem in South Korea. Although TB notification rate in Korea is gradually decreasing, still highest among the member countries of the Organization for Economic Cooperation and Development. To effectively control TB, understanding the TB epidemiology such as prevalence of latent tuberculosis infection (LTBI) and annual risk of TB infection (ARI) are important. This study aimed to identify the prevalence of LTBI and ARI among South Korean health care workers (HCWs) based on their interferon-gamma release assays (IGRA). Methods This was single center, cross-sectional retrospective study in a tertiary hospital in South Korea. We performed IGRA in HCWs between May 2017 and March 2018. We estimated ARI based on IGRA results. Logistic regression model was used to identify factors affecting IGRA positivity. Results A total of 3233 HCWs were analyzed. Median age of participants was 38.0 and female was predominant (72.6%). Overall positive rate of IGRA was 24.1% and IGRA positive rates age-group wise were 6.6%, 14.4%, 34.3%, and around 50% in the age groups 20s, 30s, 40s, and 50s and 60s, respectively. The ARIs was 0.26–1.35% between 1986 and 2005; rate of TB infection has gradually decreased in the last two decades. Multivariable analysis indicated that older age, healed TB lesion in x-ray, and male gender were risk factors for IGRA positivity, whereas working in high-risk TB departments was not. Conclusions Results showed that ARI in South Korean HCWs gradually decreased over two decades, although LTBI remained prevalent. Our results suggest that the LTBI test result of HCWs might be greatly affected by age, rather than occupational exposure, in intermediate TB burden countries. Thus, careful interpretation considering the age structure is required.


Food Research ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 385-392
Author(s):  
J.X. Wong ◽  
C.H. Kuan ◽  
S.H. Saw ◽  
S.N. Chen ◽  
C.W. Tan ◽  
...  

High occurrences of Salmonella enterica serovar Enteritidis outbreak from table eggs have been reported worldwide over the past two decades. Consumptions of hard-boiled and half-boiled eggs are popular among Malaysians. However, there is a lack of study in the risk assessment of salmonellosis associated with different egg consumption patterns. The purpose of this study was to determine the survival rate of S. enterica ser. Enteritidis in different methods for cooking eggs (hard-boiled, half-boiled and a minimally cooked egg with hot cocoa drink) using the simulation model of consumers eating habits and the risk associated with different egg consumptions patterns. In this study, S. enterica ser. Enteritidis was not detected in the hard-boiled egg samples. However, the survival rate of S. enterica ser. Enteritidis in both the half-boiled and the raw egg samples were 3.15 log CFU/mL and 7.01 log CFU/mL, respectively. The Monte Carlo Simulation applying quantitative microbial risk assessments (QMRA) was carried out using 10,000 iterations to access the risk of acquiring salmonellosis by consuming eggs cooked under different heat treatments. The total dosage of S. enterica ser. Enteritidis ingested per serving meal in the hard-boiled, half-boiled and minimally cooked eggs were 0.00 CFU/g, 7.526×104 CFU/ mL and 5.433×108 CFU/mL, respectively. The consumptions of half-boiled and minimally cooked eggs were above infectious dosage level (102 to 104 CFU/mL). The annual risk for the three feature of methods were 0.00, 1.00 and 1.00, respectively. In this study, it was indicated that there was a high probability of acquiring salmonellosis through the consumption of half-boiled and minimally cooked eggs. Thus, the fully cooked eggs should be taken instead of the undercooked eggs to avoid consuming S. enterica ser. Enteritidis.


BJPsych Open ◽  
2021 ◽  
Vol 7 (S1) ◽  
pp. S97-S97
Author(s):  
Victor Ohize ◽  
Deval Bagalkote

AimsTo determine the proportion of women of child-bearing age prescribed SV who have the SV ARF filled.BackgroundIn 2018, the Medicines and Healthcare products Regulatory Agency (MHRA) gave guidance regarding Sodium Valproate (SV) prescription. It acknowledged the significant risk of birth defects and developmental disorders in women of child-bearing age prescribed SV.Consequently, the MHRA recommendation is that SV must not be used in females of child-bearing age unless: conditions of pregnancy prevention programme are met; other treatments are ineffective or not tolerated; and evidence of discussion of risks with patient or carer and annual review of the risks are documented. The evidences of the above criteria are expected to be documented in an Annual Risk Acknowledgement Form (ARF).MethodRetrospective study involving systematic search of Trust database to identify women with ID, aged 16–50 years prescribed SV from 2018 to 2019.Result18 of 28 patients had ARF filled, a 64% compliance.The main indications for SV prescription were epilepsy; challenging behaviour; and mood stabilization.The distribution showed neurology and psychiatrist led prescription initiation equally distributed at 50%.The ARF compliance was higher in the neurology group (93%) compared to 36% in psychiatrist group.A review across the 5 ID teams (A,B,C,D and E) of the trust shows variable compliance to ARF compliance (17%,81%,100%,60%,0% respectively) with teams having higher proportion of neurology led SV prescription initiation also having higher proportion of ARF completion compliance (0%,55%,80%,80%,0% respectively).ConclusionConclusion / RecommendationARF compliance is below standard at 64%.Despite the SV prescription being equally distributed between neurology led and psychiatry led, patients whose prescription of SV is neurology led (prescription indication as epilepsy) had better ARF compliance outcome (93%) compared with patients whose prescription is psychiatry led (prescription indication as challenging behaviour or mood stabilization) with 36% ARF compliance.Organizational difference with dedicated epilepsy nurse in the ID service means patients with epilepsy had reviews of medication and compliance to MHRA guidance in completing the ARF.There is need to increase doctors’ awareness to review ARF status during patients’ appointment. Information Technology design to flag up out of date ARF may be helpful.The review of ARF may also flag up consideration of other alternatives: behavioural, psychological, functional and environmental interventions as well as alternative medications like Risperidone for challenging behaviours and other mood stabilizing options. This will minimize SV prescription, which is the original goal of the MHRA guidance.


2021 ◽  
Vol 12 ◽  
Author(s):  
Julia K. Denissen ◽  
Brandon Reyneke ◽  
Monique Waso ◽  
Sehaam Khan ◽  
Wesaal Khan

Roof-harvested rainwater (RHRW) was investigated for the presence of the human pathogenic bacteria Mycobacterium tuberculosis (M. tuberculosis), Yersinia spp. and Listeria monocytogenes (L. monocytogenes). While Yersinia spp. were detected in 92% (n = 25) of the RHRW samples, and L. monocytogenes and M. tuberculosis were detected in 100% (n = 25) of the samples, a significantly higher mean concentration (1.4 × 103 cells/100 mL) was recorded for L. monocytogenes over the sampling period. As the identification of appropriate water quality indicators is crucial to ensure access to safe water sources, correlation of the pathogens to traditional indicator organisms [Escherichia coli (E. coli) and Enterococcus spp.] and microbial source tracking (MST) markers (Bacteroides HF183, adenovirus and Lachnospiraceae) was conducted. A significant positive correlation was then recorded for E. coli versus L. monocytogenes (r = 0.6738; p = 0.000), and Enterococcus spp. versus the Bacteroides HF183 marker (r = 0.4071; p = 0.043), while a significant negative correlation was observed for M. tuberculosis versus the Bacteroides HF183 marker (r = −0.4558; p = 0.022). Quantitative microbial risk assessment indicated that the mean annual risk of infection posed by L. monocytogenes in the RHRW samples exceeded the annual infection risk benchmark limit (1 × 10–4 infections per person per year) for intentional drinking (∼10–4). In comparison, the mean annual risk of infection posed by E. coli was exceeded for intentional drinking (∼10–1), accidental consumption (∼10–3) and cleaning of the home (∼10–3). However, while the risk posed by M. tuberculosis for the two relevant exposure scenarios [garden hosing (∼10–5) and washing laundry by hand (∼10–5)] was below the benchmark limit, the risk posed by adenovirus for garden hosing (∼10–3) and washing laundry by hand (∼10–3) exceeded the benchmark limit. Thus, while the correlation analysis confirms that traditional indicators and MST markers should be used in combination to accurately monitor the pathogen-associated risk linked to the utilisation of RHRW, the integration of QMRA offers a more site-specific approach to monitor and estimate the human health risks associated with the use of RHRW.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Przemysław Kunert ◽  
Katarzyna Wójtowicz ◽  
Jarosław Żyłkowski ◽  
Maciej Jaworski ◽  
Daniel Rabczenko ◽  
...  

AbstractA shift toward the endovascular treatment of ophthalmic segment aneurysms is noticeable. However, it is not clear if the long-term treatment results improve with the development of endovascular methods. The aim of this study was to present the outcomes of the treatment of unruptured ophthalmic aneurysms using flow diverting devices (FDD) with or without coiling. This retrospective study included 52 patients with 65 UIAs treated in 2009–2016. The mean aneurysm size was 8.8 mm. Eight aneurysms were symptomatic. Therapeutic procedures included: 5 failed attempts, 55 first sessions with FDD deployment (bilateral procedures in 3) and 3 retreatment procedures. To cover 55 ICAs, 25 Silk, 26 Pipeline, 9 Fred and 1 Surpass FDD were used. FDD with coiling was applied in 19(29.2%), mainly for symptomatic and larger aneurysms. Mean radiological and clinical follow-up was 12 and 61 months, respectively. Postprocedural deterioration was noted in 3(5.8%) patients, but in long-term the modified Rankin Scale grades 0–2 were achieved in 98.1% of patients. One patient died from the treated aneurysm rupture (annual risk—0.07%). Raymond–Roy occlusion classification class I or II was achieved in 98.5% in the long term, with similar results in both groups. Complications occurred in 40.4% of patients and the most frequent were: imperfect FDD deployment (15%), failed attempt of FDD deployment (9.6%) and late FDD stenosis (9.6%). Flow-diverting devices, with additional coiling in selected cases, may offer a very high proportion of satisfactory outcomes. However, in our experience the high risk of complications remains.


2021 ◽  
Author(s):  
Tibor Krisko ◽  
György Baffy ◽  
Alexander S. Vogel

Alcohol-associated liver disease encompasses all forms of liver injury related to the consumption of alcohol, one of the most common hepatotoxic agents in the world. The spectrum of this disease ranges from steatosis, which is present in everyone who drinks alcohol in excess, to cirrhosis, which occurs in approximately 10 to 15% of individuals with alcohol abuse and conveys an annual risk of 1 to 2% for the development of hepatocellular carcinoma. Despite the prevalence of alcohol-associated liver disease and its profound impact on health, questions remain surrounding its pathogenesis and management. This review of alcohol-associated liver disease addresses the epidemiology, etiology and genetics, pathophysiology and pathogenesis, diagnosis, differential diagnosis and comorbidities, treatment, complications, measures of quality of care, and prognosis and outcome measurements. This review contains 6 highly rendered figures, 6 tables, and 50 references. Keywords: Alcohol-associated Liver Disease (ALD), alcohol, cirrhosis, liver injury, Alcohol Use Disorder (AUD), alcohol-associated hepatitis, substance abuse


2021 ◽  
Vol 10 (1) ◽  
Author(s):  
Keon‐Joo Lee ◽  
Seong‐Eun Kim ◽  
Jun Yup Kim ◽  
Jihoon Kang ◽  
Beom Joon Kim ◽  
...  

Background The long‐term incidence of acute myocardial infarction (AMI) in patients with acute ischemic stroke (AIS) has not been well defined in large cohort studies of various race‐ethnic groups. Methods and Results A prospective cohort of patients with AIS who were registered in a multicenter nationwide stroke registry (CRCS‐K [Clinical Research Collaboration for Stroke in Korea] registry) was followed up for the occurrence of AMI through a linkage with the National Health Insurance Service claims database. The 5‐year cumulative incidence and annual risk were estimated according to predefined demographic subgroups, stroke subtypes, a history of coronary heart disease (CHD), and known risk factors of CHD. A total of 11 720 patients with AIS were studied. The 5‐year cumulative incidence of AMI was 2.0%. The annual risk was highest in the first year after the index event (1.1%), followed by a much lower annual risk in the second to fifth years (between 0.16% and 0.27%). Among subgroups, annual risk in the first year was highest in those with a history of CHD (4.1%) compared with those without a history of CHD (0.8%). The small‐vessel occlusion subtype had a much lower incidence (0.8%) compared with large‐vessel occlusion (2.2%) or cardioembolism (2.4%) subtypes. In the multivariable analysis, history of CHD (hazard ratio, 2.84; 95% CI, 2.01–3.93) was the strongest independent predictor of AMI after AIS. Conclusions The incidence of AMI after AIS in South Korea was relatively low and unexpectedly highest during the first year after stroke. CHD was the most substantial risk factor for AMI after stroke and conferred an approximate 5‐fold greater risk.


Author(s):  
Matthew R D’Costa ◽  
Annamaria T Kausz ◽  
Kevin J Carroll ◽  
Jóhann P Ingimarsson ◽  
Felicity T Enders ◽  
...  

Abstract   Data directly demonstrating the relationship between urinary oxalate (UOx) excretion and stone events in those with enteric hyperoxaluria (EH) are limited. Therefore, we assessed the relationship between UOx excretion and risk of kidney stone events in a retrospective population-based EH cohort. In all, 297 patients from Olmsted County, Minnesota were identified with EH based upon having a 24-h UOx ≥40 mg/24 h preceded by a diagnosis or procedure associated with malabsorption. Diagnostic codes and urologic procedures consistent with kidney stones during follow-up after baseline UOx were considered a new stone event. Logistic regression and accelerated failure time modeling were performed as a function of UOx excretion to predict the probability of new stone event and the annual rate of stone events, respectively, with adjustment for urine calcium and citrate. Mean ± standard deviation age was 51.4 ± 11.4 years and 68% were female. Median (interquartile range) UOx was 55.4 (46.6–73.0) mg/24 h and 81 patients had one or more stone event during a median follow-up time of 4.9 (2.8–7.8) years. Higher UOx was associated with a higher probability of developing a stone event (P < 0.01) and predicted an increased annual risk of kidney stones (P = 0.001). Estimates derived from these analyses suggest that a 20% decrease in UOx is associated with 25% reduction in the annual odds of a future stone event. Thus, these data demonstrate an association between baseline UOx and stone events in EH patients and highlight the potential benefit of strategies to reduce UOx in this patient group. Background Data directly demonstrating the relationship between urinary oxalate (UOx) excretion and stone events in those with enteric hyperoxaluria (EH) are limited. Methods We assessed the relationship between UOx excretion and risk of kidney stone events in a retrospective population-based EH cohort. In all, 297 patients from Olmsted County, Minnesota were identified with EH based upon having a 24-h UOx ≥40 mg/24 h preceded by a diagnosis or procedure associated with malabsorption. Diagnostic codes and urologic procedures consistent with kidney stones during follow-up after baseline UOx were considered a new stone event. Logistic regression and accelerated failure time modeling were performed as a function of UOx excretion to predict the probability of new stone event and the annual rate of stone events, respectively, with adjustment for urine calcium and citrate. Results Mean ± SD age was 51.4 ± 11.4 years and 68% were female. Median (interquartile range) UOx was 55.4 (46.6–73.0) mg/24 h and 81 patients had ≥1 stone event during a median follow-up time of 4.9 (2.8–7.8) years. Higher UOx was associated with a higher probability of developing a stone event (P < 0.01) and predicted an increased annual risk of kidney stones (P = 0.001). Estimates derived from these analyses suggest that a 20% decrease in UOx is associated with 25% reduction in the annual odds of a future stone event. Conclusions These data demonstrate an association between baseline UOx and stone events in EH patients and highlight the potential benefit of strategies to reduce UOx in this patient group.


2020 ◽  
Vol 71 (4) ◽  
pp. 54-60
Author(s):  
V.S. VOLKOV ◽  
◽  
D.Ju. KASTYRIN ◽  
E.G. LEBEDEV ◽  
◽  
...  

The factors that determine the possibility of a traffic accident at a regulated crossing are sys-tematized. the risk estimates for road accidents are calculated based on the results of the annual cycle and daily in the mode of hourly intervals. The risk factor of a conflict point of a controlled transition is proposed, which is determined in relation to the risk indicators of a road accident in an hour-long interval of real-time mode to the average annual risk assessment of a road accident, reduced to an hour-long interval.


Author(s):  
Stanislav Polzer ◽  
Jan Kracík ◽  
Tomáš Novotný ◽  
Luboš Kubíček ◽  
Robert Staffa ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document