A Filtering Approach to Ergonomics Surveillance

Author(s):  
J. Murray Gibson ◽  
David C. Alexander

There is a need in industry for practical surveillance methods to identify ergonomics problems. Most conventional surveillance methods have the following characteristics: • Require completion of a multi-page checklist for every job in the facility. • Identify, in a “single-pass” survey, all jobs presenting a moderate to low level of ergonomic-related risk, resulting in an “unmanageable” list of problems. • Provide job risk scores used to prioritize every ergonomics problem in the facility. The author presents an alternative surveillance methodology which identifies and prioritizes high, moderate, and low risk jobs using a “filtering approach”. This filtering approach actually consists of three separate checklists, each identifying (or filtering) for jobs of different risk levels: High Risk Survey, Moderate Risk Survey, and Low Risk Survey. Each checklist utilizes data from three sources: ergonomic risk factors, loss information, and employee turnover/complaints.

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Sandra Chamat-Hedemand ◽  
Niels Eske Bruun ◽  
Lauge Østergaard ◽  
Magnus Arpi ◽  
Emil Fosbøl ◽  
...  

Abstract Background Infective endocarditis (IE) is diagnosed in 7–8% of streptococcal bloodstream infections (BSIs), yet it is unclear when to perform transthoracic (TTE) and transoesophageal echocardiography (TOE) according to different streptococcal species. The aim of this sub-study was to propose a flowchart for the use of echocardiography in streptococcal BSIs. Methods In a population-based setup, we investigated all patients admitted with streptococcal BSIs and crosslinked data with nationwide registries to identify comorbidities and concomitant hospitalization with IE. Streptococcal species were divided in four groups based on the crude risk of being diagnosed with IE (low-risk < 3%, moderate-risk 3–10%, high-risk 10–30% and very high-risk > 30%). Based on number of positive blood culture (BC) bottles and IE risk factors (prosthetic valve, previous IE, native valve disease, and cardiac device), we further stratified cases according to probability of concomitant IE diagnosis to create a flowchart suggesting TTE plus TOE (IE > 10%), TTE (IE 3–10%), or “wait & see” (IE < 3%). Results We included 6393 cases with streptococcal BSIs (mean age 68.1 years [SD 16.2], 52.8% men). BSIs with low-risk streptococci (S. pneumoniae, S. pyogenes, S. intermedius) are not initially recommended echocardiography, unless they have ≥3 positive BC bottles and an IE risk factor. Moderate-risk streptococci (S. agalactiae, S. anginosus, S. constellatus, S. dysgalactiae, S. salivarius, S. thermophilus) are guided to “wait & see” strategy if they neither have a risk factor nor ≥3 positive BC bottles, while a TTE is recommended if they have either ≥3 positive BC bottles or a risk factor. Further, a TTE and TOE are recommended if they present with both. High-risk streptococci (S. mitis/oralis, S. parasanguinis, G. adiacens) are directed to a TTE if they neither have a risk factor nor ≥3 positive BC bottles, but to TTE and TOE if they have either ≥3 positive BC bottles or a risk factor. Very high-risk streptococci (S. gordonii, S. gallolyticus, S. mutans, S. sanguinis) are guided directly to TTE and TOE due to a high baseline IE prevalence. Conclusion In addition to the clinical picture, this flowchart based on streptococcal species, number of positive blood culture bottles, and risk factors, can help guide the use of echocardiography in streptococcal bloodstream infections. Since echocardiography results are not available the findings should be confirmed prospectively with the use of systematic echocardiography.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Yuanyuan Chen ◽  
Dongru Chen ◽  
Huancai Lin

Abstract Background Infiltration and sealing are micro-invasive treatments for arresting proximal non-cavitated caries lesions; however, their efficacies under different conditions remain unknown. This systematic review and meta-analysis aimed to evaluate the caries-arresting effectiveness of infiltration and sealing and to further analyse their efficacies across different dentition types and caries risk levels. Methods Six electronic databases were searched for published literature, and references were manually searched. Split-mouth randomised controlled trials (RCTs) to compare the effectiveness between infiltration/sealing and non-invasive treatments in proximal lesions were included. The primary outcome was obtained from radiographical readings. Results In total, 1033 citations were identified, and 17 RCTs (22 articles) were included. Infiltration and sealing reduced the odds of lesion progression (infiltration vs. non-invasive: OR = 0.21, 95% CI 0.15–0.30; sealing vs. placebo: OR = 0.27, 95% CI 0.18–0.42). For both the primary and permanent dentitions, infiltration and sealing were more effective than non-invasive treatments (primary dentition: OR = 0.30, 95% CI 0.20–0.45; permanent dentition: OR = 0.20, 95% CI 0.14–0.28). The overall effects of infiltration and sealing were significantly different from the control effects based on different caries risk levels (OR = 0.20, 95% CI 0.14–0.28). Except for caries risk at moderate levels (moderate risk: OR = 0.32, 95% CI 0.01–8.27), there were significant differences between micro-invasive and non-invasive treatments (low risk: OR = 0.24, 95% CI 0.08–0.72; low to moderate risk: OR = 0.38, 95% CI 0.18–0.81; moderate to high risk: OR = 0.17, 95% CI 0.10–0.29; and high risk: OR = 0.14, 95% CI 0.07–0.28). Except for caries risk at moderate levels (moderate risk: OR = 0.32, 95% CI 0.01–8.27), infiltration was superior (low risk: OR = 0.24, 95% CI 0.08–0.72; low to moderate risk: OR = 0.38, 95% CI 0.18–0.81; moderate to high risk: OR = 0.20, 95% CI 0.10–0.39; and high risk: OR = 0.14, 95% CI 0.05–0.37). Conclusion Infiltration and sealing were more efficacious than non-invasive treatments for halting non-cavitated proximal lesions.


Author(s):  
Muhamad Bob Anthony

PT. RK is one of the major international steel producing companies. This study aims to determine the potential hazards and the value of the level of risk that is likely to occur in the new plant owned by PT. RK i.e. the gas cleaning system area which is currently in the process of entering 95% progress. This study uses the Hazard & Operability Study (HAZOPs) method in analyzing risks in the gas cleaning system area of PT. RK. The Hazard & Operability Study (HAZOPs) method was used in this study because this method is very suitable for a new plant to be used. Based on the identification of potential hazards and risk analysis that has been done in the area of gas cleaning system using the HAZOPs method, it was found that 11 deviations that might occur from all existing nodes, i.e. for extreme risk levels of 1 (one) deviation or 9%, level high risk of 2 (two) deviations or 18%, moderate risk level of 6 (six) deviations or 55% and low risk level of 2 (two) deviations or 18%.Keyword : Gas Cleaning System, HAZOPs, Potential of Hazard, Risk Levels PT. RK merupakan salah satu perusahaan manufaktur besar penghasil baja berskala internasional. Penelitian ini bertujuan untuk mengetahui potensi bahaya dan nilai level risiko yang kemungkinan terjadi di plant baru milik PT. RK yaitu area gas cleaning system yang saat ini proses pekerjaannya sudah memasuki progress 95%. Penelitian ini menggunakan metode Hazard & Operability Study (HAZOPs) dalam menganalisa risiko di area gas cleaning system  PT. RK.  Metode Hazard & Operability Study (HAZOPs) digunakan dalam penelitian ini dikarenakan metode ini sangat cocok untuk sebuah plant baru yang akan digunakan. Berdasarkan identifikasi potensi bahaya dan analisa risiko yang telah dilakukan di area gas cleaning system dengan menggunakan metode HAZOPs, didapatkan bahwa 11 penyimpangan yang kemungkinan terjadi dari semua node yang ada yaitu untuk level risiko extreme sebanyak 1 (satu) penyimpangan atau sebesar 9%, level risiko high risk sebanyak 2 (dua) penyimpangan atau sebesar 18%, level risiko moderate sebanyak 6 (enam) penyimpangan atau sebesar 55% dan level risiko low risk sebanyak 2 (dua) penyimpangan atau sebesar 18%.Kata Kunci: Gas Cleaning System, HAZOPs, Level Risiko, Potensi Bahaya


2020 ◽  
Author(s):  
Adnan I Qureshi

Background and Purpose There is increasing recognition of a relatively high burden of pre-existing cardiovascular disease in Corona Virus Disease 2019 (COVID 19) infected patients. We determined the burden of pre-existing cardiovascular disease in persons residing in United States (US) who are at risk for severe COVID-19 infection. Methods Age (60 years or greater), presence of chronic obstructive pulmonary disease, diabetes, mellitus, hypertension, and/or malignancy were used to identify persons at risk for admission to intensive care unit, or invasive ventilation, or death with COVID-19 infection. Persons were classified as low risk (no risk factors), moderate risk (1 risk factor), and high risk (two or more risk factors present) using nationally representative sample of US adults from National Health and Nutrition Examination Survey 2017 and 2018 survey. Results Among a total of 5856 participants, 2386 (40.7%) were considered low risk, 1325 (22.6%) moderate risk, and 2145 persons (36.6%) as high risk for severe COVID-19 infection. The proportion of patients who had pre-existing stroke increased from 0.6% to 10.5% in low risk patients to high risk patients (odds ratio [OR]19.9, 95% confidence interval [CI]11.6-34.3). The proportion of who had pre-existing myocardial infection (MI) increased from 0.4% to 10.4% in low risk patients to high risk patients (OR 30.6, 95% CI 15.7-59.8). Conclusions A large proportion of persons in US who are at risk for developing severe COVID 19 infection are expected to have pre-existing cardiovascular disease. Further studies need to identify whether targeted strategies towards cardiovascular diseases can reduce the mortality in COVID-19 infected patients.


2021 ◽  
Vol 11 ◽  
Author(s):  
Ling-Feng Liu ◽  
Qing-Song Li ◽  
Yin-Xiang Hu ◽  
Wen-Gang Yang ◽  
Xia-Xia Chen ◽  
...  

PurposeThe role of radiotherapy, in addition to chemotherapy, has not been thoroughly determined in metastatic non-small cell lung cancer (NSCLC). The purpose of the study was to investigate the prognostic factors and to establish a model for the prediction of overall survival (OS) in metastatic NSCLC patients who received chemotherapy combined with the radiation therapy to the primary tumor.MethodsThe study retrospectively reviewed 243 patients with metastatic NSCLC in two prospective studies. A prognostic model was established based on the results of the Cox regression analysis.ResultsMultivariate analysis showed that being male, Karnofsky Performance Status score &lt; 80, the number of chemotherapy cycles &lt;4, hemoglobin level ≤120 g/L, the count of neutrophils greater than 5.8 ×109/L, and the count of platelets greater than 220 ×109/L independently predicted worse OS. According to the number of risk factors, patients were further divided into one of three risk groups: those having ≤ 2 risk factors were scored as the low-risk group, those having 3 risk factors were scored as the moderate-risk group, and those having ≥ 4 risk factors were scored as the high-risk group. In the low-risk group, 1-year OS is 67.7%, 2-year OS is 32.1%, and 3-year OS is 19.3%; in the moderate-risk group, 1-year OS is 59.6%, 2-year OS is 18.0%, and 3-year OS is 7.9%; the corresponding OS rates for the high-risk group were 26.2%, 7.9%, and 0% (P&lt;0.001) respectively.ConclusionMetastatic NSCLC patients treated with chemotherapy in combination with thoracic radiation may be classified as low-risk, moderate-risk, or high-risk group using six independent prognostic factors. This prognostic model may help design the study and develop the plans of individualized treatment.


2020 ◽  
Vol 30 (Supplement_5) ◽  
Author(s):  
I Piras ◽  
G Murenu ◽  
G Piras ◽  
G Pia ◽  
A Azara ◽  
...  

Abstract Background Falls in hospital are adverse events with serious consequences for the patient. Fall risk assessment requires easy tools that are suitable for the specific clinical context. This is important to quickly identify preventing measures. The aim of the study is to identify an appropriate scale for assessing fall risk in patients from an emergency department. Methods For the fall risk assessment in the emergency department, three scales were identified in literature: Kinder 1, MEDFRAT, and Morse. MEDFRAT and Morse classify the patient in high, moderate, and low risk; Kinder 1 split patients “at risk” (also when there is only one positive item) and “non-risk” (in which all items are negative). The study was carried out in July 2019 in an Italian emergency department. Patients who arrived in triage were assessed for the fall risk using the three scales. Results On a sample of 318 patients, the used scales show different levels of fall risk. For Kinder 1, 83.02% is at risk and 16.98% is not at risk; for MEDFRAT, 14.78% is at high risk, 15.09% moderate, and 70.13% low risk; for Morse, 8.81% is at high risk, 35.53% moderate, and 56.66% low risk. As Kinder 1 implies as “high risk” that all items of the questionnaire are positive, to compare Kinder 1 to the other scales with three measurements, we assumed only one positive response as “moderate risk”, all negative responses as “low risk”. Thus, Kinder 1 shows no cases at high risk, 83.02% moderate risk, and 16.98% low risk. All the scales show that the moderate-high risk increases with age. MEDFRAT and Morse have concordant percentages for young (13.6%), elderly (61.2%), and long-lived (66.6%) people. Kinder 1, 59%, 96.7%, and 100%, respectively. Conclusions The comparison between scales shows inhomogeneity in identifying the level of risk. MEDFRAT and Morse appear more reliable and consistent. Key messages An appropriate assessment scale is important to identify the fall risk level. Identifying accurate fall risk levels allows for implementing specific prevention actions.


Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 4545-4545
Author(s):  
Massimo Breccia ◽  
Matteo Molica ◽  
Irene Zacheo ◽  
Giuliana Alimena

Abstract Nilotinib is currently approved for the treatment of chronic myeloid leukemia (CML) in chronic (CP) and accelerated phase (AP) after failure of imatinib and in newly diagnosed patients. Atherosclerotic events were retrospectively reported in patients with baseline cardiovascular risk factors during nilotinib treatment. We estimated the risk of developing atherosclerotic events in patients treated with second or first line nilotinib, with a median follow-up of 48 months, by retrospectively applying the SCORE chart proposed by the European Society of Cardiology (ESC) and evaluating risk factors at baseline (diabetes, obesity, smoking and hypertension). Overall, we enrolled in the study 82 CP patients treated frontline (42 patients, at the dose of 300 mg BID) or after failure of other tyrosine kinase inhibitors (40 patients, treated with 400 mg BID). The SCORE chart is based on the stratification of sex (male vs female), age (from 40 to 65 years), smoker vs non-smoker, systolic pressure (from 120 to 180 mm Hg) and cholesterol (measured in mmol/l, from 150 to 300 mg/dl). For statistical purposes we considered patients subdivided in low, moderate, high and very high risk. There were 48 males and 34 females, median age 51 years (range 22-84). According to WHO classification, 42 patients were classified as normal weight (BMI < 25), 26 patients were overweight (BMI 26- <30) and 14 were obese (BMI > 30). Retrospective classification according to the SCORE chart revealed that 27 patients (33%) were in the low risk category, 30 patients (36%) in the moderate risk category and 24 patients (29%) in the high risk category. As regards risk factors, we revealed that 17 patients (20.7%) had a concomitant type II controlled diabetes (without organ damage), 23 patients (28%) were smokers, 29 patients (35%) were receiving concomitant drugs for hypertension, 15 patients (18%) had concomitant dyslipidaemia. Overall, the cumulative incidence of atherosclerotic events at 48 months was 8.5% (95% CI: 4.55-14.07): none of the low-risk patients according to the SCORE chart experienced atherosclerotic events compared to 10% in the moderate risk and 29% in the high risk category (p=0.002). Atherosclerotic-free survival was 100%, 89% and 69% in the low, moderate and high-risk population, respectively (p=0.001). SCORE chart evaluation at disease baseline could be a valid tool to identify patients at high risk of atherosclerotic events during nilotinib treatment. Disclosures Breccia: novartis: Consultancy; BMS: Consultancy; Celgene: Consultancy.


2021 ◽  
Vol 9 (12) ◽  
pp. 403-407
Author(s):  
Owais Ahmed Wani ◽  
◽  
Nasir Ali ◽  
Ouber Qayoom ◽  
Rajveer Beniwal ◽  
...  

Background: The Thrombolysis in Myocardial Infarction (TIMI) risk score is said to be an important factor in predicting mortality risk in fibrinolysis-eligible STEMI patients. An attempt was made to assess the situation by comparing risk stratification based on the TIMI score with the hospital outcome of such individuals. Methods: 145 STEMI patients were included in this srudy , TIMI risk scores were calculated and analysed vis-Ã -vis various relevant parameters.. Based on their TIMI scores, the patients were placed into three risk groups: low-risk,moderate-risk, and high-risk. All patients received standard anti-ischemic medication, were thrombolyzed, monitored in the ICCU, and monitored throughout their hospital stay for post-MI sequelae. Results: According to the TIMI risk score, 79 patients (54.5%) had low-risk , 48 (33.1%) to the moderate-risk , and 18 (12.4%) to the high-risk . The highest mortality rate (total 17 deaths) was found in the high-risk group (55.6%), followed by moderate-risk (12.2%) and low-risk (1.28%) groups, respectively. Killips categorization grade 2-4 had the highest relative risk (RR-15.85) of the seven potentially dubious variables evaluated, followed by systolic BP 100mmHg (RR-10.48), diabetes mellitus (RR-2.79), and age >65 years (RR- 2.59). Conclusions: In patients with STEMI, the TIMI risk scoring system appears to be a straightforward, valid, and practical bedside tool for quantitative risk classification and short-term prognosis prediction.


10.2196/16069 ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. e16069
Author(s):  
Kenneth B Chapman ◽  
Martijn M Pas ◽  
Diana Abrar ◽  
Wesley Day ◽  
Kris C Vissers ◽  
...  

Background Several pain management guidelines recommend regular urine drug testing (UDT) in patients who are being treated with chronic opioid analgesic therapy (COAT) to monitor compliance and improve safety. Guidelines also recommend more frequent testing in patients who are at high risk of adverse events related to COAT; however, there is no consensus on how to identify high-risk patients or on the testing frequency that should be used. Using previously described clinical risk factors for UDT results that are inconsistent with the prescribed COAT, we developed a web-based tool to adjust drug testing frequency in patients treated with COAT. Objective The objective of this study was to evaluate a risk stratification tool, the UDT Randomizer, to adjust UDT frequency in patients treated with COAT. Methods Patients were stratified using an algorithm based on readily available clinical risk factors into categories of presumed low, moderate, high, and high+ risk of presenting with UDT results inconsistent with the prescribed COAT. The algorithm was integrated in a website to facilitate adoption across practice sites. To test the performance of this algorithm, we performed a retrospective analysis of patients treated with COAT between June 2016 and June 2017. The primary outcome was compliance with the prescribed COAT as defined by UDT results consistent with the prescribed COAT. Results 979 drug tests (867 UDT, 88.6%; 112 oral fluid testing, 11.4%) were performed in 320 patients. An inconsistent drug test result was registered in 76/979 tests (7.8%). The incidences of inconsistent test results across the risk tool categories were 7/160 (4.4%) in the low risk category, 32/349 (9.2%) in the moderate risk category, 28/338 (8.3%) in the high risk category, and 9/132 (6.8%) in the high+ risk category. Generalized estimating equation analysis demonstrated that the moderate risk (odds ratio (OR) 2.1, 95% CI 0.9-5.0; P=.10), high risk (OR 2.0, 95% CI 0.8-5.0; P=.14), and high risk+ (OR 2.0, 95% CI 0.7-5.6; P=.20) categories were associated with a nonsignificantly increased risk of inconsistency vs the low risk category. Conclusions The developed tool stratified patients during individual visits into risk categories of presenting with drug testing results inconsistent with the prescribed COAT; the higher risk categories showed nonsignificantly higher risk compared to the low risk category. Further development of the tool with additional risk factors in a larger cohort may further clarify and enhance its performance.


2019 ◽  
Vol 3 ◽  
pp. 98-113
Author(s):  
R.K.A. Bhalaji ◽  
S. Bathrinath ◽  
S.G. Ponnambalam ◽  
S. Saravanasankar

This study portrays a ranked structure on different risk factors including patient, surgery and hospital related risk factors and allied unfavorable outcomes concerning an orthopedic hospital. The paper suggests a methodical surgical site infection (SSI) risk assessment method for assessing level of risk factors using three vital quantifying elements; outcome, time and likelihood of exposure. To transform the linguistic data into numeric risk scores, an enhanced decision making technique using fuzzy set theory has been endeavored in this paper. The notion of ‘centre of area’ technique for widespread TFNs has been discovered to measure the ‘extent of risk’ with regard to crisp scores. Lastly, a reasonable structure for classifying risk factors into various risk levels has been built based on differentiated ranges of assessed crisp risk scores. Then, an activity necessity plan has been proposed, which could give direction to the officials for effectively controlling risk factors in the circumstance of orthopedic hospital.


Sign in / Sign up

Export Citation Format

Share Document