The role of cell-derived inflammation in metal fume fever – blood count changes after exposure with zinc- and copper-containing welding fumes

Author(s):  
Mara Reisgen ◽  
Katharina Thomas ◽  
Viktor Beilmann ◽  
Agnieszka Markert ◽  
Benjamin Gerhards ◽  
...  
Author(s):  
Sarah McCarrick ◽  
Valentin Romanovski ◽  
Zheng Wei ◽  
Elin M. Westin ◽  
Kjell-Arne Persson ◽  
...  

AbstractWelders are daily exposed to various levels of welding fumes containing several metals. This exposure can lead to an increased risk for different health effects which serves as a driving force to develop new methods that generate less toxic fumes. The aim of this study was to explore the role of released metals for welding particle-induced toxicity and to test the hypothesis that a reduction of Cr(VI) in welding fumes results in less toxicity by comparing the welding fume particles of optimized Cr(VI)-reduced flux-cored wires (FCWs) to standard FCWs. The welding particles were thoroughly characterized, and toxicity (cell viability, DNA damage and inflammation) was assessed following exposure to welding particles as well as their released metal fraction using cultured human bronchial epithelial cells (HBEC-3kt, 5–100 µg/mL) and human monocyte-derived macrophages (THP-1, 10–50 µg/mL). The results showed that all Cr was released as Cr(VI) for welding particles generated using standard FCWs whereas only minor levels (< 3% of total Cr) were released from the newly developed FCWs. Furthermore, the new FCWs were considerably less cytotoxic and did not cause any DNA damage in the doses tested. For the standard FCWs, the Cr(VI) released in cell media seemed to explain a large part of the cytotoxicity and DNA damage. In contrast, all particles caused rather similar inflammatory effects suggesting different underlying mechanisms. Taken together, this study suggests a potential benefit of substituting standard FCWs with Cr(VI)-reduced wires to achieve less toxic welding fumes and thus reduced risks for welders.


Author(s):  
Thomas G Morris ◽  
Sushmita Lamba ◽  
Thomas Fitzgerald ◽  
Gary Roulston ◽  
Helen Johnstone ◽  
...  

Background Differentiating between true and pseudohyperkalaemia is essential for patient management. The common causes of pseudohyperkalaemia include haemolysis, blood cell dyscrasias and EDTA contamination. One approach to differentiate between them is by checking the renal function, as it is believed that true hyperkalaemia is rare with normal function. This is logical, but there is limited published evidence to support it. The aim of this study was to investigate the potential role of the estimated glomerular filtration rate in differentiating true from pseudohyperkalaemia. Methods GP serum potassium results >6.0 mmol/L from 1 January 2017 to 31 December 2017, with a repeat within seven days, were included. Entries were retrospectively classified as true or pseudohyperkalaemia based on the potassium reference change value and reference interval. If the initial sample had a full blood count, it was classified as normal/abnormal to remove blood cell dyscrasias. Different estimated glomerular filtration rate cut-points were used to determine the potential in differentiating true from pseudohyperkalaemia. Results A total of 272 patients were included with potassium results >6.0 mmol/L, with 145 classified as pseudohyperkalaemia. At an estimated glomerular filtration rate of 90 ml/min/1.73 m2, the negative predictive value was 81% (95% CI: 67–90%); this increased to 86% (95% CI: 66–95%) by removing patients with abnormal full blood counts. When only patients with an initial potassium ≥6.5 mmol/L were included (regardless of full blood count), at an estimated glomerular filtration rate of 90 ml/min/1.73 m2, the negative predictive value was 100%. Lower negative predictive values were seen with decreasing estimated glomerular filtration rate cut-points. Conclusion Normal renal function was not associated with true hyperkalaemia, making the estimated glomerular filtration rate a useful tool in predicting true from pseudohyperkalaemia, especially for potassium results ≥6.5 mmol/L.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Tanja Belcic Mikic ◽  
Tadej Pajic ◽  
Matjaz Sever

AbstractSuspicion of myeloproliferative neoplasms (MPNs) and especially essential thrombocythemia (ET) in primary care is often based solely on blood counts, with patients referred to a haematologist without a thorough evaluation. We retrospectively assessed the role of calreticulin gene (CALR) mutations in the diagnosis of MPN in this population. We studied CALR mutations in 524 JAK2 V617F-negative patients with suspected MPN. Uncommon CALR mutations were confirmed by Sanger sequencing and searched for in the COSMIC or HGMD database. Mutations were defined as frameshift or non-frameshift mutations. CALR mutations were detected in 23 patients (23/524 = 4.4%). Four mutations detected in our study were newly identified mutations. Non-frameshift mutations were detected in two patients. Most patients (380/524 = 72.5%) were diagnosed with secondary conditions leading to blood count abnormalities such as iron deficiency, inflammatory and infectious diseases, malignancy and hyposplenism. Nine patients (9/23 = 39%) were retrospectively diagnosed with ET based on CALR mutation confirmation. Two patients with non-frameshift CALR mutations were diagnosed with reactive thrombocytosis and MPN unclassifiable, respectively. Our study showed that CALR mutations are important, non-invasive diagnostic indicators of ET and can aid in its diagnosis. Moreover, the type of CALR mutation must be accurately defined, as non-frameshift mutations may not be associated with ET. Finally, CALR mutation detection should be reserved for patients with high suspicion of clonal haematological disease.


2012 ◽  
Vol 78 (4) ◽  
pp. 493-495
Author(s):  
Armin Kamyab ◽  
Jonathan Cook ◽  
Sameer Sawhney ◽  
Michael Milshteyn ◽  
Vijay Mittal

Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 3351-3351
Author(s):  
Maria E. Montoya ◽  
Peter R. Van Delden ◽  
M. Tarek Elghetany ◽  
J. David Bessman

Abstract Detection of iron deficiency remains poorly understood and costly due to inappropriate screening. Low ferritin is a definitive diagnosis of iron deficiency, but screening with ferritin is not allowed. Therefore surrogates in the blood count have been used to justify obtaining the serum ferritin. The purpose of this research was to analyze the role of Hemoglobin (Hgb), Mean Corpuscular Volume (MCV), and RBC Distribution Width (RDW) as surrogates in screening for iron deficiency. All 2,563 patients with serum ferritin levels gathered over 12 months were reviewed. The relative utility of Hgb, MCV, and RDW in screening for low ferritin levels was shown through multiple Receiver Operator Characteristic (ROC) curves. 264 patients had a ferritin less than 10 ng/ml and 210 between 11 to 20 ng/ml. Results indicate that when viewed independently MCV correlates most closely to low ferritin as seen in Figure 1. RDW and Hgb in both males and females demonstrate a weaker association though remains of value. Table1 lists the values at which the three screening tools were 95% and 100% sensitive for detecting ferritin levels of 10 ng/ml and below. In contrast the data indicate that for ferritin levels from 11 to 20 ng/ml all three screening variables have poor sensitivity and specificity. This is demonstrated clearly in Figure 2. The data suggest that the most severe iron deficiency (ferritin under 10 ng/ml) can be well predicted by abnormalities in the blood count; however less severe iron deficiency (ferritin 10 to 20 ng/ml) cannot be anticipated from the blood count. The blood count does not appear to be a practical alternative to ferritin for screening for iron deficiency. Table 1: Sreening Variable Sensitivities* 100% Sensitivity 95% Sensitivity *values for ferritin less than 11 ng/ml MCV >98.2 >90.0 RDW <12.2 <13.1 Hgb Males >15.0 >13.7 Hgb Females >14.2 >12.6 Figure 1 Figure 1. Figure 2 Figure 2.


2015 ◽  
Vol 6 ◽  
pp. 1 ◽  
Author(s):  
Alida Harahap ◽  
Dewi Megawati ◽  
Ita M. Nainggolan ◽  
Maria Swastika ◽  
Iswari Setianingsih ◽  
...  

2004 ◽  
Vol 67 (3) ◽  
pp. 233-249 ◽  
Author(s):  
James M. Antonini ◽  
Michael D. Taylor ◽  
Anthony T. Zimmer ◽  
Jenny R. Roberts
Keyword(s):  

2011 ◽  
Vol 286 (22) ◽  
pp. 19533-19540 ◽  
Author(s):  
Erik R. Anderson ◽  
Xiang Xue ◽  
Yatrik M. Shah

Erythropoiesis is a coordinated process by which RBCs are produced. Erythropoietin, a kidney-derived hormone, and iron are critical for the production of oxygen-carrying mature RBCs. To meet the high demands of iron during erythropoiesis, small intestinal iron absorption is increased through an undefined mechanism. In this study, erythropoietic induction of iron absorption was further investigated. Hypoxia-inducible factor-2α (HIF-2α) signaling was activated in the small intestine during erythropoiesis. Genetic disruption of HIF-2α in the intestine abolished the increase in iron absorption genes as assessed by quantitative real-time reverse transcription-PCR and Western blot analyses. Moreover, the increase in serum iron following induction of erythropoiesis was entirely dependent on intestinal HIF-2α expression. Complete blood count analysis demonstrated that disruption of intestinal HIF-2α inhibited efficient erythropoiesis; mice disrupted for HIF-2α demonstrated lower hematocrit, RBCs, and Hb compared with wild-type mice. These data further cement the essential role of HIF-2α in regulating iron absorption and also demonstrate that hypoxia sensing in the intestine, as well as in the kidney, is essential for regulation of erythropoiesis by HIF-2α.


Sign in / Sign up

Export Citation Format

Share Document