scholarly journals Malaria Incidence Does Not Differ with Immediate Compared to 28-day Delayed Iron Treatment in Children with Severe Malaria and Iron Deficiency (OR10-04-19)

2019 ◽  
Vol 3 (Supplement_1) ◽  
Author(s):  
Sarah Cusick ◽  
Robert Opoka ◽  
Andrew Ssemata ◽  
Michael Georgieff ◽  
Chandy John

Abstract Objectives We aimed to determine if delaying iron until 28 days after antimalarial treatment in children with severe malaria and iron deficiency leads to fewer subsequent clinical malaria episodes as compared to concurrent iron therapy. Methods The randomized controlled trial was conducted Ugandan children 18 mo-5 y with severe malaria [cerebral malaria (CM), n = 79; severe malarial anemia (SMA), n = 77] and healthy community children (CC, n = 83) at Mulago Hospital in Kampala, Uganda. All children with malaria received antimalarial treatment. Children with iron deficiency (defined by zinc protoporphyrin (ZPP) >= 80 µmol/mol heme) were randomized to start a 90-day course of ferrous sulfate (2 mg/kg/day) concurrently with antimalarial treatment on Day 0 (immediate group, I) or on Day 28 (delayed group, D). Incidence of malaria episodes over the 12-month follow-up period was assessed by sick-child visits to the study clinic. Malaria was defined as measured fever (T >37.5°C) plus Plasmodium falciparum on blood smear. Negative binomial regression was used to model counts of malaria episodes as a function of treatment group (I or D), controlling for age. Hazard ratios compared time to event between the I and D groups. Results All children with CM and SMA and 35 CC had high ZPP and were randomized to I or D iron. There were no differences in malaria incidence (defined with either measured fever or history of fever) with I vs. D treatment in any study group. The incidence of inpatient malaria episodes defined with history of fever was marginally statistically significant lower with D iron in the SMA group [incidence rate ratio (IRR) D/I (95% CI) = 0.38 (0.14, 1.1), P = 0.07). In the SMA group, children who received D iron tended to have a longer time to first inpatient event than children in the I group [Hazard ratio (95% CI) D/I: 0.37 (0.13, 1.1), P = 0.07]. Conclusions Delaying iron in children with severe malaria had no clear risk or benefit on subsequent malaria incidence or time-to-first episode as compared to immediate treatment. Given that previous analysis revealed that iron status was improved with delayed iron among children with SMA, the lack of difference in malaria incidence suggests that delaying iron therapy may be a safe way to improve iron status in this group. Funding Sources NIH/NICHD.

2020 ◽  
Vol 111 (5) ◽  
pp. 1059-1067 ◽  
Author(s):  
Sarah E Cusick ◽  
Robert O Opoka ◽  
Andrew S Ssemata ◽  
Michael K Georgieff ◽  
Chandy C John

ABSTRACT Background WHO guidelines recommend concurrent iron and antimalarial treatment in children with malaria and iron deficiency, but iron may not be well absorbed or utilized during a malaria episode. Objectives We aimed to determine whether starting iron 28 d after antimalarial treatment in children with severe malaria and iron deficiency would improve iron status and lower malaria risk. Methods We conducted a randomized clinical trial on the effect of immediate compared with delayed iron treatment in Ugandan children 18 mo–5 y of age with 2 forms of severe malaria: cerebral malaria (CM; n = 79) or severe malarial anemia (SMA; n = 77). Asymptomatic community children (CC; n = 83) were enrolled as a comparison group. Children with iron deficiency, defined as zinc protoporphyrin (ZPP) ≥ 80 µmol/mol heme, were randomly assigned to receive a 3-mo course of daily oral ferrous sulfate (2 mg · kg–1 · d–1) either concurrently with antimalarial treatment (immediate arm) or 28 d after receiving antimalarial treatment (delayed arm). Children were followed for 12 mo. Results All children with CM or SMA, and 35 (42.2%) CC, were iron-deficient and were randomly assigned to immediate or delayed iron treatment. Immediate compared with delayed iron had no effect in any of the 3 study groups on the primary study outcomes (hemoglobin concentration and prevalence of ZPP ≥ 80 µmol/mol heme at 6 mo, malaria incidence over 12 mo). However, after 12 mo, children with SMA in the delayed compared with the immediate arm had a lower prevalence of iron deficiency defined by ZPP (29.4% compared with 65.6%, P = 0.006), a lower mean concentration of soluble transferrin receptor (6.1 compared with 7.8 mg/L, P = 0.03), and showed a trend toward fewer episodes of severe malaria (incidence rate ratio: 0.39; 95% CI: 0.14, 1.12). Conclusions In children with SMA, delayed iron treatment did not increase hemoglobin concentration, but did improve long-term iron status over 12 mo without affecting malaria incidence. This trial was registered at clinicaltrials.gov as NCT01093989.


2019 ◽  
Vol 149 (3) ◽  
pp. 513-521 ◽  
Author(s):  
Brietta M Oaks ◽  
Josh M Jorgensen ◽  
Lacey M Baldiviez ◽  
Seth Adu-Afarwuah ◽  
Ken Maleta ◽  
...  

ABSTRACTBackgroundPrevious literature suggests a U-shaped relation between hemoglobin concentration and adverse birth outcomes. There is less evidence on associations between iron status and birth outcomes.ObjectiveOur objective was to determine the associations of maternal hemoglobin concentration and iron status with birth outcomes.MethodsWe conducted a secondary data analysis of data from 2 cohorts of pregnant women receiving iron-containing nutritional supplements (20–60 mg ferrous sulfate) in Ghana (n = 1137) and Malawi (n = 1243). Hemoglobin concentration and 2 markers of iron status [zinc protoporphyrin and soluble transferrin receptor (sTfR)] were measured at ≤20 weeks and 36 weeks of gestation. We used linear and Poisson regression models and birth outcomes included preterm birth (PTB), newborn stunting, low birth weight (LBW), and small-for-gestational-age.ResultsPrevalence of iron deficiency (sTfR >6.0 mg/L) at enrollment was 9% in Ghana and 20% in Malawi. In early pregnancy, iron deficiency was associated with PTB (9% compared with 17%, adjusted RR: 1.63; 95% CI: 1.14, 2.33) and stunting (15% compared with 23%, adjusted RR: 1.44; 95% CI: 1.09, 1.94) in Malawi but not Ghana, and was not associated with LBW in either country; replete iron status (sTfR <10th percentile) was associated with stunting (9% compared with 15%, adjusted RR: 1.71; 95% CI: 1.06, 2.77) in Ghana, but not PTB or LBW, and was not associated with any birth outcomes in Malawi. In late pregnancy, iron deficiency was not related to birth outcomes in either country and iron-replete status was associated with higher risk of LBW (8% compared with 16%, adjusted RR: 1.90; 95% CI: 1.17, 3.09) and stunting (6% compared with 13%, adjusted RR: 2.14; 95% CI: 1.21, 3.77) in Ghana, but was not associated with birth outcomes in Malawi.ConclusionsThe associations of low or replete iron status with birth outcomes are population specific. Research to replicate and extend these findings would be beneficial. These trials were registered at clinicaltrials.gov as NCT00970866 (Ghana) and NCT01239693 (Malawi).


2019 ◽  
Vol 73 (1) ◽  
pp. 1-9
Author(s):  
Radisa Prodanovic ◽  
Sreten Nedic ◽  
Oliver Radanovic ◽  
Vesna Milicevic ◽  
Ivan Vujanac ◽  
...  

Introduction. Neonatal calves are often deficient in iron. Accumulating evidence indicates that iron status is associated with disease pathologies including diarrhea. Our objective was to examine the association between iron status and gut function in neonatal calves with and without a history of calf diarrhea. Materials and Methods. Calves were divided into two groups based on their history of diarrhea; the first group were diarrheic calves (n=6) and the second group were non-diarrheic healthy calves (n=6). Blood samples (n=12) were collected at day 12 of age and erythrogram determination and measurements of serum iron and total iron binding capacity were performed. Hematological values were measured using an automatic analyzer, and biochemical properties were determined spectrophotometrically. Fecal samples were obtained from all calves and pH measured using semi quantitative test strips as well as being examined by bacterial cultivation for enterotoxigenic Escherichia coli, Salmonella spp. and Clostridium perfringens, by RT-PCR for the presence of bovine rotavirus, bovine coronavirus and bovine viral diarrhea virus, and by microscopy for the presence of Cryptosporidium parvum. Results and Conclusions. There were significant iron-related changes for most hematological indices in diarrheic calves; and iron (Fe) deficiency and microcytic, hypochromic anemia were diagnosed. The pH of the feces was significantly higher in diarrheic calves than in the non-diarrheic healthy group (P<0.01). All fecal samples were negative for the analyzed enteric pathogens. According to the results obtained, calves experiencing iron deficiency anemia exhibit changes in gut function leading to diarrhea as compared with a matched group of healthy calves.


Author(s):  
Amrita S Kumar ◽  
A Geetha ◽  
Jim Joe ◽  
Arun Mathew Chacko

Introduction: Blood donation is one of the most significant contributions that a person can make towards the society. A donor generally donates maximum 450 mL of blood at the time of donation. If 450 mL of blood is taken in a donation, men lose 242±17 mg and women lose 217±11 mg of iron. Hence, adequate iron stores are very important in maintenance of the donor’s health. Aim: To assess the influence of frequency of blood donation on iron levels of blood donors by estimating Haemoglobin (Hb) and other blood indices which reflect iron status of blood and serum ferritin which reflects body iron stores. Materials and Methods: The present study was a cross-sectional analytical study, conducted on 150 blood donors, 18-40 years of age presenting to the Blood Bank in Government Medical College, Kottayam, Kerala, India, between December 2016 to December 2017. Total of 150 donors were divided into four groups according to the number of donations per year. Group I were the first time donors with no previous history of blood donation, Group II- included those with history of donation once in the previous year, Group III- those donors with history of donation twice in the previous year and Group IV- those having history of donation thrice in the previous year. Six ml of whole blood collected from each donor, two ml was used for estimating Haemoglobin (Hb), Packed Cell Volume (PCV), Mean Corpuscular Volume (MCV), Mean Corpuscular Hb (MCH), Mean Corpuscular Haemoglobin Concentration (MCHC) in haematology analyser. Serum separated from remaining four mL of blood underwent ferritin analysis by Chemiluminescence Immunoassay (CLIA) method. Iron stores were considered normal at serum ferritin value from 23.9-336ng/mL in males and 11-307ng/mL in females. Statistical analysis was performed in Statistical Package for the Social Sciences (SPSS) version 16.0. Analysis of Variance (ANOVA) test and Pearson correlation test were used to find association between various parameters and collected data. The p-value <0.05 was considered as statistically significant. Results: There was no significant correlation between serum ferritin level and frequency of blood donation. MCH, MCHC showed significant association (p-value 0.039 and 0.007, respectively) with frequency of blood donation. Low positive correlation was seen between Hb and PCV with serum ferritin levels (r=0.381, p-value <0.001 and r=0.354, p-value <0.001, respectively). Conclusion: There is no significant association between frequency of blood donation and serum ferritin levels.


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 5149-5149
Author(s):  
John Adamson ◽  
Zhu Li ◽  
Paul Miller ◽  
Annamaria Kausz

Abstract Abstract 5149 BACKGROUND Iron deficiency anemia (IDA) is associated with reduced physical functioning, cardiovascular disease, and poor quality of life. The measurement of body iron stores is essential to the management of IDA, and the indices most commonly used to assess iron status are transferrin saturation (TSAT) and serum ferritin. Unfortunately, serum ferritin is not a reliable indicator of iron status, particularly in patients with chronic kidney disease (CKD), because it is an acute phase reactant and may be elevated in patients with iron deficiency in the presence of inflammation. Recent clinical trials have shown that patients with iron indices above a strict definition of iron deficiency (TSAT >15%, serum ferritin >100 ng/mL), do have a significant increase in hemoglobin (Hgb) when treated with iron. These results are consistent with recent changes to the National Cancer Comprehensive Network (NCCN) guidelines, which have expanded the definition of functional iron deficiency (relative iron deficiency) to include a serum ferritin <800 ng/mL; previously, the serum ferritin threshold was <300 ng/mL. Additionally, for patients who meet this expanded definition of functional iron deficiency (TSAT <20%, ferritin <800 ng/mL), it is now recommended that iron replacement therapy be considered in addition to erythropoiesis-stimulating agent (ESA) therapy. Ferumoxytol (Feraheme®) Injection, a novel IV iron therapeutic agent, is indicated for the treatment of IDA in adult patients with CKD. Ferumoxytol is composed of an iron oxide with a unique carbohydrate coating (polyglucose sorbitol carboxymethylether), is isotonic, has a neutral pH, and evidence of lower free iron than other IV irons. Ferumoxytol is administered as two IV injections of 510 mg (17 mL) 3 to 8 days apart for a total cumulative dose of 1.02 g; each IV injection can be administered at a rate up to 1 mL/sec, allowing for administration of a 510 mg dose in less than 1 minute. METHODS Data were combined from 2 identically designed and executed Phase III randomized, active-controlled, open-label studies conducted in 606 patients with CKD stages 1–5 not on dialysis. Patients were randomly assigned in a 3:1 ratio to receive a course of either 1.02 g IV ferumoxytol (n=453) administered as 2 doses of 510 mg each within 5±3 days or 200 mg of oral elemental iron (n=153) daily for 21 days. The main IDA inclusion criteria included a Hgb ≤11.0 g/dL, TSAT ≤30%, and serum ferritin ≤600 ng/mL. The mean baseline Hgb was approximately 10 g/dL, and ESAs were use by approximately 40% of patients. To further evaluate the relationship between baseline markers of iron stores and response to iron therapy, data from these trials were summarized by baseline TSAT and serum ferritin levels. RESULTS Overall, results from these two pooled trials show that ferumoxytol resulted in a statistically significant greater mean increase in Hgb relative to oral iron. When evaluated across the baseline iron indices examined, statistically significant (p<0.05) increases in Hgb at Day 35 were observed following ferumoxytol administration, even for subjects with baseline iron indices above levels traditionally used to define iron deficiency. Additionally, at each level of baseline iron indices, ferumoxytol produced a larger change in Hgb relative to oral iron. These data suggest that patients with CKD not on dialysis with a wide range of iron indices at baseline respond to IV iron therapy with an increase in Hgb. Additionally, ferumoxytol consistently resulted in larger increases in Hgb relative to oral iron across all levels of baseline iron indices examined. Disclosures: Adamson: VA Medical Center MC 111E: Honoraria, Membership on an entity's Board of Directors or advisory committees. Li:AMAG Pharmaceuticals, Inc.: Employment. Miller:AMAG Pharmaceuticals, Inc.: Employment. Kausz:AMAG Pharmaceuticals, Inc.: Employment.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 955-955 ◽  
Author(s):  
Justin CL Ho ◽  
Ivan Stevic ◽  
Anthony Chan ◽  
Keith KH Lau ◽  
Howard H.W. Chan

Introduction: Following the seminal study by Guyatt et al., serum ferritin has been widely accepted as the most accurate surrogate marker for iron deficiency, particularly if ferritin levels are < 45 mg/L. However, as an acute-phase reactant, ferritin levels rise with a number of conditions, including obesity, age, liver disorders, and inflammation. Elevated ferritin levels due to these concomitant clinical conditions may mask the underlying iron deficiency, thus rendering serum ferritin an unreliable marker for iron status. Therefore, the aim of this study is to evaluate the sensitivity and specificity of ferritin for the diagnosis of iron deficiency in patients presenting with normocytic anemia, when response to iron replacement was used as the gold standard for the diagnosis of iron deficiency. Methods: This study is a retrospective case review involving patients referred to an academic hematology clinic from 2003 to 2015 for further evaluation of chronic normocytic anemia without other cell lines abnormalities. Following initial workup to ensure the absence of 1) mixed microcytic-macrocytic anemia, 2) reticulocytosis suggesting acute blood loss or hemolysis, and 3) suboptimally low erythropoietin level, 59 patients received a therapeutic trial of oral ferrous gluconate. Intravenous iron sucrose was provided if patients could not tolerate or were refractory to oral iron therapy. All 59 patients (median age: 71 years, range: 24-93, male:female ratio 23:36) underwent a complete review of records before and after iron therapy for changes in haematological parameters and iron indexes. An increase of Hb ≥ 10.0 g/L from baseline was defined as a response to iron therapy, according to the WHO criteria. Results: The mean pre-treatment ferritin level of the cohort was 110 μg/L (median: 61 μg/L), which was higher than the generally accepted cut-off for iron deficiency. Following iron replacement therapy, the mean ferritin concentration of the cohort was raised to 257 μg/L, thus confirming the efficacy of iron therapy. Overall, 33 patients (56%) responded to iron therapy, experiencing an increase in Hb ≥ 10.0 g/L. Interestingly, 19 (58%) of these 33 patients had a pre-treatment ferritin value > 45 μg/L. Receiver operating characteristic (ROC) analysis of response rates to iron therapy and pre-treatment ferritin levels revealed an area under the curve (AUC) of only 0.492, indicating poor performance of ferritin tests in predicting the response to iron therapy. As such, serum ferritin is inadequate in predicting response to iron therapy in patients presenting with normocytic anemia. Conclusion: Despite the prevailing notion that low ferritin levels are diagnostic of iron deficiency, this retrospective case study exhibited the shortcomings of using ferritin as the sole determinant of iron status. Consequently, patients with normocytic anemia having a normal or high ferritin should not be excluded from a therapeutic trial of iron therapy. Disclosures No relevant conflicts of interest to declare.


1992 ◽  
Vol 38 (11) ◽  
pp. 2184-2189 ◽  
Author(s):  
J Hastka ◽  
J J Lasserre ◽  
A Schwarzbeck ◽  
M Strauch ◽  
R Hehlmann

Abstract Zinc protoporphyrin (ZPP) is determined by hematofluorometry of whole blood to detect iron deficiency in blood donors. In hospitalized patients, ZPP did not correlate with established markers of iron status. We performed 4500 ZPP measurements with the Aviv front-face hematofluorometer in samples from 475 patients and measured ferritin, transferrin saturation, hemoglobin, and erythrocyte indices. We found that the fluorometric determination is affected by substances dissolved in plasma but that this interference can be eliminated by using washed erythrocytes. In validation tests the within-day variation was &lt; 3.5%; the day-to-day variation was &lt; 6.8%. In 130 healthy persons without iron deficiency, ZPP was &lt; or = 40 mumol/mol heme, which we consider a normal value. Mean ZPP in 46 iron-deficient patients was 256 (SD 105) mumol/mol heme (correlation with ferritin: -0.73; with hemoglobin: -0.85; P &lt; 0.001). When washed erythrocytes are used, the hematofluorometric determination of ZPP is sensitive and specific for detecting iron deficiency in otherwise healthy individuals and hospitalized patients.


Sign in / Sign up

Export Citation Format

Share Document