The Donation Interval of 56 Days Requires Extension to 180 Days for Whole Blood Donors to Recover from Disturbances in Iron Homeostasis

Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 774-774
Author(s):  
Nienke N.Schotten ◽  
Pieternel C Pasker-de Jong ◽  
Diego Moretti ◽  
Michael Bruce Zimmermann ◽  
Marian G Kraaij ◽  
...  

Abstract Background: To protect whole blood donors from developing iron deficiency and anemia, many blood banks require a minimum interval of 56 days between two donations. We aimed to assess whether the donation interval of 56 days is adequate in both new and regular donors to recover from changes in iron homeostasis. Methods: Fifty male whole blood donors (25 new and 25 regular donors), were followed for 180 days after donating 500 mL of blood. Recovery of Hb and iron parameters (Hb, mean corpuscular volume (MCV), mean corpuscular Hb (MCH), MCH concentration (MCHC), red cell distribution width (RDW), reticulocytes, reticulocyte Hb content (CHr), iron, total iron-binding capacity (TIBC), transferrin saturation (TSAT), ferritin, soluble transferrin receptor (sTfR), sTfR-F index (sTfR/log ferritin), erythropoietin (EPO) and hepcidin) was investigated and tested for differences between new and regular donors in blood drawn at baseline (before donation) and at nine time points after blood donation. Differences in iron absorption and erythrocyte iron incorporation between new and regular donors were investigated with stable iron isotopes, administered at day 8 and measured at day 29. Results: At baseline, levels of Hb and iron parameters differed between new and regular donors. In regular donors, Hb, ferritin and hepcidin were lower and EPO was higher compared to new donors. However, patterns of change in parameters over time after whole blood donation were similar for new and regular donors with an increase in EPO and decrease in Hb and hepcidin after day 2 followed by a decrease in ferritin and increase in reticulocytes, sTfR and TIBC after day 4. At day 57, only for TIBC all (100%) regular donors were back at pre-donation level (Figure 1). Percentages for other parameters ranged from 16.0% (regular donors) and 20.8% (new donors) for ferritin to above 80.0% for MCV, reticulocytes, CHr, hepcidin and ZPP (regular donors) and Hb, MCH, MCHC, reticulocytes, CHr, iron, TIBC, hepcidin and ZPP (new donors). At day 85 and 180, >90% of donors were back at their pre-donation levels for all parameters except for i) RDW, EPO, in both donor groups and ii) ferritin at day 85 in new (26.1%), and regular donors (48.0%) and at day 180 in new donors (78.3%). Importantly, for ferritin, when compared to regular donors, the increase in the percentage of new donors that reached pre-donation levels was slower over time and lower at 180 days. Iron absorption (17.0 and 21.9%, respectively) and incorporation into erythrocytes (81.6 and 83.7%, respectively) were not statistically significantly different between new and regular donors, but appeared to be higher in regular donors. Conclusion: For the majority of blood donors the current interval of 56 days is too short to fully recover from a change in Hb and iron parameters after blood donation. Regular donation results in lower ferritin levels at baseline compared to new donors. Our data suggest that ferritin is a promising candidate parameter to personalize donation intervals. Moreover, they imply that prolongation of the donation interval from 56 to 180 days would i) prevent a further decrease of ferritin levels of regular donors, and ii) reduce the risk for iron deficiency associated symptoms upon subsequent donations. Figure 1. Percentage of donors back at baseline levels (before blood donation) at day 57, 85 and 180 after blood donation. Solid lines: regular donors; dashed lines, new donors. Figure 1. Percentage of donors back at baseline levels (before blood donation) at day 57, 85 and 180 after blood donation. Solid lines: regular donors; dashed lines, new donors. Disclosures No relevant conflicts of interest to declare.

Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 2887-2887
Author(s):  
Alan E. Mast ◽  
Tisha Foster ◽  
Holly L. Pinder ◽  
Craig A. Beczkiewicz ◽  
Daniel B. Bellissimo ◽  
...  

Abstract Examination of the low hematocrit (HCT) deferral rates in whole blood donors based on gender/menstrual status and donation intensity unexpectedly revealed that low HCT deferral rates level off and even begin to decrease in frequent donors (>8 donations in 2 years) suggesting that frequent blood donors are a self-selected population possessing either behavioral or biochemical characteristics that allow greater iron absorption than the general population. To define these characteristics, 138 donors (101 male, 37 female, 136 Caucasian) that had donated 13 times in a 2-year period (“superdonors”) completed a questionnaire and had a blood sample analyzed for ferritin, hepcidin and HFE and JAK-2 genotypes. Ferritin was 31.0±20.3 ug/L for males and 25.2±14.8 ug/L for females. Two-thirds of both men and women had ferritin below 30 ug/L indicating that most have reduced iron stores. Average ferritin was ∼15 ug/L higher in donors taking multiple vitamins with iron or iron supplements than in those who did not take them. Hepcidin is an iron regulatory hormone that negatively regulates intestinal iron absorption. Serum hepcidin levels were determined using a liquid chromatography tandem mass spectrometry assay. The normal serum hepcidin concentration using this assay is 8–11 ng/ml [Blood110:1048 (2007)]. Serum hepcidin was greatly decreased in superdonors (males 2.9±5.4 ng/ml; females 2.8±2.7 ng/ml) and 55 had no detectable hepcidin (<1 ng/ml), suggesting that superdonors absorb maximal amounts of intestinal iron. The C282Y mutation in the HFE gene has been linked to unregulated iron absorption and the development of hemochromatosis. This mutation was analyzed to determine if heterozygosity is present at greater than expected frequency in superdonors. It is present in 21 (15.2%) of the superdonors. This is higher than the reported frequency of 10–12% in Caucasians but did not reach statistical significance. The JAK-2 mutation is strongly associated with polycythemia vera and other myeloproliferative disorders. None of the superdonors had this mutation. In summary, superdonors are able to frequently donate whole blood with a lower than expected frequency of low HCT deferral despite having very low iron stores. Many, but not all, superdonors take either multiple vitamins with iron or an iron supplement that partially accounts for their ability to repeatedly meet the HCT requirement for whole blood donation. A key biochemical characteristic of superdonors that also contributes to their ability to repeatedly donate whole blood is greatly reduced serum hepcidin concentration that allows maximal intestinal iron absorption. Genetic analyses revealed that there is a trend towards an increased prevalence of heterozygosity for the C282Y mutation that may allow some superdonors to efficiently absorb intestinal iron and donate frequently without low HCT deferral, but no evidence for presence of the JAK-2 mutation indicating that undiagnosed polycythemia vera is not a common cause for successful repeated blood donation by superdonors.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 2888-2888
Author(s):  
Alan E. Mast ◽  
Karen S. Schlumpf ◽  
Brian Custer ◽  

Abstract Introduction: The most common cause of deferral of whole blood donors is HCT below 38%. 15% of previously successful donors deferred for low HCT do not return within a 5-year period (Transfusion2007, 47:1514). Identification of the causes for low HCT deferral is important to define new donor management strategies and maintain an adequate blood supply. We sought to define risk factors for low HCT deferral among previously successful whole blood donors. Methods: The REDSII database tracked >750,000 whole blood donors at the 6 REDSII blood centers between January 2006 and March 2007. Donors were stratified by whole blood donation intensity and then by gender, age, race and education level. The rate of low HCT deferral was determined by dividing the number of donors with HCT deferral by the total number of donors in each donation intensity and demographic stratum. Plots of donation intensity (x-axis) vs. HCT deferral rate (y-axis) were used to generate low HCT deferral curves. For the donation frequency (1 to 5) during this time period the slopes of the deferral curve are linear (r2>0.98) and provide a measure for comparison of the sensitivity/resistance of each group to low HCT deferral. Results: Women less than 51 years old have the steepest deferral slope (0.068-meaning a 6.8% increase in low HCT deferral with each increase in donation frequency). The slope is identical in women of all race/ethnicity and education groups suggesting that menstrual status is the single most important predictor of low HCT deferral in frequent female donors. The deferral slope decreases in post-menopausal women between 51 and 70 years old (0.046) but never approaches that of men (0.011) suggesting that women do not readily replenish iron stores following cessation of menses. The deferral curve slope in men increases with each decade of life beginning at age 50. An increase is also observed in women >70 years old and demonstrates that donors >50 progressively become less able to respond to the stress of blood donation. Studies of race/ethnicity in male donors demonstrated somewhat unexpected, yet distinct, differences. Men of Asian descent have a much lower deferral slope (0.0035) than white (0.011) or Hispanic (0.012) donors, while African-American donors (0.021) have a much higher deferral slope. There are several possible explanations for these data including hemoglobinopathies or genetic set point for HCT. However, it is also plausible that there are genetically regulated differences in iron absorption in response to the stress of repeated whole blood donation. Studies of education level in males also demonstrated differences. The deferral curve is steepest for high school graduates (0.015) and decreases with increased education, some college (0.012), bachelor’s degree (0.0089), and master’s degree or higher (0.0082). Conclusions: Rates of low HCT deferral in well-defined groups of whole blood donors were examined as a function of donation intensity. Gender, age, race and education based differences in the ability to repeatedly donate whole blood were identified. These data lay the ground work for devising personalized blood donation intervals that may reduce the incidence of low HCT deferral among previously successful donors. They also provide important, new epidemiological information that may help us better understand diverse issues such as the anemia of aging and genetically regulated controls of iron absorption.


2008 ◽  
Vol 132 (6) ◽  
pp. 947-951 ◽  
Author(s):  
Lindsay A. Alaishuski ◽  
Rodney D. Grim ◽  
Ronald E. Domen

Abstract Context.—Informed consent in transfusion medicine has been an area lacking of significant research and it is unknown if donors fully comprehend the risks associated with whole blood donation. Objective.—To assess the adequacy of the informed consent process in whole blood donation. Design.—A brief questionnaire was constructed and distributed to whole blood donors visiting various fixed and mobile donor sites of the Central Pennsylvania Blood Bank. Questions consisted of demographic information; donor opinions of information content, length, and comprehension; and a short quiz pertaining to donor risks and eligibility. Results.—Analysis of 849 surveys demonstrated that donors comprehended a mean of 73.5% of the various donor eligibility and risks that were surveyed. Female and younger donors scored statistically higher on comprehension questions compared with male and older counterparts. Donors were most aware of (1) donor eligibility requirements related to acquired immunodeficiency syndrome comprehension, (2) the risk of dizziness postdonation, and (3) having lived in a certain country (93.7%–95.6% comprehension, respectively). Donors were least aware of (1) the risk of a possible referral to a physician for outstanding medical conditions or positive test results, (2) the risk of a positive test result, and (3) West Nile virus testing information (22.4%–49.3% comprehension, respectively). Conclusions.—Whole blood donors believed that they were giving informed consent, but a significant percentage of donors were unaware of several of the risks associated with blood donation, including participation in West Nile virus research testing. Our data suggest that donors do not fully comprehend the risks of whole blood donation and that repetition of information to the donor, and in multiple formats, strengthens the level of comprehension and thus the informed consent process.


Transfusion ◽  
2019 ◽  
Vol 59 (10) ◽  
pp. 3275-3287 ◽  
Author(s):  
Saurabh Zalpuri ◽  
Nienke Schotten ◽  
A. Mireille Baart ◽  
Leo M. Watering ◽  
Katja Hurk ◽  
...  

Transfusion ◽  
2012 ◽  
Vol 53 (8) ◽  
pp. 1670-1677 ◽  
Author(s):  
A. Mireille Baart ◽  
Paulus A.H. van Noord ◽  
Yvonne Vergouwe ◽  
Karel G.M. Moons ◽  
Dorine W. Swinkels ◽  
...  

Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 2056-2056
Author(s):  
Sant-Rayn S Pasricha ◽  
Zoe McQuilten ◽  
Mark Westerman ◽  
Anthony Keller ◽  
Elizabeta Nemeth ◽  
...  

Abstract Abstract 2056 Introduction: Iron deficiency remains the commonest blood disorder worldwide. Hepcidin is a key regulator of iron homeostasis. In iron depletion, decreased hepcidin facilitates increased iron absorption and recycling. Hepcidin is detectable in whole blood, serum & urine, and although assays have been developed, the utility and clinically appropriate cutoffs for diagnosis of iron deficiency remain to be established. Blood donors are at particular risk of iron deficiency, yet early diagnosis remains challenging in this setting; thus donors are an ideal population in which to evaluate a new diagnostic test of iron deficiency. We evaluated hepcidin as a diagnostic test of iron deficiency in female blood donors. Methods: Subjects: Premenopausal, non-anemic females accepted for whole blood donation by the Australian Red Cross Blood Service, not taking iron supplements and with no history of hemochromatosis. Iron status assessment: Serum ferritin (chemiluminescence), soluble transferrin receptor (sTfR) (immunoturbidometry) and serum hepcidin (competitive ELISA). Analysis: Diagnostic utility of hepcidin, compared with ‘gold standards’ ferritin, sTfR and sTfR/log(ferritin) index, was evaluated by Area under Receiver Operating Characteristic curves (AUCROC). Potential hepcidin cutoffs were identified, and their sensitivities and specificities evaluated. Results: We recruited 261 donors: 22.6% had ferritin<15ng/mL, 10.3% had sTfR>4.4mg/mL, and 20.3% had sTfR/log(ferritin) index>3.2. The 95% range of hepcidin values was <5.4-175.0ng/mL (overall); 9.3–203.0ng/mL (if ferritin≥15ng/mL); and 8.1–198.5ng/mL (if sTfR/log(ferritin)index≤3.2). By linear regression, log(hepcidin) was associated with log(ferritin) (coefficient +1.08, P<0.001); log(sTfR) (coefficient -2.02, P<-0.001) and log(sTfR/ferritin index) (coefficient -1.58, P<0.001). The AUCROC for hepcidin, compared with sTfR/log(ferritin) index>3.2 was 0.89, compared with ferritin<15ng/mL was 0.87 and compared with sTfR>4.4mg/mL was 0.81. An undetectable hepcidin (<5.4ng/mL) had sensitivity and specificity of 32.2% and 99.9% respectively for identification of sTfR/log(ferritin) index>3.2; hepcidin<8.1ng/mL had sensitivity and specificity of 41.5% and 97.5% respectively, and hepcidin<20ng/mL had sensitivity and specificity 74.6% and 83.2% respectively. Conclusions: Hepcidin shows promise as a diagnostic test for iron deficiency. Further work is needed to select suitable cutoffs for this assay, however a cutoff of <8.1ng/mL seems to accurately identify normal subjects, whilst <20ng/mL offers a balance between appropriate identification of cases and normal subjects. Hepcidin may become a valuable clinical index of iron status. Rapid diagnosis of iron deficiency with point of care whole blood or urine hepcidin assays may be achievable and useful in various settings, including blood donation. Prevention of donor iron deficiency is a high priority for the Australian Red Cross Blood Service and is being addressed through a comprehensive strategy. Disclosures: Westerman: Intrinsic Life Sciences: Employment, Membership on an entity's Board of Directors or advisory committees. Nemeth:Intrinsic Life Sciences: Employment, Membership on an entity's Board of Directors or advisory committees. Ganz:Intrinsic Life Sciences: Employment, Membership on an entity's Board of Directors or advisory committees.


2020 ◽  
Author(s):  
Steven Bell ◽  
Michael Sweeting ◽  
Anna Ramond ◽  
Ryan Chung ◽  
Stephen Kaptoge ◽  
...  

SUMMARYObjectiveTo compare four haemoglobin measurement methods in whole blood donors.BackgroundTo safeguard donors, blood services measure haemoglobin concentration in advance of each donation. NHS Blood and Transplant’s (NHSBT) usual method has been capillary gravimetry (copper sulphate), followed by venous HemoCue® (spectrophotometry) for donors failing gravimetry. However, gravimetry/venous HemoCue® results in 10% of donors being inappropriately bled (i.e., with haemoglobin values below the regulatory threshold).MethodsThe following were compared in 21,840 blood donors (aged ≥18 years) recruited from 10 mobile centres of NHSBT in England, with each method compared with the Sysmex XN-2000 haematology analyser, the reference standard: 1) gravimetry/venous HemoCue®; 2) “post donation” approach, i.e., estimating current haemoglobin concentration from that measured by a haematology analyser at a donor’s most recent prior donation; 3) capillary HemoCue®; and 4) non-invasive spectrometry (MBR Haemospect® or Orsense NMB200®). We assessed each method for sensitivity; specificity; proportion of donors who would have been inappropriately bled, or rejected from donation (“deferred”) incorrectly; and test preference.ResultsCompared with the reference standard, the methods ranged in test sensitivity from 17.0% (MBR Haemospect®) to 79.0% (HemoCue®) in men, and from 19.0% (MBR Haemospect®) to 82.8% (HemoCue®) in women. For specificity, the methods ranged from 87.2% (MBR Haemospect®) to 99.9% (gravimetry/venous HemoCue®) in men, and from 74.1% (Orsense NMB200®) to 99.8% (gravimetry/venous HemoCue®) in women. The proportion of donors who would have been inappropriately bled ranged from 2.2% in men for HemoCue® to 18.9% in women for MBR Haemospect®. The proportion of donors who would have been deferred incorrectly with haemoglobin concentration above the minimum threshold ranged from 0.1% in men for gravimetry/venous HemoCue® to 20.3% in women for OrSense®. Most donors preferred non-invasive spectrometry.ConclusionIn the largest study reporting head-to-head comparisons of four methods to measure haemoglobin prior to blood donation, our results support replacement of venous HemoCue® with the capillary HemoCue® when donors fail gravimetry. These results have had direct translational implications for NHS Blood and Transplant in England.


Sign in / Sign up

Export Citation Format

Share Document