Although appreciation for the emergence of a negative iron balance, particularly in regular donors, enjoys a long history,[1] publication of the experience at blood centers taking part in the Retrovirus Epidemiology Donor Study (REDS)-II Donor Iron Status Evaluation has brought concern for iron deficiency in blood donors and its medical significance into sharper focus.[2] Given the study's confirmation of absent iron stores and iron deficient erythropoiesis in a significant percentage of frequent donors, the authors urged more attention to donor safety, to consideration of restrictions on how often volunteers could donate, and to routine iron supplementation.

Many of the REDS-II investigators' concerns were addressed when the AABB Interorganizational Task Force on Donor Hemoglobin Deferrals developed Bulletin #12-03, titled “Strategies to Monitor, Limit, or Prevent Iron Deficiency in Blood Donors.”[3] The task force felt that there was enough evidence to conclude that “iron deficiency in the absence of anemia is also increasingly being recognized as a problem.” Their Bulletin included detailed recommendations that addressed prolonging the interdonation interval, the measurement of donor ferritin, and the role of iron replacement therapy. The authors of the Bulletin pointed out that their recommendations could apply to the management of donors in general or, alternatively, could be directed specifically at donor groups who are vulnerable to iron depletion.

With regard to the first of the Bulletin authors' considerations, prolonging the interdonation interval, it was appropriate to remind readers that prolongation of the interval could significantly affect inventories. For example, at this commentary's authors' blood program, doubling the interval to 112 days would decrease annual group O D-inventory by more than 10%.[4]

While the Bulletin emphasized considering a reassessment of the length of interdonation interval for regular donors, it was silent on what the interval might be for donors temporarily deferred with a low hemoglobin (Hb). The silence was understandable given that, without any generally accepted policies, there are wide variations in the advice that different blood programs offer. Temporary deferral in Australia is for 6 months, an interval significantly longer than that in the United States where donors can return much sooner. Responses to a survey sent to members of America's Blood Centers revealed that while 70% of centers deferred donors for 1 day, the deferral period recommended elsewhere ranged between 3 days and 3 months. At 10% of the blood centers the recommended deferral period was not fixed, but varied, increasing in length with lower deferral Hb levels.[5]

At our center we decided on a different approach to crafting a message for deferred donors. Our intentions were twofold. We wanted to be able to tell deferred donors the likelihood of a future successful donation and we also wanted to modify donor call lists so that recruitment messages did not encourage premature returns with only slim possibilities of successful donations. To do this we reviewed our experience between 2005 and 2013 when 78,489 female and 13,642 male donors earned deferral with Hb levels of less than 12.5 g/dL. The 47,764 females who returned (60.9% of the original group) were divided by different ranges of their earlier deferral Hb levels. We then further divided the groups according to when they returned and plotted what percentage in each group made a successful donation on that return. Outcomes for deferred female donors at their subsequent visit are shown in Fig. 1. Not surprisingly, we confirmed that a deferred female donor has a greater likelihood of a successful subsequent donation the closer her deferral Hb level is to the 12.5 g/dL threshold and the longer she waits before return. The analysis enabled us to give a modestly more informed answer to those deferred donors asking “when should I try again?” For example, we could say that donors deferred with Hb levels less than 10.9 g/dL, despite waiting 21 to 24 weeks after deferral, do not achieve a 50% likelihood of a successful donation. Our analysis also confirmed that for some deferred donors, prolongation of the interdonation interval alone will not be enough to ensure a subsequent successful donation. Of 1681 females originally deferred with Hb levels between 12.2 and 12.4 g/dL, 440 (26.2%) were deferred again despite delaying their return by 21 to 24 weeks. This redeferral rate, in a group at most 0.3 g/dL below cutoff, served as a reminder that questions about iron balance are not restricted to the overtly anemic female donor. This group's repeat deferral makes a case for the Bulletin's second recommendation concerning ferritin assays. If these assays are adopted selectively, then donors deferred with a low Hb who subsequently return only to be deferred again would be a candidate group. However, donors deferred significantly below the threshold might include individuals whose anemia is not due to iron deficiency. This possibility argues for storage iron screening of all deferred donors to ensure that those needing clinical evaluation, such as anemic individuals with normal stores, are not overlooked.


Figure 1. Percentages of female donors, previously deferred with Hb level of less than 12.5 g/dL, successfully donating at their subsequent visits. The donors are grouped by Hb intervals at deferral.

Download figure to PowerPoint

While deciding which donors deserve storage iron evaluation is an important consideration, there are other considerations that relate to the ferritin assay itself. Some investigators caution against reliance on ferritin as the exclusive measure of iron stores.[6] In their experience the addition of soluble transferrin receptor levels adds precision and makes up for the relative nonspecificity of the ferritin determination. Currently available ferritin assays also have, potentially, another disadvantage. If research confirms that reduced iron stores, in the absence of anemia, do contribute to safety consequences for the donor, then a ferritin assay should be part of the predonation eligibility screen. This would ensure that would-be donors with depleted stores, but Hb levels above cutoff, are not drawn. A point-of-care ferritin test, performed at the same time and with the same turnaround as the Hb, would, then, be an appropriate addition to donor screening, but, as the Bulletin concedes, no such assay exists. By comparison, a different measure of iron stores, the zinc erythrocyte protoporphyrin assay, does have a suitable turnaround time. The assay relies on the fact that when iron is not available for incorporation into the protoporphyrin ring, zinc becomes a substitute. The test is rapid and inexpensive and was first promoted as a predonation screen more than 30 years ago.[7] Recent published experience suggests that zinc erythrocyte protoporphyrin assays are valuable in the predonation setting to identify donors with iron-deficient erythropoiesis.[8]

Regardless of the method chosen to determine iron stores, greater attention will need to be paid to the messages given to donors. The message, for deferred donors with absent iron stores, might simply promote the essentials of good iron nutrition; however, repeat donors with dwindling stores might be better served if their message includes the advice to reduce their donation frequency.

The Bulletin's third strategy addressed iron replacement and presented a number of options ranging from providing iron tablets to merely making recommendations to donors about oral iron therapy. Whatever the choice in this regard, a new level of responsibility is assumed by the blood program when donors become candidates for treatment once testing reveals diminished or absent iron stores. While a cynical interpretation might be that the blood center is trying to ensure the donor's continued eligibility, a more charitable view would hold that iron supplementation promotes donor safety. The challenge to the latter opinion, however, is that there is varying consistency in any evidence of dysfunction among individuals with ferritin values at the lower end of the reference range. Certainly fatigue has earned attention and investigation of a role for iron therapy in its management has a long history. Some 50 years ago, Beutler and his colleagues[9] conducted a double-blind, placebo-controlled study of ferrous sulfate in the management of “otherwise healthy” nonanemic women who complained of chronic fatigue. They concluded that since some study participants improved, a role for iron treatment deserved further study. Further study tends to support a beneficial role for iron both by the oral[10] and the parenteral route.[11] Elsewhere, however, it is disappointing, especially from a blood banker's perspective, that further study has not provided clear answers to questions about any association between storage iron deficiency and pathology in the absence of anemia. There are two areas in particular, however, where further study, depending on the results, could profoundly influence opinions about donor safety and the treatment of iron depletion. The first concerns the possibility that iron deficiency might impair neurocognitive function. To explore this link, the REDS-III group will investigate whether blood donation by 16- to 18-year-olds plays a contributory role.[12] The other area of study relates to understanding if blood donation and consequent iron depletion are associated with an increased risk for premature delivery and low-birthweight neonates. A group of investigators from Héma-Québec plans to investigate this possibility by scrutinizing birth registries for adverse outcomes of pregnancies in female blood donors with at least one delivery between 2001 and 2011 (G. Delage and M. Germain, personal communication, January 2014).

The decision to measure iron stores in blood donors, either across the board or in selected groups, will have significant ramifications especially if results are used to develop messages, guide treatment for storage iron deficiency, and identify candidates for referral to physicians. While Bulletin #12-03 did suggest that the approach could be selective, it could be argued that if reduced stores need to be corrected then they should be corrected for all comers. Blood programs cannot ignore iron deficiency among first-time donors or those who have not donated in many years. If these individuals are ignored, programs risk the inevitable criticism that they are indifferent to a health and safety issue, namely, iron deficiency, in the general population and only interested in remedying the condition when blood donation had a role in causing it. Results from the REDS-II Donor Iron Status Evaluation are a reminder that iron deficiency is common even among the healthy population. Ferritin was less than 12 ng/mL in 6.4% of women who had never donated or had only donated in the remote past.[13]

Against this background, an iron deficiency diagnosis and treatment policy at blood centers has the potential to contribute to a broader public health intervention. The importance of this intervention can be appreciated from reports in the US National Health and Nutrition Examination Surveys. The prevalence of iron deficiency in, for example, 20- to 49-year-old women ranges between 8 and 19% depending on ethnicity.[14] Hardly surprising, then, that one of the US Health and Human Services Department National Health Objectives is to reduce iron deficiency in vulnerable populations.[15] Recommendations in Bulletin #12-03 provide a pathway for blood programs to contribute to this important goal.

Conflict of Interest

  1. Top of page
  2. Conflict of Interest
  3. References

The authors have disclosed no conflicts of interest.


  1. Top of page
  2. Conflict of Interest
  3. References