Iron deficiency is defined by three stages of increasing severity: depletion of iron stores (stage 1), iron deficiency without anemia (ID) (stage 2), and iron deficiency anemia (IDA) (stage 3). Changes in erythropoiesis are a late manifestation of ID evidenced by an abnormally low concentration of hemoglobin or hematocrit. As with anemia, the risk of ID is elevated during early childhood, in women of reproductive age, as well as during pregnancy. ID is associated with an inadequate intake of absorbable iron in the face of a circumstance that increases the need for extra iron, e.g., rapid growth during the first 2 years of life, loss of iron in menstrual blood in women, and the increase in red cell mass and growth of the fetus during pregnancy.
Many women of childbearing age have a dietary intake of absorbable iron that is too low to offset losses from menstruation and the increased requirement associated with gestation. Data from the National Health and Nutrition Examination Survey (NHANES) III (1988–1994) show a median dietary intake of 14.7 mg/day of iron for pregnant women, suggesting that approximately 90% were below the estimated average requirement for pregnancy (22 mg/day).4 Recent participants from the Camden Study (2001–2007 [unpublished data]) had a similar iron intake from food (15 mg/day at the median), with 83% below the estimated average requirement for pregnancy (Table 1).
Dietary intakes of iron and other micronutrients are associated with ID during pregnancy. While energy-adjusted intakes of protein and carbohydrate were similar among groups in the Camden Study, pregnant women with 3rd trimester ID, based upon two abnormal tests (ferritin <12 ng/mL and transferrin saturation <15%), had diets that were significantly lower (based on an average of three 24-h recalls) in iron, B12, B6, riboflavin, and folate (P = 0.07) and higher in fat than gravidae who were not iron deficient (Table 2). In addition to a poor-quality diet, other important causes of ID in women include poor iron absorption and blood loss from menstruation, through labor and delivery, or from rapid repeat pregnancy.4–6
During pregnancy, the maternal body requirement for iron increases to approximately 1,000 mg, on average.4,6 This amount covers 350 mg associated with fetal and placental growth, 500 mg associated with expansion in red cell mass, and 250 mg associated with blood loss at delivery. The increased requirement needs to be supported by higher maternal iron intakes, increasing from 6 mg/day in the 1st trimester, to 19 mg/day in the 2nd trimester, to 22 mg/day in the 3rd trimester of pregnancy.4,6 In order to meet these increased requirements, gravidae must draw upon iron stores, consequently increasing the risk of ID and IDA. Among non-pregnant women aged 16–49 years from NHANES III, the prevalence of IDA, estimated from low hemoglobin and two of three additional laboratory tests for ID, was 2–5%, while ID occurred in 11–16%.7,8 A body iron model that utilized soluble transferrin receptors and serum ferritin gave a lower prevalence for ID, amounting to 9.2% in women of reproductive age between the ages of 20 and 49 years.8 Among gravidae in the Camden Study (2001–2007), there was a substantial rise in maternal anemia between trimesters 1 and 3. Likewise, ID (based upon low serum ferritin of <12 ng/mL and transferrin saturation of <15%), rose by trimester from 5%, to 14.4%, to 40%, and IDA (defined using CDC criteria for anemia by trimester with ID)6 increased from 0.9%, to 3%, to 17% during trimesters 1, 2, and 3, respectively (Figure 1). Predictive values would, of course, be higher in developing countries where IDA is more prevalent. While supplementation with iron or iron-folic acid is beneficial for improving maternal hemoglobin levels during pregnancy and reducing the risk of maternal anemia, daily supplementation may increase the risk of a maternal hemoglobin count that exceeds 130 g/L, but the clinical significance of this is uncertain.9