Screening is a form of secondary prevention, detecting and treating the disorder early before serious problems occur. If timed at the age of peak incidence (at 18 months), many children would have been anaemic for some months. If introduced earlier, e.g. 13 months at the time of MMR immunization, a number of children not anaemic then become so some months later. In Bristol a quarter of children found to be anaemic at the age of 2 years had not been so at 13 months (James et al, 1993). An ideal age for screening is therefore not apparent. All children in particular groups might be screened if the prevalence of IDA is high in the group, e.g. in inner city areas, children of immigrant or refugee families, exclusively breast-fed 10-month-olds, and toddlers in whom cows' milk was the main drink before 12 months of age.
Overall preventative strategy
Various bodies have suggested strategies, e.g. government-funded bodies: the Centres for Disease Control (1998) in Atlanta; Department of Health (1994) in U.K.; non-government organizations such as the British and Swedish Nutrition Foundations; many individual nutritionists and paediatricians (Wharton, 1999a; Ziegler & Fomon, 1996).
Blood loss should be avoided. Effective umbilical clamping with devices which tighten as the cord withers usually prevent cord haemorrhage. The time of clamping may affect subsequent ID. In Guatemala infants in whom the cord was not clamped until pulsation had stopped had a higher haematocrit at 2 months of age; this manoeuvre had no effect on serum ferritin at 3 months of age in Indian children (Grajeda et al, 1997; Geethanath et al, 1997). Since three-quarters of iron ‘stored’ at birth is in haemoglobin, perinatal blood loss is a potent cause of anaemia in early and later infancy. Generally the other stores in newborns show little relationship to the mothers' iron status, although some studies in both the developed and developing world have shown one. Two papers have shown that poorer maternal iron status in pregnancy is associated with a poorer iron status in the infants at 1 year of age (Strauss, 1996; Colomer et al, 1990). This could reflect a longer-term effect of reduced iron stores or that both mother and child have received an iron-deficient diet.
Suckling period (0–4 months)
For normal-sized babies there is little concern because total body iron does not increase during this time. If ID occurs then abnormal blood loss should be considered. This may occur in the perinatal period (e.g feto-maternal transfusion, cord accident) or later (e.g. reflux oesophagitis, bleeding from ectopic gastric mucosa in a Meckels diverticulum, rarely allergic colitis of infancy). Breast feeding is encouraged, or failing that a modern infant formula is used.
Weaning: continuation of the suckling's food
From the age 6 months a dietary source of iron is necessary. For bottle-fed babies this is easily provided by continued use of an infant formula which is iron fortified or introduction of a follow-on formula, all of which are iron fortified. Breast milk alone will not supply the extra iron but absorption of the small amount of iron present is high.
Weaning: introduction of weaning foods containing available iron
This is less critical in bottle-fed babies receiving iron-fortified formula or a follow-on milk, and bottle-fed babies tend to receive weaning foods from an earlier age than breast-fed ones (White et al, 1990). Although there is evidence that too early an introduction of solid foods interferes with iron absorption from breast milk (Pisacane et al, 1995), they should be introduced from 6 months of age. Meat, because of its haem iron, is an excellent choice, providing zinc as well, which may also become a limiting nutrient in prolonged breast feeding. A Danish study showed that an intake of 27 g of meat a day from the age of 8 months (compared to an intake of 10 g daily) led to lower falls of haemoglobin in later infancy, although there were no effects on serum ferritin or transferrin receptor concentrations (Engelmann et al, 1998). Unfortunately many mothers who continue to breast feed their older infants choose vegetarian weaning foods from which iron is less available. Convenience weaning foods are widely available in the Western world, and some are fortified with iron. Wide use of these foods provided a more satisfactory diet (more iron, less protein, salt and sugar) than a home-made diet alone (Mills & Tyler, 1992; Stordy et al, 1995).
Work in Honduras questioned the possibility of giving iron supplements to breast-fed children from about 4 months rather than run the risk of introducing microbe-contaminated feeds which anyway have low iron availability (Dewey et al, 1998).
Family foods containing available iron
In Britain 12–15% of the total iron intake of children 1–15 years old is provided by meat, i.e. 4–5% as haem iron, 20–30% by fortified cereal products such as breakfast cereals and bread; vegetables, biscuits and chips (french-fried potatoes) each supply 5–10%. This diet meets the ‘reference nutrient intake’ (RNI) for most ages but not for toddlers age 1.5–2.5 years (mean intake was 73% of RNI; 100% is desirable), nor for girls aged 10–15 (63% of RNI) (Department of Health, 1989; Gregory et al, 1995). However, the total intake is only part of the story. The absorption of iron is determined by the overall composition of the meal, the integrity of the gastrointestinal tract, and systemic factors. Absorption of haem iron increases when anaemia is present but is little affected by other components of the meal. Absorption of non-haem iron is enhanced by vitamin C, other organic acids present in fruit and vegetables such as citric and malic acid, and animal protein. Absorption is inhibited by phytate, calcium and polyphenols (in tea). For detailed reviews see Lynch (1997) and, specifically for children, Lonnerdal (1990) and Fomon (1993). The extent of iron absorption is also affected by body stores, the rate of erythropoiesis, and hypoxia. The mechanisms whereby enterocytes receive information from these factors to alter absorption are not clear.
In developing countries, fortified foods are less available at affordable prices, and fibre and phytate intakes are higher (Tatala et al, 1998). On the other hand, some foods are cooked in iron pots leading to better iron status than if aluminium pots are used (Borigato & Martinez, 1998). Grape molasses is used as a reasonable source of iron in Turkey (Aslan et al, 1997).
Avoidance of blood loss
The role of pasteurized cows' milk in intestinal blood loss has been referred to above. In many parts of the world hookworm infestation is the most common cause of blood loss. There are recommended control programmes (intermittent antihelminthic medication at least twice yearly, control of faecal contamination of soil, use of simple shoes) for schoolchildren and women. These should be applied to preschool children as well, perhaps combined with iron supplementation (Stoltzfus et al, 1997, 1998; Hopkins et al, 1997). Bilharzia is less important as a cause of anaemia, trichuris and giardia have similarly been regarded as less frequent causes of iron deficiency as a population problem (De Morais et al, 1996).
Methods to achieve the strategy
With a clear strategy implementation should be a simple matter of informing mothers what is best, but modification of eating customs and traditions is difficult.
Small-scale health education interventions have been successful, e.g. in a family practice in Bristol prevalence of microcytic anaemia at 13 months fell from 25% to 8% during a 2-year period, but enthusiasm waned because it had risen to 13% a further 2 years later (James et al, 1993). Larger programmes have not been successful. In a project reaching about 500 children in a health district of Birmingham, about 30% of children in both the intervention and control groups were anaemic at 18 months of age (Childs et al, 1997).
Consumption of suitable foods may be encouraged if their price to the consumer is reduced (sometimes free of charge) by subsidies from a government or a charity.
The outstanding example is the Women and Infants and Children Program in the U.S.A. (WIC). Iron-fortified infant formulas and weaning cereals are supplied free of charge to about a quarter of all infants. Since the programme was introduced the prevalence of iron deficiency anaemia has fallen considerably and is less than in many other countries (e.g. in U.S.A. the prevalence of IDA among 1–2-year-olds is 3%, Britain 12%; Looker et al, 1997; Gregory et al, 1995). A recent evaluation showed that those within the programme had less anaemia and a better iron status than those who were not (Owen & Owen, 1997). The British scheme, open to families on income support or job-seekers allowance, enables the mother to choose an infant formula (all of which are fortified in Britain) or whole cows' milk which contains little iron. Follow-on formulas, although fortified, are not included nor are ‘solid’ weaning foods.
Many other countries have subsidy schemes but many are based on milk alone, which, although excellent for energy, protein, calcium and riboflavine, does little to promote iron nutrition.
Food fortification: infant formulas
The main issue in iron fortification of infant formulas is quantity. Most formulas in the U.S.A. contain about 12 mg/l (1.8 mg/100 kcal) and most in Europe up to 7 mg/l (1.0 mg/100 kcal). In both continents formulas without added iron are allowed by the current regulations and in Scandinavia levels are around 4 mg /l (0.6 mg/100 kcal). There are proposals to add little or no iron to formulas consumed in the first 4–6 months of life (Wharton, 1989, 1996). (a) Total body iron increases little during this time, breast milk contains only small amounts of iron, and iron may have adverse effects on the faecal flora (Balmer & Wharton, 1991; Mevissen-Verhage et al, 1985). (b) With higher fortification the absolute amount absorbed is only a little greater, leaving more unabsorbed iron in the gut lumen. (c) Young infants receiving an infant formula with no or small amounts of added iron in the first 4 months of life do not develop ID (Haschke et al, 1993; Hernell & Lonnerdal, 1996). Nevertheless, almost all infant formulas used at this age do contain added iron.
From about 6 months of age more dietary iron becomes essential and a ‘safety net of fortified foods’ ensures a satisfactory intake (Wharton, 1986). The safety net usually includes a fortified formula but the appropriate level of fortification is unclear, e.g. continued use of a European infant formula (1 mg/100 kcal) or an American infant formula (1.8 mg/100 kcal ) or introduction of a European style ‘follow-on formula’ (about 1.8 mg/100 kcal). Any of these options is preferable to the early introduction of ordinary cows' milk. Evidence is accumulating that even in formulas consumed in later infancy a level of fortification lower than used previously can be effective in maintaining adequate absorption and/or iron status, e.g. 8 mg (Fomon et al, 1997), 3 mg (Haschke et al, 1993) 2 mg (Walter et al, 1998), but the periods of surveillance were for 3–6 months only and did not follow infants into the second year of life when anaemia is most common. Using radiolabelled iron infant formulas in adults, the Chilean group suggested fortification of 7 mg/l to provide 1 mg of absorbed iron (Hertrampf et al, 1998).
A second issue is which qualities of an infant formula enhance iron status. Many studies have shown the positive effect of using iron-fortified formulas instead of cows' milk on iron status in infants >6 months old and in toddlers in the second year of life. It is not certain, however, to what extent this reflects solely the effect of a greater intake of iron, or is due also to ‘other qualities’ of an infant/follow-on formula such as the greater absorption of iron because of the higher vitamin C content, less inhibition of absorption because of the lower concentrations of protein, calcium and phosphorus (although one study found that the addition of calcium glycerophosphate did not adversely affect iron status; Dalton et al, 1997), or less immunologically induced milk enteropathy and iron loss. Probably both the iron fortification and the ‘other qualities’ are operating. Three British studies of the use of follow-on formulas or cows' milk from the age of 6 months support that conclusion. ID was least in those receiving an iron fortified formula, more common in those receiving an unfortified formula and most in those on pasteurized cows milk (Daly et al, 1996; Gill et al, 1997; Stevens & Nelson, 1995).
Food fortification: children's foods
Iron is added to a variety of weaning foods, particularly cereals. The issues are the availability and the reactivity with other nutrients of the fortificant iron. Ferrous sulphate is relatively well absorbed but it catalyses the oxidation of unsaturated fats leading to rancidity, discoloration and flavour changes unless access to oxygen is limited (vacuum or nitrogen packing, rapid turnover times from fortification to consumption). Nevertheless fortification of wheat and maize flour with ferrous fumarate was acceptable to Venezuelan school children and reduced the prevalence of iron deficiency from 19% to 10% (Layrisse et al, 1996). Powders of elemental iron, if finely ground, are well absorbed but similarly have a pro-oxidant effect; if less finely ground the oxidant effect is less but so is the absorption (Hurrell, 1997).
An alternative is to use haem iron as the fortificant, in effect adding blood to the food vehicle. This resulted in less ID and higher haemoglobin in Chile in older infants and school children (Hertrampf et al, 1990; Walter et al, 1993). A recent study in a small number of British 6-month-olds showed iron retention from a meat and cereal or a meat and vegetable dish was no higher if haem iron was added (retention 1–8 mg) than if ferrous sulphate were the fortificant (−1 to 6 mg) (Martinez et al, 1998).
Food fortification: adult foods and other vehicles
Bread and breakfast cereals fortified with elemental iron are commonly consumed and breakfast cereals are also fortified with vitamin C. Sodium iron EDTA fortification is not inhibited by phytate, does not oxidize fats, and affects flavour less than ferrous sulphate, although discolouration remains a problem. Evaluation continues and it is not as yet licensed for general use. For general reviews of iron fortification see Hurrell (1997) and Gibson (1997).
Other foods which have been fortified with iron for consumption by children include water in Brazil (de Oliveira et al, 1996), chocolate-flavoured milk also fortified with vitamin C in Jamaica (Davidsson et al, 1998), and soft drinks in China (Cheng et al, 1992).
Supplementation is generally not favoured for prevention because a toxic medicine is put into the household and anyway compliance is poor.
Regimens of oral iron every 5–7 d have been promoted. Compliance is better, and in field studies in developing countries they are as effective as a daily dose probably because a daily dose saturates the enterocyte, blocking absorption of the next doses for a few days. Viteri (1997, 1998) and Solomons (1997) have reviewed this approach. Most evidence relates to animal studies or adults, particularly during pregnancy, but some experiences in preschool children are described, e.g. in Bolivia (Berger et al, 1997), China (Liu & Liu, 1996), Indonesia (Schultink et al, 1995) and Vietnam (Thu et al, 1999), and also in Indonesian school-children in whom an improvement in growth as well as iron status was noted (Angeles-Agadeppa et al, 1997; Soemantri et al, 1997). Controversy remains about this intermittent approach. Cook & Reddy (1995) found the absorption of radiolabelled iron by adult volunteers was similar whether daily or intermittent doses were given and they questioned the desirability or need for intermittent regimens compared to the well-tried daily method. More detailed discussion can be found in the correspondence following the publication of the paper (Viteri, 1996; Cook, 1996). Coffee drinking reduces the effectiveness of supplementation (increase in serum ferritin) in Guatemalan 1–2-year-olds (Dewey et al, 1997). Presumably tea could have a similar effect. A third of British toddlers drink tea. Parenteral iron given at 2 months increases the risk of malarial parasitaemia but oral iron was used safely in Tanzania, reducing the prevalence of severe anaemia and not increasing the frequency of malaria (Menendez et al, 1997).
The use of micronutrient supplements in developing countries is being actively considered by UNICEF and USAID. There is a possibility of interaction of different minerals in multimicronutrient supplements. A zinc supplement given in water depresses the absorption of iron, but not when both minerals are presented in food, e.g. a hamburger (Rossander-Hulten et al, 1991). The interaction may be a further reason for giving iron in a food vehicle (fortification) or singly if given as a medicine.
Caution with prevention programmes
Ideally only those requiring extra iron would receive it. There is some association between raised measurements of iron nutrition, heart disease, and cancer in adults, but no firm conclusions can be reached and the cause of the raised measurements may reflect inflammation rather than a high intake (see the major reviews referred to earlier: British Nutrition Foundation, 1995; Hallberg & Asp, 1996).
Nevertheless the possibility of adverse effects of iron programmes in children should be considered. If an individual receives more dietary iron than they need the control mechanisms make excessive iron absorption unlikely unless there is increased erythropoiesis or dyserythropoiesis (e.g. in thalassaemia) or as yet undiagnosed haemachromatosis is present. What is the effect of any unnecessary unabsorbed iron? Lactoferrin in breast milk is an important antibacterial agent in the intestine (at least in vitro; the evidence in vivo is unclear), but if saturated with iron the effect is lost. Despite the lack of evidence in vivo it is therefore preferable to avoid iron supplements in babies receiving breast milk in the first 6 months of life. The addition of added iron to an infant formula — which of course does not contain lactoferrin — is not apparently an important factor in the pathogenesis of intestinal infection in bottle-fed infants, nor in systemic infection (Walter et al, 1997).