SEARCH

SEARCH BY CITATION

Keywords:

  • brain development;
  • cognitive decline;
  • cognitive reserve;
  • dementia;
  • neurodegeneration;
  • neurodevelopment;
  • risk factors

Abstract

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

Rather than being an inevitable consequence of age, cognitive decline can occur with marked variation among individuals. In this context, nutrition is one factor that is believed to be influential. When considering the potential role of diet, two factors need to be considered. First, cognitive or brain reserve is said to decrease the incidence of dementia; that is, it has been suggested that those with larger brains and better intellectual functioning have a greater capacity to resist the effects of the biological changes that define dementia. As such, the adequacy of nutrition before birth and in the early formative years may have long-term consequences. Second, shrinkage of the brain begins in young adulthood, suggesting that any insidious influence of diet will take place from that time onward over a period of many decades. The marked decline in the weight of the brain associated with advanced dementia suggests it will be easier to slow that decline than to repair the brain. If this model is accurate, diet is influential throughout the entire lifespan, and this has substantial methodological implications for the study of the topic.


INTRODUCTION

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

The largest risk factor for dementia is age, but dementia is not inevitable and can occur at relatively young ages. A nationally representative sample of the US population found that 13.9% of those aged 71 years and older had dementia, and the figures increased from 5% in those aged 71–79 years to 37.4% in those older than 90 years.1 Among those aged 71 years and older, 9.7% had Alzheimer's disease (AD), and this was the most common cause. A decline in cognitive functioning is one of the strongest predictors of impending mortality.2 At one time, it was held that cognitive aging was unavoidable because it reflected an age-related loss of brain cells. More recently, attention has been directed to those who, to a large extent, retain their faculties in the hope that factors that prevent mental decline can be established. Wilson et al.3 found that cognitive decline was not associated with age unless cognitive was sampled within a few years of death, and, in fact, not everybody develops problems of cognition. A study of people older than 100 years found that 89% had still been living independently at the age of 934; that is, they did not have dementia.

It appears that the rate of cognitive decline reflects individual differences, rather than being an inevitable consequence of age, and that nutrition may be an influential factor. As such, if we understand the role played by nutrition, we may be able to influence the rate of dementia, or the age or the speed at which the decline occurs.

CHANGES WITH AGE

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

A review5 of the changes in cognition that occur with age concluded that, in particular, there is a decline in episodic memory, i.e., the ability to recall events in time and place. If circumstances allow only the taking of one measure, then it is this type of test that should be used; e.g., the ability to recall a story or a list of words. Nilsson6 stated “cross-sectional research shows that there is a linear, decreasing memory performance as a function of age for episodic memory.” In contrast, other aspects of memory are relatively constant throughout adult life. Nilsson reported that the decline in episodic memory began in the youngest group he considered (age range: 35–40 years), although other findings suggest that the decline begins as early as 20 years of age. Thus, if nutrition influences the rate of cognitive decline, then we need to study its influence from an earlier age than has usually been considered.

COGNITIVE RESERVE AND BRAIN RESERVE

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

An observation reported in 1988 greatly influenced the way the aging process has been viewed subsequently. Katzman et al.7 examined the brains of 137 older adults who had died and unexpectedly found that the degree of pathology of the brain did not necessarily relate to cognitive functioning that had been assessed before death. Whereas the brains of some individuals at post-mortem examination were found to have extensive biological changes of the type that define AD, before death, they had few manifestations of the disorder. In contrast, others had serious memory problems during life, yet post-mortem examination found limited biological pathology. Those who had not displayed symptoms while living, although there was post-mortem evidence of extensive pathology in the brain, tended to have heavier brains and more neurones. The idea of “cognitive reserve” was thus born; those with larger brains were suggested to have a greater capacity to resist the effect of the biological changes that define dementia. There are potentially two aspects of this concept: a role for a bigger brain (sometimes called brain reserve) and a role for how it has been programmed, a reflection of education and other experiences (sometimes termed cognitive reserve).

In this area of research, there are passive and active models. Brain reserve is a passive model, in which the reserve is said to result from the size of the brain or the number of neurones. Because a given insult leaves more healthy tissue when the brain is larger, it is better able to maintain functional capacity. This view assumes that similar brain damage will have a similar effect in different individuals but that repeated damage will summate until a threshold is reached at which clinical symptoms appear.8 It will take longer for this threshold to be reached in those with larger brains.

In contrast, cognitive reserve is an active model. A response to brain damage is said to involve existing cognitive processes and the development of additional compensatory mechanisms. From this point of view, it is not a threshold that needs to be reached; rather, it is the ability to deal with brain damage that is critical. However, such a distinction may be arbitrary because the ability to respond cognitively must have a physiological basis, and if a larger brain is helpful, that must be because it enhances cognitive processing.

As an illustration of this concept, a greater incidence of late-onset dementia was associated with lower mental ability in childhood,9 although early-onset dementia was not, thus suggesting a different etiology. Perneczky et al.10 examined 270 patients with AD, using brain imaging to measure the degree of brain atrophy and head circumference as a proxy measure of original brain size because larger skulls tend to house bigger brains. With a comparable level of brain shrinkage, cognition was better in those with a greater head circumference, that is, those who initially had a larger brain. The size of the brain reflects to a large extent how people develop during early childhood. It was pointed out that, by the age of 6 years, the brain is 93% of its final size. As such, head circumference mainly reflects brain growth during the first years of life. As well as genetic and environmental influences, nutrition has an important role in this early growth.

The Nun Study, which followed 678 School Sisters of Notre Dame aged 75 years and older, demonstrated the importance of basic biology and experience.11 The incidence of dementia was lower in those with larger brains, but in those with smaller brains, the incidence of dementia was lower if the level of education was higher. Nuns are a useful sample because they have similar life-styles, social support, and diet. A likely explanation is that both the size of the brain and the education-induced increases in neuronal conductivity provided a buffer against the effect of neuronal death. This line of research indicates that, when considering the effect of nutrition on the aging brain, we should primarily attempt to develop brains with as much surplus capacity as possible.

EARLY INFLUENCES

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

If nutrition during the formative years can increase intellectual capacity, then the incidence of dementia can be reduced. In addition to trying to slow intellectual decline, we should consider neurodevelopment and in this way facilitate “brain reserve.”

The rapid rate of growth of the brain during the last third of gestation and the early postnatal stage places great demands on the diet to provide the nutrients required for brain development.12 A review concluded that inadequate intake of protein and energy was associated with problems of behavior and cognition in both the short and long term.13 Thus, in developing countries, food supplementation before the age of 2 years has tended to result in benefits over longer terms, while the effects of supplementation at later ages have tended to be less lasting. More specifically, iodine is critical for brain development; if there is a shortage during the end of the first trimester and the early part of the second trimester of gestation, cretinism may result, with an associated lower level of intelligence.12

In industrialized countries, iron shortage is perhaps the most widespread nutritional problem. Lozoff14 concluded that “there is compelling evidence that 6- to 24-month-old infants with iron-deficiency anemia are at risk for poorer cognitive, motor, social-emotional, and neurophysiologic development in the short- and long-term.” Iron deficiency before the age of 2 years results in long-term problems that do not respond to a subsequently adequate intake of iron. Although a low level of iron intake at a later age adversely influences cognitive functioning, this can be reversed by a subsequently adequate intake.

The dry weight of the brain is approximately 60% lipid, with the omega-3 (n-3) fatty acid docosahexaenoic acid accounting for approximately 20% of the brain's mass. There has therefore been a particular interest in the nature of the fat provided by the diet, particularly because n-3 and n-6 fatty acids are essential fatty acids, that is, the body cannot make them. Infants have been randomized to receive formula to which long-chain polyunsaturated fatty acids were and were not added. A Cochrane review15 concluded that most studies of this nature found that short-term supplementation improved vision, but there was no evidence of a long-term benefit. The stage of development may be important, and subtle effects may depend on the dose, the composition of the supplement, and the age at which it was supplied.16

Given the higher incidence of poorer diet in developing countries, it is inevitable that the influence of the diet is greater there, although more subtle effects may occur in industrialized societies. The only long-term randomized trial of the influence of diet performed to date illustrated that the adequacy of early diet can influence cognitive development, with consequent implications for the development of brain reserve. Using a nasogastric tube, premature infants were fed while in an incubator with either a standard cow-milk–based formula or one enriched with additional protein and micronutrients. Although fed in this way for a median of only 4 weeks, at 18 months, those who received the enriched formula were more advanced socially and showed better motor coordination.17 At 8 years of age, the boys, but not the girls, had better scores on an intelligence test.18 At the age of 16 years, this difference still existed, and imaging studies found that the caudate nucleus was larger in those who had consumed the enriched formula at an early age.19 This finding illustrates that, even in industrialized societies, short-term differences in nutrition can have long-term consequences. Because the infants in this study were premature, however, and their brains were thus at an earlier stage of development, the findings may not generalize to full-term infants. In addition, it is unclear which nutritional aspects of the enriched formula are important. Although the study established the principle that minor changes in the nature of diet can have long-term consequences, specific recommendations cannot yet be offered.

METHODOLOGICAL IMPLICATIONS

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

This analysis of the nature of cognitive aging has various implications for the study of the possible effect of nutrition. Ideally, the mother's nutrition should be monitored prior to the date of delivery, and the offspring's diet should then be considered throughout its life. To the extent that diet influences initial cognitive development, and thus helps to develop “brain reserve”, nutrition prior to birth and during childhood might influence the incidence of dementia in later life. Cognitive decline6 and shrinking of the brain begins in an individual's 20s,20 so if diet is influential, it may be important throughout adult life. Imaging studies have consistently shown a decrease in brain volume with age, even in those without cognitive symptoms. A loss of white matter, particularly in the prefrontal cortex, may relate to cognitive decline.20

Traditionally, the greatest weight has been placed on information from randomized, double-blind, placebo-controlled studies because this approach allows a causal relationship to be established. Public health recommendations should ideally be based on the systematic review of a number of such studies. Nobody would, in principle, question the desirability of such an approach, but concern arises about the extent to which such an approach is possible in the present context. It is clearly not possible to randomly allocate individuals to a dietary pattern that they will follow for a lifetime. To establish the influence of dietary style over many decades, there is likely to be little alternative to the use of epidemiological methods. This is not necessarily a problem, because there are similar health-related topics that are not susceptible to study using randomized trials in which causal relationships are generally accepted. The influence of smoking on health is an obvious example; over a period of many years, people have not been randomly told to smoke or not smoke cigarettes. The problem has been solved by requiring that data satisfy a range of criteria.21 It was said that causality could be established if exposure preceded the outcome, which is the only absolutely essential criterion. In addition, causality was more likely if the relationship was strong, there was a dose–response relationship, results had been replicated in different settings using different methods, and the suggested effects were plausible and agreed with accepted understandings of pathology; finally, alternative hypotheses need to have been ruled out. These are more-difficult criteria to satisfy than those associated with randomized trials.

Given the obvious advantages of randomized trials, there is a natural desire to use this approach, although inevitably, any intervention is likely to occur over a relatively short time period that may not reflect the prolonged time period over which diet is influential. One approach might be to consider not cognitive decline, as such, but rather any biomarker that could be established to be reliably associated. Although future evidence will be required to firmly establish such associations, these might involve the monitoring of any of the biological mechanisms that are hypothesized to be associated with aging. Among others, these include oxidative stress,22 inflammation,23 homocysteine,24 advanced glycation end products,25 and the provision of fatty acids.26 These mechanisms have in common that they reflect the nature of the existing diet or respond to changes in what we eat. Rather than looking for a rapid change from normal cognition to dementia, we might wish to consider changes in specific brain structures that take place slowly over many decades. Alternatively, changes such as general brain shrinkage, reduced cerebral blood, glucose tolerance, or related measures may prove to be useful endpoints.

Should it not be possible to establish suitable biomarkers, then change in cognition may, by default, become the biomarker of interest; however, other problems then arise. Dementia is not a disease but a group of symptoms that are common to many diseases, such as problems of thinking, memory, and reasoning. Although AD accounts for between 50% and 70% of all dementia cases, and vascular dementia accounts for 10–15%, there are as many as 50 other causes. These diseases have different etiologies, so nutrition will have an impact in different ways, or, in particular instances, it may not have any effect at all. For example, because the Wernicke-Korsakoff syndrome responds to thiamine supplementation, and vitamin B12 deficiency responds to supplementation of this vitamin, this does not imply that other forms of dementia would benefit similarly.

There are many epidemiological studies that have used a single measure of cognition, such as the Mini-Mental State Examination, to measure cognitive decline or dementia. The use of such a brief test, which adds several unrelated aspects of cognition together to produce a single score, has been subject to extensive criticism.5 Even if tests that are more precise and reliable are used, such an approach will inevitably ensure that a single dependant variable is considered; for example, memory decline may reflect a range of etiologies that are not influenced by nutrition in a similar manner. Mathias and Burke27 reviewed cognitive functioning in AD and vascular dementia and concluded that cognitive testing had a limited ability to discriminate between these two forms of dementia. Thus, although cognitive assessment plays an important role in diagnosis, it should be used only in conjunction with other information, such as imaging and a medical history, in an attempt to establish the particular type of dementia.

CONCLUSION

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

The question investigated here was whether, at particular stages, nutrition influenced the development or degeneration of brain functioning. Although there is much we do not understand, nutrition can be influential throughout life and even prior to birth, during gestation. To the extent that nutrition influences cognitive development and hence offers “brain reserve,” it will decrease the incidence of dementia in later life. The shrinking of the brain begins in young adulthood, suggesting that any insidious influence of diet will begin to influence the structure of the brain from that time onwards. Although the brain can, in the short-term, adapt to the death of brain cells, the marked decline in the weight of the brain associated with advanced dementia suggests it will be easier to slow that decline than repair the brain after it occurs. If this model of diet being influential throughout the entire lifespan is accurate, then it has substantial implications for the study of the topic.

Acknowledgments

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES

Declaration of interest.  Dr. Benton has no conflict of interest to declare. He received a small honorarium for writing this article. This work was commissioned by the Nutrition and Mental Performance Task Force of the European branch of the International Life Sciences Institute (ILSI Europe). Industry members of this task force are Abbott Nutrition, Barilla G. & R. Fratelli, Coca-Cola Europe, Danone, Dr. Willmar Schwabe, DSM, FrieslandCampina, Kellogg Europe, Kraft Foods, Martek Biosciences Corporation, Naturex, Nestlé, PepsiCo International, Pfizer, Roquette, Soremartec – Ferrero Group, Südzucker/BENEO Group, Unilever. For further information about ILSI Europe, please call +32-2-771-00-14 or email: info@ilsieurope.be. The opinions expressed herein are those of the authors and do not necessarily represent the views of ILSI Europe. The coordinator for this supplement was Ms Agnes Meheust, ILSI Europe.

REFERENCES

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. CHANGES WITH AGE
  5. COGNITIVE RESERVE AND BRAIN RESERVE
  6. EARLY INFLUENCES
  7. METHODOLOGICAL IMPLICATIONS
  8. CONCLUSION
  9. Acknowledgments
  10. REFERENCES