Jan F. Gogarten is a graduate student in the Department of Biology, McGill University, Montreal, Quebec. His research interests are in the area of behavioral ecology. He is conducting his doctoral dissertation research on the environmental and social predictors of primate parasites. The impetus for this article was developed in a graduate course on human evolution at Stony Brook University. Email: email@example.com
Frederick E. Grine is a Professor in the Departments of Anthropology and Anatomical Sciences, Stony Brook University, Stony Brook, New York. His research is broadly concerned with interpreting the hominin fossil record, particularly with regard to the dietary ecology and trophic adaptations of our extinct relatives. Email: firstname.lastname@example.org
The microscopic traces of use wear on teeth have been extensively studied to provide information that will assist in elucidating the dietary habits of extinct hominin species.[1-13] It has been amply documented that dental microwear provides information pertaining to diet for living animals, where there is a strong and consistent association between dental microwear patterns and different types of foods that are chewed. The details of occlusal surface wear patterns are capable of distinguishing among diets when the constituent food items differ in their fracture properties.[14-20] For example, the microwear traces left on the teeth of mammals that crush hard, brittle foods such as nuts are generally dominated by pits, whereas traces left on the teeth of mammals that shear tough items such as leaves tend to be characterized by scratches. These microwear features result from and thus record actual chewing events. As such, microwear patterns are expected to be variably ephemeral, as individual features are worn away and replaced or overprinted by others as the tooth wears down in subsequent bouts of mastication. Indeed, it has been demonstrated, both in the laboratory and the wild, that short-term dietary variation can result in the turnover of microwear.[17, 21-23] Because occlusal microwear potentially reflects an individual's diet for a short time (days, weeks, or months, depending on the nature of the foods being masticated), tooth surfaces sampled at different times will display differences that relate to temporal (for example, seasonal) differences in diet.
As a result of its potential turnover, microwear will preserve information pertaining to those items consumed just before an individual's death, a phenomenon referred to as the “Last Supper effect.” The amount of time represented by this effect is a direct function of the types of foods consumed. Thus, chewing soft foods and making tooth-tooth contact will result in the slow removal of enamel and, therefore, the slow transformation of any features that constitute the fabric of a microwear pattern. On the other hand, processing hard foods and/or exogenous grit will result in much more rapid overprinting or removal of wear features. Such turnover means that microwear traces of diet may be ephemeral, but to different degrees. Although some workers[26-29] have seen this as problematic for dietary inference, the “Last Supper effect” possibly allows one to decipher the actual dietary habits of an individual at a given point or time span. As such, the interpretive strength of occlusal microwear is that it is direct evidence left by the foods that were consumed at a given point in an animal's lifetime rather than the range of potential foods that might be inferred from the morphological attributes of the species.
By the same token, however, microwear turnover may potentially confound the analysis of fossil assemblages. These assemblages may accumulate over a prolonged period or may preferentially sample certain seasons, geographic locales, and/or climatic conditions in unequal proportions. In other words, the vagaries of the fossilization process mean that the representation of individuals in a paleontological assemblage may be taphonomically biased; that is, sampled in unequal proportions vis-à-vis the parent populations from which they derived. Vertebrate taphonomists have long been concerned with the ways in which the accumulation and preservation of fossils reflect paleoenvironments,[31, 32] the composition and abundance of source faunas, and possible social behaviors.[34-36] The relationship between fossil (death) assemblages and seasonality has been a matter of particular concern.[37-40]
SEASONAL VARIATION AND FALLBACK FOODS
Seasonal variation in diet, coupled with differential seasonal mortality, might lead to unexpected and perhaps uncharacteristic patterns of microwear being preferentially preserved for some species. If individuals are more prone to die during certain periods, particularly those during which preferred food items are less prevalent, the dental microwear that is preserved in fossil assemblages may not accurately represent the most commonly ingested items in an organism's diet. For example, molar microwear indicates that the diets of Australopithecus anamensis, A. afarensis, and especially Paranthropus boisei were not dominated by the hard foods predicted by their commonly perceived masticatory capabilities.[8, 9, 11-13] Although these species may have had trophic morphologies capable of processing a range of foods, including hard, brittle items such as nuts, seeds and hard fruits, their molar microwear patterns suggest that they did not always do so. In particular, those individuals that have been sampled do not appear to have masticated hard foods during the periods in which their microwear was formed. On the other hand, microwear studies of Paranthropus robustus molars have revealed more heavily pitted and complex occlusal surfaces, consistent with the processing of more hard and brittle items.[1, 2, 5, 7]
In an attempt to reconcile observed microwear patterns with the inferred masticatory capabilities of some Late Pliocene-Early Pleistocene hominin taxa, it has been proposed that their trophic morphologies may represent adaptations to fallback resources.[8, 9, 12] Indeed, the microwear displayed by P. robustus, and particularly its individual variation, has suggested comparison with extant primate species such as Lophocebus albigena and Sapajus apella, which consume hard objects as fallback foods when softer, more preferred foods are unavailable.[41-45] Thus, Scott and coworkers suspected that the microwear attributes of their P. robustus sample might well indicate that at least some individuals consumed fracture-resistant fallback foods.
Although fallback resources are not preferred food items, and may be consumed for relatively short periods, they might be critical for a population's survival. They commonly present significant structural obstacles to comminution. As Kinzey47:378 aptly observed, “when a food item is critical for survival, even though not part of the primary specialization, it will influence the selection of dental features.” Although such fallback food items may be consumed only rarely, they are precisely the kinds of items that require dental specialization, the reason being that they are likely to have more significant mechanical defenses and lower energy yields. On the other hand, there remains considerable controversy surrounding the general applicability of the concept of fallback foods to those items in the diet that are mechanically protected. Indeed, some species, including those that seem to lack the expected morphological adaptations to certain foods that are consumed on a regular basis (for example, Sacoglottis gabonensis seeds, which are consumed by Cercocebus atys in the Taï Forest), may prefer mechanically protected foods,[48, 49]
The discord among morphology, diet, and feeding behavior observed in some extant taxa suggests that the question of whether a dental or other morphological feature in a species reflects an adaptation to preferred foods or to less commonly eaten but critical fallback items is not a trivial one.
With regard to the fallback food hypothesis put forward by Grine and colleagues as a potential mechanism to reconcile the molar microwear patterns and craniodental features of A. afarensis, Kimbel and Delezene50:29 considered that “if hard-object foods were consumed in high stress periods with (presumably) high mortality, then at least some fossils should be expected to show evidence of hard-object feeding; it would be a taphonomic anomaly that none do.” This sentiment goes to the heart of the matter of potential death assemblage bias and the interpretation of diet from the microwear preserved by the fossils that comprise such assemblages.
The potential over-representation of fallback resources in the hominin fossil record is an important issue. However, some workers have argued that because complex microwear patterns will likely have a longer “half-life” than noncomplex ones, this will lead to overestimation of the consumption of hard foods. On the other hand, others[26, 51] have bemoaned microwear turnover because it results in the traces of rarely eaten hard fallback foods not being preserved. The problem is determining whether the patterns of microwear observed in the hominin fossil record are more likely to reflect overall dietary habits or to overrepresent a portion of the diet range, as exemplified by fallback foods that were consumed during periods of seasonal resource stress (Fig. 1).
SEASONAL MORTALITY AND FALLBACK FOODS
At the root of this question is the assumption that fallback foods should be “overrepresented” in the paleontological record. This is based on the notion that animals, including primates, most often die during times of food resource stress.[51, 52] In support of this intuitively appealing notion, Ungar and Sponheimer have cited studies by Gould, Sussman, and Sauther and Nakagawa, Ohsawa, and Muroyama showing population declines in two primate species in relation to several years of extreme hardship, such as drought. Along these same lines, Hamilton documented 22 baboon deaths during a period of intense drought in the Namib Desert of Namibia. Pictures of animal mortality during periods of drought and nutritional deprivation have been vividly etched into our vision of the African savanna under duress (Fig. 2).
Unfortunately, Hamilton's study spanned only one seven-month period of extreme hardship and was conducted in an area from which the top baboon predator had been extirpated. In fact, subsequent studies on the same population under less duress revealed that the majority of infant deaths were caused by kidnapping and especially parasitism (tick infestations) and that adult male mortality was primarily caused by injuries from agonistic interactions.[56, 58] With regard to the 15-year period recorded by Gould, Sussman, and Sauther, there appears to have been a virtually complete turnover of the study population, which would indicate significant mortality even in the years that did not include extreme drought. Indeed, since neither Gould, Sussman, and Sauther nor Nakagawa Ohsawa, and Muroyama identified the causes of death or the diet of the animals, it is possible that the observed mortality patterns were driven by rainfall, disease such as increased parasitism, or predation rather than by dietary factors.
In this regard, increased rather than decreased rainfall has been linked to increased parasitism[58, 59] and leucocyte loads, which serve as a measure of immune system response to infectious agents. Studies of chimpanzees and gorillas have shown that most deaths are disease- and parasite-related.[61, 62] Even in those instances in which mortality is tied to predation, this may be enmeshed with disease and/or diet. Disease may predispose individuals to capture,[63, 64] while serious dietary insufficiencies that lead to a decline in overall health could increase susceptibility to disease and predation.
Thus, while there is some evidence from field studies of modern primates to support differential mortality being tied to food abundance,[53, 54, 65] other studies[58, 59, 61, 62, 66, 67] have found little or no association between food availability and mortality. In addition to food deprivation,[66, 68, 69] many factors contribute to primate mortality, including disease,[62, 66, 70-72] predation,[73-76] and injury during agonistic interactions.[62, 77] Environmental variables may influence the relative importance of these sources of mortality.[66, 78] Furthermore, many different factors may interact to drive mortality patterns. Determining the ultimate causes of death is extremely difficult, especially for wild primates. For example, temperature and rainfall may affect the prevalence of vector-borne diseases because they affect arthropod distribution and abundance as well as the development and transmission rates of parasites. Thus, elevated mortality during rainy seasons may be related to increased disease risk.[66, 71] Alternatively, dietary stress induced by seasonal declines in resource availability, often associated with decreased rainfall, may depress the immune system, thus increasing susceptibility to disease.[59, 80] Given this somewhat mixed evidence, it is uncertain whether fallback foods would be overrepresented in the microwear of taphonomic assemblages of extinct hominin species regardless of the ultimate causes of death.
SEASONAL MORTILITY IN PRIMATES
The question that needs to be addressed directly is whether there is any evidence to suggest that primates die preferentially during periods of high dietary stress, when individuals rely more heavily, if not entirely, on fallback foods. Although considerable attention has been paid to the use of fallback foods by various primate species,[47, 81-86] there are surprisingly few data on mortality patterns in relation to the periods when these resources are most important.
Causes of death are expected to differ in the degree of seasonality they impose on mortality patterns. In addition, the relative contributions and interactions of these causative factors to produce mortality patterns are expected to differ across species. Seasonal patterns of resource availability and disease susceptibility suggest that species living in more seasonal environments will exhibit higher seasonal mortality. Moreover, because it has been argued that leaves are an abundant and ubiquitously distributed resource over which primates will not have to compete (but see Sterck and Steenbeck and Snaith and Chapman), and because foliage is likely to be more abundant than fruit during periods of resource scarcity,[89, 90] there is an a priori expectation that folivores and frugivores will differ in patterns of seasonal mortality. All other factors being equal, more frugivorous species should exhibit more seasonality in mortality than more folivorous taxa.
Although numerous reports of the causes of death in primate populations have been recorded in the literature, many of these are anecdotal. Patterns of mortality have rarely been analyzed over broad spatial or temporal scales; they also have not been compared across multiple species.
AN ANALYSIS OF DATA FOR TEN PRIMATE POPULATIONS
To gain insight into the question of whether mortality is seasonally driven and, in particular, whether it is correlated with times of food resource scarcity in extant nonhuman primates, Gogarten and coworkers compiled adult and juvenile mortality data for ten wild populations of nine species distributed across a variety of environments ranging from dense tropical rain forest to open savanna. They combined original mortality data from seven field sites with comparable data for three other species reported by Milton, Watts, and Cheney and coworkers to determine how seasonal variation in rainfall and resource availability may influence mortality.
The species included in the study by Gogarten and colleagues (Table 1) represent most of the major primate clades, and thus provide a broad taxonomic sampling of the order. Primates are particularly well-suited for investigating the role of seasonality on mortality patterns for four important reasons. First, it is possible to conduct comparative analyses such as that by Gogarten and associates because the phylogenetic relationships among crown groups are reasonably well-resolved. To account for the non-independence of data related to phylogenetic propinquity, Gogarten and colleagues employed phylogenetic generalized least squares (PGLS) regression[92-94] using the consensus tree for the taxa of interest from the 10kTrees project (Fig. 3). Second, conclusive rather than inferred mortality data are available for various populations that have been followed in the field for prolonged periods. Third, primate dietary habits, which vary across species, have been extensively documented. Although many primate species consume a variety of items, most show distinct preferences, permitting them to be characterized in general terms as being primarily frugivorous or folivorous. Moreover, many display seasonal differences, coming to rely on fallback foods during periods of preferred resource scarcity.[97, 98] Finally, rainfall seasonality varies across the geographic range of primate habitats from marked seasonality to more evenly distributed precipitation throughout the year.
Table 1. Taxonomic and Geographic Distribution of the Primate Population Groups Included in the Study by Gogarten et al.
Principal dietary category classification follows Nunn and van Schaik.
Barro Colorado Island, Panama
Lomas Barbudal, Costa Rica
Santa Rosa, Costa Rica
blue (diademed) monkey
Moremi Reserve, Botswana
Gunung Leser, Indonesia
Tana River red colobus
Gogarten and coworkers addressed five non-mutually exclusive questions regarding primate mortality patterns: Is there a seasonal pattern to mortality across primate species? Is greater environmental seasonality related to increased seasonality in mortality? If mortality is seasonal, is it higher during wet than dry seasons? If mortality is seasonal, is it higher during periods of preferred food scarcity or fallback food consumption? If mortality is seasonal, do folivores show less seasonality than frugivores?
Gogarten and coworkers found that about half (five of nine) of the primate species in their study exhibited seasonal mortality. However, even in the most seasonal populations, death occurred throughout the year, likely reflecting multiple causes of mortality such as those related to predation, disease and injury, each of which may act independently of and/or be differentially influenced by environmental seasonality (Table 2, Fig. 4). Mortality was more seasonal in more seasonal environments and was strongly tied to rainfall. However, elevated mortality was most often higher in wet periods than in times of food resource stress. This relationship is possibly driven by disease, as increased rainfall has been associated with increased parasite loads[58, 59, 79, 100, 101] and white blood cell counts.
Table 2. Mortality Data for the Primate Population Groups Included in the Study by Gogarten et al.
n deaths(study yrs)
Mortality during the wet season
Mortality during period of food scarcity
Principal dietary category classification for each population is recorded in Table 1.
Cebus capucinus (LB) = study population at Lomas Barbudal; Cebus capucinus (SR) = study population at Santa Rosa.
Details on the statistical analyses are presented in Gogarten et al.; r-statistic values with asterisks indicate significant results with ***(p < 0.001), **(p < 0.01), and *(p < 0.05); ns = no significant pattern.
Mortality was seasonal but this seems to be driven by predation during the period of flooding rather than seasonal patterns in food availability.
Contrary to expectation, this study yielded only limited support for the hypothesis that mortality is higher during periods of food scarcity. Of the five species that exhibited a seasonal pattern of mortality, two (Alouatta palliata and Propithecus edwardsi) showed elevated mortality during the food-scarce period, while three (Papio ursinus, Piliocolobus rufomitratus, and Gorilla beringei) had elevated deaths during periods of food abundance. Moreover, and also contrary to expectation, Gogarten and associates found that when controlling for phylogeny, folivorous species exhibited more seasonal mortality than did frugivorous taxa. This, too, may be related to disease factors, with the larger volumes of food eaten by folivores leading to the ingestion of more parasites, the infectious stages of which contaminate leaf materials. Comparative studies have indicated that the degree of folivory in primates is positively correlated with parasitic helminth diversity and infestation.[102, 103]
Thus, the results of the analysis by Gogarten and colleagues indicate that microwear patterns observed in the fossil record are unlikely to overrepresent periods in which fallback foods were eaten under conditions of resource scarcity. In only five of the nine primate species examined was there any relationship between mortality and resource availability, and in only two was mortality elevated during food-scarce periods. While additional field studies of differential mortality in other populations of primate (and other mammalian) species are required to fully address this issue, the results of the study by Gogarten and colleagues indicate that, notwithstanding the “Last Supper effect,” fallback foods should not necessarily be expected to be overrepresented in microwear studies.
Of course the vagaries of taphonomy and fossilization may lead to the preservation of unusual patterns of mortality, such as those represented in catastrophic assemblages.[40, 104, 105] These assemblages may comprise individuals who fed on uncommon types of foods or ingested unusual amounts of exogenous grit and, as such, may unduly affect the perceived microwear signature for a population or a species (Fig. 5). In addition, a degree of interpretive caution is also warranted by the possibility that the global patterns of seasonality seen today may not necessarily equate to those in the Pliocene and Early Pleistocene of Africa.
MODERN PRIMATE MODELS FOR PLIOCENE HOMININS
Is there any reason to suspect that microwear patterns in the hominin fossil record are likely to overrepresent periods in which fallback foods were eaten under conditions of resource scarcity? As our closest living relatives, chimpanzees have been suggested as a model for the dietary habits and mortality patterns of the earliest hominins.[106, 107] Interestingly, however, the chimpanzees of Gombe demonstrate the least seasonal mortality of all the primate species examined by Gogarten and coworkers. Given this, there would seem to be little reason to postulate seasonal over-representation of fallback foods in a species such as Ardipithecus ramidus, whose reconstructed habitat is a woodland or woodland-grassland biotope (see Grine and colleagues for a summary of the paleoecological evidence), consistent with the habitat used by chimpanzees today.
Many workers envision later hominin taxa as having inhabited more open savanna environments than are currently occupied by most chimpanzees.[108-112] As such, chacma baboons may be a better model for a primate inhabiting the type of environment faced by Pliocene hominins. The chacma baboons of the Okavango deltaic savanna exhibit seasonally elevated death rates owing to predation pressure rather than food scarcity (Fig. 6). The predation pressure during this period seems to be driven by predators focusing on the predictably shallowest and shortest water routes between islands used by the baboons. In addition, the migration of other potential prey species to the flood margins in search of grass at this time potentially causes an increased predatory focus on the baboons (D. Cheney, personal communication). The wet and wooded grassland that characterizes the Okavango endorheic delta is the sort of environment that has been envisioned for a variety of hominin taxa throughout the Plio-Pleistocene[112-114] and accords particularly well with the preferred habitats envisaged for Paranthropus boisei.
Similarly, while the chacma baboons that inhabit the more arid Namib Desert exhibit increased mortality during intense drought, other years of less severe drought find deaths being primarily due to parasitism, infanticide, and injury, with no indication of starvation or malnutrition as the root of demise.[56, 57] These observations appear to provide additional support for the notion that microwear will not necessarily over-represent periods of resource stress (or the use of fallback foods) in the Plio-Pleistocene hominin fossil record.
Because of a wide range of potential taphonomic biases in their mode of accumulation,[30, 31, 39] fossil assemblages may not reflect underlying seasonal patterns of mortality even if these existed. Regardless of the season in which animals may tend to die, preservation is strongly affected by where death occurs and the subsequent potential for burial. For example, remains are more likely to be preserved if individuals die on riverine margins just before a period of overbank flooding. Even if there was a seasonal pattern of mortality in a hominin population, the multifarious processes involved in fossilization could easily overprint those patterns and influence the kind of microwear preserved. We suggest that these taphonomic processes may be more universally important than seasonal mortality in influencing the hominin fossil record and its interpretation.
Dental microwear has been used to elucidate the dietary habits of extinct species. Because microwear may exhibit turnover, it potentially preserves information pertaining only to those items consumed just before an individual's death. Seasonal variation in diet, and especially seasonal reliance on fallback foods that present significant structural obstacles to comminution, when coupled with differential seasonal mortality, might lead to the preservation in a fossil assemblage of microwear patterns that do not reflect the broader diet of the species. In an attempt to reconcile observed microwear with the inferred masticatory capabilities of some hominin taxa, it has been suggested that their trophic features may represent adaptations to fallback resources. However, it also has been argued that fallback foods should be very visible in the paleontological record because of the expectation that mortality should be higher during times of nutritional resource stress. We review the available evidence on mortality patterns in primates to inform this discussion. Although many species exhibit seasonal mortality, deaths are generally not more common during periods of preferred food scarcity. Rather, the observed mortality trends seem more likely to be related to disease or predation. Thus, current evidence suggests that fallback foods are unlikely to be overrepresented in microwear studies of extinct taxa, especially studies that incorporate teeth from more than a few individuals. Even if extinct hominin populations exhibited distinctly seasonal patterns of mortality, the complex taphonomic processes that affect fossil accumulations may be more important in influencing the interpretation of microwear in the paleontological record.
We thank John Fleagle for the invitation to submit this essay. We are grateful to him, Anna Kay Behrensmeyer, and Dorothy Cheney for their cogent comments on different versions of this manuscript. We thank John Fleagle for his editorial skill and Travis Pickering and three anonymous reviewers for their keen observations and thoughtful suggestions on the manuscript; all have greatly improved its clarity. We thank Roman Wittig for permission to use the photograph of the Okavango baboons that formed the core of Dorothy Cheney's remarkable 16-year-long study. Conrad Brain and Alan Root graciously permitted use of their photographs of some unfortunate African mammals. We thank Peter Ungar for providing the images of early hominin molar microwear. We thank Leone Brown and Colin Chapman for fruitful discussions and Caitlin Friesen for her careful reading of the manuscript in its final stages. Jan F. Gogarten was supported by a Graduate Research Fellowship from the National Science Foundation (Grant Number: DGE-1142336), the Explorers Club - Eddie Bauer Youth Grant, and the Canadian Institutes of Health Research's Strategic Training Initiative in Health Research's Systems Biology Training Program.