Seasonal Mortality Patterns in Primates: Implications for the Interpretation of Dental Microwear

Authors

  • Jan F. Gogarten,

    Search for more papers by this author
    • Jan F. Gogarten is a graduate student in the Department of Biology, McGill University, Montreal, Quebec. His research interests are in the area of behavioral ecology. He is conducting his doctoral dissertation research on the environmental and social predictors of primate parasites. The impetus for this article was developed in a graduate course on human evolution at Stony Brook University. Email: jan.gogarten@mail.mcgill.ca

  • Frederick E. Grine

    Search for more papers by this author
    • Frederick E. Grine is a Professor in the Departments of Anthropology and Anatomical Sciences, Stony Brook University, Stony Brook, New York. His research is broadly concerned with interpreting the hominin fossil record, particularly with regard to the dietary ecology and trophic adaptations of our extinct relatives. Email: frederick.grine@stonybrook.edu


Abstract

The microscopic traces of use wear on teeth have been extensively studied to provide information that will assist in elucidating the dietary habits of extinct hominin species.[1-13] It has been amply documented that dental microwear provides information pertaining to diet for living animals, where there is a strong and consistent association between dental microwear patterns and different types of foods that are chewed. The details of occlusal surface wear patterns are capable of distinguishing among diets when the constituent food items differ in their fracture properties.[14-20] For example, the microwear traces left on the teeth of mammals that crush hard, brittle foods such as nuts are generally dominated by pits, whereas traces left on the teeth of mammals that shear tough items such as leaves tend to be characterized by scratches. These microwear features result from and thus record actual chewing events. As such, microwear patterns are expected to be variably ephemeral, as individual features are worn away and replaced or overprinted by others as the tooth wears down in subsequent bouts of mastication. Indeed, it has been demonstrated, both in the laboratory and the wild, that short-term dietary variation can result in the turnover of microwear.[17, 21-23] Because occlusal microwear potentially reflects an individual's diet for a short time (days, weeks, or months, depending on the nature of the foods being masticated), tooth surfaces sampled at different times will display differences that relate to temporal (for example, seasonal) differences in diet.[24]

As a result of its potential turnover, microwear will preserve information pertaining to those items consumed just before an individual's death, a phenomenon referred to as the “Last Supper effect.”[2] The amount of time represented by this effect is a direct function of the types of foods consumed.[25] Thus, chewing soft foods and making tooth-tooth contact will result in the slow removal of enamel and, therefore, the slow transformation of any features that constitute the fabric of a microwear pattern. On the other hand, processing hard foods and/or exogenous grit will result in much more rapid overprinting or removal of wear features. Such turnover means that microwear traces of diet may be ephemeral, but to different degrees. Although some workers[26-29] have seen this as problematic for dietary inference, the “Last Supper effect” possibly allows one to decipher the actual dietary habits of an individual at a given point or time span. As such, the interpretive strength of occlusal microwear is that it is direct evidence left by the foods that were consumed at a given point in an animal's lifetime rather than the range of potential foods that might be inferred from the morphological attributes of the species.

By the same token, however, microwear turnover may potentially confound the analysis of fossil assemblages. These assemblages may accumulate over a prolonged period or may preferentially sample certain seasons, geographic locales, and/or climatic conditions in unequal proportions. In other words, the vagaries of the fossilization process mean that the representation of individuals in a paleontological assemblage may be taphonomically biased; that is, sampled in unequal proportions vis-à-vis the parent populations from which they derived.[30] Vertebrate taphonomists have long been concerned with the ways in which the accumulation and preservation of fossils reflect paleoenvironments,[31, 32] the composition and abundance of source faunas,[33] and possible social behaviors.[34-36] The relationship between fossil (death) assemblages and seasonality has been a matter of particular concern.[37-40]

SEASONAL VARIATION AND FALLBACK FOODS

Seasonal variation in diet, coupled with differential seasonal mortality, might lead to unexpected and perhaps uncharacteristic patterns of microwear being preferentially preserved for some species. If individuals are more prone to die during certain periods, particularly those during which preferred food items are less prevalent, the dental microwear that is preserved in fossil assemblages may not accurately represent the most commonly ingested items in an organism's diet. For example, molar microwear indicates that the diets of Australopithecus anamensis, A. afarensis, and especially Paranthropus boisei were not dominated by the hard foods predicted by their commonly perceived masticatory capabilities.[8, 9, 11-13] Although these species may have had trophic morphologies capable of processing a range of foods, including hard, brittle items such as nuts, seeds and hard fruits, their molar microwear patterns suggest that they did not always do so. In particular, those individuals that have been sampled do not appear to have masticated hard foods during the periods in which their microwear was formed. On the other hand, microwear studies of Paranthropus robustus molars have revealed more heavily pitted and complex occlusal surfaces, consistent with the processing of more hard and brittle items.[1, 2, 5, 7]

In an attempt to reconcile observed microwear patterns with the inferred masticatory capabilities of some Late Pliocene-Early Pleistocene hominin taxa, it has been proposed that their trophic morphologies may represent adaptations to fallback resources.[8, 9, 12] Indeed, the microwear displayed by P. robustus, and particularly its individual variation, has suggested comparison with extant primate species such as Lophocebus albigena and Sapajus apella, which consume hard objects as fallback foods when softer, more preferred foods are unavailable.[41-45] Thus, Scott and coworkers[7] suspected that the microwear attributes of their P. robustus sample might well indicate that at least some individuals consumed fracture-resistant fallback foods.

Although fallback resources are not preferred food items, and may be consumed for relatively short periods, they might be critical for a population's survival. They commonly present significant structural obstacles to comminution.[46] As Kinzey47:378 aptly observed, “when a food item is critical for survival, even though not part of the primary specialization, it will influence the selection of dental features.” Although such fallback food items may be consumed only rarely, they are precisely the kinds of items that require dental specialization, the reason being that they are likely to have more significant mechanical defenses and lower energy yields. On the other hand, there remains considerable controversy surrounding the general applicability of the concept of fallback foods to those items in the diet that are mechanically protected. Indeed, some species, including those that seem to lack the expected morphological adaptations to certain foods that are consumed on a regular basis (for example, Sacoglottis gabonensis seeds, which are consumed by Cercocebus atys in the Taï Forest), may prefer mechanically protected foods,[48, 49]

The discord among morphology, diet, and feeding behavior observed in some extant taxa suggests that the question of whether a dental or other morphological feature in a species reflects an adaptation to preferred foods or to less commonly eaten but critical fallback items is not a trivial one.

With regard to the fallback food hypothesis put forward by Grine and colleagues[9] as a potential mechanism to reconcile the molar microwear patterns and craniodental features of A. afarensis, Kimbel and Delezene50:29 considered that “if hard-object foods were consumed in high stress periods with (presumably) high mortality, then at least some fossils should be expected to show evidence of hard-object feeding; it would be a taphonomic anomaly that none do.” This sentiment goes to the heart of the matter of potential death assemblage bias and the interpretation of diet from the microwear preserved by the fossils that comprise such assemblages.

The potential over-representation of fallback resources in the hominin fossil record is an important issue. However, some workers[28] have argued that because complex microwear patterns will likely have a longer “half-life” than noncomplex ones, this will lead to overestimation of the consumption of hard foods. On the other hand, others[26, 51] have bemoaned microwear turnover because it results in the traces of rarely eaten hard fallback foods not being preserved. The problem is determining whether the patterns of microwear observed in the hominin fossil record are more likely to reflect overall dietary habits or to overrepresent a portion of the diet range, as exemplified by fallback foods that were consumed during periods of seasonal resource stress (Fig. 1).

Figure 1.

Permanent molar microwear textures in the Early Pleistocene hominins Paranthropus robustus and P. boisei recorded by tandem scanning confocal microscopy. Upper right corner provides 3D photosimulations and simulated mesh axonometrics of digital elevation models of two molar facets each of P. boisei (above; scale = 3.2 µm) and P. robustus (below; scale = 8.0 µm). The graph compares values for surface complexity versus feature anisotropy for eight specimens of P. boisei (red circles) and nine specimens of P. robustus (blue diamonds). The data from which this plot was constructed are recorded in Grine and coworkers[40] (Table 4). The anisotropy variable (Y-axis) is the exact proportion length to scale anisotropy of relief (epLsar), and quantifies the degree of directionality in surface roughness at a fine scale. Higher epLsar values indicate increased directionality of wear features (for example, more parallel scratches). The complexity variable (X-axis) is a scale-sensitive measure of surface roughness (Asfc). It quantifies change in roughness with scale so that more complex surfaces have areas that increase at faster rates with resolution. Higher Asfc values indicate increased surface roughness (for example, a higher incidence of pitting, which is characteristic of hard-object consumers). The photosimulations and simulated mesh axonometrics are courtesy of Peter Ungar. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]

SEASONAL MORTALITY AND FALLBACK FOODS

At the root of this question is the assumption that fallback foods should be “overrepresented” in the paleontological record. This is based on the notion that animals, including primates, most often die during times of food resource stress.[51, 52] In support of this intuitively appealing notion, Ungar and Sponheimer[52] have cited studies by Gould, Sussman, and Sauther[53] and Nakagawa, Ohsawa, and Muroyama[54] showing population declines in two primate species in relation to several years of extreme hardship, such as drought. Along these same lines, Hamilton[55] documented 22 baboon deaths during a period of intense drought in the Namib Desert of Namibia. Pictures of animal mortality during periods of drought and nutritional deprivation have been vividly etched into our vision of the African savanna under duress (Fig. 2).

Figure 2.

Death scene on the African savanna. An elephant carcass at a desiccated water hole during the dry season (July-September) at Etosha National Park, Namibia. Having been partially devoured by hyenas, the carcass is now the focus of a cast of vultures. Photograph courtesy of Conrad Brain. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]

Unfortunately, Hamilton's[55] study spanned only one seven-month period of extreme hardship and was conducted in an area from which the top baboon predator had been extirpated. In fact, subsequent studies on the same population under less duress revealed that the majority of infant deaths were caused by kidnapping and especially parasitism (tick infestations) and that adult male mortality was primarily caused by injuries from agonistic interactions.[56, 58] With regard to the 15-year period recorded by Gould, Sussman, and Sauther,[53] there appears to have been a virtually complete turnover of the study population, which would indicate significant mortality even in the years that did not include extreme drought. Indeed, since neither Gould, Sussman, and Sauther[53] nor Nakagawa Ohsawa, and Muroyama[54] identified the causes of death or the diet of the animals, it is possible that the observed mortality patterns were driven by rainfall, disease such as increased parasitism, or predation rather than by dietary factors.

In this regard, increased rather than decreased rainfall has been linked to increased parasitism[58, 59] and leucocyte loads, which serve as a measure of immune system response to infectious agents.[60] Studies of chimpanzees and gorillas have shown that most deaths are disease- and parasite-related.[61, 62] Even in those instances in which mortality is tied to predation, this may be enmeshed with disease and/or diet. Disease may predispose individuals to capture,[63, 64] while serious dietary insufficiencies that lead to a decline in overall health could increase susceptibility to disease and predation.

Thus, while there is some evidence from field studies of modern primates to support differential mortality being tied to food abundance,[53, 54, 65] other studies[58, 59, 61, 62, 66, 67] have found little or no association between food availability and mortality. In addition to food deprivation,[66, 68, 69] many factors contribute to primate mortality, including disease,[62, 66, 70-72] predation,[73-76] and injury during agonistic interactions.[62, 77] Environmental variables may influence the relative importance of these sources of mortality.[66, 78] Furthermore, many different factors may interact to drive mortality patterns. Determining the ultimate causes of death is extremely difficult, especially for wild primates. For example, temperature and rainfall may affect the prevalence of vector-borne diseases because they affect arthropod distribution and abundance as well as the development and transmission rates of parasites.[79] Thus, elevated mortality during rainy seasons may be related to increased disease risk.[66, 71] Alternatively, dietary stress induced by seasonal declines in resource availability, often associated with decreased rainfall, may depress the immune system, thus increasing susceptibility to disease.[59, 80] Given this somewhat mixed evidence, it is uncertain whether fallback foods would be overrepresented in the microwear of taphonomic assemblages of extinct hominin species regardless of the ultimate causes of death.

SEASONAL MORTILITY IN PRIMATES

The question that needs to be addressed directly is whether there is any evidence to suggest that primates die preferentially during periods of high dietary stress, when individuals rely more heavily, if not entirely, on fallback foods. Although considerable attention has been paid to the use of fallback foods by various primate species,[47, 81-86] there are surprisingly few data on mortality patterns in relation to the periods when these resources are most important.

Causes of death are expected to differ in the degree of seasonality they impose on mortality patterns. In addition, the relative contributions and interactions of these causative factors to produce mortality patterns are expected to differ across species. Seasonal patterns of resource availability and disease susceptibility suggest that species living in more seasonal environments will exhibit higher seasonal mortality. Moreover, because it has been argued that leaves are an abundant and ubiquitously distributed resource over which primates will not have to compete[87] (but see Sterck and Steenbeck[88] and Snaith and Chapman[89]), and because foliage is likely to be more abundant than fruit during periods of resource scarcity,[89, 90] there is an a priori expectation that folivores and frugivores will differ in patterns of seasonal mortality. All other factors being equal, more frugivorous species should exhibit more seasonality in mortality than more folivorous taxa.

Although numerous reports of the causes of death in primate populations have been recorded in the literature, many of these are anecdotal. Patterns of mortality have rarely been analyzed over broad spatial or temporal scales; they also have not been compared across multiple species.

AN ANALYSIS OF DATA FOR TEN PRIMATE POPULATIONS

To gain insight into the question of whether mortality is seasonally driven and, in particular, whether it is correlated with times of food resource scarcity in extant nonhuman primates, Gogarten and coworkers[91] compiled adult and juvenile mortality data for ten wild populations of nine species distributed across a variety of environments ranging from dense tropical rain forest to open savanna. They combined original mortality data from seven field sites with comparable data for three other species reported by Milton,[71] Watts,[61] and Cheney and coworkers[67] to determine how seasonal variation in rainfall and resource availability may influence mortality.

The species included in the study by Gogarten and colleagues[91] (Table 1) represent most of the major primate clades, and thus provide a broad taxonomic sampling of the order. Primates are particularly well-suited for investigating the role of seasonality on mortality patterns for four important reasons. First, it is possible to conduct comparative analyses such as that by Gogarten and associates[91] because the phylogenetic relationships among crown groups are reasonably well-resolved. To account for the non-independence of data related to phylogenetic propinquity, Gogarten and colleagues[91] employed phylogenetic generalized least squares (PGLS) regression[92-94] using the consensus tree for the taxa of interest from the 10kTrees project (Fig. 3).[95] Second, conclusive rather than inferred mortality data are available for various populations that have been followed in the field for prolonged periods. Third, primate dietary habits, which vary across species, have been extensively documented. Although many primate species consume a variety of items, most show distinct preferences, permitting them to be characterized in general terms as being primarily frugivorous or folivorous.[96] Moreover, many display seasonal differences, coming to rely on fallback foods during periods of preferred resource scarcity.[97, 98] Finally, rainfall seasonality varies across the geographic range of primate habitats from marked seasonality to more evenly distributed precipitation throughout the year.[99]

Figure 3.

Phylogeny of the primate populations included in the analysis by Gogarten and colleagues.[91] Modified from the consensus tree of the 10kTrees project.[95] For those species for which genetic data were not available, well-established sister taxa that excluded all others in the analysis were employed (for example, Gorilla gorilla was used in place of Gorilla beringei). To incorporate the two populations of Cebus capucinus, a short branch of 10,000 years was added to the tip of this species lineage. The scale is in millions of years. Reproduced with modification from Gogarten and coworkers[91] (Fig. 1).

Table 1. Taxonomic and Geographic Distribution of the Primate Population Groups Included in the Study by Gogarten et al.[91]
SpeciesStudy siteDietCommon name
  1. Principal dietary category classification follows Nunn and van Schaik.[96]

Strepsirrhini   
Propithecus edwardsiRanomafana, MadagascarFolivoreMilne-Edwards’ sifaka
Haplorhini   
Platyrrhini   
Atelidae   
Alouatta palliataBarro Colorado Island, PanamaFolivoremantled howler
Cebidae   
Cebus capucinusLomas Barbudal, Costa RicaFrugivorewhite-faced capuchin
Cebus capucinusSanta Rosa, Costa RicaFrugivorewhite-faced capuchin
Catarrhini   
Cercopithecidae   
Cercopithecinae   
Cercopithecini   
Cercopithecus mitisKakamega, KenyaFrugivoreblue (diademed) monkey
Papionini   
Papio ursinusMoremi Reserve, BotswanaFrugivoreChacma baboon
Colobinae   
Presbytis thomasiGunung Leser, IndonesiaFolivoreThomas’ langur
Piliocolobus rufomitratusKibale, UgandaFolivoreTana River red colobus
Hominidae   
Homininae   
Gorilla beringeiKarisoke, RwandaFolivoreMountain gorilla
Pan troglodytesGombe, TanzaniaFrugivoreEastern chimpanzee

Gogarten and coworkers[91] addressed five non-mutually exclusive questions regarding primate mortality patterns: Is there a seasonal pattern to mortality across primate species? Is greater environmental seasonality related to increased seasonality in mortality? If mortality is seasonal, is it higher during wet than dry seasons? If mortality is seasonal, is it higher during periods of preferred food scarcity or fallback food consumption? If mortality is seasonal, do folivores show less seasonality than frugivores?

Gogarten and coworkers[91] found that about half (five of nine) of the primate species in their study exhibited seasonal mortality. However, even in the most seasonal populations, death occurred throughout the year, likely reflecting multiple causes of mortality such as those related to predation, disease and injury, each of which may act independently of and/or be differentially influenced by environmental seasonality (Table 2, Fig. 4). Mortality was more seasonal in more seasonal environments and was strongly tied to rainfall. However, elevated mortality was most often higher in wet periods than in times of food resource stress. This relationship is possibly driven by disease, as increased rainfall has been associated with increased parasite loads[58, 59, 79, 100, 101] and white blood cell counts.[60]

Figure 4.

Number of deaths in each month for the ten species included in the study by Gogarten and coworkers.[91] Bars indicate number of deaths, with the axis on the left. The solid line indicates the monthly rainfall in mm, with the axis on the right. The food-scarce period is indicated by the dashed line at the top of each species' graph; where possible, this is based on site-specific plant phenology data; in other cases, this is based on the feeding or weight loss data and observations from researchers working with the species at the site (see the methods section of Gogarten and coworkers[91] for specific details). For Presbytis thomasi, the topmost dashed line indicates the period of fruit scarcity, while the lower dashed line indicates the period of young leaf scarcity. For Pan troglodytes the topmost dashed line indicates the period of weight loss, while the lower line indicates the period of fruit scarcity. Reproduced with modification from Gogarten and associaates[91] (Fig. 2).

Table 2. Mortality Data for the Primate Population Groups Included in the Study by Gogarten et al.[91]
Speciesn deaths(study yrs)Seasonalityof mortality(r-statistic)Rainfallseasonality(r-statistic)Mortality during the wet seasonMortality during period of food scarcity
  1. Principal dietary category classification for each population is recorded in Table 1.

  2. Cebus capucinus (LB) = study population at Lomas Barbudal; Cebus capucinus (SR) = study population at Santa Rosa.

  3. Details on the statistical analyses are presented in Gogarten et al.[91]; r-statistic values with asterisks indicate significant results with ***(p < 0.001), **(p < 0.01), and *(p < 0.05); ns = no significant pattern.

  4. a

    Mortality was seasonal but this seems to be driven by predation during the period of flooding rather than seasonal patterns in food availability.

Propithecus edwardsi23 (24)0.400*0.262DecreasedElevated
Alouatta palliata179 (7)0.413***0.320ElevatedElevated
Cebus capucinus (LB)31 (11)0.1980.536nsns
Cebus capucinus (SR)20 (15)0.2580.540nsns
Cercopithecus mitis46 (13)0.0960.198nsns
Papio ursinus38 (11)0.409**0.775ElevatedtDecreaseda
Presbytis thomasi42 (15)0.1890.088nsns
Piliocolobus rufomitratus11 (10)0.4450.122ElevatedDecreased
Gorilla beringei38 (25)0.297*0.115ElevatedDecreased
Pan troglodytes112 (50)0.0770.456nsns

MORTALITY AND RESOURCE SCARCITY

Contrary to expectation, this study yielded only limited support for the hypothesis that mortality is higher during periods of food scarcity. Of the five species that exhibited a seasonal pattern of mortality, two (Alouatta palliata and Propithecus edwardsi) showed elevated mortality during the food-scarce period, while three (Papio ursinus, Piliocolobus rufomitratus, and Gorilla beringei) had elevated deaths during periods of food abundance. Moreover, and also contrary to expectation, Gogarten and associates[91] found that when controlling for phylogeny, folivorous species exhibited more seasonal mortality than did frugivorous taxa. This, too, may be related to disease factors, with the larger volumes of food eaten by folivores leading to the ingestion of more parasites, the infectious stages of which contaminate leaf materials. Comparative studies have indicated that the degree of folivory in primates is positively correlated with parasitic helminth diversity and infestation.[102, 103]

Thus, the results of the analysis by Gogarten and colleagues[91] indicate that microwear patterns observed in the fossil record are unlikely to overrepresent periods in which fallback foods were eaten under conditions of resource scarcity. In only five of the nine primate species examined was there any relationship between mortality and resource availability, and in only two was mortality elevated during food-scarce periods. While additional field studies of differential mortality in other populations of primate (and other mammalian) species are required to fully address this issue, the results of the study by Gogarten and colleagues[91] indicate that, notwithstanding the “Last Supper effect,”[2] fallback foods should not necessarily be expected to be overrepresented in microwear studies.

Of course the vagaries of taphonomy and fossilization may lead to the preservation of unusual patterns of mortality, such as those represented in catastrophic assemblages.[40, 104, 105] These assemblages may comprise individuals who fed on uncommon types of foods or ingested unusual amounts of exogenous grit and, as such, may unduly affect the perceived microwear signature for a population or a species (Fig. 5). In addition, a degree of interpretive caution is also warranted by the possibility that the global patterns of seasonality seen today may not necessarily equate to those in the Pliocene and Early Pleistocene of Africa.

Figure 5.

Fossils in the making? A large herd of hippopotamus congregates in the quickly disappearing remnants of a river in the Katavi National Park, Tanzania, during the severe drought of 1971. Over 100 animals have amassed in this 45-m section of the river bed to stave off the effects of the sun. These individuals are at risk of becoming part of a catastrophic death assemblage. Their recent diets will bear little resemblance to that which is typical of the species in this region under normal conditions. Photograph courtesy of Alan Root. [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]

MODERN PRIMATE MODELS FOR PLIOCENE HOMININS

Is there any reason to suspect that microwear patterns in the hominin fossil record are likely to overrepresent periods in which fallback foods were eaten under conditions of resource scarcity? As our closest living relatives, chimpanzees have been suggested as a model for the dietary habits and mortality patterns of the earliest hominins.[106, 107] Interestingly, however, the chimpanzees of Gombe demonstrate the least seasonal mortality of all the primate species examined by Gogarten and coworkers.[91] Given this, there would seem to be little reason to postulate seasonal over-representation of fallback foods in a species such as Ardipithecus ramidus, whose reconstructed habitat is a woodland or woodland-grassland biotope (see Grine and colleagues[40] for a summary of the paleoecological evidence), consistent with the habitat used by chimpanzees today.

Many workers envision later hominin taxa as having inhabited more open savanna environments than are currently occupied by most chimpanzees.[108-112] As such, chacma baboons may be a better model for a primate inhabiting the type of environment faced by Pliocene hominins. The chacma baboons of the Okavango deltaic savanna exhibit seasonally elevated death rates owing to predation pressure rather than food scarcity (Fig. 6). The predation pressure during this period seems to be driven by predators focusing on the predictably shallowest and shortest water routes between islands used by the baboons.[67] In addition, the migration of other potential prey species to the flood margins in search of grass at this time potentially causes an increased predatory focus on the baboons (D. Cheney, personal communication). The wet and wooded grassland that characterizes the Okavango endorheic delta is the sort of environment that has been envisioned for a variety of hominin taxa throughout the Plio-Pleistocene[112-114] and accords particularly well with the preferred habitats envisaged for Paranthropus boisei.[115]

Figure 6.

The Okavango deltaic savanna in February 2010. Photograph by F.E. Grine. Inset: a female chacma baboon feeds on a blue water lily (Nymphaea nouchali) in the Okavango. Photograph courtesy of Roman Wittig. Lilies are favored not only by the baboons, but also by the local human inhabitants, with both primate species viewing them less as a fall-back resource than a “fall-on” food (D. Cheney, personal communication). Is this possibly the type of item that may have played a critical role in the dietary repertoire of Paranthropus boisei? [Color figure can be viewed in the online issue, which is available at wileyonlinelibrary.com.]

Similarly, while the chacma baboons that inhabit the more arid Namib Desert exhibit increased mortality during intense drought,[55] other years of less severe drought find deaths being primarily due to parasitism, infanticide, and injury, with no indication of starvation or malnutrition as the root of demise.[56, 57] These observations appear to provide additional support for the notion that microwear will not necessarily over-represent periods of resource stress (or the use of fallback foods) in the Plio-Pleistocene hominin fossil record.

Because of a wide range of potential taphonomic biases in their mode of accumulation,[30, 31, 39] fossil assemblages may not reflect underlying seasonal patterns of mortality even if these existed. Regardless of the season in which animals may tend to die, preservation is strongly affected by where death occurs and the subsequent potential for burial. For example, remains are more likely to be preserved if individuals die on riverine margins just before a period of overbank flooding. Even if there was a seasonal pattern of mortality in a hominin population, the multifarious processes involved in fossilization could easily overprint those patterns and influence the kind of microwear preserved. We suggest that these taphonomic processes may be more universally important than seasonal mortality in influencing the hominin fossil record and its interpretation.

CONCLUSIONS

Dental microwear has been used to elucidate the dietary habits of extinct species. Because microwear may exhibit turnover, it potentially preserves information pertaining only to those items consumed just before an individual's death. Seasonal variation in diet, and especially seasonal reliance on fallback foods that present significant structural obstacles to comminution, when coupled with differential seasonal mortality, might lead to the preservation in a fossil assemblage of microwear patterns that do not reflect the broader diet of the species. In an attempt to reconcile observed microwear with the inferred masticatory capabilities of some hominin taxa, it has been suggested that their trophic features may represent adaptations to fallback resources. However, it also has been argued that fallback foods should be very visible in the paleontological record because of the expectation that mortality should be higher during times of nutritional resource stress. We review the available evidence on mortality patterns in primates to inform this discussion. Although many species exhibit seasonal mortality, deaths are generally not more common during periods of preferred food scarcity. Rather, the observed mortality trends seem more likely to be related to disease or predation. Thus, current evidence suggests that fallback foods are unlikely to be overrepresented in microwear studies of extinct taxa, especially studies that incorporate teeth from more than a few individuals. Even if extinct hominin populations exhibited distinctly seasonal patterns of mortality, the complex taphonomic processes that affect fossil accumulations may be more important in influencing the interpretation of microwear in the paleontological record.

ACKNOWLEDGMENTS

We thank John Fleagle for the invitation to submit this essay. We are grateful to him, Anna Kay Behrensmeyer, and Dorothy Cheney for their cogent comments on different versions of this manuscript. We thank John Fleagle for his editorial skill and Travis Pickering and three anonymous reviewers for their keen observations and thoughtful suggestions on the manuscript; all have greatly improved its clarity. We thank Roman Wittig for permission to use the photograph of the Okavango baboons that formed the core of Dorothy Cheney's remarkable 16-year-long study. Conrad Brain and Alan Root graciously permitted use of their photographs of some unfortunate African mammals. We thank Peter Ungar for providing the images of early hominin molar microwear. We thank Leone Brown and Colin Chapman for fruitful discussions and Caitlin Friesen for her careful reading of the manuscript in its final stages. Jan F. Gogarten was supported by a Graduate Research Fellowship from the National Science Foundation (Grant Number: DGE-1142336), the Explorers Club - Eddie Bauer Youth Grant, and the Canadian Institutes of Health Research's Strategic Training Initiative in Health Research's Systems Biology Training Program.