The role of diet in human evolution is of interest to paleoanthropologists and laypersons alike. Most people are introduced to the subject in popular diet books. Atkins (2002) The New Diet Revolution for example, described the early hominin, “eating the fish and animals that scampered and swam around him, and the fruits and vegetables and berries that grew nearby.” While such assertions offer little insight to the target audience of the Yearbook of Physical Anthropology, The New Diet Revolution sold more than 15 million copies. And the chapter on hominin diets in Sears' (1995) New York Times #1 best seller The Zone has undoubtedly been read by millions more than even the best-cited academic paper on the same subject. The basic idea is that many chronic degenerative diseases result from a discordance between our biology and the foods we eat because human evolution has not kept pace with changes in diet and other aspects of lifestyle (Eaton and Konner, 1985). While paleoanthropologists are unlikely to solve obesity and other healthcare problems any time soon, it is not difficult to understand public interest in the question, “what did early hominins eat?”
But what explains interest in hominin diets among academics? Brillat-Savarin (1825) wrote nearly two centuries ago, “dis-moi ce que tu manges, je te dirai qui tu es.” “Tell me what you eat, I will tell you who you are.” Diet defines us, not just as individuals, but as a species. And changing diets are surely an important key to understanding hominin evolution. Our food choices dictate our fundamental interactions with the environment and, as Fleagle (1999) has written, diet is “the single most important parameter underlying the behavioral and ecological differences among living primates.” Reconstructions of diet are therefore crucial to paleoanthropology; they hold the potential to yield substantial insights into early hominin ecology and evolution.
We will likely never get at all the details of hominin paleonutrition; there are intractable, intrinsic limits to our knowledge (Ungar, 2007b). Because individuals have different food preferences and access to different resources in different places and at different times, there was no single menu for a fossil hominin species, no “nutritional contents” labels for us to decipher. But there are aspects of diet we can infer. And I am convinced that we can do better than Begun's (2004) pessimistic lamentation that, “while it is frustrating to be unable to describe a fossil hominoid's behavior with sufficient detail to be able to distinguish it from an edentate, that is probably as good as it gets.”
Most research on early hominin diets has focused on fossil teeth. Teeth are usually the most commonly preserved elements in fossil assemblages, and they are the only durable parts of the digestive system that contact food. Teeth are especially important from an ecological perspective, as they are positioned at an interface fundamental to the interaction between vertebrates and their environments. As the early naturalist Georges Cuvier is quoted to have said, “montrez-moi vos dents et je vous dirai qui vous êtes.” “Show me your teeth and I will tell you who you are.”
Teeth offer vertebrate paleontologists many different lines of evidence for reconstructing diet, both those related to species-level adaptations, and traces of actual use by individuals in life (see Ungar, 2010 for review). The size, shape, internal architecture, and microstructure of a tooth reflect natural selection for efficient acquisition and processing of foods with specific physical properties. Stable isotopes in a tooth, and use-related damage and wear tell us about chemical and structural properties of foods eaten by the very animal whose remains are being studied. In this article, I will review approaches that paleoanthropologists take to reconstructing diets of Plio-Pleistocene hominins using tooth size, shape, structure, and wear (see Lee-Thorp and Sponheimer, 2006 for a review of tooth chemistry).
Academic interest in Plio-Pleistocene hominin diets has grown hand-in-hand with new theoretical perspectives and methods of analysis over the past couple of decades. Today we address questions I could not have imagined asking in my first published review of the subject (Ungar, 1992). Were early hominins dietary specialists or generalists? How versatile were their palates? Do adaptations reflect selection for favored foods or less preferred fallback items? These questions are made possible in large part by the development of increasingly sophisticated ways of teasing more and more information from teeth. Each line of evidence can provide important insights as we colligate the genetic and nongenetic clues.
Researchers have long considered tooth size an important line of evidence of early hominin diets. The lengths and breadths of teeth are easy to measure, species differ from one another in these measurements, and logic dictates that these differences should relate to variation in food acquisition and processing.
Early work on incisor–molar size ratios
Robinson (1954) noted in his seminal paper on South African hominin dentitions that “Australopithecinae” in general, and Paranthropus robustus in particular, had small anterior teeth compared with their cheek teeth. He argued that Paranthropus, with its disparity in size between the front and back teeth, flattened premolar and molar occlusal surfaces, and thick mandibular corpora, was well suited to crushing and grinding vegetation such as shoots and leaves, berries, tough wild fruits, and grit-laden roots and bulbs. Australopithecus on the other hand, with its relatively larger front teeth and smaller premolars and molars, at least compared with Paranthropus, “would probably have had a more nearly omnivorous diet, which may have included a fair proportion of flesh.” Robinson also recognized a fundamental contrast in cheek tooth size between these hominins and the “Euhomininae” (now Homo), observing that “Telanthropus” and “Sinanthropus” had smaller molars than even Australopithecus. This fundamental contrast was echoed by Leakey et al. (1964), when these authors included small molars relative to Australopithecus (including Paranthropus) as part of their revised diagnosis of the genus Homo in their initial description of H. habilis.
Subsequent researchers attempted to better understand the functional implications of these differences in tooth size using living primates as analogs. Groves and Napier (1968) for example, associated relatively large incisors compared with molars in chimpanzees with the consumption of fruit with “a hard exterior but soft interior, conditions which demand a strong incisor bite but relatively little chewing.” They noted that, in contrast, gorillas have a much lower incisor–molar size index, presumably reflecting “a diet low in fruit but high in coarse vegetable matter.” These results were consistent with Robinson's original interpretations, as Australopithecus had a similar incisor–molar index to gorillas, and Paranthropus had even smaller incisors and larger molars. Interestingly, Homo habilis had an incisor–molar index in the range of chimpanzees and orangutans.
Soon after, Jolly (1970) developed his seed-eater hypothesis based on analogy with the gelada, a large-bodied, terrestrial savanna dweller with precision grip, relatively small incisors, and large molars. He suggested that Paranthropus dental proportions related to an adaptation to consume small, tough seeds. Smaller incisors were attributed to a somatic-budget effect wherein selection favors the smallest size consistent with function, and the Oppenheimer effect, wherein stress limits alveolus development and room for anterior teeth (see Jungers 1978). Paranthropus was said to have achieved a stable adaptive plateau (his Phase I), whereas Australopithecus and Homo had increased incisor–molar size ratios, and evolved in to a Phase II omnivorous dietary adaptation.
Work on tooth size that has followed has considered the front and back teeth separately, as ratios of the two cannot tell us whether selection is acting on the incisors, molars, or both. Hylander's (1975) study on anthropoid incisors provides a case in point. He found that residuals from a regression line of incisor row width plotted against body mass reflect diet such that frugivorous cercopithecines have relatively larger front teeth than do folivorous colobines (see also Goldstein et al., 1978). Further, frugivorous squirrel monkeys, lar gibbons, and chimpanzees have relatively larger front teeth than do more folivorous howlers, siamangs, and gorillas respectively.
The take-home message is that higher primates feeding on large-husked fruits likely benefit from larger incisors to process them, whereas those that feed on smaller objects (e.g., berries, leaves) do not. Larger incisors have also been argued to increase functional life given wear associated with increased use, though this inference has recently been called into question (see McCollum, 2007). And there are caveats, such as the importance of comparing closely related species; platyrrhines as a group have smaller incisors than do catarrhines, independent of diet (Eaglen, 1984,see also Ungar, 1996). Further, the relationship between diet and incisor size in strepsirhines, which have specialized tooth combs, is not nearly as clear (Eaglen, 1986). Nevertheless, because relative incisor size does track diet to an extent when comparing closely-related higher primates, there is likely value in examining this attribute in early hominins.
Relative to body size, it appears that neither Australopithecus nor Paranthropus has especially large incisors when compared with living apes (Ungar and Grine, 1991). In fact, A. anamensis, A. afarensis, and A. africanus all fall on the regression line for a plot of I1 breadth against body weight (in log–log space) comparing extant catarrhines (see Fig. 1). This line connects gibbons and gorillas, suggesting that Australopithecus spp. were intermediate between more folivorous colobines and more frugivorous cercopithecoids and hominoids in their propensity for anterior tooth use (Teaford and Ungar, 2000). Paranthropus robustus evidently had smaller incisors and presumably less incisor use in ingestion, with a residual value similar to those of many living colobines and modern humans. Interestingly, Homo habilis and H. rudolfensis may have had larger incisors, more like those of chimpanzees and orangutans, whereas H. erectus incisors were apparently similar in size to those of Australopithecus (Teaford et al., 2002).
These results at first glance might suggest an increase in incisor use in the earliest members of our genus, followed by a decrease, perhaps related to increasing tool use for food acquisition and processing, in African Homo erectus. On the other hand, interpretations of incisor allometry in hominins should probably be approached with caution, if not skepticism, given extremely small samples and uncertain body weight estimates (Ungar et al., 2006a). We can count the number of I1s reported for some hominin taxa with the fingers of one hand, which presents a formidable challenge given typical size variation of about ±20% for hominoids (Plavcan, 1990). An even greater challenge is the paucity of associated craniodental and postcranial remains upon which to base body weight estimates and the huge confidence intervals of those estimates published for most early hominin taxa (Smith, 1996).
Studies of molar allometry are not in much better shape. In fact, not only are samples small for some taxa and body weight estimates questionable, but the relationship between relative cheek-tooth size and diet is less clear. Nevertheless, molar size continues to be considered by researchers an important proxy for adaptive zone (e.g., Wood and Collard, 1999; Leakey et al., 2001), and many have suggested a trend over time, with an increase in the australopiths followed by a decrease in the genus Homo (e.g., see Brace et al., 1991; McHenry and Coffing, 2000; Teaford and Ungar, 2000; Fig. 1). Enlarged, “megadont” cheek teeth in Australopithecus and especially Paranthropus have been said to provide more surface area to process larger quantities of low-quality, mechanically challenging foods. While there has been debate regarding the species of Homo that began the trend toward smaller cheek tooth size (compare Wood and Collard, 1999; McHenry and Coffing, 2000), dental reduction in the genus is often related to relaxation of selective pressures given increasing extraoral food processing with tools and by cooking. It has also been argued to reflect an adaptation to avoid dental crowding in a smaller jaw or to slow the rate of food processing (e.g., see Brace et al., 1991; Calcagno and Gibson, 1991; Lucas et al., 2009).
Some have suggested that Australopithecus and Paranthropus had similar diets, and that tooth size differences actually relate to differences in body size. Pilbeam and Gould (1974) argued, based on positive allometry of cheek tooth surface area across mammals, that the australopiths were “scaled variants of the ‘same’ animal.” And the fact that larger-cheek-toothed species also likely had larger chewing muscles suggested to some comparable stresses across occlusal surfaces, which could be interpreted as consuming “more of the same” (Walker, 1981; Demes and Creel, 1988). Indeed, some even advocated a single-species hypothesis for the South African australopiths, with variation in tooth size resulting from larger individuals having disproportionately larger masticatory apparatus (Wolpoff, 1974).
That said, cheek tooth areas do not scale with positive allometry for species with similar diets; they scale isometrically with body size (Kay, 1975,Corruccini and Henderson, 1978,Goldstein et al., 1978). In fact, while cheek teeth scale isometrically, smaller animals actually process more food in a given period time because they chew more quickly, which makes sense given their typically higher metabolic rates (Fortelius, 1988). Discussions of scaling-related metabolic equivalence of australopith cheek teeth are made moot however, by the fact that there is little evidence for consistent differences in body size between these hominins (e.g., Jungers, 1988; McHenry, 1988).
Cheek tooth size differences among early hominins are therefore likely related to food differences. They might be related to food quantity and quality as mentioned above, or as Lucas (2004) has argued, to external properties of foods, such as the size, shape, or abrasiveness of ingested particles. A diet dominated by smaller items, such as grass seeds or berries, or thinner ones, such as leaves, should select for larger teeth to increase the probability of fracture. Likewise, abrasive foods, such as plant parts rich in phytoliths or adherent grit, should select for larger teeth to increase surface area for wear. Thus, more megadont hominins may have been adapted to consume more small, thin, and/or abrasive foods.
We may be able to gain further insights by considering relationships between occlusal surface area and diet in living primates. Because leaves, especially mature ones, are tough, thin sheets requiring thorough chewing, they should select for larger teeth; and folivores do have longer molars than closely-related frugivores for many primate groups (Kay, 1977; Vinyard and Hanna, 2005). This does not hold for Old World monkeys however; colobines have smaller molars than cercopithecines (Kay, 1977). It is also unclear why for many primate species, males have relatively smaller cheek teeth than females (Harvey et al., 1978). Because relative molar size does not track broad diet category the same way in all groups of extant primates, we are on shaky ground using this attribute to retrodict food preferences for fossil taxa (see Kay and Cartmill, 1977), at least until we can explain differences in patterns between extant higher-level taxa.
So what might explain the unexpected results for cercopithecoids? Perhaps the tendency toward smaller teeth in colobines relates to the need to avoid dental crowding given shorter faces. Of course, there remains then the question of whether tooth size drives jaw length or whether it is the other way around (Brace et al., 1991). Perhaps there is a modular developmental link between jaw length and molar size (see Vinyard and Hanna, 2005). Indeed, Workman et al. (2002) found for mice that many quantitative trait loci affecting tooth size and jaw shape are the same. This may have implications for fossil hominins, for which associations between jaw length and tooth size have been noted for some time (Sofaer, 1973). And as McCollum and Sharpe (2001) have argued, there is likely a developmental link between tooth form and skull form in early hominins. Until we have a better understanding of relationships between cheek tooth size and function in living primates however, we are probably best off looking to other lines of evidence, such as occlusal morphology, to infer the diets of fossil hominins.
Relationships between occlusal morphology and diet have been considered on two distinct levels, corresponding to Butler's (1983) internal and external environment, and reflected in Evans and Sanson's (2006) geometry of occlusion and geometry of function. On one level, the occlusal surface has been thought of as a guide for chewing, as its shape limits masticatory movements when opposing teeth enter and exit centric occlusion (Simpson, 1933; Crompton and Sita-Lumsden, 1970). Kay and Hiiemae (1974) noted for example, that insectivorous primates have reciprocally concave blades well suited to shearing tough insect chitin between the leading edges of crown crests, whereas frugivores have molars with cusp tips more in line with the occlusal plane for crushing and grinding three-dimensional fruit flesh and seeds (see also Rosenberger and Kinzey, 1976; Seligsohn and Szalay, 1978). Today some dental biomechanists speak of “autocclusal mechanisms” (Mellett, 1985) rather than teeth guiding jaw movements per se; tall cusps fit into deep basins are likely to prevent much transverse movement between teeth in occlusion (grinding). Another example would be opposing blades with acute angles of bevel, or rake angles, which push foods away from opposing blades as they shear, resulting in forces pressing those blades together in “autoalignment” as they approach one another (Evans and Sanson, 2006).
Researchers have recently begun to think of dental functional morphology on another level though, considering teeth not merely as guides for chewing, but as complex surfaces that interact directly with foods to accomplish fracture (Lucas, 2004). Mammals chew to facilitate the assimilation of stored energy in foods. They rupture protective casings such as plant cell walls and insect exoskeletons to release nutrients, and fragment items to increase surface area for digestive enzymes to act on. As Aristotle noted more than two millennia ago in De Partibus Animalium, “teeth have one invariable office, namely the reduction of food.” Workers who take this approach to dental functional morphology prefer not to think of teeth in terms of shearing, crushing, and grinding but, rather, as tools for generating and propagating cracks through food items (Lucas and Teaford, 1994).
Because different foods have different fracture properties, they require different tools to break them efficiently, and food science offers valuable predictions for tooth form–function relationships (see Lucas, 2004; Lucas et al., 2008a; Ungar and Lucas, 2010 for review). Some foods are protected by stress-limited defenses. They tend to be strong and stiff, demanding substantial force per unit area to initiate a crack in them. These items are also often brittle, requiring little work to spread a crack once it starts. Examples include many nuts and other hard (in the vernacular sense) objects. Such foods should select for blunt, dome-shaped cusps to concentrate force on a small area, but at the same time protect the tooth itself from fracture. These cusps should oppose concave surfaces formed by basins or staggered cusps to prevent energy loss due to spread or movement of food. Other foods are protected by displacement limited defenses and are tough or ductile. Initiating a crack in such items may be less of a problem than propagating it. Examples include many leaves, insect exoskeletons, and raw vertebrate flesh. Such foods are best divided using offset opposing blades or crests. These serve as wedges to create tension at the tips of advancing cracks; and there is little risk of cracking sharp cusp tips on tough, pliant foods that spread and produce compressive stress on the teeth. Yet other foods are intermediate in their fracture properties, and many are composites with individual parts varying in their mechanical defenses. These can select for teeth intermediate in form, those with two or more distinct functional elements, or differentiation of tooth types along the dental row.
Studies of dental morphology confirm these predictions. Folivorous and insectivorous primates generally have longer shearing crests relative to tooth length than do frugivores, and hard-object feeders tend to have very short crests and more bulbous cusps (Rosenberger and Kinzey, 1976; Kay and Covert, 1984; Strait, 1993; Meldrum and Kay, 1997). That said, comparisons should be limited to closely related species as noted above for incisor allometry studies as, for example, cercopithecoids have longer crests than platyrrhines independent of diet (Kay and Ungar, 1997; Ungar, 2005).
The fact that teeth change shape as they wear must also be considered. The gold standard for characterizing primate molar crest or blade lengths has been Kay's (1977) shearing quotient (SQ) method, which involves measurement of mesiodistal crests running up and over cusps. Most SQ studies have been limited to unworn or nearly unworn specimens because the cusp tips used to define crest endpoints are obliterated with wear. This can be a problem for taxa represented mostly by worn teeth. The entire published assemblage of South African australopiths for example, includes fewer than 10 unworn M2s (the tooth most often used in SQ studies). It also gives us only part of the picture. Surely natural selection does not stop when wear starts; teeth should evolve to be worn in a manner that keeps them functionally efficient throughout the reproductive years (see King et al., 2005).
Dental topographic analysis was developed as a landmark-free approach to characterizing functionally relevant aspects of occlusal morphology in variably worn teeth (Ungar and Williamson, 2000; Ungar and M'Kirera, 2003; Dennis et al., 2004). We use a laser scanner to generate point clouds representing the occlusal table of a tooth and geographic information systems (GIS) software to interpolate and analyze the surface. Cusps are represented by mountains, fissures by valleys, etc. and the tools available for measuring those surfaces are used to generate data on average surface slope, angularity, relief, and other attributes. Teeth are scored for gross wear using Scott's (1979) technique, and taxa are compared by wear stage using a factorial ANOVA model. Results to date have been consistent with expectations. While primate teeth get flatter when worn (Dennis et al., 2004), folivorous monkeys and apes have, at given stages of gross wear, steeper sloping surfaces with greater occlusal relief than do closely related frugivores (M'Kirera and Ungar, 2003; Ungar and M'Kirera, 2003; Ungar and Bunn, 2008; Bunn and Ungar, 2009).
Few studies have considered variation in occlusal functional morphology among early hominins. While these species can be distinguished on the basis of their occlusal form (see Bailey and Wood, 2007 and references therein), it is difficult to measure their SQs even on unworn molars, given bulbous cusps that lack discrete shearing crests. There are nevertheless apparent differences among hominins. Wallace (1975) suggested for example, that among South African australopiths, Paranthropus had lower cheek tooth cusps than Australopithecus, and according to Grine (1981), the latter have molars with steeper wear facets than do the former. Grine opined based on this that the “gracile” australopiths evinced more shearing, with occlusal surfaces sliding past one another nearly parallel to their planes of contact, whereas the “robust” form had a shallower approach into and out of centric occlusion for more grinding, which includes both perpendicular and parallel components to occlusal contact. Dental topographic analysis of these taxa confirm differences in molar morphology, with P. robustus having lower average occlusal surface slope values than A. africanus at any given stage of wear (Ungar, 2007a; Fig. 2). This is consistent with the idea that Paranthropus was adapted to consume more hard-brittle foods than was Australopithecus.
Differences in occlusal form between the australopiths and early Homo species have also been suggested. While early members of our genus did not have long, sharp crests as seen in extant folivorous primates, their cheek teeth do appear less bunodont than those of their australopith predecessors and contemporaries (Teaford et al., 2002,Wood and Strait 2004). And this too has been confirmed by dental topographic analysis. Results of comparisons of dental topography of M2s of a combined sample of H. habilis, H. rudolfensis, and H. erectus with Australopithecus afarensis and extant chimpanzees and gorillas indicate that early Homo as a group falls between the two African apes in average surface slope and topographic relief for all but the most worn specimens, and relief is significantly less than that of gorillas. On the other hand, the average occlusal slope for early Homo is significantly greater than that for A. afarensis. These results suggest, with caveats for small sample sizes and combined species samples, that early Homo species, while not specialists by any stretch of the imagination, would have been capable of shearing tough foods more efficiently than could A. afarensis. The australopith molars on the other hand, would probably have been more capable of resisting fracture under heavy stress loads.
Teeth and food are in a perpetual “death match” (Ungar, 2008) as nature selects for resistance to fracture in both; teeth must break foods without themselves being broken. With few exceptions (notably for many primates, fleshy fruits), it does a plant or animal little good to have its parts eaten. Structural defenses not only protect foods, but can make teeth vulnerable to breakage, especially given heavy stresses associated with crushing hard items, or the risk of fatigue failure with repetitive loading of tough ones. Teeth can be protected in the same ways that food items are defended, by hardening (sensu Lucas et al., 2000) to prevent cracks from starting or toughening to stop cracks from spreading (see Teaford, 2007b for review). Researchers have long considered the role of enamel thickness in resisting tooth fracture, and recent studies have focused on relationships between degree of mineralization and hardness, and between microstructure (especially the layout of prisms) and toughness.
Thick tooth enamel has traditionally been considered an important signpost along our evolutionary path marking the transition from an ape-like, arboreal lifestyle to a terrestrial, human-like one (e.g., Robinson, 1956). Simons and Pilbeam (1972) suggested that this trait is an adaptation to lengthen the use life of the dentition given rapid wear with the consumption of tough, grit-laden foods on the ground (see also Macho and Spears, 1999). While this seems intuitive, there is, as Kay (1981) has noted, no tendency among living apes or Old World monkeys for more terrestrial species to have thicker enamel than do more arboreal ones. And the fact that the largely arboreal orangutan not only has thicker enamel but also tends to have less worn teeth than themore terrestrial gorilla, suggests that this need not be a compensatory mechanism for increased wear (Dean et al., 1992).
Thickened enamel in early hominins could instead be an adaptation for structural reinforcement to prevent fracture (Kay, 1981). Indeed, primates that consume hard objects tend to have thicker molar enamel than do closely related species that eat softer foods (Dumont, 1995). This makes sense because tooth crowns are bilayered with stiff enamel overlaying more compliant dentin. Hard-object feeders should have thicker enamel because heavy loads would make thin coats more prone to flex and cause tensile stresses leading to cracks in teeth (Lucas et al., 2008b). An increase in the relative contribution of enamel to a crown should therefore result in less deformation for a given load and less risk of fracture (Popowics et al., 2001).
Despite methodological differences in measurement, there has been recognition by many that not only do Plio-Pleistocene hominins have thicker molar enamel than extant African apes, but that fossil species vary from one another. Wallace (1975) wrote that compared with australopiths, early Homo specimens from Swartkrans and those he examined from Koobi Fora seemingly have thinner enamel (see Fig. 3). Beynon and Wood's (1986) study of naturally fractured specimens support this observation, confirming that Paranthropus boisei has thicker molar enamel than early Homo, especially H. erectus. Further, Grine and Martin (1988) found for a small sample of sectioned teeth that both P. robustus and P. boisei had thicker enamel than Australopithecus africanus, suggesting to them functional implications for countering increased wear and/or occlusal loads (but see Olejniczak et al., 2008a).
But the relationship between enamel thickness and diet is not a simple one. It is more likely that the distribution of enamel across a crown, not just its average thickness, is important to understanding function (e.g., Greaves, 1973; Rensberger, 1973; Macho and Thackeray, 1992; Schwartz, 2000). Dental sculpting provides a case-in-point. Many mammals have teeth with what Fortelius (1985) has referred to as “secondary morphology;” they actually require wear to function properly (see Ungar, 2010 for discussion). Because enamel is harder than dentin, wear can cause a sharp edge to form where the two tissue types meet on the occlusal surface (Shimizu, 2002; Ungar and M'Kirera, 2003; Kono, 2004). Thinner enamel can mean quicker dentin exposure to facilitate fracture of tough foods. Thus, the morphology of the enamel–dentin junction (EDJ) can literally guide wear to sculpt occlusal surfaces.
New studies using X-ray microcomputed tomography (micro-CT) to map the distribution of enamel across tooth crowns show great promise to help us better understand both dental form and function (e.g., Kono, 2004; Gantt et al., 2006; Kono and Suwa, 2008; Olejniczak et al., 2008b; Smith and Tafforeau, 2008). For example, australopiths appear to have especially thick enamel over the cusp tips and relatively short dentin horns, whereas humans have thicker enamel at cusp bases (Olejniczak et al., 2008a). Olejniczak et al. (2008a) suggest that the australopith pattern reflects the consumption of abrasive small objects, or relates to the prevention of cracks at the EDJ in large-object feeders (see Lucas et al., 2008b). They also relate the pattern seen in H. sapiens to distribution of masticatory forces (see Macho and Spears, 1999).
Enamel mineral content
Researchers are beginning to consider the fracture properties of enamel at a finer scale, using nanoindentation to study hardness and examining histology to map prism layout and resistance to crack propagation. Nanoindentation studies indicate that hardness and stiffness vary across primate enamel crowns (Cuy et al., 2002; Braly et al., 2007; Lee et al., 2010). This research has shown that indentation hardness and Young's modulus can decrease by more than 50% from the occlusal surface to the EDJ. These attributes also vary between buccal and lingual sides of a tooth and between teeth (Darnell et al., 2010). According to Braly et al. (2007), changes in enamel properties at this scale relate to local chemistry (levels of mineralization, organic matter, and water content) and varying volume fractions of inorganic crystals and organic matrix. While there has yet to be a study of enamel hardness and mineral content in fossil hominins, researchers are beginning to document variation in indentation hardness and Young's modulus within teeth of different primate species (Darnell et al., 2010). Such studies may in the future provide new insights into relationships between dental form and function.
While enamel histology has little effect on hardness or stiffness at nanoscales (Braly et al., 2007), the internal structure of this tissue can be very effective at limiting the spread of cracks (see Maas and Dumont, 1999 for review). Most mammals bundle thousands of long, thin crystallites, each about 40 nm in diameter, into cylindrical or semicylindrical prisms or rods like bunches of dried spaghetti strands. These prisms, each between about 2–10 μm in diameter, are packed together and run from the EDJ to the surface of the tooth (see Fig. 3). In primates, individual prisms tend to run parallel to one another as they approach the surface; we call this radial enamel. But the angle at which they hit the outer surface of the crown varies which, according to Shimizu etal. (2005), should effect both wear resistance and stiffness of the tissue and therefore have implications for surface wear and strength.
It is also important to consider the paths of prisms as they make their way from the EDJ to the crown surface. Depending on the directions and magnitudes of forces acting during mastication, radial enamel is susceptible to fracture with cleavage along planes of weakness between adjacent rows. Decussation, wherein layers of prisms wiggle about along their long axes, can mitigate this problem; changing prism directions require cracks to change direction as they spread, increasing the work required for fracture. Further, adjacent layers can be interwoven with one another at angles up to about 90° to form Hunter–Schreger (H–S) bands, which can be stacked horizontally, vertically, or in a zigzag fashion. These H–S bands stop cracks by absorbing the energy required for their propagation. Enamel can be further strengthened when different enamel types (e.g., radial, horizontal, vertical, and zigzag) are themselves layered to form complex schmelzmuster patterns (see Maas and Dumont, 1999).
Researchers have recently begun to consider the implications of enamel microstructure for diet in early hominins. Macho and Shimizu (2009) for example, compared the angle at which enamel prisms approach the occlusal surface in Paranthropus robustus and Australopithecus africanus, and argued that the “robust” australopith cheek teeth are stiffer and adapted to more vertical loads, whereas those of the “gracile” australopiths are more wear resistant and adapted to coping with more laterally directed loads. Macho and Shimizu (2010) also considered prism orientation in A. anamensis, which suggested to them that this hominin was adapted to tough foods requiring a significant shear component and a wide range of loading directions.
Functional studies of hominin enamel decussation have thus far been limited. Beynon and Wood (1986) suggested in their study that Paranthropus had little enamel decussation, but that early Homo had more. Grine and Martin (1988) on the other hand, observed well-developed H–S bands in Paranthropus enamel, and argued that prism decussation in these hominins functioned in stopping cracks. Macho et al. (2005) and Macho and Shimizu (2010) also documented decussation in Australopithecus anamensis, especially close to the EDJ where, according to Lucas et al. (2008), cracks are prone to start. But studies to date have been limited to exposed surfaces on naturally broken specimens or a few sectioned teeth and, as Macho et al. (2005) have noted, it is the complex three-dimensional arrangement of prisms across a tooth that is likely to be most informative. In this light, new technologies, such as phase contrast X-ray synchrotron microtopography, which allows whole-tooth imaging with submicron resolution (Tafforeau and Smith, 2008), may help us realize the potential of functional studies of enamel microstructure.
DIRECT EVIDENCE OF TOOTH USE: ANTEMORTEM DAMAGE AND WEAR
While tooth size, shape, and structure likely reflect dietary adaptations, they tell us more about potential than what a specific animal in the past ate on a daily basis. As Kinzey (1978) asked rhetorically, “is it possible that features of the dentition are selected for on bases other than the ‘primary specialization’?” Perhaps, as he speculated, “when a food item is critical for survival, even though not part of the primary specialization, it will influence the selection of dental features.” As Robinson and Wilson (1998) later noted, “some resources are intrinsically easy to use and are widely preferred, while others require specialized phenotypic traits on the part of the consumer. This asymmetry allows optimally foraging consumers to evolve phenotypic specializations on nonpreferred resources without greatly compromising their ability to use preferred resources.” They referred to the notion that animals may actually avoid the foods to which they are adapted when more favored ones are available as Liem's Paradox.
Chimpanzees and gorillas provide a useful example. These apes differ markedly in dental allometry, morphology, and microstructure, to say nothing of diet-related differences in their jaws, chewing muscles, and guts. Despite this, as Wrangham (2007) has noted, the two “have closely similar diets. Both choose ripe fruits when they are available, being almost equally frugivorous.” Gorillas exhibit Liem's Paradox; notwithstanding clear adaptations for tough, fibrous foods, they prefer soft, sugary fruits given the choice (Remis 2002). The African apes differ in diet mostly at times of resource scarcity when, as Wrangham (2007) notes, “gorillas can survive by eating fibrous foods for 100% of their feeding time. Chimpanzees never do so.” Perhaps then, unique aspects of tooth shape and structure in early hominins likewise reflect nonpreferred foods (Ungar, 2004; Constantino et al., 2009). But how would we know? We can look to nongenetic clues, evidence of actual use of teeth, such as antemortem chipping and dental microwear (Ungar, 2009).
Antemortem damage: Tooth chipping
Robinson (1954) claimed that Paranthropus robustus had more chipped teeth than Australopithecus africanus. He attributed the difference to consumption of grit-laden roots and bulbs by the “robust” australopiths. According to Tobias (1967) however, the two species had dental chips “similar in size, character, and number per jaw”; and all were rather large, suggesting to him bouts of bone chewing. Wallace (1975) later also examined antemortem dental chipping in these hominins and, like Tobias, found no notable differences between A. africanus and P. robustus. He reasoned that this indicates similar amounts of grit in the diets of “gracile” and “robust” australopiths.
Antemortem tooth chipping in early hominins has recently been revisited by paleoanthropologists. Grine et al. (2010) reported similar chip rates between premolars and molars of A. africanus, suggesting to them that chewing stresses did not differ substantially between the two tooth types. Constantino et al. (2010) have taken tooth chipping evidence a step further, with in vitro experiments with human molars and Vickers indentations suggesting that chip dimensions reflect chewing stresses. If these in vitro tests are reasonable proxies for in vivo conditions, this new tool may well allow reconstructions of maximum bite forces by early hominins and lead to insights into very rarely consumed items. Consistent criteria for antemortem chip identification and reporting, and more comparisons with other taxa will hopefully also lead to further insights.
One of the best approaches to reconstructing diets of early hominins is dental microwear analysis. Scratches and pits form on a tooth's surface as the result of its use and so provide direct evidence for diet. Microwear features are like footprints in the sand, traces left by real actions of specific individuals at a moment in time. They are a direct connection to animals that lived in the past. Most studies of dental microwear have focused on incisors and molars.
Incisor microwear tells us something about front tooth use. It has been examined in a broad range of mammals, from kangaroos to moose (Young, 1986; Young et al., 1990). Work on living primates suggests that habitual incisor use in food preparation results in relatively high densities of microwear features on front teeth (e.g., Kelley, 1990; Ungar, 1990, 1994). Some have also argued that microwear feature types can be associated with specific ingestive behaviors, such as horizontally oriented scratches and stripping of foods laterally across the mouth (Walker, 1976; Ryan, 1981, 1994). It may even be that incisor microwear patterning reflects substrate use, with predominant dietary abrasives encountered on or near the ground (exogenous grit) causing different wear feature incidences or sizes than those encountered high in the trees (phytoliths) (Walker, 1976; Ungar, 1994).
A few studies have documented incisor microwear in Plio-Pleistocene hominins. Ryan and Johanson (1989) suggested that Australopithecus afarensis had a mosaic of gorilla-like and baboon-like features reflecting the use of these teeth to strip gritty plant parts such as roots and rhizomes. And Ungar and Grine (1991) noted that A. africanus had a higher average incisor microwear feature density than P. robustus, suggesting the “gracile” australopiths ate more abrasive foods requiring anterior tooth use in ingestion than did the “robust” species.
Molar microwear tells us something about the fracture mechanics of foods, especially as they relate to movements of opposing tooth surfaces relative to one another during mastication. This falls under the domain of what engineers call tribology (Schulz et al., 2010). When stress-limited (e.g., hard-brittle) foods are crushed between molars, they tend to form pits, whereas displacement-limited (e.g., tough-pliable) items sheared between blades or crests are apt to cause scratches as opposing teeth slide past one another, dragging food-borne abrasives between them (see Teaford, 1988, 2007a; Ungar et al. 2007b for review). Smaller pits can also result from the consumption of tough foods as prisms are “plucked” from their surrounding matrix due to friction (Teaford and Runestad, 1992). Thus, microwear feature size can also help inform us of the material properties of foods. And relationships between microwear and diet seem to hold well not just for primates, but across Mammalia, from antelopes to zebras, bats to moles, pigs to sheep, cats to dogs, marsupials to primates, and other taxa (see Ungar, 2010).
There have been several studies of molar microwear in early hominins. Grine (1986) found Australopithecus africanus occlusal surfaces to be dominated by microwear scratches, while those of Paranthropus robustus had more pits (see Fig. 4). He attributed this difference to the consumption of small, hard objects by the “robust” australopiths and softer foods, such as fruits and immature leaves, by the “gracile” hominins. Subsequent work on A. afarensis and A. anamensis showed that these “gracile” australopiths also had microwear surfaces dominated by striations, with remarkable consistency over time and between inferred habitat types (Grine et al., 2006a,b). This suggested that specimens of Austalopithecus spp. examined more likely consumed tough foods than hard-brittle ones in the days or weeks before death. Grine et al. (2006a,b) concluded that these species were probably not hard-object specialists despite anatomical traits suggesting potential to consume such items. Early Homo specimens examined have somewhat higher average pit percentages than Australopithecus spp., though their surfaces were still not dominated by pits (Ungar et al., 2006b). This suggested to Ungar et al. (2006b) a preference for less fracture resistant foods, though small pits in H. erectus and Homo from Swartkrans Member 1 hint that these hominins may have consumed more hard or tough items prior to death than did H. habilis and Homo from Sterkfontein Member 5C.
New approaches to microwear analysis
As research into the etiology of microscopic use-wear on teeth has progressed, it has become clear that this approach to hominin diet reconstruction has not yet reached its full potential (see Rose and Ungar, 1998; Teaford, 2007a; Ungar et al., 2007b). We can get well beyond “Species A ate mostly hard-brittle foods whereas Species B ate mostly soft-tough ones.” Indeed, many years of study of wild primates have shown rather subtle variation in microwear patterns between individuals living in different microhabitats and those sampled in different seasons (e.g., Teaford and Robinson, 1989, e.g., Teaford and Glander, 1996; Nystrom et al., 2004; Teaford et al., 2006).
The most common technique for quantifying high-resolution microwear patterns has involved imaging by scanning electron microscopy followed by user identification and measurement of individual scratches and pits (Ungar et al., 1991, 1995). This approach has provided remarkable results, especially given intra- and interobserver error in microwear feature measurement (Grine et al., 2002). Still, a newer technique combining white-light confocal microscopy with scale-sensitive fractal analysis offers more automated surface texture characterization, and is becoming increasingly popular (Ungar et al., 2003, 2007b; Scott et al., 2006). Species with more pitting tend to have more complex surfaces (defined by change in roughness with scale of observation); whereas those with more aligned striations tend to have higher surface anisotropy. Other texture attributes, such as fill volume and the scale of maximal complexity, can be related to feature sizes (see Scott et al. 2006 for details). Folivorous primates tend to have low complexity and high anisotropy values, those that consume hard-brittle foods more often have high complexity and low anisotropy, and frugivores often have intermediate or varied texture attributes (Ungar et al., 2007b).
While there is general congruence between feature-based and texture-based microwear analyses, the latter promises even more resolving power because it is free from observer measurement error. Reduced noise in measurements gives us more confidence to consider subtleties in our data, such as the distribution of variation within samples. And the ability to document variation within a sample may be among the most valuable contributions microwear has to offer.
Dental microwear reflects diets only days or weeks before death because individual features on these scales are often no more than a few microns deep, and so turn over quickly with further wear (Teaford and Oyen, 1989). This fact, referred to as “the last supper” phenomenon (Grine, 1986), is both a liability and an asset. A record of the last few meals of an individual may tell us little about dietary adaptations of a species; but with a sufficient sample of specimens, we might begin to look at variation in diet within that species, especially if seasonal biases in death and preservation do not overwhelm our signals. We can predict that if hard-brittle foods are preferred, most individuals in a sample should have complex microwear surface textures. If on the other hand, such foods are fallback items taken when softer, weaker foods are unavailable, only a small percentage of specimens in a sample should have high texture complexity. Thus, microwear could in principle help us infer aspects of foraging strategy. When combined with dental functional morphology, microwear might even offer clues as to whether individuals of a past species commonly or rarely consumed the foods to which they were adapted, and so offer insights into the nature of selection (Ungar, 2009).
Dental microwear textures of Australopithecus africanus and Paranthropus robustus provide an example (Scott et al., 2005). The “gracile” australopith has on average lower microwear texture complexity (considered a proxy for food hardness) and more anisotropy (considered a proxy for food toughness) than does the “robust” species. Still, these species both have some individuals with lower anisotropy and lower complexity values, perhaps implying consumption of less mechanically challenging foods. But the distributions for A. africanus and P. robustus differ, with data points for these species spread into the upper range of anisotropy and complexity respectively. This suggested to Scott et al. (2005) overlap in the fracture properties of foods, but differences between the australopiths in critical dietary resources consumed periodically during the year. Indeed, the P. robustus distribution of complexity values is similar to that for a sample of Lophocebus albigena, a species reported to fall back on hard bark and seeds when softer, preferred foods are unavailable (Lambert et al., 2004). Paranthropus robustus may have likewise consumed softer foods much of the time, with craniodental adaptations reflecting less preferred but still critical hard, brittle foods. If so, P. robustus may present an example of Liem's Paradox in the hominin lineage (Ungar, 2007a, 2009).
Eastern African australopiths have also been examined by microwear texture analysis, with some results as expected but others surprising. Australopithecus anamensis, A. afarensis, and Paranthropus boisei all have lower average texture complexity values than extant hard-object feeding primates, but at the same time, have lower average anisotropy values than living folivores (Ungar et al., 2008, 2010, 2011). Complexity results for A. anamensis and A. afarensis were not unexpected given feature-based studies suggesting limited dietary variability and a lack of hard-object feeding. On the other hand, the similarity of complexity results for P. boisei with those for A. anamensis and A. afarensis is surprising, both because the craniodental toolkits of these hominins are so different, and because microwear textures of P. robustus are so different. While it is possible that microwear reflects a hard-object-fallback adaptation for the eastern African australopiths (especially P. boisei), it is hard to imagine that we are completely missing a hard-object complexity signal due to sampling error given data for more than 30 individuals spread over such substantial time and space.
It is also interesting to note that eastern African australopiths lack values at the upper end of both the complexity and anisotropy ranges. Extant taxa sampled to date have at least some high values in one of these two attributes given diets including foods with stress-limited or displacement-limited defenses. The atypical pattern seen in these hominins may relate in part to tooth shape. Ungar et al. (2010, 2011) noted that high anisotropy values seen in living tough-food eaters is likely related to constraints on tooth-tooth movement seen in living folivores or grass-eaters that have high cusps and substantial occlusal relief. While low complexity is not consistent with hard-object feeding, low anisotropy need not be incompatible with a tough-food diet if items are ground between flat or bulbous cusps. It can also be noted microwear texture analysis also hints at possible dietary variation among eastern African australopiths (e.g., a broader range of scale of maximal complexity values in P. boisei), though not to the degree separating them from their South African congeners.
Microwear textures have also been examined in Homo habilis and H. rudolfensis (Ungar and Scott, 2009; Ungar et al., 2011). Early Homo as a group shows moderate average complexity and relatively low anisotropy. Homo erectus has lower average textural fill volume and scale of maximal complexity than H. habilis, consistent with relatively more small pits and perhaps some tough food consumption (see above). Homo erectus also has a remarkably high dispersion of complexity, matched among the hominins only by Paranthropus robustus. This is consistent with a fairly broad diet, at least in terms of food fracture properties.
Microwear and enamel mineral content and structure
Before we leave the discussion of microwear, a few words about the effects of enamel mineral content and microstructure on the hardness and toughness of enamel are in order, particularly given discussions above. Maas (1991, 1994) found in abrasion experiments with tooth enamel and silicon carbide grits that while abrasive particle size was the primary determinant of microwear feature size, variation in crystallite orientation could affect striation breadths under shearing loads. Further, Macho and Shimizu (2009) speculated that the orientation of prisms might affect microwear patterning, which could complicate interpretations of results comparing taxa with different enamel microstructures. The extent of this effect is probably limited however; while prism orientation is very important for toughening enamel against crack propagation, it has little effect on hardness (Braly et al., 2007). On the other hand, mineral content can have an important effect on enamel, and indentation studies indicate that hardness can decrease by more than 50% from the occlusal surface to the EDJ (Cuy et al., 2002).
While degree of mineralization and prism orientation could in principle affect patterns of microscopic wear, there is little evidence for an effect of sufficient magnitude to affect interpretation, at least not on the scales at which microwear is typically examined (Teaford, 1988). If there were, we would expect to see consistent variation within primate species related to gross wear because prism orientation and mineral content change from the EDJ to the occlusal surface. This does not seem to be the case. More to the point, studies of a very broad range of extant mammals with different enamel properties consistently confirm expected relationships between microwear patterns and diet. Folivorous primates have higher anisotropy and lower complexity averages than frugivores, especially hard-object feeders (Ungar et al., 2007b). Grazing bovids have higher aniostropy and lower complexity values than browsers, especially those that include fruit in their diet (Ungar et al., 2007a). Grazing kangaroos also have higher anisotropy and lower complexity values than browsing wallabies (Prideaux et al., 2009). And among carnivores, durophagus hyenas have higher complexity and lower anisotropy averages than felids, especially tough-flesh-eating cheetahs (Schubert et al., 2010). The proof is in the pudding.
Even a cursory review of recent work on the dental evidence for diets of early hominins makes clear that this is a vibrant, growing research domain. As our understanding improves, we can begin to ask questions like, “was selection driven by preferred foods or fallback resources?” rather than simply, “did they eat hard objects?” But we clearly have a long way to go before we reach the limits of our potential knowledge. Kant's (1783) observation that, “every answer given on principle of experience begets a fresh question, which likewise requires its answer” applies well to reconstructions of early hominin diets. Rescher (1999) remarks that, “in a complex world, the natural dynamics of rational inquiry will inevitably exhibit a tropism toward increasing complexity” does too. The more we learn, the more it seems there is to know. Despite these Sisyphean frustrations however, new methods and theoretical approaches are bringing progress and improving our understandings of the diets of Plio-Pleistocene hominins.
Plio-Pleistocene hominin diets
A convenient way to summarize our knowledge to date is to consider the Plio-Pleistocene hominins genus-by-genus, especially because the number of analyses and types of data available differ between species.
If dental allometry data hold, A. anamensis, A. afarensis, and A. africanus all have incisors intermediate in relative size between those of chimpanzees, orangutans, and many cercopithecines on the one hand, and more folivorous colobines and humans on the other. Their incisors are about the same relative size as those of gorillas and gibbons, taxa that do not focus on foods requiring extensive incisal preparation (Teaford and Ungar, 2000). And incisor microwear on A. africanus is consistent with moderate levels of incisor use (Ungar, and Grine 1991). Still, microwear of the anterior teeth of A. afarensis (Ryan and Johanson, 1989) and extreme gross wear on the front teeth of A. anamensis (Ward et al., 2010) suggest possible variation in degree and types of ingestive behaviors among the “gracile” australopiths.
The molar teeth of “gracile” australopiths are relatively large and bunodont, though their functional morphology may suggest a mosaic of adaptations for fracturing stress-limited and displacement-limited foods. These hominins lack the distinct shearing crests found on the cheek teeth of many extant primates; topographic relief of A. afarensis molars for example, is less than that of living chimpanzees (Ungar, 2004). Further, Australopithecus spp. have thickened molar enamel compared with living African apes, particularly under the cusp tips (Olejniczak et al., 2008a). On the other hand, their molar enamel may not be so thick overall as previously thought judging from microtomographic study of A. africanus (Olejniczak et al., 2008a). And prism orientation in A. anamensis and A. africanus may be consistent with an ability to resist fracture given laterally-directed shearing movements (Macho and Shimizu, 2009, 2010). Dental microwear of Australopithecus spp. suggests that these hominins did not regularly consume hard-brittle foods, though they may have consumed tough items at least occasionally (Scott et al., 2005,Ungar et al., 2010). Microwear also suggests variation within the genus, with the eastern African taxa having lower average microwear complexity values than the South African species.
The “robust” australopiths P. boisei and P. robustus have relatively small front teeth (Kay, 1985,Ungar and Grine, 1991). If incisor allometry data are accurate, the P. robustus I1 is within the range of extant colobines relative to body weight; among the living apes, only humans have smaller front teeth. And the low density of microwear features on P. robustus incisors is consistent with their limited use in ingestion compared with those of extant anthropoids that employ these teeth regularly to husk large fruits (Ungar and Grine, 1991; Ungar, 1998).
Paranthropus spp. have large, bulbous molar teeth that could have served well in hard-object feeding. Their megadontia quotients are extremely high and, at least among South African early hominins, Paranthropus has the least sloping occlusal surfaces at given stages of gross wear and shallowest wear facets (Grine, 1981; Teaford et al., 2002; Ungar, 2007a). The “robust” australopiths also have thick molar enamel, and may have well-developed H–S bands, with prisms oriented to resist vertical loads (Grine and Martin, 1988; Olejniczak et al., 2008a; Macho and Shimizu, 2009). These features suggest an ability to withstand heavy stresses or fatigue failure related to repetitive loading.
One question that has arisen is whether these dietary specializations resulted in stenotopy, including relatively few foods, or whether they actually reflect eurytopy, with a broadened subsistence base including both extremely hard foods and less mechanically challenging items (Wood and Strait, 2004). The microwear evidence is mixed in this regard. Paranthropus robustus does show substantial variance in microwear texture complexity, including some specimens suggesting consumption of foods with stress-limited defenses (Scott et al., 2005). On the other hand, no P. boisei specimens examined to date have the high-complexity microwear texture expected of a hard-object feeder (Ungar et al., 2008, 2011). Rather, P. boisei microwear suggests a softer- or tough-food diet. Differences between the South African and eastern African “robust” australopiths may indicate differences in their feeding strategies, consistent with variation in isotope signatures reported by van der Merwe et al. (2008) and Cerling et al. (2011).
Early Homo incisors are all over the map. If allometry data for early Homo species are correct, both H. habilis and H. rudolfensis have large front teeth, about the same size relative to body weight as living chimpanzees and orangutans, species known to use their front teeth for husking large fruits and other ingestive behaviors. The African H. erectus value is intermediate between those of H. habilis and H. rudolfensis on the one hand, and modern H. sapiens on the other. Relative incisor size in African H. erectus falls on the regression line, along with living gibbons and gorillas, which tend to be more limited in anterior tooth use during ingestion (Teaford et al., 2002). This might suggest the consumption of more foods requiring incisal preparation in H. habilis and H. rudolfensis compared with their australopith predecessors and contemporaries, but a decrease in incisor use in African H. erectus, either because of a change in diet, or perhaps increased extraoral food processing (Ungar et al., 2006a). Dental microwear analysis of early Homo incisors could be helpful in resolving this.
Early Homo cheek teeth also vary. While Homo habilis and H. rudolfensis have higher megadontia quotients than living great apes (albeit lower than their Paranthropus contemporaries), H. erectus cheek teeth are as small or smaller (Teaford et al., 2002). While small sample sizes preclude separate comparisons of occlusal topography between early Homo species, the group as a whole has more sloping molar surfaces than its australopith predecessors, with an average value between those of chimpanzees and gorillas, at least until late in the wear sequence. And while little work has been done on enamel strength in the earliest members of our genus, early Homo (especially H. erectus) seems to have thinner enamel than the australopiths (Beynon and Wood, 1986). Results suggest a relaxation of or change in selective pressures consistent with less hard-object feeding but perhaps a diet including more tough foods, especially for H. erectus. Molar microwear texture data are in accord with this, as neither H. habilis nor H. erectus specimens examined to date show extremely high surface complexity. Further, H. erectus shows a greater range of complexity values, and texture attributes suggesting smaller pits on average compared with H. habilis. This is consistent with a greater range of diets, including perhaps more tough foods in H. erectus.
Other datasets and future directions
The studies described here show clearly that fossil teeth hold the potential to offer important insights into early hominin diets, though this review is hardly complete. Consider the functional implications of differences in the sizes of cheek teeth along the row (Lucas et al., 1986), and variation in differential wear of these teeth (e.g., Teaford, 1983; Deter, 2009). And many new approaches, such as occlusal fingerprint analysis (Ulhaas et al., 2007; Fiorenza et al., 2011) are emerging, with new technologies leading the way. Nanoindentation and phase contrast X-ray synchrotron microtopography, for example, promise to allow researchers to better relate relationships between enamel microstructure and mechanical properties.
There also remains much to be done. There have been very few functional studies of early hominin permanent premolars and canines, and even fewer of deciduous teeth. And there are other important hominins to consider, especially the newly discovered Mio-Pliocene forms, such as Ardipithecus. Research is progressing to reconstruct aspects of their diets too (e.g., Suwa et al., 2009).
Further, tooth size, shape, structure, and wear should be considered in light of other lines of evidence beyond the scope of this review (see Walker, 2007; Ungar and Sponheimer, 2011). Stable isotopes and craniomandibular biomechanics of the fossils themselves reflect the chemistry of foods eaten, and the stresses generated and dissipated during mastication respectively. Contextual evidence, such as stone tools, and the remains of plants and animals found in hominin-bearing deposits can also provide important clues. And studies of ecological analogs and development of energetics models can help us put together the most complete picture possible. Finally, new approaches on the horizon, such as studies of parasite relationships, microbial ecology, and comparative genomics hold the promise of even better understandings.
The author thanks Bob Sussman for his kind invitation to submit this article to the Yearbook of Physical Anthropology. He also thanks Mark Teaford, Gary Schwartz, Fred Grine, Matt Sponheimer, Paul Constantino, and Dave Strait for comments and discussions related to this review, and Fred Grine and Gary Schwartz for providing the images in Figure 3.