Now listen to me, all of you. You are all condemned men. We keep you alive to serve this ship. So row well, and live.
Quintus Arrius (Ben-Hur, 1959)
- Chronic activation of the stress axis caused by long-term uncontrollable and unpredictable factors in the environment has been regarded as causing maladaptive and/or pathological effects, both by those studying animals in the laboratory and in nature. While pathology may apply to the former, I argue that it does not apply to the latter.
- Our thinking on the role of chronic stress in animals in nature has been heavily influenced by biomedical research, but much less so by the ecological and evolutionary context within which animals actually function. I argue that when such stressors occur (e.g. periods of high predation risk, food limitation, prolonged severe weather, social conflict, etc.), although the animal may be chronically stressed, its responses are adaptive and continue to promote fitness.
- Chronic stressors in nature can be subdivided into whether they are reactive (direct physiological challenges threatening homeostasis and not requiring cognitive processing – for example, food limitation) or anticipatory (perceived to be threatening and requiring cognitive processing – for example, high predation risk). For anticipatory stressors, their impact on the animal should not be based on their absolute duration (they may be acute), but rather by the duration of their physiological consequences.
- The anticipatory stressor of persistent high predation risk does not elicit chronic stress in all prey classes. Cyclic snowshoe hare and arctic ground squirrels exhibit evidence of chronic stress when predator numbers are high, but cyclic vole and noncyclic elk populations do not. I suggest that chronic stress has evolved to benefit the fitness of the former and not the later, with the key factors being lifespan and life history. I propose that chronic stress evolves in a species only if it is adaptive.
An organism is chronically stressed when there is long-term activation of the hypothalamic–pituitary–adrenal axis (HPA) caused by unpredictable or uncontrollable stimuli (stressors) in its environment. There are two views with respect to the role of chronic stress in natural animal populations. The first is that it does not occur because if it did, it would be fatal and thus an understanding of chronic stress has contributed little to our understanding of how animals in nature cope with stressors (Wingfield et al. 1998; Wingfield & Ramenofsky 2011). Given the impact of chronic stress under this view, there is limited possibility for adaptation and thus for evolution to act on the mechanisms of coping. A related, but less extreme position under this view is that if chronic stress does occur, it is associated with an increased risk of stress-related disease and pathology (Romero, Dickens & Cyr 2009). The second view is that chronic stress is more common in the natural world, having both individual and population consequences (e.g. Boonstra et al. 1998a; Romero & Wikelski 2001; Clinchy et al. 2011; Kitaysky et al. 2010), but that animals have evolved mechanisms to cope with it and thus maintain a fitness greater than zero. My primary purpose in this review is to examine the role of chronic stress in natural populations and how the thinking of the biomedical community has misled that of physiological ecologists. My secondary purpose is to question the assumption that a severe, persistent stressor is perceived similarly by all species. As an example, I examine how the response to long-term high predation risk produces chronic stress in some prey species but not others and argue that this is contingent on the evolutionary history of the interactions between the predator and prey and their respective life histories.
The study of stress in vertebrates is proceeding along two fronts – a major one represented by the biomedical science community and a minor one represented by those working in the natural world. The readers of this article will, by and large, be in the latter camp. There is some cross-pollination between these two fronts, such as in the realm of redefining homeostasis to allostasis [stability through constancy (Fink 2009) versus stability through change (McEwen 1998)] and trying to apply it to natural populations (McEwen & Wingfield 2003; Romero, Dickens & Cyr 2009) as well as field endocrinologists drawing on the elegant work and summaries of some biomedical findings through the insightful articles on the stress response (e.g. Sapolsky, Romero & Munck 2000). However, the direction of pollination is primarily one way. A recent four-volume encyclopaedia of what is known about stress and how it functions had 531 review articles in it (Fink 2007). This compendium provides a compelling overview of the depth of knowledge we have acquired in this area and its critical importance to the human experience. However, only c. 1% have any reference to animals from nature. Human surrogates – laboratory rodent models and primates – provide the basis for much of that knowledge. For the most part, the objective of the former front is to understand ourselves, our health, and our society in our present environment and what happens when things go amiss. The objective of the second is to understand how animals in the natural world cope with reality, and if and how the stress axis shapes and responds to that reality. Animals in nature have evolved a wide diversity of solutions to the problems of existence and some of these may be more analogous to the human condition than to that of animals in the laboratory. There is a strong admonition to integrate biomedical findings into research on wild organisms (Orchinik 1998; Romero 2004). While I agree that knowledge of this wealth of information is necessary as a starting point, discernment and validation is required when assessing its applicability to wild animals, realizing the possibility that much of it may not apply.
A key difference between how stress is studied in the natural world versus in the laboratory is the ecological and evolutionary underpinnings of the former and its general lack in the latter (but this is changing: i.e. see Hales & Barker 2001; Nesse 2005; Bateson, Brilot & Nettle 2011; Gluckman, Hanson & Low 2011). The adaptations animals have in the natural world are solutions to ecological problems. These problems – the agents of selection together with their interactions [i.e. the abiotic environment, scramble competition for resources such as food, the social environment (intra- and interspecific contest competition), and natural predators] – operate through differential fitness amongst phenotypes (MacColl 2011). In contrast, the adaptations that laboratory models have to their environment are a result of our needs – we are the agents of selection. Thus, the laboratory rodents are produced by intense selection and are highly inbred. There is likely to have been a pronounced reduction in the amount of phenotypic and genetic variation in laboratory rodents because of truncation selection as some variants fail to thrive and others are selected against. They are raised in artificial environments with unlimited food, no predators, no disease and benign environmental conditions. The stressors we subject them to are artificial and bear little or no relationship to those experienced by their wild counterparts (Koolhaas et al. 1997). They are often less aggressive, more sedentary, obese, glucose intolerant, more social, more responsive or less responsive to stressors than their wild counterparts, and on a trajectory to premature death in ways that are strain specific (Miller et al. 2002; Künzl et al. 2003; Wolff 2003; Harper 2008; Martin et al. 2010). Even as experimental models for human pathology, they may be inappropriate. For example, in 2008, cardiovascular disease (heart disease and stroke) was the second most important source of mortality in Canada (Statistics Canada) after cancer. Psychological stress plays a key role in cardiovascular disease (Steptoe & Marmot 2002), but psychogenic chronic stressors in laboratory rat models have failed to reproduce similar effects (Nalivaiko 2011). Laboratory rats are used as models for human stroke (e.g. Arvidsson et al. 2002) under the assumption that we gain insight into human stroke repair mechanisms by studying those repair mechanisms in rats. It is highly unlikely that any rodent suffering a stroke in the wild would survive more than a few days. Hence, natural selection would have nothing to select on to produce such adaptive mechanisms. Shorter et al. (2012; see references therein) highlighted two major genetic issues with laboratory models: first, the unknown consequences of complete homozygosity of alleles that do not exist in such states in nature; and second, the genome-wide combinations of alleles (whether homo- or heterozygous) that do not exist in nature. The alternatives to homozygosity are genetic knockouts in laboratory models, and although these have benefits in permitting increased understanding of the relationships between neuroendocrine mechanisms and behaviour, they also have marked limitations (Smale, Heideman & French 2005). Given these caveats, caution is warranted when extrapolating from laboratory animal models to animals in nature in how the latter respond to and cope with stressors. I do not mean to imply that the entire baby has to be thrown out with the bath water – understanding neural and hormonal pathways and neurotransmitters acting during acute and chronic stressors may be relevant to understanding animals in nature and the in-depth biomedical findings provide a starting point. But it is not a given that these findings apply. Thus, although we seek generality for our mechanisms, reality may be quite different and we cannot assume it without suitable assessment and experimentation. Finally, even when we bring wild animals into the laboratory, their behaviour and physiology may diverge markedly from that of their wild counterparts and thus caution is needed here as well in extrapolating from their response in the laboratory to that in nature (Wolff 2003; Calisi & Bentley 2009).
A brief history of stress physiology and chronic stress
The history of how we have come to think of the role of stress in animals is both defining and constraining, so I review it briefly here. Claude Bernard (1859) was the first to discuss how cells and tissues were protected from stress by mechanisms to maintain a steady state, the milieu intérieur, around biologically determined set points independent of the changes in external environment. Walter Cannon (1932) coined two key terms for how organisms deal with threats. First, homeostasis was used for the coordinated physiological processes to maintain a steady state in an animal. This was basically Bernard's idea. One can think of this as an attempt to maintain ‘stability through constancy’ (Fink 2009). Second, the term ‘fight or flight’ described the immediate, nonspecific response to a threat. Cannon's work focused on the sympathetic end of the acute stress response (i.e. the sympathetic nervous system causes the adrenal medulla to release catecholamines – epinephrine and norepinephrine – into general circulation) (McCarty 2007). Finally, Hans Selye from 1936 onwards was key in popularizing the all-pervasive, nonspecific nature of stress, particularly as it relates to the human condition. Because of this and of his prodigious output, he has been referred to as the ‘father of stress’. He focused on the second part of the bodies’ stress response – the glucocorticoid secretion end (i.e. the cascade of hormones from the hypothalamus to the pituitary to the adrenal cortex to release glucocorticoids, Fig. 1). It is this second part of the stress response that I will focus on in this review.
Selye was the first to recognize that homeostasis could not by itself ensure stability in response to a stressor. A key factor was its duration. If the stressor was short-term (acute), the response was short lived. If it was long-term (chronic), the body went through a series of nonspecific responses called the General Adaptation Syndrome (GAS). There were three stages in the syndrome that the body went through progressively, contingent on the duration of the stressor: alarm, resistance and exhaustion. Exhaustion was the end point of being chronically stressed when the body could no longer cope. It was characterized by overproduction of potentially beneficial glucocorticoids that had secondary, detrimental effects called the ‘diseases of adaptation’ (e.g. ulcers, high blood pressure, immunosuppression, thymic involution, etc.). However, Selye was wrong on two accounts. First, the stress response is not nonspecific, but rather geared to the nature of the stressor based on prior experience and on innate species-specific responses (Pacak & Palkovits 2001, Ulrich-Lai & Herman 2009; Koolhaas et al. 2011). Second, the three stages of GAS have not withstood the tests of time and GAS has been rejected (Fink 2009). However, from the standpoint of the ecology of stress in natural populations, the one idea of Selye's that continues to influence the thinking of those studying natural populations is that chronic stress results in pathology. I submit that terms such as dysregulation, dysfunction, deleterious and pathology are common in the ecology of stress literature as a direct consequence of Selye's influence. Such terms are clearly applicable to laboratory animal models (e.g. Bartolomucci et al. 2005; Sterlemann et al. 2008; Tamashiro et al. 2011) and to humans (e.g. Schmidt, Sterlemann & Müller 2008; Evans & Schamberg 2009; Brosschot 2010; Juster 2011). Indeed, much of the focus of stress research is an attempt to mimic in laboratory models the negative effects of chronic stress found in humans. However, when used with reference to animals in nature, these terms imply that natural chronic stressors result in maladaptations (i.e. cause harm). I suggest that on the contrary, long-term chronic stress is part of the normal experience of many animals in nature and, although there are fitness costs, their responses are adaptive – contingent on the circumstances – and part of their long-term evolutionary adaptations to particular ecological and habitat pressures. I will dissect this issue of chronic stress into three related issues below.
The nature of the stress response: reacting versus anticipating
A stressor is any environmental stimulus that either directly threatens an organism's survival and homeostasis or is perceived to do so. Although such stressors are usually unpredictable, in some life histories, they may also be predictable (e.g. semelparous life histories in salmonids and small dasyurid marsupials or the intense conflict associated with reproduction in some species). The latter are essentially internally imposed by the life-history-driven stressors and will not be dealt with here; the former are externally imposed by the environment and are the subject of this review. There are two classes of stressors that result in a stress response, but these operate via different neural pathways and may ultimately result in different subsequent brain programming (Fig. 1). Both classes can act either acutely or chronically and may not operate independently. I use the terminology of Herman et al. (2003) as it is directly linked to brain organization. Both classes ultimately impinge on the paraventricular nucleus (PVN) of the hypothalamus that is responsible for initiating the sequential release of a cascade of hormones culminating in the release of glucocorticoids from the adrenal cortex. The first – called reactive stressors – are direct physiological challenges that cause an immediate threat to homeostasis. These are either physical signals transmitted to the hypothalamus via neural input from peripheral sensory fibres via the spinal cord and hindbrain regions or blood borne signals. The former involve signals such as cold, respiratory distress, somatic or visceral pain, and blood loss, and the latter involve cytokine and chemokine factors that signal infection or inflammation (Sapolsky, Romero & Munck 2000; Herman et al. 2003) and signals involving energy balance, including glucocorticoids, insulin and leptin (Dallman & Bhatnagar 2001). These trigger reflexive responses that do not require higher-order cognitive processing (Herman 2010). Thus, long-term declines in food abundance in natural populations (e.g. Kitaysky et al. 2010) should result in food-related stress through this signalling pathway.
The second class – called anticipatory stressors – are environmental cues perceived by the organism to be either threatening to survival or to homeostatic disruption and have a psychological basis. These cues result in a preparative glucocorticoid response and can occur in the absence of an existing physiological insult and in advance of a direct threat. These cues require the organism to interpret the significance of environmental sensory stimuli. This interpretation may be based on innate responses that are species specific (e.g. instinctive fear by prey of an aerial predator shape) or on memories of prior experience (e.g. a social interaction with a dominant conspecific, social cues from a conspecific that has had prior negative experience or a failed predator attack). These cues are transmitted to the PVN indirectly after initial evaluation and processing by limbic brain structures (i.e. the hippocampus, prefrontal cortex, amygdala, septum and midline thalamaus) (Herman et al. 2003). The hippocampus and prefrontal cortex act in a largely inhibitory fashion to regulate HPA axis secretion (Herman et al. 2005), whereas the amygdala acts in a largely stimulatory fashion to activate HPA axis secretion (Fanselow & Ponnusamy 2007; Rodrigues, LeDoux & Sapolsky 2009).
Duration of impact of the stressor: acute versus chronic effects
Stressors can be classified as either acute or chronic, with the former being of short duration (minutes to hours) and latter long (days to weeks). However, although their impact on the animal's stress physiology may also be for a short or long duration, certain acute stressors may have a long-term negative impact. It is clear that learning can occur after a single exposure to a novel stressor in fish (Ferrari, Capitania-Kwok & Chivers 2006), reptiles (Thaker et al. 2010), birds (Love et al. 2003) and mammals (Wiedenmayer 2004; Armario, Escorihuela & Nadal 2008). However, key with respect to the subsequent state of the animal is whether it is now chronically alert to the possibility of a repetition of the stressor or returns to normal. We do not know whether animals in nature respond this way, but can conceive that a single predator attack in some species or a severe attack by a dominant conspecific may produce an animal that chronically anticipates more of the same and is thus chronically stressed (Clinchy et al. 2011). This may be analogous to a single stressful event that may precipitate post-traumatic stress disease in humans (Yehuda 2006). From an evolutionary perspective, it would pay an animal to both remember the single acute event and to change its behaviour and stress physiology if the fitness costs of not doing so were high (Wiedenmayer 2004). Thus, in terms of defining a stressor as acute or chronic, it should not be carried out on the basis duration of the stressor, but rather by the duration of its consequences on the physiology of the animal.
Consequences of chronic stressors: adaptation or pathology?
I contend that the impacts of chronic stressors as laid out in the biomedical literature – pathology – and as accepted by the majority in the ecological literature – do not occur in nature, although chronic stress does. First, it is useful to discuss what is meant by the terms used to describe the consequences of chronic stress. Dysregulation and dysfunction are used interchangeably and, when applied to the HPA axis, mean that it is not regulated optimally nor functioning as it should. Negative feedback between the adrenals and the brain fails [clinical manifestations of chronic stress included marked dexamethasone resistance (dexamethasone, an artificial glucocorticoid, should cause glucocorticoids blood level to fall rapidly, but, in resistant, chronically stressed animals, fails to do so)]. Deleterious responses are detrimental to the animal (reproductive suppression, immunosuppression, energy mobilization and gluconeogenesis, growth suppression and suppression of digestion – see Fig. 1, also see Table 1 in Romero, Dickens & Cyr 2009). The net consequence is that an animal's fitness (survival and reproduction) is reduced. Finally, pathology is usually related to disease, but disease here includes not simply diseases caused by viruses, bacteria and parasites, but a host of systemic problems such as cardiovascular diseases, hypertension, ulcers, insulin resistance, compromised immune function, etc. that ultimately increase the probability of death. It may also include psychological aspects such as increased anxiety and fear.
Acute, short-term stressors are much more common than chronic stressors in the natural history of most species. Wingfield and colleagues (e.g. Wingfield et al. 1998; Wingfield & Ramenofsky 2011; Wingfield, Kelly & Angelier 2011) have developed a comprehensive framework of how animals respond to such acute stressors (called labile perturbation factors) and their short-term coping strategies (called emergency life-history stages). The allostasis model (McEwen & Wingfield 2003) and the reactive scope model (Romero et al. 2009) are attempts to provide a theoretical link between acute versus chronic stressors and I will not discuss them here. However, both assume that when an animal experiences long-term chronic stress (long-term allostatic overload versus homeostatic overload, respectively), pathology results.
The critical assessment of whether chronic stress results in pathology is if the stress response and its downstream measures fail to sustain life both reproductively and in terms of survival. I see no evidence for this in any system. The most convincing evidence that severe chronic stress does not result in such pathology comes from the extreme situation snowshoe hares (Lepus americanus) face during the decline phase of the 10-year cycle (discussed later). Virtually all hares die during the 2–3 years decline because they are killed by their predators. Nevertheless, their HPA axis continues to function well (slight dexamethasone resistance and increased adrenal response to an ACTH challenge) throughout this time, notwithstanding all the negative downstream effects that are present. Thus, the hares are not inherently nonviable nor is reproduction totally inhibited (although it is reduced), and there is no evidence for increased disease. The optimal solution for hares during the decline is to trade-off reproduction for survival but not to such an extent that survival itself is at risk. Thus, hares have evolved to make the best of a bad situation to maximize their fitness during an extremely severe environment.
Variable response to persistent high predation risk: acute versus chronic stress
Predation is a major ecological process that can act not only to limit prey populations, but in so doing, affect ecosystem structure and function (e.g. Krebs, Boutin & Boonstra 2001a; Hawlena & Schmitz 2010; Estes et al. 2011). Coping with predators is a central problem for virtually all prey and predators themselves may become prey. Predators affect prey both directly by killing them, and indirectly by affecting their behaviour, foraging patterns, reproduction and stress physiology, and thus affecting fitness (Lima 1998; Preisser, Bolnick & Benard 2005; Creel & Christianson 2008; Peckarsky et al. 2008). Thus, predation is a major evolutionary force shaping the adaptations of the prey. Here, I focus on whether sustained high levels of predation risk are perceived as chronically stressful and use mammalian predator–prey interactions to assess the range of variation in prey response. There are two responses prey can make to chronically high predation risk. The first is to show only a short-term stress response when subject to predator attack (an acute stress response) and then go back to the business of living. The second is to show both a short-term acute stress response and a long-term chronic stress response, with the latter resulting in an anticipatory memory of the attack that enhances vigilance and increases fear.
A predation attempt is unlike any other stressor that an organism is likely to experience and the consequences, if successful, are immediate and irrevocable. Other stressors such as severe weather, low food supply, or social conflict may be more graded in terms of their impact, may require short-term responses, and if of longer duration, result in chronic responses. However, there may be the chance of recovery of function and fitness from these stressors. In contrast a predation attack is an all or nothing event in terms of its success. It has been referred to as the life-dinner principle (Dawkins & Krebs 1979). Thus, the direct effects of predation are obvious. However, the indirect effects are not invariant, and I will argue below that in certain predator–prey interactions irrespective of continuous or almost continuous predator presence, prey regard their predators only as acute stressors and thus there are no long-term indirect effects. In other cases, the prey may be chronically stressed by increasing predation risk. In assessing whether prey have been chronically stressed by their predators, one should examine all manifestations of chronic stress available ranging from changes in various blood indices [i.e. increases in glucocorticoid levels (or their excreted signature – increased faecal glucocorticoid metabolite levels) and declines in corticosteroid binding globulin, dexamethasone resistance, increased gluconeogenesis with higher glucose levels and lower free fatty acid levels; suppression of immune system function] and to declines in mass, body condition and reproduction. Even if the former have not been measured, changes in the latter may be convincing if there is suitable comparative or longitudinal data.
Chronic stress effects absent: microtine–weasel interactions
In the laboratory, the chronic effects of predator stress have been examined in laboratory rodents. Rats do not habituate to the visual exposure of a cat even after 20 days (Blanchard et al. 1998). The rats showed evidence of chronic stress, including higher basal corticosterone concentrations, adrenal hypertrophy and reduced thymus mass. Some also showed an enhanced stress response when challenged with an acute stressor, possibly related to a failure of the HPA feedback system. However, extrapolating from these laboratory studies to the natural world is problematic both because one is dealing with inbred strains of rats that have been selected to function well in the laboratory, but not in nature, and because the rats have no ability to escape from presence of the cat.
In the field, the chronic effects of predator stress in microtines (lemmings and voles) appear to be absent. Microtines occur throughout the Northern Hemisphere and some populations go through 3- to 4-year cycles in numbers. They are preyed upon by multiple avian and mammalian predators. Direct predation on microtines is key to generating many of these cycles (Klemola et al. 2002; Krebs 2011). In Fennoscandia, these cycles are caused by delayed numerical responses of their specialist predators (small mustelids), with the vole populations increasing first, followed by an increase in the weasel populations that then causes the vole populations to decline (Henttonen et al. 1987; Korpimäki et al. 2005). It is not that microtines are oblivious to the threat posed by their predators. They engage in avoidance behaviour when confronted by the indirect evidence of predators – the odour of mustelids and foxes (either becoming arboreal, entering their underground burrows or avoiding areas with recent predator odour, reviewed in Norrdahl & Korpimäki 2000). In the bank vole (Myopus glareolus), Ylönen (1989) found preliminary laboratory evidence from enclosed pairs that chronic mustelid odour suppressed reproduction. This would be predicted if predator odour acted as a stressor and a reliable signal of predator presence. This observation led to the predator-induced breeding suppression hypothesis, which postulated that it was adaptive for the voles to delay reproduction until after the predator density had declined (Ylönen & Ronkainen 1994). Although corticosterone concentrations were not measured, evidence in favour of this hypothesis came largely from laboratory studies using weasel odour. This odour produced suppression of reproduction in pairs of voles and delayed maturation in young females (Ylönen & Ronkainen 1994). However, most field studies using mustelid odour have failed to corroborate these findings (Wolff & Davis-Born 1997; Mappes, Koskela & Ylönen 1998; see review in Ylönen 2001) as has a recent laboratory study measuring faecal glucocorticoid metabolites (Ylönen et al. 2006). Thus, predator-induced breeding suppression appears to be an artefact of the laboratory. This conclusion is consistent with a laboratory study that weasel odour did not cause a differential stress response (higher corticosterone levels) in meadow voles relative to control odours (Fletcher & Boonstra 2006). A field study by Fuelling and Halle (2004) found evidence in favour of the hypothesis but methodological considerations (performed only for 1 month in mid-late August and the possibility of a neophobic response of young born prior to the treatment avoiding traps during the treatment) call their conclusion into question. Finally, theoretical modelling indicates that delayed reproduction is only optimal when the number of future offspring produced by not breeding exceeds that of breeding immediately (Kokko & Ranta 1996). For microtines, it may never pay to delay reproduction in the face of high predation risk, given their short life spans and seasonal breeding. Thus, stress-induced suppression of reproduction by weasels is unlikely to have evolved in these small mammals and I conclude that they do not suffer from the effects of chronic stress, irrespective of the critical impact of weasels on their population dynamics.
Chronic stress effects present: snowshoe hare–boreal forest predator interactions
In snowshoe hares, the chronic stress effects of their mammalian and avian predators are unmistakably present. Their populations go through 10-year cycles in the boreal forests of North America (Krebs et al. 1995), and these cycles have been occurring for at least the last 250 years (Sinclair et al. 1993). These cycles are driven by both direct and indirect effects of predation (Krebs et al. 1995; Boonstra et al. 1998a; Hodges et al. 2001; Sheriff, Krebs & Boonstra 2011). Alternative explanations (i.e. high density at the peak phase leading to social competition or to lack of overwintering food or declining nutrition) are not tenable (see evidence from factorial experiments and arguments in Krebs et al. 1995, 2001b; Boonstra et al. 1998a; Hodges, Krebs & Boonstra 2006). During the 2- to 3-year decline phase, the hare population plummets. For example, in the southern Yukon, hares went from a peak of 150 km−2 in 1989–90 to a low of 8 km−2 in 1993–94. Thus, almost all hares die during the decline and in most years >90% are killed by their predators (based on radiotelemetry and live-trapping) (Krebs et al. 1995; Hodges et al. 2001). Virtually every predator in this ecosystem kills hares and thus hares simultaneously have to contend with both avian and terrestrial hunting tactics. The dominant terrestrial predators in the Yukon (killing between c. 65 and 75% of the hares, Hodges et al. 2001) were a hare specialist predator – the lynx (Lynx canadensis) (using both stalking and ambush hunting techniques) and a generalist predator – the coyote (Canis latrans) (using cursorial, opportunistic flushing techniques) (Murray et al. 1995; O'Donoghue et al. 1998). The dominant avian raptors (killing between c. 15 and 25% of the hares, Hodges et al. 2001) were great horned owls (Bubo virginianus) and goshawks (Accipiter gentilis).
The key question with respect to chronic stress during the decline is whether the hares are aware that the predation risk and their probability of death has increased dramatically. The answer is yes – they were highly sensitive to this increased predation risk during the decline phase as indicated by the following evidence. They had greater plasma free cortisol and faecal glucocorticoid metabolite levels, lower corticosteroid binding globulin levels, greater ability to mobilize energy, lower levels of leucocytes and lower body condition indices (Boonstra et al. 1998a; Sheriff, Krebs & Boonstra 2011). They lost weight over winter during decline years of high predation risk, but not in enclosed areas protected from mammalian predators (Hodges, Krebs & Boonstra 2006). During the decline, hares moved into areas of denser cover with higher protection from predators, but poorer quality food (Hik 1995). There was a marked decline in reproduction, with the number of litters per female during the breeding season falling from four in the increase phase to two in the decline phase (Cary & Keith 1979; Stefan & Krebs 2001). Following the predator decline, all of the above physiological indices in the surviving hares improved markedly (Boonstra et al. 1998a; Boonstra, Krebs & Stenseth 1998b; Hodges, Krebs & Boonstra 2006; Sheriff, Krebs & Boonstra 2011).
Finally, hare offspring in the decline phase litters were prenatally stressed (Sheriff, Krebs & Boonstra 2009), and this signature lasted into adulthood (Sheriff, Krebs & Boonstra 2010). Wild hares removed from different phases of the cycle to the benign conditions of the laboratory continued to exhibit the reproductive performance of their field counterparts (Sinclair et al. 2003). Thus, they retained a ‘memory’ of the phase from which they had been removed. This memory in the adults and their offspring is the most plausible explanation for the low phase (2–5 years) that follows the population decline (Boonstra, Krebs & Stenseth 1998b; Sheriff, Krebs & Boonstra 2010). At this time, the programming of the individuals and their offspring through maternal effects make them temporarily mismatched to this more favourable environment even though predator numbers have collapsed and food supplies are ample (in this issue, see Love, McGowan & Sheriff 2013 for a discussion on the nature of this programming).
Nevertheless, in spite of the evidence that hares were chronically stressed during the decline, adrenal exhaustion, as predicted by Selye's GAS hypothesis, did not occur. The basic stress response continued to function well, as indicated by a strong negative feedback in response to the dexamethasone suppression test and a strong positive response to an ACTH challenge (Boonstra et al. 1998a; Sheriff, Krebs & Boonstra 2011). There was no evidence that during these periods of severe decline, there were increases in pathology related to disease and parasites (Keith, Keith & Cary 1986; Sovell & Holmes 1996; Murray, Cary & Keith 1997). I conclude that snowshoe hares were indeed chronically stressed during the population decline by high predation risk, but their stress response did not become dysfunctional nor did they exhibit pathological symptoms. I would also now not characterize their response as deleterious (although we did in these studies). It would only have been deleterious if their response was inappropriate to the situation. However, during the decline, their response was the only viable option to permit their survival. The consequence of this response was that their reproductive output during the decline was reduced. We speculate that the resulting offspring had higher vigilance and anti-predator behaviour that increased their fitness in a high predator environment relative to the highly fecund offspring born during the low predator environment of the increase phase (Sheriff, Krebs & Boonstra 2010). During the 2- to 3-year low phase, however, these hares are temporarily mismatched to their environment resulting in a lag in population response and a failure to exploit conditions that are now favourable.
Chronic stress effects present: ground squirrel–predator interactions
In arctic ground squirrels (Urocitellus parryii), the chronic stress effects of predators are present in forest habitats. The hormonal evidence is less complete than in snowshoe hares, but the demographic evidence is robust. Arctic ground squirrels evolved in open arctic and alpine tundra habitats (Nadler & Hoffmann 1977), and they have a suite of vocal, visual and behavioural anti-predator tactics that are effective in open habitats (Hubbs, Karels & Byrom 1996; Karels & Boonstra 1999). Ground squirrels generally are sensitive to predator presence, modifying their behaviour in response to both direct evidence (visual, olfactory, auditory and tactile stimuli coming from predators) and indirect evidence that corresponds to the increased likelihood of encountering predators (e.g. increased foraging distance from burrows or trees or increased visual obstructions) (e.g. Arenz & Leger 1997; Mateo 2007). In open environments, food abundance, burrow availability, spacing behaviour and overwinter survival conditions, but not predation, limit or regulate arctic ground squirrel numbers (Carl 1971; Gillis et al. 2005). In contrast, in more closed environments such as meadows surrounded by boreal forest, predation pays a key role in limiting their populations (Byrom et al. 2000; Karels et al. 2000; Gillis et al. 2005).
The demographic evidence for the chronic stress effects of predators comes from the same area as that used by the snowshoe hare research and involves two related sets of studies. First, long-term experimental manipulations excluded mammalian predators from a 1 km2 area and partially excluded avian predators with an aerial barrier (Krebs et al. 1995; Karels et al. 2000). Relative to the unprotected control areas, the following were higher in squirrels within the predator exclosure: litter sizes, percentage of females lactating and weaning litters, body condition and juvenile survival rates (Byrom et al. 2000; Karels et al. 2000). The net impact was a doubling of population size within the exclosure. Second, the demography in two adjacent habitats (alpine and boreal forest) was studied. Although both sites had a similar diversity of avian and mammalian predators, predator density was higher in the forest (Hik, McColl & Boonstra 2001). Squirrel populations in the boreal forest went through marked multi-annual declines associated with snowshoe hare cyclic declines (Byrom et al. 2000; Karels et al. 2000); alpine populations remained stable (Gillis et al. 2005). Adult female summer survival was significantly lower in the boreal forest than in the alpine (Gillis et al. 2005). The cause of the poor survival in the forest was determined by radiotelemetry both during the hare decline and the low phase thereafter. Overall the predators accounted for 96% of the squirrel mortalities, with c. 50:50 ratio between avian and mammalian predators during the decline, but a 77:23 ratio, respectively, during the low phase. Thus, predators in the boreal forest affected demography directly (through killing squirrels) and indirectly (by reducing reproduction).
The hormonal evidence for the chronic stress effects of predators comes from both the laboratory and the field. In the laboratory, Hubbs, Millar & Wiebe (2000) measured stress responses in females of a related species (U. columbianus) – to a dog (the model predator) over 8 weeks. Predator-challenged females had higher levels of total and free cortisol than controls and a heightened stress response after 1 month of predator exposure. In the field, Hik, McColl & Boonstra (2001) compared the stress responses of nonreproductive arctic ground squirrels living in the predator-rich, low elevation, more closed boreal forest to those living in the nearby predator-poor, high elevation, open alpine area. Relative to the alpine squirrels, boreal forest squirrels exhibited lower levels of basal free cortisol levels, dexamethasone resistance in females, but not males, reduced ability to respond to an ACTH challenge, lower corticosteroid binding globulin levels, and lower haematocrit levels (a condition indicator) (Hik, McColl & Boonstra 2001). These results indicated that the HPA axis of the boreal forest squirrels were under responding to the hormonal challenge relative to the alpine squirrels as a consequence of being chronically stressed in the boreal forest (see Mateo 2007 for similar results using faecal glucocorticoid metabolites on U. beldingi). Nevertheless, there was no evidence of adrenal exhaustion as predicted by Selye's GAS hypothesis. Thus, both the demographic evidence (the indirect effects on reproduction) and the hormonal evidence indicated that arctic ground squirrels are chronically stressed by their predators in the boreal forest, but they showed no evidence of pathology nor of an HPA axis with an inability to respond.
Chronic stress effects absent: elk–wolf interactions
Ungulates appear not to suffer from chronic stress effects from their chief predator – the wolf. Wolves are the principal predators of the ungulates throughout the Northern Hemisphere and, with bears, are directly responsible for limiting their numbers (Mech & Peterson 2003; Ripple & Beschta 2012). Wolf predation on ungulates can be largely compensatory (i.e. bottom-up population control with wolves killing prey that would have died anyways – the young, the old, the debilitated and the ill – Mech & Peterson 2003) or additive (i.e. top-down population control with prey being killed that would have lived if wolves were absent) (see Garrott, White & Rotella 2009a for discussion of these concepts as they apply to wolf predation and the evidence). Thus, the direct effect of wolf predation is obvious. The key question is whether the threat of predation – an indirect effect – is perceived by the prey as a chronic stressor or only as an acute one when an attack occurs? Findings from the Yellowstone ecosystem over the last two decades have allowed us to disentangle the direct and indirect effects of predation from the other main explanation limiting and/or regulating ungulate populations – insufficient food. First, the research benefited from the before and after aspect of the system relative to wolf predation and thus was a perturbation experiment allowing insight into the direct and indirect impacts of wolf predation on elk populations. Wolves were extirpated from the Yellowstone ecosystem by the mid-1920s and were then reintroduced in 1995 and 1996. Elk had been studied intensively before and after wolf release (Garrott, White & Watson 2009b). Following wolf introduction, counts of the elk populations on their winter range in the northern portion of Yellowstone National Park and the south-western portion of Montana have decreased >60% (25 453 to 9625, from 1994 to 2009, respectively) with predation from wolves being primarily responsible in the high snowfall areas, coupled with predation by bears, hunter harvests and drought effects (White, Proffitt & Lemke 2012). Second, the impact of wolves on the stress axis of elk was examined over a range of wolf densities (Creel, Winnie & Christianson 2009). These studies were thus able to examine both an immediate measure of the stress response (faecal glucocorticoid metabolites) and its downstream consequences (changes in reproduction and condition). Some caution has to be exercised in interpreting the results among the different studies and extrapolating from them, for although seven populations of elk occur in the north-western region of Greater Yellowstone Ecosystem (all reasonably distinct, Hamlin et al. 2009), different studies may have concentrated only on one or a subset of them. However, I am assuming that conclusions the researchers arrived are general ones that apply to elk–wolf interactions generally.
Creel et al. (2009) found no relationship between variation in wolf densities and elk faecal glucocorticoid levels. They concluded that elk were not chronically stressed by the threat of wolf predation, although other mechanisms not related to the stress response still played a role in elk response to wolves. The latter conclusion was reached because they did find a negative relationship between wolf densities and elk progesterone levels (Creel et al. 2007). They argued that these levels reflected changes in pregnancy rates. These rates were proposed to be lower in areas of higher wolf densities as elk reduced their risk of predation by foraging in poorer quality, but less risky, habitat (shifting from preferred grassland to forests). They argued that poorer nutrition was linked to lower pregnancy rates. This explanation appears not to be tenable four reasons. First, Hamlin et al. (2009) and White et al. (2011) found no evidence that the introduction of wolves caused a reduction in pregnancy rates (i.e. these were related neither to wolf numbers nor to elk/wolf ratios) nor did they find that elk maternal condition declined after wolves were introduced. Second, Gower et al. (2009) found that that elk maintained foraging efficiency irrespective of the increase in wolf predation risk following wolf introduction. They argued that large herbivores have evolved to live and forage efficiently in the presence of wolves. Third, Creel et al. (2007) used a progesterone assay to assess pregnancy rates. Subsequently, Garrott, White & Rotella (2009a) reported that the assay gave false negatives for reasons that were not obvious. Thus, estimates of low pregnancies were unreliable. Fourth, poorer nutrition causes plasma corticosteroid levels to increase (laboratory mammals: Dallman et al. 1999; wild birds: Kitaysky et al. 2010). If elk were consistently foraging in poorer quality, forested habitat to reduce their predation risk, this should be reflected in higher faecal glucocorticoid levels independent of predator-induced fear that drove them there. However, faecal glucocorticoid levels were not affected (Creel et al. 2009). It is possible that indeed plasma cortisol levels were higher in the elk moving into the forests, but not in faeces, because of changes in fibre content and gut passage rates in elk from the two habitats (see Goymann 2012 for a discussion of the confounding effects of diet on faecal metabolite levels). However, irrespective of this, I conclude that elk do not respond to the increased threat of wolf predation by a chronic activation of the stress axis nor do they show the downstream measures (suppression of condition and reproduction) of this interaction. I suggest that indeed this is the pattern for all northern hemisphere ungulates that have evolved with wolves as their principal predators: they respond only acutely when challenged. It may also be the pattern for ungulates elsewhere, such as in Africa, where a different suite of mammalian predators is present (Sinclair, Mduma & Brashares 2003).
Synthesis of chronic stress effects of high predation risk
In all of the above examples, the prey populations are limited by their predators and direct predation plays a central role in that process. Krebs et al. (1998) concluded that this is the general pattern for all vertebrate herbivores, unless they have evolved escape mechanisms in size (e.g. elephants), space or time (such as migration). However, the indirect effects of predators on their prey acting through chronic stress are not invariant – only cyclic snowshoe hare and arctic ground squirrel populations in the boreal forest show evidence of chronic stress effects on their physiology, reproduction and body condition; cyclic vole and noncyclic elk populations do not. Thus, there is clearly plasticity among species in their response to a long-term, high predation risk. I suggest that this dichotomy is dependent on the fitness benefit of the chronic stress response. It is not simply that predators impose a long-term anticipatory fear on their prey, but rather how the prey ‘choose’ in the evolutionary sense to respond to that threat. If it is adaptive to be chronically stressed, they will evolve in that direction; if not, they will only respond acutely. If this is correct, what factors determine one evolutionary strategy versus another? First, the temporal variability of the predation threat varies markedly amongst these prey. For northern hemisphere ungulates, the threat from their predators is more or less a constant feature of their environment, whereas for the cyclic prey, it is only periodically intense (for snowshoe hares and arctic ground squirrels predators may be low for multiple years). However, this cannot be the whole story, given lack of chronic stress effects in cyclic voles. Second, the lifespan of the prey varies markedly among them. In voles, the lifespan is generally much shorter than a year (Boonstra 1994), and the only option is to breed early and often (Kokko & Ranta 1996), with spring and early summer cohorts breeding in the year of their birth. At the other extreme are the ungulates, which are long-lived herbivores with multi-annual reproductive lives, reproduction in females usually starts in the first several years after birth (e.g. Nussey et al. 2008). Species with intermediate life spans, such as snowshoe hare, live maximally about 4 years in nature, but most live <1·5 years; reproduction starts in the year after birth, with maximally four litters per breeding season (Hodges et al. 2001). Arctic ground squirrels are similar in terms of longevity, but have only one litter per year (Karels et al. 2000; Gillis et al. 2005). Thus, I suggest that it is only in species with intermediate life spans where being chronically stressed by high predator risk is adaptive.
The benefits of a species being chronically stressed during periods of high predation risk are two. First, the individual benefits directly as they redirect resources towards enhancing immediate survival. Second, their offspring benefit as a result of perinatal maternal programming assuming that chronic maternal stress is a good predictor of the environment the offspring are about to enter and if, by so doing, this programming increases offspring fitness (survivorship and reproduction) during the period of high predation risk. Given that the period of high predation risk during the 10-year cycles last 2–3 years and that the offspring breed only in the year after birth, this anticipatory programming is plausible. Programming, before or just after birth, results in the organization of target tissues and/or patterns of gene expression that affects function throughout life (Meaney, Szyf & Seckl 2007; see Love, McGowan & Sheriff 2013 for a review of the impact of maternal stress on offspring HPA axis function in natural and laboratory studies). I expect that one of the manifestations of the offspring programming should be increased anti-predator behaviour during the peak and decline phases (Sheriff, Krebs & Boonstra 2010). We do not know whether this occurs here, but it has been found in other species (e.g. enhanced flight performance in starlings: Chin et al. 2009; increased shoaling behaviour in sticklebacks: Giesing et al. 2011). Such transgenerational, adaptive anti-predator programming of the behavior of offspring has also be seen in non-vertebrates (e.g. Storm & Lima 2010). Thus, I hypothesize that the chronic stress response is an adaptive trait selected for under certain life histories.
My analysis of the role of chronic stress in wild animals leads to two conclusions: first that there is no evidence the experience of chronic stress in nature results in pathology and second that chronic stress is not the inevitable result of a persistent, severe stressor, but it is an evolved adaptive response. However, adaptive arguments are easy to formulate when animals show the effects of chronic stress, but to test, one needs to carry out field experiments to assess whether the animals are indeed responding in an adaptive way and thus maximizing their fitness (for an elegant experimental study in which glucocorticoid levels are manipulated directly and fitness measured, see Love & Williams 2008). Hence, field experiments are needed that assess the benefits and costs of alternative responses to the same chronic stressor.
I thank Ben Dantzer and two reviewers for incisive comments, Caitlin Riebe for making Fig. 1 available and for modifying it for me, and the Natural Sciences and Engineering Research Council of Canada for funding.