• Oliver Otti,

    1. University of Bern, Zoological Institute, Division of Evolutionary Ecology, Wohlenstrasse 50a, CH-3032 Hinterkappelen, Switzerland
    2.  E-mail:
    3. Animal and Plant Sciences, University of Sheffield, Western Bank, Sheffield S10 2TN, United Kingdom
    Search for more papers by this author
  • Iris Gantenbein-Ritter,

    1. University of Bern, Zoological Institute, Division of Evolutionary Ecology, Wohlenstrasse 50a, CH-3032 Hinterkappelen, Switzerland
    Search for more papers by this author
  • Alain Jacot,

    1. University of Bern, Zoological Institute, Division of Evolutionary Ecology, Wohlenstrasse 50a, CH-3032 Hinterkappelen, Switzerland
    Search for more papers by this author
    • Present address: IEE – Conservation Biology, University of Bern, Erlachstrasse 9a, CH-3012 Bern, Switzerland.

  • Martin W. G. Brinkhof

    1. University of Bern, Zoological Institute, Division of Evolutionary Ecology, Wohlenstrasse 50a, CH-3032 Hinterkappelen, Switzerland
    Search for more papers by this author
    • Present address: Swiss Paraplegic Research, CH-6207 Nottwil, Switzerland.


Why do individuals have an imperfect immune system? Most studies suggest trade-offs associated with immunity and metabolism, and neglect ecological factors, such as predation. We provide one of the first experimental studies demonstrating a context-dependent survival cost to immune activation. In the presence of a predator, immune-challenged male field crickets showed significantly lower survival than controls, whilst there was no difference in a predator-free environment. Immune-challenged males spent more time outside their burrows and reacted slower to a simulated predator attack. We conclude that some costs of immunity are expressed via increased susceptibility to predation, indicating the importance of integrating the ecological context when investigating optimal investment in immunity.

Hosts bear a number of costs after infection by a parasite. Impairment of a host by parasitism may result from direct resource use by the parasite, from parasite strategies maximizing the transmission to a new host (Moore 2002; Thomas et al. 2010), and from costs related to host immune defense (Moret and Schmid-Hempel 2000; Schmid-Hempel 2003). Optimal investment into immune defense is in part determined by the ratio of physiological costs of activating immune defense mechanisms to the benefits of this activation in removing parasites (Sheldon and Verhulst 1996; Schmid-Hempel 2003). In this framework, the resources used to mount an immune response curtail resource availability to other life-sustaining functions (Moret and Schmid-Hempel 2000; Schmid-Hempel 2003). Such physiological trade-offs lead to context-dependent resource allocation to an immune response, therefore ecological context will feed back on optimal investment in immunity (Moret and Schmid-Hempel 2000; Rolff and Siva-Jothy 2003; Adamo et al. 2008; Sadd and Schmid-Hempel 2009). This interdependency results in a situation in which investment in immunity pulls an infected individual away from its optimal strategy in its ecological context, whilst the ecological context precludes optimal investment in immunity. Parasitized individuals often suffer increased predation (Hoogenboom and Dijkstra 1987; Moore 2002; Møller and Nielsen 2007) that may not only result from the harmful effects of the parasite but also from the activation cost of host immunity. And parasites with trophic transmission have been shown to enhance susceptibility to predation as a means to reach the subsequent or final host, which has mainly been taken as evidence for parasite manipulation of host susceptibility to predation (Poulin 2007). However, we suggest that part of the predation costs of parasitism are due to activating host immunity.

Predation is a major selective force involved in shaping the evolution of morphological and behavioral traits (Lima and Dill 1990, Losos et al. 2006, Svensson and Friberg 2007), which are orchestrated through complex physiological interactions (Mikolajewski et al. 2010). The presence of a predator often induces stress in prey and may change the physiological response to other stresses present at the same time, for example, pesticides or parasites (Relyea and Mills 2001; Joop and Rolff 2004; Yin et al. 2011). Generally, studies have investigated the effect of predation or its perceived risk on behavioral or physiological changes in the prey (Lima and Dill 1990; Joop and Rolff 2004; Yin et al. 2011). However, a study by Eraud et al. (2009) showing decreased survival of immune-challenged pigeon fledglings suggests that predation might be facilitated by the physiological change in the prey. In the present study, we explore how a physiological change, that is, immune system activation, influences predator avoidance, and mortality due to predation.

We therefore examined predation risk as a potential ecological cost of immunity in males of the field cricket Gryllus campestris in a series of experiments. In each of these experiments, males were immune challenged with a standard dose of lipopolysaccharides (LPS)(Jacot et al. 2004). In a capture–recapture study, we established the effect of immune system activation on mortality of males under field conditions allowing predation. Assuming a similar temporal effect of LPS on antimicrobial activity as in Haine et al. (2008), we predicted that immune system activation is costly over a short-term period. In a laboratory experiment, we investigated predator avoidance traits (PAT). We hypothesized that immune system activation may augment predation risk by prolonging reaction time and/or reducing sprint speed in cricket males. Additionally, we directly investigated in outdoor enclosures whether immune system activation may enhance mortality under risk of predation by comparing the survival of control and experimental males in predator-free plots and predator plots, using the Greater white-toothed shrew, Crocidura russula, as model predator. We hypothesized a higher short-term risk of mortality in immune-challenged males as compared to control males under risk of predation only. We further assessed exposure time in the predator plots as a potential behavioral trait affecting the risk of predator encounter.

Material and Methods


We used male field crickets G. campestris from population that was maintained for two generations as a laboratory stock. The founder population was caught on field sites near Berne, Switzerland. The crickets were reared and housed under standard laboratory conditions until emergence (Jacot et al. 2005). Freshly moulted adult males were kept singly in plastic boxes (13 × 9 × 18 cm) with ad libitum food and water. On day 7 (where day 1 is the day of adult eclosion), all males were temporarily CO2-anesthetized and we measured tibia length and pronotum size using a digital imaging system (NIH software, National Institute of Health, Bethesda, MD).


For all experiments, we assessed body mass to the nearest 0.1 g (Mettler, Giessen, Germany) on day 11 after emergence to adulthood, and randomly assigned males to the control or immune-challenged treatment groups. To induce an immune response, we injected half of the crickets with 0.01 μg of LPS and peptidoglycans from the bacteria Serratia marcescens (L6136, Sigma-Aldrich, Munich, Germany; concentration = 1 mg/mL), dissolved in 10 μl Grace's insect medium (G8142, Sigma-Aldrich). Control males were injected with 10 μl Grace's insect medium only. Crickets were inoculated with a Hamilton syringe (Microlance, standard RN needle, 0.26-mm diameter, 0.01-mL total volume), inserting the needle between the third and fourth sternite. Both molecules are nonpathogenic and induce an immune response (Khush and Lemaitre 2000, Hultmark 2003; Cerenius and Söderhäll 2004; Steiner 2004; Strand 2008). The immune response to LPS and peptidoglycan is induced within a few hours and persists for approximately four days or 10 days respectively. Humoral antimicrobial activity peaks 24 h after injection and declines thereafter (Haine et al. 2008).

Capture–Recapture Study

To assess short- and long-term costs of immunity on survival, we constructed 16 plots (4 × 4 m) surrounded by a fence (40 cm), which effectively prevented crickets from escaping but permitted potential predation. The location of these plots was on a cricket-free meadow at the Ethological Field Station Hasli, University of Berne (lat 46°58.001′N; long 7°23.016′E). In the plots, we placed 16 artificial burrows (corrugated plastic tube sealed at one end and pushed into the ground, length 5 cm, diameter 1 cm) equally spaced from each other to provide the male crickets with an appropriate housing. The experiment was started upon successive days. At the start of a capture–recapture trial day, we always filled at least two plots with 16 individually marked male crickets (eight experimental/eight control animals). A plot was only used once during the study period (N= 16). The individuals were assigned randomly to their burrows and were not provided with any food. All individuals readily accepted their burrows and within hours started to prepare an arena to perform their calling song. After release, burrows were checked daily and we stopped checking 10 days after the last animal in a plot was found.


We estimated the daily survival probability using the capture–recapture history of individual male crickets over the first seven days of the study. To control for recapture probabilities, we used MARK version 5.1 (Cooch and White 2009) to estimate daily survival (S) and recapture probability (P), simultaneously. Our main aim was to investigate the effect of immune treatment (i) on cricket survival. As LPS-induced antimicrobial activity in insects peaks commonly within 24 h after inoculation (Haine et al. 2008), we allowed the effect of i to vary between day 1 and the subsequent days of the study period (days 2 to 7; time period t) and included the interaction i×t. We further controlled for heterogeneity in the effect of the experiment on survival between the 16 study plots by including variable z (i.e., plot as factor) and its interactions with i and t to the modeling. Capture probability may vary with immune treatment, time period and study plot, and i, t, and z and their mutual interactions were examined in the capture rate part of the model. Model selection was conducted using Akaike's Information Criterion, corrected for sample size (AICc; Akaike 1973) and overdispersion by adjusting the quasi-likelihood parameter c-hat (quasi AICc, QAICc, Cooch and White 2009; see Tab. S1).


Between day 10 and 14 after moulting, we measured PATs of 125 male field crickets. Starting 1 h after the immune treatment, each male was tested three times at an interval of 20 min. To test for potential mechanisms that could lead to a higher risk of predation in immune-challenged crickets, we assessed several PAT. For the measure of reaction time and sprint speed of male crickets, we built an apparatus (henceforth called PAT measurement tunnel) consisting of an entrance box, the main test tube, and a refuge box (Fig. S1). To induce predator avoidance behavior in male crickets, we used the air-puff method. This method consists of generating a sudden airflow over the abdominal cerci, which elicits escape behavior in crickets (Gras and Hörner 1992). The release of an air-puff (2.5 m/sec) and the subsequent PAT measurements were controlled by three light barriers (Fig. S1). The light barriers were connected to a timer and allowed us to measure the precise position of the cricket to the nearest 0.001 sec. By allowing the cricket to enter the main test tube voluntarily, thereby breaking the first light barrier (Fig. S1) and triggering the air-puff, we standardized our experiment for motivation to move. Reaction time was recorded as the time elapsed between breaking and closing of the first light barrier. Sprint time, recorded as the time difference between the breaking of the second and third light barrier, was transformed into sprint speed by dividing 0.15 m (distance between second and third light barrier) by sprint time in seconds (following the formula for average velocity v=s/t).


Four males did not react at all to the stimulus and were excluded from further analyses. The statistical analysis of reaction time, sprint speed, the probability that a cricket runs the whole distance, and predation risk were conducted using the statistical package R 2.8.0. (R Development Core Team 2011). Pronotum length, as a measure of structural size, was included as a covariate in all analyses. Reaction time and sprint speed were analyzed with linear mixed-effects models with Gaussian error distribution and individual as a random intercept. When analyzing the probability to run the whole distance, the number of trials were taken as denominator and the number of positive trials (i.e., whole distance run) were taken as numerator to run a glm with binomial error distribution.


To investigate whether predation risk is a potential cost of immune system activation, we used the Greater white-toothed shrew C. russula as a model predator. C. russula is a generalist predator, and common in our study site. Our supplementary video shows C. russula to be an efficient predator of G. campestris (Supporting information). Six female shrews were caught near the University of Lausanne or Switzerland and were housed singly in plastic cages (38 × 20 × 25 cm) containing sawdust bedding and a flowerpot for shelter. Each day the shrews were provided with 5 g of fresh animal food (alternatively mealworms, field crickets, and freeze-dried arthropods) and ad libitum water from plastic vials. Two days prior to the start of the experiment, the shrews were put on a restricted food regime (2.5 g) to increase their motivation to hunt during experimental trials.

The experiments were performed in two plots (240 × 190 × 40 cm high plastic fence; Fig. 1A) within adjacent outdoor enclosures to serve as paired predator and nonpredator plots. The plots were embedded with grass (Fenaco, Lyssach, Switzerland) and each plot contained two artificial burrows for the experimental crickets. In front of each burrow, we installed a heating lamp (Zoo Med, San Luis Obispo, CA) allowing male crickets to display behavioral fever (Fig. 1A). Behavioral fever is a response to an infection shown in several insects, including orthopterans (Louis et al. 1986; McClain et al. 1988; Adamo 1998; Blanford et al. 1998) and can be induced by LPS (Bundey et al. 2003). The adaptive value of choosing warmer microhabitats to perform behavioral fever lies in a higher recovery rate from an infection (Louis et al. 1986; Moore and Freehling 2002; Thomas and Blanford 2003). Behavioral fever can only be achieved by increasing the exposure time outside the burrow, thereby enhancing the risk of predation. Temperature directly in front of the burrow varied between 25°C and 30°C. We also installed microphones to record eventual calling activity with data loggers (Lego RCX10), but none of the crickets called during the experimental period. Finally, each plot contained a flowerpot to house the shrew (Fig. 1A).

Figure 1.

Setup and summary of results from the field enclosure experiment. (A) Top view of one of the plots in the outdoor enclosure. (1) Artificial burrow, (2) shrew shelter, and (3) position of the heating lamp. (B) Odds ratio for an LPS-injected individual being predated first, that is, relative risk of predation. (C) Mean within pair difference in time spent outside the burrow. Control males spent 33% and experimental males 52% of the observed time outside the burrow. Error bars represent 95% confidence intervals.

Each shrew was used in four trials. One shrew was introduced about 2 h prior to the start of a trial by putting it under the flowerpot. Two hours after the immune treatment, we transferred two crickets (one LPS, one control) into an artificial burrow of the outdoor plot. To control for side effects, we alternated burrows every other day between immune treatments. Between predator and nonpredator plots, crickets were further matched for body size (test pairs). One hour after their introduction, we recorded the behavior of the crickets in the predator plots for 3 h on video. For the predator plots, we quantified a male's time spent outside its burrow by scoring the relative time spent inside and outside the burrow. We recorded several types of behavior for the time outside the burrow, that is, sitting, grooming, exploring, and digging (Tab. S2). In five of the 17 test pairs, one individual was predated before video recordings, leaving 12 cricket pairs for evaluation. To record predation, we checked each burrow and plot repeatedly, that is, before and after video recording and finally, on the following morning, 22 h after the start of the experiment. Missing crickets were in all cases predated by shrews, as confirmed by finding remains of the eaten crickets (mainly hind legs and wings) in the surroundings of their burrow.


As individuals were sitting on average for 75% of their time spent outside (control: 68%, 95% CI: 44–90%, immune-challenged: 82%, 95% CI: 66–97%) and there was no difference in doing so between the treatments (Sign test on 12 matched pairs: P= 0.39), we decided to differentiate between time spent in and outside the burrow for simplicity. For the analysis of the time spent outside the burrow, we conservatively used a nonparametric sign test (Zar 1999). The relative risk of predation in relation to the immune treatment was analyzed with a likelihood ratio test using the frequency distribution of test pairs in which the LPS-injected or control-injected male was eaten first. To further establish whether treatment effects on the probability of being eaten first might be explained by differences in the time spent outside the burrow, we compared the fit of binomial mixed effect models using the logit link function, with pair as random intercept and proportion of time spent outside the burrow and/or treatment as fixed factors.



The best model indicates that the effect of immune treatment on cricket survival varied between day 1 and days 2 to 7, controlling for variation in recapture probability with immune treatment and study plot. This model obtained three times the support of the second best model (Ratio AICc weights: 0.52/0.17 = 3.1; Table S1). The estimated survival probability on day 1 was lower in experimental males (60%, 95% CI: 37–79%) than in control males (84%, 95% CI: 62–95%); from days 2 to 7 experimental males (94%, 95% CI: 80–98%) and control males (90%, 95% CI: 79–95%) showed similar daily survival (Fig. 2).

Figure 2.

Variation in estimates of survival with immune treatment (experimental males immune-challenged with bacterial lipopolysaccharides [LPS]) and time period. The estimated survival probability on day 1 was lower in experimental males (60%, 95% CI: 37–79%) than in control males (84%, 95% CI: 62–95%); from days 2 to 7 experimental males (94%, 95% CI: 80–98%) and control males (90%, 95% CI: 79–95%) showed similar daily survival.


The reaction time towards the predator-mimicking stimulus differed significantly between males in the two treatments (glmm with gaussian error distribution and individual as random intercept, t118= 6.40, P= 0.013, covariate pronotum length t118 < 0.001, P= 0.99) (Fig. 3). Immune-challenged males took longer to react (154.1 msec, 95% CI: 139.9–168.3) than control individuals (128.6 msec, 95% CI: 114.5–142.7) (Fig. 3). A male's probability of running the full distance in the sprint speed trial did not depend on the immune treatment (glm with binomial error distribution with number of trials that an individual ran the whole distance as dependent variable and number of trials as binomial denominator [logistic regression], χ21= 0.08, P= 0.78, n= 121). Immune treatment did not affect sprint speed in animals that covered the full measurement distance (control males 0.194 m/sec [95% CI: 0.170–0.226], experimental males 0.189 m/sec (95% CI: 0.165–0.220), t55= 0.26, P= 0.80, covariate pronotum length t55=−1.52, P= 0.13; Supporting information). Therefore, the activation state of its immune system did not affect the running endurance of the cricket.

Figure 3.

Mean reaction times (in msec) for control and LPS-injected individuals. Reaction time towards the wind stimulus was 128.6 (95% CI: 114.5–142.7) msec and 154.1 (95% CI: 139.9–168.3) msec in control and experimental males, respectively, indicating poorer escape response following immune treatment (t118= 6.40, P= 0.013; controlling for variation in cricket size using pronotum length, t118 < 0.001, P= 0.99). Error bars represent 95% confidence intervals.


All crickets survived (n= 48) in the predator-free plots, whereas at least one cricket was preyed upon in 17 out of 24 trials in the predator plots (Fisher's exact test: P= 0.009). Experimental males were 4.7 times more likely to be preyed upon first as compared to controls (odds ratio = 4.71 [95% CI: 1.33–16.63], Wald test, χ21= 5.80, P= 0.016; Fig. 1B). Thus, immune system activation significantly lowers the probability of survival in the presence of a predator.

In the presence of a predator, control males spent 33% (95% CI: 24–42%) and experimental males 52% (95% CI: 44–60%) of the observed time outside the burrow, indicating a 19% higher exposure time following immune treatment (sign test on 12 matched pairs: P= 0.039; Fig. 1C). There was a tendency that individuals spending proportionally more time outside the burrow were more likely to be eaten first (binomial mixed effects model: z= 1.770, P= 0.077). This association disappeared when adding treatment as predictor variable (for proportion time outside: z= 1.060, P= 0.289; treatment: z= 3.002, P= 0.003). The support for the best model, that is, the model with only treatment, was 1.5 times the support for the second best model, which additionally contained the proportion of time outside the burrow (Ratio AICc weights: 0.6/0.4 = 1.5, Table S3).


We first examined the impact of immune system activation on survival under field conditions. In a previous study, in which predators were excluded from the cricket harborage (the burrow area), all control and all experimental males survived the first 24 h following inoculation (Jacot et al. 2004). In contrast, when predators were not excluded, 20% fewer immune-challenged males survived compared to control males over the first 24 h in our capture–recapture study. After this initial difference in mortality, the survival rates were similar over subsequent days, which is again consistent with the results of Jacot et al. (2004)(Supporting information; Fig. 2). Similar to the finding of Eraud et al. (2009), our results suggest a short-term survival cost of immune system activation. Differences in mortality over the first day only seem consistent with the immediate and temporary activation of the immune system by LPS (Haine et al. 2008). We propose that predation is a factor in this short-term mortality after immune challenge. Using a matched case–control study, we directly tested this hypothesis and found that immune system activation increases the relative risk of death in the presence of the predator. This represents the first empirical evidence for an immunity-mediated cost of predation. The fact that immune system activation measurably reduced survival of males solely under risk of predation indicates that optimal investment in immunity is related to the ecological context in which fitness costs are measured, in particular predation pressure. Other such costs may involve a temporary reduction in calling activity and reduced longevity (Jacot et al. 2004).

As potential mechanisms of the observed immunity-mediated cost of predation, we suggest two traits: one affecting the risk of predator encounter, and the other one affecting the risk of capture by the predator upon encounter. In the outdoor enclosure experiment, experimental males had an almost 20% higher exposure time than control males, thereby increasing their risk of detection by the shrew. Time spent outside was mostly used for sitting in front of the burrow in both treatments. Thermoregulation seems a likely explanation for why immune-challenged individuals sit longer outside. Therefore, we speculate that the prolonged exposure time is related to behavioral fever in response to the immune challenge (Adamo 1998; Blanford et al. 1998; Bundey et al. 2003).

In the laboratory experiment, experimental males also showed a poorer escape response. A slower reaction is likely to reduce the probability of a successful escape in case of a predator attack. The reduced vigilance of immune-challenged males might indicate impaired function of the nervous system. Depleting cricket brains of biogenic amines significantly reduces their responsiveness to a simulated predator attack (Stevenson et al. 2000). Furthermore, studies in honeybees and bumblebees show a negative effect of immune system activation on learning ability (Riddell and Mallon 2006). It remains to be tested whether immune system activation causes a depletion of energy or specific substances involved in neurotransmission. In contrast, immune treatment did not affect sprint speed in those animals covering the full measurement distance. Our experimental design does not allow us to give an answer to why reaction time and sprint speed seem not to be linked. Once a cricket is detected by a shrew it probably cannot escape by running away, and so effects on endurance or other aspects of locomotion would be of little relevance.

High exposure to a predator and reduced vigilance seem to impose a predation-mediated cost of immunity, which might be a component of the previously described predation-mediated costs of parasitism (Hudson et al. 1992; Lefcort and Eiger 1993; Lefcort and Blaustein 1995; Murray et al. 1997). Several nonexclusive scenarios involving different underlying mechanisms and trade-offs may explain why the cost of immune system activation is ultimately expressed as increased predation risk in field crickets. Invertebrates, like vertebrates, increase their metabolic rates during immune system activation. For instance, immune system activation has been shown costly in terms of energy expenditure (e.g., Moret and Schmid-Hempel 2000; Råberg et al. 2000; Freitak et al. 2003; Martin et al. 2003; reviewed in Schmid-Hempel 2003 and 2005) as well as demand of substances, for example, essential amino acids (Peng et al. 2007) and micronutrients (Siva-Jothy and Thompson 2002; Lee et al. 2008; Malafaia et al. 2009). Increased investment in immunological defense against parasitism may make hosts more susceptible to predation.

A resource allocation strategy not only depends on the efficiency of fighting an infection, but also on its consequences for predator avoidance behavior. In the extreme case of infection with a potentially lethal parasite, the cost of parasitism, while not responding immunologically, outweighs fitness costs associated to a reduction in predation avoidance efficiency. Therefore, the trade-off between antiparasite defense and antipredator defense appears unavoidable under natural conditions. The resource expenditures may be at the cost of physiological and behavioral traits that are important in predator avoidance, as suggested in the present study. Alternatively, exerting behavioral fever in response to immune challenge may be at the cost of increased exposure to predators. Future studies addressing the different mechanisms and costs may clarify the importance of these and other scenarios. By linking host–parasite and predator–prey interactions, we conclude that the ecological context induces heterogeneity in optimal immunity between individuals and populations. A cost of immunity relating to predation risk implies that optimal immune defense will differ in relation to predator abundance. This could be tested in an even greater ecological context by using the approach of Rodríguez-Muñoz et al. (2010) combined with experimental manipulation of immunity to quantify the associated fitness costs and benefits.

Associate Editor: A. Read


We thank H. Scheuber for rearing crickets, N. Rossel for assisting with data collection, R. Eggler for building the predator avoidance measurement tunnel, N. Perrin and C. Bouteiller for providing the shrews. We thank M. Siva-Jothy, J. Rolff, K. Reinhardt, A. Dobson, B. Sadd, and T. Fountain for comments on a draft. We are also very grateful for the constructive comments and the suggestions made by two anonymous referees. Research Ethics Approval for use of Greater white-toothed shrews was obtained from the Office for Agriculture of the Canton Berne (notification number 103/01). The study was financially supported by the Swiss National Science Foundation (grant 3100–059223 to MWGB).

Author contributions: OO and MWGB had full access to all of the data in the study and take responsibility for the integrity of the data and the accuracy of data analyses. Study concept and design: OO, MWGB. Acquisition of data: OO, IG-R, AJ. Statistical analysis and interpretation of data: OO, AJ, MWGB. Drafting of the manuscript: OO, MWGB. Critical revision of the manuscript: AJ, IG-R. Obtained funding: MWGB.

The authors declare no conflict of interest.