Virulence evolution in response to anti-infection resistance: toxic food plants can select for virulent parasites of monarch butterflies

Authors


Jacobus C. de Roode, Biology Department, Emory University, 1510 Clifton Road, Atlanta GA 30322, USA.
Tel.: +1 404 727 2340; fax: +1 404 727 2880; e-mail: jderood@emory.edu

Abstract

Host resistance to parasites can come in two main forms: hosts may either reduce the probability of parasite infection (anti-infection resistance) or reduce parasite growth after infection has occurred (anti-growth resistance). Both resistance mechanisms are often imperfect, meaning that they do not fully prevent or clear infections. Theoretical work has suggested that imperfect anti-growth resistance can select for higher parasite virulence by favouring faster-growing and more virulent parasites that overcome this resistance. In contrast, imperfect anti-infection resistance is thought not to select for increased parasite virulence, because it is assumed that it reduces the number of hosts that become infected, but not the fitness of parasites in successfully infected hosts. Here, we develop a theoretical model to show that anti-infection resistance can in fact select for higher virulence when such resistance reduces the effective parasite dose that enters a host. Our model is based on a monarch butterfly–parasite system in which larval food plants confer resistance to the monarch host. We carried out an experiment and showed that this environmental resistance is most likely a form of anti-infection resistance, through which toxic food plants reduce the effective dose of parasites that initiates an infection. We used these results to build a mathematical model to investigate the evolutionary consequences of food plant-induced resistance. Our model shows that when the effective infectious dose is reduced, parasites can compensate by evolving a higher per-parasite growth rate, and consequently a higher intrinsic virulence. Our results are relevant to many insect host–parasite systems, in which larval food plants often confer imperfect anti-infection resistance. Our results also suggest that – for parasites where the infectious dose affects the within-host dynamics – vaccines that reduce the effective infectious dose can select for increased parasite virulence.

Introduction

Hosts have evolved many mechanisms to protect themselves against parasites. For example, hosts may avoid parasite-rich habitats (Hart, 1990; Parker et al., 2010), medicate themselves or their offspring with anti-parasitic substances (Christe et al., 2003; Singer et al., 2009; Lefèvre et al., 2010a) or tolerate parasite infection through compensatory physiological mechanisms (Corby-Harris et al., 2007; Råberg et al., 2007; Baucom & De Roode, 2011). Two defence mechanisms that have received much theoretical attention are anti-infection and anti-growth resistance (Van Baalen, 1998; Gandon & Michalakis, 2000; Gandon et al., 2001; Ganusov & Antia, 2006). Anti-infection resistance (also known as qualitative resistance or avoidance) reduces the probability of infection, whereas anti-growth resistance (also known as quantitative resistance or control) reduces parasite growth or increases parasite clearance upon infection. Anti-infection and anti-growth resistance can be determined by host genes and environmental factors (Brown et al., 2000; Thomas & Blanford, 2003; Cory & Hoover, 2006; De Roode et al., 2008a), but may also be imposed by the use of drugs and vaccines. Hence, there has been much interest in understanding these resistance mechanisms, in part to be able to predict the consequences of drug and vaccine campaigns on the evolution of parasite virulence (i.e. parasite-induced host mortality) (Williams, 2010).

Theoretical models have shown that anti-infection and anti-growth resistance can have drastically different consequences for parasite evolution (Gandon & Michalakis, 2000). In particular, several studies have found that anti-growth resistance selects for parasites with higher intrinsic virulence (Van Baalen, 1998; Gandon & Michalakis, 2000; Gandon et al., 2001), whereas anti-infection resistance does not (Gandon & Michalakis, 2000; Gandon et al., 2001). These models are based on the assumption that parasite evolution is driven by a trade-off between virulence and transmission, such that parasites cannot increase their transmission rate without simultaneously increasing their virulence (e.g. Levin & Pimentel, 1981; Anderson & May, 1982; Van Baalen & Sabelis, 1995a; Frank, 1996; Alizon et al., 2009). The existence of a trade-off has been shown experimentally for myxoma virus in rabbits, HIV in humans and a protozoan parasite of monarch butterflies (Fraser et al., 2007; De Roode et al., 2008b; Bolker et al., 2010). The outcome of such a trade-off is the selection for parasite genotypes with intermediate levels of within-host growth, at which parasite fitness is maximized. In the case of anti-growth resistance, within-host parasite growth is reduced to sub-optimal levels, resulting in lower parasite fitness; as a consequence, parasites evolve higher intrinsic growth rates to overcome this reduction. Parasites selected in such a way would have higher than optimal growth in nonresistant hosts and thereby cause higher virulence. In contrast, anti-infection resistance does not directly reduce within-parasite growth and does therefore not select for parasites with increased rates of growth and virulence. In fact, anti-infection resistance may reduce the probability of infection and thereby reduce the incidence of mixed-genotype parasite infections in a population. Because it is often assumed that mixed-genotype infections select for higher parasite virulence (Levin & Pimentel, 1981; Bremermann & Pickering, 1983; Nowak & May, 1994; Van Baalen & Sabelis, 1995a; Frank, 1996; De Roode et al., 2005; Alizon & Van Baalen, 2008; Choisy & De Roode, 2010), a reduction in the occurrence of mixed infections may result in the selection for lower parasite virulence (Van Baalen & Sabelis, 1995b; Gandon et al., 2001). Note that this latter expectation is based on situations where parasite genotypes with higher growth rates are both more virulent and competitive; however, in situations where parasites produce public goods (Brown et al., 2002; West & Buckling, 2003) or chemicals that kill each other (Gardner et al., 2004), a reduction in the occurrence of mixed infections may not lead to decreases in virulence.

The results of the theoretical studies on anti-growth and anti-infection resistance depend critically on the assumption that anti-infection resistance is an all-or-nothing trait: parasites have a certain probability to infect a host, and when they do, the ensuing infection always results in the same growth, virulence and transmission. However, empirical studies have shown that in many diseases the probability of infection and the ensuing parasite growth, virulence and transmission depend strongly on the number of infectious particles that enter a host (Van Beek et al., 1988; Agnew & Koella, 1999; Ebert et al., 2000; Osnas & Lively, 2004; Brunner et al., 2005; De Roode et al., 2007; Ben-Ami et al., 2008). A host defence mechanism that reduces the initial dose occurs before the initiation of infection and can therefore be considered as an imperfect anti-infection resistance mechanism. Such a mechanism is likely to trigger an evolutionary response from the parasite. Our goal is to determine whether an increase in such anti-infection resistance selects for parasites with a higher intrinsic virulence.

Our study is based on a monarch butterfly–parasite system, in which resistance can be conferred by the food plants that monarchs consume as caterpillars (De Roode et al., 2008a). Moreover, higher infectious doses result in greater parasite replication and virulence in this system, and parasites vary genetically in their growth and virulence for a given dose (De Roode et al., 2007; De Roode & Altizer, 2010). Our approach relies both on mathematical modelling and an experimental study. We first show experimentally that food plants most likely confer imperfect anti-infection – and not anti-growth – resistance to monarch butterflies. We then combine these results with a recently demonstrated virulence-transmission trade-off in this system (De Roode et al., 2008b) to create a mathematical optimality model and study the consequences of food plant-derived host resistance on virulence evolution.

Experimental materials and methods

The host–parasite system

Monarch butterflies (Danaus plexippus) are commonly infected with the protozoan parasite Ophryocystis elektroscirrha (McLaughlin & Myers, 1970; Leong et al., 1997; Altizer et al., 2000). This parasite forms dormant spores on the outside of adult butterflies, which are scattered onto eggs and food plant leaves during monarch oviposition. Hatching larvae ingest these spores, after which the spores release sporozoites into the monarch midgut. These sporozoites do not replicate within the gut lumen, but invade the hypodermal tissues after which replication takes place. The parasite first replicates asexually and then sexually during larval and pupal stages to form a new generation of dormant spores on the newly emerged adult butterfly. Ophryocystis elektroscirrha replicates heavily in its host, with single-spore infections resulting in millions of spores on the adult butterfly (De Roode et al., 2007), and infections are highly virulent: infected monarchs have reduced pre-adult survival, mating success, lifetime fecundity, adult lifespan and flight ability (Bradley & Altizer, 2005; De Roode et al., 2008b, 2009), and higher spore loads result in greater reductions in host fitness (De Roode et al., 2008b, 2009). Monarch butterflies form a tight association with their milkweed food plants (Ackery & Vane-Wright, 1984), and a recent study has shown that different species of milkweed can confer different levels of anti-parasite resistance (De Roode et al., 2008a). In particular, monarchs reared on Asclepias curassavica (tropical milkweed) suffered lower parasite loads and lived longer as adults than monarchs reared on A. incarnata (swamp milkweed). Although these results clearly demonstrate that A. curassavica confers resistance to monarch hosts, they do not reveal whether this plant species reduces the per-parasite infection probability (a form of anti-infection resistance) or reduces parasite replication upon infection (anti-growth resistance).

Host, parasite and milkweed sources

Monarchs used here were the laboratory-reared grand-offspring of monarchs collected in Miami, FL, in April 2008. The parasite used was a cloned isolate denoted C1F3-P3-1 and was originally obtained from an infected butterfly collected in Miami in 2004. Milkweed seeds of the species A. curassavica and A. incarnata were obtained from Butterfly Encounters, CA, and were germinated and reared to adulthood in a climate-controlled greenhouse. Both milkweed species occur sympatrically with the monarchs and parasite used in this experiment.

Experimental design

We infected monarch butterfly caterpillars with a standard dose of 10 parasites and fed these caterpillars different combinations of the two milkweed species during three periods of monarch larval development time: a pre-infection period, an infection period, and a post-infection period (Fig. 1). Our hypothesis was that in the case of anti-infection resistance A. curassavica would reduce final parasite loads when larvae were fed on this species before or during infection by reducing the probability of infection or by reducing the effective dose of parasites that initiated the infection; in the case of anti-growth resistance, A. curassavica would reduce parasite growth in monarchs fed with this species after parasite infection had occurred.

Figure 1.

 Experimental design. Larval development was divided into three periods: a pre-infection period of 2 days, an infection period of 1 day, and a post-infection period of an average of 9 days. During each period, either Asclepias curassavica (indicated by letter ‘c’ and black bars) or A. incarnata (indicated by letter ‘i’ and grey bars) was provided as food. Monarch development is indicated on the right, with the following life stages shown (from bottom to top): egg, first instar larva, second instar larva, fifth instar larva and pupa (pictures not to scale).

Upon hatching from their eggs, larvae were held in a Petri dish with an A. curassavica or A. incarnata leaf for 2 days (pre-infection period). On the third day (infection period), larvae were transferred to a new Petri dish, in which they were provided with a 0.8-cm-diameter leaf disc of A. curassavica or A. incarnata; to infect larvae with the O. elektroscirrha parasite, we manually added 10 parasite spores to these leaf discs; control larvae were provided with a leaf disc without parasites. Once larvae had completely eaten their leaf disc (within 24 h), they were transferred to 1.34-L plastic tubes, in which they were provided with A. curassavica or A. incarnata cuttings in florist tubes and reared until pupation (post-infection period; an average of 9 days). As shown in Fig. 1, the experiment consisted of eight treatment groups, which differed in the combination of milkweed species fed during each period. Each treatment group consisted of 30 infected and 10 uninfected control larvae.

Larvae were held in a climate-controlled chamber held at 26 °C, 50% RH and a 16L:8D light cycle. Six days after pupation – an average of 3 days before adult emergence – pupae were moved to a laboratory held at the same conditions to avoid potential contamination of the climate-controlled chamber with parasites derived from emerging adult butterflies. After emergence, adult butterflies were held in glassine envelopes at 12 °C, and their day of death was recorded to determine their adult lifespan. This measure of lifespan provides a combined index of adult monarch lifespan and starvation resistance and responds to parasite infection and increasing parasite numbers in a similar way as lifespan under more natural conditions (De Roode et al., 2009). After death, monarch bodies were vortexed at high speed in 5 mL H2O to dislodge parasite spores; spores were then counted using a haemocytometer to determine the number of spores on the infected adult butterfly, a measure we refer to as parasite spore load.

Statistical analysis

We used GLM with a quasibinomial error distribution to compare the proportion of monarchs that survived in inoculated and control groups and to analyse the effect of milkweed species during pre-infection, infection and post-infection periods on the proportion of monarchs that became infected when inoculated with the parasite. Analysis of variance was used to study the effects of milkweed species during pre-infection, infection and post-infection periods on parasite spore load on infected monarchs, and on adult lifespan of infected and uninfected monarchs. In all analyses, we first fitted maximal models including all factors and interactions between them (note that this was not possible in the analysis of proportions as each treatment group provided only one value, such that there were not enough degrees of freedom to test for interactions). Models were then minimized by removing nonsignificant terms (P > 0.05). Significance of remaining terms was determined by term removal followed by model comparison using F-tests (Crawley, 2002). All analyses were carried out in R2.7.0, and models were checked for homogeneity of variance and normality of errors.

Experimental results

A total of 232 of 240 (96.7%) inoculated monarchs and 75 of 80 (93.8%) uninfected control monarchs survived to adulthood. These survival rates were not significantly different (F1,14 = 1.12, P = 0.31). The following analyses are restricted to surviving monarchs.

Infection probability

Of the 232 surviving inoculated monarchs, a total of 224 became infected (96.6%). Food plant species had a significant effect on the proportion of monarchs that became infected, but only the infection period mattered. Thus, monarchs that were inoculated with parasites on A. curassavica had a lower probability of becoming infected than monarchs inoculated with parasites on A. incarnata (Fig. 2a,b; F1,6 = 7.03, P = 0.038). Overall, 93.9% of A. curassavica-inoculated monarchs became infected versus 99.1% of A. incarnata-inoculated monarchs. In contrast, monarchs fed A. curassavica before infection or after infection were as likely to become infected as monarchs fed A. incarnata during these periods (Fig. 2a,b; pre-infection period: F1,5 = 0.004, P = 0.95; post-infection period: F1,4 = 0.46, P = 0.53).

Figure 2.

 Effects of milkweed species on the proportion of monarchs that became infected (a, b), parasite spore load (c, d) and monarch adult lifespan (e, f). Panels on the left show data for monarchs that were fed Asclepias curassavica post-infection, whereas panels on the right show data for monarchs that were fed A. incarnata post-infection. Black bars show data for monarchs that were fed A. curassavica pre-infection, whereas grey bars show data for monarchs that were fed A. incarnata pre-infection. Following the same notations as in Fig. 1, food-order treatments are indicated in each bar, with ‘c’ indicating A. curassavica and ‘i’ indicating A. incarnata. Error bars in panels C–F show 1 SE.

Parasite spore load

Among monarchs that became infected, parasite spore loads were significantly lower on monarchs that were fed A. curassavica before infection (Fig. 2c,d; F1,221 = 21.7, P < 0.0001) or during infection (Fig. 2c,d; F1,221 = 27.5, P < 0.0001). However, A. curassavica did not reduce parasite spore loads on monarchs that were fed this species after infection had occurred (Fig. 2c,d; F1,220 = 0.11, P = 0.75). There were no significant interactions between pre-infection, infection and post-infection periods. The effects of A. curassavica pre-infection and during infection were additive, such that monarchs reared on A. curassavica pre-infection and infected on A. curassavica suffered lower parasite spore loads than monarchs that received A. curassavica during only one of these periods (Fig. 2c,d).

Monarch adult lifespan (infected monarchs)

Results on monarch adult lifespan mirrored those on parasite spore load: infected monarchs reared on A. curassavica before infection (Fig. 2e,f; F1,221 = 25.6, P < 0.0001) and those infected on A. curassavica (Fig 2e,f; F1,221 = 26.8, P < 0.0001) lived longer as adults than those reared on A. incarnata pre-infection and those infected on A. incarnata. There was no effect of the plant species fed after infection on monarch adult lifespan (Fig. 2e,f; F1,220 = 0.78, P = 0.38), and none of the interactions between periods were significant.

Monarch adult lifespan (uninfected monarchs)

Overall, uninfected monarchs lived much longer as adults than did infected monarchs (22.3 ± 0.46 days vs. 8.82 ± 0.24 days for uninfected and infected monarchs, respectively; F1,297 = 800, P < 0.0001). Among uninfected control monarchs, milkweed species only had minor effects. None of the main effects of pre-infection, infection and post-infection period were significant, but there was a significant interaction between the infection and post-infection periods (F1,71 = 11.3, P = 0.001). This interaction indicated that monarchs mock-inoculated on A. incarnata lived slightly longer as adults when they were subsequently reared on A. curassavica than when they were subsequently reared on A. incarnata (mean ± SE: 23.17 ± 1.20 vs. 20.89 ± 1.73 days, respectively). Conversely, monarchs mock-inoculated on A. curassavica lived slightly longer as adults when they were subsequently reared on A. incarnata than when they were subsequently reared on A. curassavica (24.37 ± 1.21 vs. 20.89 ± 1.73 days, respectively). We currently have no explanation for these results, but we note that the differences in adult lifespan between uninfected monarchs reared on A. curassavica and A. incarnata are generally small (Lefèvre et al., 2010a) and sometimes do not occur at all (De Roode et al., 2008a).

Summary of experimental findings

The results from the experiment indicate that A. curassavica more likely confers a form of anti-infection resistance to monarch caterpillars than anti-growth resistance. Monarchs suffered lower infection probability when exposed to parasites on A. curassavica. Furthermore, among infected monarchs, parasite growth was reduced in monarchs that were reared on A. curassavica before infection or infected on A. curassavica, but parasite growth was not reduced in monarchs that were reared on A. curassavica after infection occurred. Although it is possible that the consumption of A. curassavica before and during infection resulted in anti-growth resistance later in life, it is more parsimonious to conclude that A. curassavica reduced the effective dose of parasites to below the 10 spores that were administered and thereby reduced parasite replication and final spore loads (De Roode et al., 2007). Thus, our results suggest that milkweeds can confer a type of imperfect anti-infection resistance to monarchs through which the parasite’s effective dose is reduced without fully blocking infection.

Theoretical model

We develop a theoretical framework to ask whether this imperfect anti-infection resistance acting through a reduction in initial spore load can select for higher parasite virulence. As stated in the introduction, evolutionary epidemiology models typically rely on a trade-off between parasite transmission and virulence (Van Baalen, 1998; Gandon & Michalakis, 2000; Gandon et al., 2001), and the monarch–parasite system is one of few systems in which such a trade-off has been demonstrated (De Roode et al., 2008b). In particular, De Roode et al. (2008b) showed that higher parasite spore loads result in decreased adult emergence probability and mating probability, both of which reduce the probability that a monarch reaches the egg-laying stage during which parasites are transmitted. However, higher parasite spore loads also increase the probability that spores are transmitted during an oviposition event. Higher spore loads also increase the number of spores transferred during successful transmission events and consequently increase the probability that the offspring caterpillars will become infected. This trade-off between virulence and transmission results in maximum parasite fitness at an intermediate spore load and De Roode et al. (2008b) showed that parasite fitness (ω) can be expressed as a function of parasite spore load on adults (p) as follows:

image(1)

where E(p) is the emergence probability of an infected butterfly, M(p) the mating probability, T(p) the parasite transmission probability and I(p) the probability that the offspring will become infected given the number of spores that are transmitted during a successful transmission event. The optimal parasite spore load, i.e. the spore load value p* that maximizes the function ω, can be derived from eqn 1 using an optimization approach (Otto & Day, 2007) by solving the equation

image(2)

This optimal spore load is associated with an optimal level of virulence for the parasite that is roughly proportional to inline image

We build a model to investigate how an increase in the intensity of host plant-induced resistance affects the parasite’s optimal spore load. We summarize the plant effect with a parameter a, which corresponds to the reduction in initial spore load (infectious dose) because of the therapeutic effect of the plant, i.e. the intensity of the plant-derived anti-infection defence. For simplicity, we assume that the plant population is homogeneous, i.e. all plants have the same value of a.

We assume that the dynamics of the parasite density in the host during larval and pupal stages (denoted pL) follow a logistic growth:

image(3)

where r is the parasite growth rate and K is the maximum spore load a parasite would produce given unlimited time (set at = 107). However, the parasite will rarely reach the maximum attainable spore load because its growth is constrained by the pre-adult development time τ of its host (an average of 20 days under laboratory conditions). Therefore, once the host reaches adulthood, parasite growth stops. By solving eqn 3, we can write the equilibrium spore load in an adult host inline image (i.e. the spore load at time τ, pL(τ)) as:

image(4)

where pL(0) is the initial number of spores on day 0 (i.e. the infectious dose). According to eqn 4, the higher the parasite growth rate, the closer the spore load on the adult infected host will be to K (Fig. 3). The same is true for the initial dose (the higher the dose, the higher the adult spore load).

Figure 3.

 Within-host parasite growth for strains with different growth rates (= 1.0: solid line; = 0.75: dashed line; and r = 0.6: dotted line). When the host reaches adulthood (i.e. when = τ), parasite replication stops. Given enough time, the parasite will reach a maximum density of K. Parameter values are p(0) = 10 spores, = 107 spores and τ = 20 days.

To study the evolution of parasite spore load on adults (inline image), which is linked to virulence through the function E(p) (De Roode et al., 2008b), we also need to know the relationship between the spore load on an adult butterfly and the number of spores transferred to its offspring (the infectious dose). Denoting the spore load on an adult from generation N by pN and the initial spore load in the same host by pL,N(0), De Roode et al. (2008b) showed that this relationship can be expressed as:

image(5)

with = 0.65 and = 45.41.

Here, we modify this expression to include the food plant effect on the initial dose:

image(6)

where a is the fraction of the initial dose lost due the effects of the food plant. If the plant has no effect on the parasite, then = 0; in contrast, if the plant confers resistance and reduces the effective infectious dose, then a > 0. Note that when a < 0, the food plant would increase the effective infectious dose and thus be beneficial to the parasite. Combining eqns 4 and 6, we obtain:

image(7)

with  1 and 0 < < 1. Given enough host generations, this system will reach an equilibrium, where inline image. The value of inline image is then obtained numerically by looking for nonzero real solutions of the equation:

image(8)

Note that inline image is a function of the parasite replication rate (r) and of the intensity of the plant-conferred host defence (a). Its value lies between 0 (if a is low) and K (if r is high). Importantly, inline image is not the optimal spore load on an adult butterfly, i.e. the spore load that maximizes parasite fitness, but the spore load on adult butterflies in a population of infected butterflies where population dynamics are at equilibrium. (The optimal spore load (p*) is obtained by maximizing eqn 1.)

The main contribution of eqn 8 is that it links spore load on adults (p), plant-induced host defence (a) and parasite replication rate (r). By replacing p in eqn 1 by inline image we can investigate the consequences of increasing a (i.e. increasing the intensity of food plant-induced resistance) on the optimal level of parasite replication rate r. Figure 4 shows that when a becomes higher (i.e. the food plant confers greater resistance), maximum fitness is attained by parasites with higher intrinsic growth rates. This is because a higher value of a leads to a lower value of inline image, and this can be compensated for by an increase in r. Thus, by decreasing the effective inoculation dose, anti-parasitic plants can select for parasites with higher intrinsic growth rates. In this model, higher growth rates result in higher spore loads (Fig. 3); moreover, De Roode et al. (2008b) showed experimentally that the disease-induced mortality of monarchs inline image always increases with parasite spore load (p). Therefore, our results suggest that anti-parasitic plants select for more virulent parasites. As for the case of imperfect vaccines that block parasite replication (Gandon et al., 2001), it is important to stress that these changes will only be visible if a parasite adapted to hosts feeding on one type of plant infects a host that feeds on another type of plant with a different level of therapeutic effect. This is because the change in replication rate compensates exactly for the loss in replication because of the plant’s therapeutic effect.

Figure 4.

 Parasite fitness as a function of plant-conferred resistance (a) and parasite replication rate (r). When plants reduce the effective parasite inoculum more strongly (i.e. increasing a), maximum fitness is attained at higher replication rates. Parameter values are = 107 spores, τ = 20 days, = 0.65 and = 45.41. Note that some numerical simulations fail to converge when parasite suppression is nearly total (a close to 1).

Discussion

Our experiment showed that milkweed species can have important effects on the growth, virulence and transmission of monarch butterfly parasites and that these effects are mediated pre-infection and during infection, but not after infection. A. curassavica reduced the proportion of monarchs that became infected when it was administered to monarchs at the same time as the parasite, suggesting that this milkweed species reduces the probability of parasite infection. Moreover, among successfully infected monarchs, A. curassavica also reduced parasite spore load and virulence in monarchs fed with this species before or during infection. However, parasite growth was not reduced in monarchs that were fed A. curassavica after infection had occurred. Although it is possible that the consumption of A. curassavica before and during infection resulted in anti-growth resistance later in life, we favour the more parsimonious conclusion that A. curassavica reduced the effective dose of parasites that enter the caterpillar tissues rather than their subsequent growth.

There are two ways in which milkweeds may reduce the effective parasite dose. First, A. curassavica may increase overall monarch health or increase the monarch’s immune response such that monarchs are better protected against parasite infection. Several studies have found host diet effects on insect immunity, including haemocyte concentration, phenoloxidase activity and encapsulation ability (Ojala et al., 2005; Lee et al., 2006; Klemola et al., 2007; Bukovinszky et al., 2009; Shikano et al., 2010), and it is possible that a similar effect occurs in this system. Under this scenario, the additive effect of A. curassavica before and during infection could be explained if an overall higher intake of A. curassavica increases parasite resistance. However, it is unlikely that higher host health explains the observed food plant effects on infection. This is because we found no main effect of food plants on the longevity of uninfected caterpillars, and where we did find an effect (an interaction between infection and post-infection periods), it was inconsistent with the effects among infected monarchs.

Second, A. curassavica may directly interfere with the parasite in the monarch midgut and reduce the number of parasites that can initiate an infection. Direct negative effects of food plant chemicals have been found in many plant-insect systems (Felton & Duffey, 1990; Keating et al., 1990; Hunter & Schultz, 1993; Cory & Hoover, 2006). Previous work (De Roode et al., 2008a) has shown that A. curassavica and A. incarnata differ dramatically in their concentration and constitution of cardenolides, toxic secondary chemicals that milkweeds use as defence against generalist herbivores (Malcolm & Brower, 1986; Agrawal & Fishbein, 2006). Moreover, a recent study found that higher concentrations of two nonpolar cardenolides were associated with lower parasite spore load (Lefèvre et al., 2010a), suggesting that these chemicals are toxic to the parasite. Under this scenario, we may expect A. curassavica to only have a negative effect on the parasite during the infection stage and not during the pre-infection stage. However, it is likely that some of the A. curassavica foliage ingested pre-infection is still present in the midgut when caterpillars are exposed to parasites during the infection phase.

Because food plants affect the growth, virulence and transmission of monarch butterfly parasites, they will likely have a crucial impact on parasite evolution. Our theoretical model shows that milkweed-induced resistance may select for parasites with higher intrinsic virulence. The reason for this is intuitive: when the effective infectious dose is reduced, parasites can respond by evolving a higher per-parasite growth rate to compensate for the reduction in final spore load caused by the reduction in initial dose. Monarchs occur in populations worldwide (Ackery & Vane-Wright, 1984) and utilize different species of milkweed in these populations (Woodson, 1954; Malcolm & Brower, 1989). For example, commonly used species in eastern North America are A. incarnata and A. syriaca (Malcolm & Brower, 1989), whereas Australian monarchs are restricted to A. fruticosa and A. curassavica (Oyeyele & Zalucki, 1990). Because milkweed species vary in the amount of resistance they confer to monarchs (De Roode et al., 2008a), we would expect that parasites have evolved higher intrinsic growth rates and virulence in populations where monarchs utilize milkweeds that confer greater resistance. Importantly, monarchs may actively contribute to the selection for more virulent parasites using anti-parasitic milkweeds as medicine. A recent study has shown that, when given a choice between A. curassavica and A. incarnata, infected butterflies preferentially lay their eggs on A. curassavica and thereby reduce parasite growth in their offspring (Lefèvre et al., 2010a). Our theoretical model shows that such medication may select for parasites with higher intrinsic virulence.

It is important to note that we used our model to explore the evolutionary consequences of anti-infection resistance on parasite growth rate and virulence. However, it is possible that parasites may evolve resistance to therapeutic plants instead of compensating for the loss of growth caused by such plants (Williams, 2010). In that case, we may not expect parasites from different populations to vary in their intrinsic growth rate and virulence. Instead, parasites from populations with anti-parasitic plants would have higher growth rates on plants to which they have evolved resistance, but not on nontoxic plants, on which they would have similar growth rates as parasites from populations with such nontoxic plants.

Our results have important implications for understanding the potential consequences of drug and vaccine use on the evolution of parasite virulence. As reviewed by Williams (2010), there are three main ways in which anti-parasitic treatment can affect parasite evolution: (i) the evolution of drug resistance; (ii) the evolution of strains with antigens that are not recognized by existing vaccines (immune escape); and (iii) the selection for strains that can overcome treatment by increasing their host exploitation rate. Although researchers have traditionally focused on the former two evolutionary outcomes, it is becoming increasingly clear that many pathogens may adapt to drug and vaccine treatment by altering their exploitation rates (Williams, 2010). For example, it has been suggested that anthelminthic drugs may select for bigger and more fecund worms (Leignel & Cabaret, 2001) and that anti-malarial drugs may select for more virulent parasites (Schneider et al., 2008).

The evolution of altered growth and virulence is likely to occur when anti-parasitic treatment reduces parasite growth without completely clearing infections, as is the case for sub-curative drug treatment as well as ‘imperfect’ vaccines (Anderson & May, 1991). Imperfect vaccines have received much attention recently, and several authors have shown that such vaccines may importantly shape the evolution of parasite virulence: vaccines that reduce parasite growth (i.e. anti-growth resistance) select for higher intrinsic virulence, but vaccines that reduce the probability of infection (i.e. anti-infection resistance) do not (Van Baalen, 1998; Gandon et al., 2001, 2003; Mackinnon et al., 2008). The explanation for these effects is that anti-growth vaccines directly reduce parasite fitness and thereby select for parasites that can overcome this reduction in growth. In contrast, several models predict that anti-infection resistance could actually favour less virulent parasites. This is because by reducing the prevalence of the parasite, they also reduce the prevalence of multiple infections, which are thought to select for more virulent strains (Levin & Pimentel, 1981; Bremermann & Pickering, 1983; Nowak & May, 1994; Van Baalen & Sabelis, 1995a,b; Frank, 1996; De Roode et al., 2005; Alizon & Van Baalen, 2008; Choisy & De Roode, 2010). For instance, competition experiments show that Plasmodium strains that transmit best from a co-infected host are also those that are the most virulent (De Roode et al., 2005; Bell et al., 2006). A key assumption of existing models, however, is that anti-infection resistance is an all-or-nothing trait, whereby the probability of infection is reduced, but not the parasite growth, transmission and virulence upon successful infection. However, anti-infection resistance – whether conferred by genes, drugs, vaccines or food – does not have to be an all-or-nothing mechanism, but may instead reduce the effective infectious dose of the parasite. When this happens, our model shows that parasites may be selected to compensate for this reduction by increasing their per-parasite growth rate and virulence.

Our results emphasize that the exact details of host–parasite interactions need to be understood to make accurate predictions on the effects of anti-parasite resistance on virulence evolution (Ganusov & Antia, 2003, 2006). Pathogens vary widely in the infectious dose they need to initiate an infection (Schmid-Hempel & Frank, 2007), but the relationships between infectious dose and severity are still poorly understood (Schmid-Hempel, 2009). Although theoretical models generally assume that diseases such as malaria and influenza are not subject to dose dependence, there is some evidence that infectious dose may in fact result in greater parasite growth and virulence in these diseases (McElroy et al., 1997; Paulo et al., 2010). A recent model showed that incorporating such dose dependence allowed for a better understanding of the huge mortality caused by 1918 flu (Paulo et al., 2010), and as we show here, dose dependence may also contribute to virulence evolution in response to anti-infection immunity.

One challenge in determining the effects of anti-infection resistance will be to combine parasite dynamics at the within-host and population level. For instance, the processes of anti-infection resistance and multiple infections are likely to operate simultaneously and thereby cause conflicting selection for higher and lower virulence. We also assumed that hosts did not vary in their genetic resistance and that the plant population was homogeneous with respect to the plant-conferred levels of resistance. This is certainly an oversimplification as monarchs vary genetically in their resistance (Lefèvre et al., 2010b) and milkweed individuals vary in their anti-parasitic effects (Lefèvre et al., 2010a). Adding heterogeneity would not affect the results qualitatively, but it would impact the value of the parasite optimal growth rate. One way to deal with these complexities would be to use the recent Price equation framework, which combines epidemiology and population genetics. This framework has been used to follow transitory dynamics (i.e. short-term evolution) of virulence (Day & Proulx, 2004) and can also be used to study the effects of host heterogeneity (Gandon & Day, 2009). Such a framework may be especially useful for tri-trophic interactions – such as that between monarchs, their parasites and their food plants – where changes in any of these partners could have unexpected effects on short-term evolution.

Acknowledgments

We thank Rachel Rarick for help with the experiment and Sylvain Gandon, Andy Gardner, Thierry Lefèvre, Yannis Michalakis and an anonymous reviewer for constructive comments on the manuscript. Funding was provided by Emory University and National Science Foundation grant DEB-1019746 to JCdR. CLFdC was supported by Emory College’s Scholarly Inquiry and Research at Emory (SIRE) programme.

Ancillary