Epidemiological trade‐off between intra‐ and interannual scales in the evolution of aggressiveness in a local plant pathogen population

Abstract The efficiency of plant resistance to fungal pathogen populations is expected to decrease over time, due to their evolution with an increase in the frequency of virulent or highly aggressive strains. This dynamics may differ depending on the scale investigated (annual or pluriannual), particularly for annual crop pathogens with both sexual and asexual reproduction cycles. We assessed this time‐scale effect, by comparing aggressiveness changes in a local Zymoseptoria tritici population over an 8‐month cropping season and a 6‐year period of wheat monoculture. We collected two pairs of subpopulations to represent the annual and pluriannual scales: from leaf lesions at the beginning and end of a single annual epidemic and from crop debris at the beginning and end of a 6‐year period. We assessed two aggressiveness traits—latent period and lesion size—on sympatric and allopatric host varieties. A trend toward decreased latent period concomitant with a significant loss of variability was established during the course of the annual epidemic, but not over the 6‐year period. Furthermore, a significant cultivar effect (sympatric vs. allopatric) on the average aggressiveness of the isolates revealed host adaptation, arguing that the observed patterns could result from selection. We thus provide an experimental body of evidence of an epidemiological trade‐off between the intra‐ and interannual scales in the evolution of aggressiveness in a local plant pathogen population. More aggressive isolates were collected from upper leaves, on which disease severity is usually lower than on the lower part of the plants left in the field as crop debris after harvest. We suggest that these isolates play little role in sexual reproduction, due to an Allee effect (difficulty finding mates at low pathogen densities), particularly as the upper parts of the plant are removed from the field, explaining the lack of transmission of increases in aggressiveness between epidemics.


| INTRODUCTION
Understanding how quickly plant pathogen populations respond to the deployment of host resistance in agrosystems is a real challenge.
The role of spatial and temporal variation in host and pathogen lifehistory traits in the evolutionary trajectories of plant pathogens remains poorly understood (Barrett, Thrall, Burdon, & Linde, 2008), and we still have few empirical data concerning the ways in which interactions between epidemiological and evolutionary processes influence the generation and maintenance of such variation in hostpathogen interactions (Tack, Thrall, Barrett, Burdon, & Laine, 2012).
Furthermore, the multidimensional nature of phenotypic adaptation is a key component of evolutionary biology. Common approaches focusing on single traits rather than multiple-trait combinations therefore probably limit our understanding of the adaptive value of a species (Laughlin & Messier, 2015).
Pathogenicity is defined as the ability of a plant pathogen to cause disease, that is, to damage a plant host. Virulence is the qualitative component of pathogenicity which allows a plant pathogen strain to infect a recognized susceptible host; it is largely shaped by host-pathogen interactions in accordance with the gene-for-gene model (Flor, 1971). Plant pathologists use the term "aggressiveness" to describe the quantitative variation in pathogenicity on a susceptible host (Lannou, 2012;Shaner, Stromberg, Lacy, Barker, & Pirone, 1992), which was recently suggested as being conditioned also by minor gene-for-minor gene interactions (Niks, Qi, & Marcel, 2015).
The efficiency of plant resistance tends to decrease over time, due to the evolution of pathogen populations, with an increase in the frequency of virulent or highly aggressive strains (Geiger & Heun, 1989;Lo, van den Bosch, & Paveley, 2012). A breakdown of qualitative resistance due to a matching increase in pathogen virulence has been observed at different spatiotemporal scales: in a single field after a few infection cycles (Alexander, Groth, & Roelfs, 1985;Newton & McGurk, 1991) and in a landscape over several years in response to the deployment of host resistance genes Hovmøller, Munk, & Østergård, 1993;Kolmer, 2002;Rouxel et al., 2003). So-called boom-and-bust cycles are typical of rapid selection for particular virulence genes in a pathogen population corresponding to resistance genes present in a local host population (Mundt, 2014).
Only a few experiments have addressed the issue of changes in the aggressiveness of pathogen populations over large time scales, due to the complex nature of the relationship between evolution of aggressiveness and evolution of virulence. In the Puccinia triticina-Triticum aestivum pathosystem, the dominance of a single pathotype is explained not only by its virulence on its sympatric host cultivar, but also by its greater aggressiveness Pariaud, Goyeau, Halkett, Robert, & Lannou, 2012). In the wild Melampsora lini-Linum marginale pathosystem, a trade-off between the qualitative and quantitative components of pathogenicity may play a key role in generating local adaptation (Thrall & Burdon, 2003). This trade-off may also explain the inconsistency of results for the evolution of aggressiveness following the selection of Zymoseptoria tritici on susceptible versus moderately resistant wheat cultivars (Ahmed, Mundt, Hoffer, & Coakley, 1996;Cowger & Mundt, 2002).
The investigation of changes in aggressiveness has paid little attention to the degree of aggressiveness itself, and the most significant changes have been established over the course of an annual epidemic in field experiments with partially resistant cultivars or cultivar mixtures (Caffier et al., 2016;Delmas et al., 2016;Montarry, Glais, Corbière, & Andrivon, 2008;Newton & McGurk, 1991). In several cases, evolution of aggressiveness was found to be independent of host genetic background or of the virulence genes present in the pathogen population Villaréal & Lannou, 2000). In pea, changes in the aggressiveness of Didymella pinodes populations over the course of an annual epidemic differ between winter and spring crops (Laloi et al., 2016), highlighting the potentially complex influences on selection of cropping system, climatic conditions, and epidemiological processes, depending on the nature of the inoculum.
Evolution of aggressiveness due to selection over the course of an annual epidemic may be too weak to be empirically detected, relative to shifts in pathogenicity occurring at larger temporal scales (Miller, Hamm, & Johnson, 1997). Moreover, there may be no selection for aggressiveness traits over small spatiotemporal scales, or this selection may be less intense in natural conditions, for instance, due to genetic trade-offs between aggressiveness traits (Laine & Barrès, 2013). Alternatively, selection may be negligible relative to other antagonistic evolutionary forces (e.g., gene flow due to alloinoculum; Laloi et al., 2016;McDonald, Mundt, & Chen, 1996).
Selection for greater fitness in pathogen populations is known to be increased by within-host competition among pathogen strains (Zhan & McDonald, 2013), while acknowledging that the more aggressive strains may not always be the more competitive. Co-inoculating a host with a synthetic pathogen population consisting of isolates differing strongly in aggressiveness and then assessing the competition between these isolates is a convenient way to exemplify such a selection (Pariaud, Robert, Goyeau, & Lannou, 2009;Zhan & McDonald, 2013). Several studies have shown that the adaptation of plant pathogens, in terms of aggressiveness traits, can occur after repeated cycling on the same host. "Serial-passage competition experiments" were designed, with the inoculation of a host plant with a pathogen population, followed by the inoculation of a new set of plants with the offspring of the initial pathogen population, repeated over several cycles: rearing a heterogeneous population of Puccinia graminis f. sp. avenae separately on two different oats genotypes for seven asexual generations caused the mean infection efficiency of the population to increase by 10%-15% by the end of the experiment on the host on which it had been maintained, but not on the other host (Leonard, 1969). In similar "artificial selection experiments," the use of only the subset of the pathogen population with the highest virulence or aggressiveness to inoculate the next generation of host plants resulted in a shortening of the latent period of an asexual P. triticina population after five generations (Lehman & Shaner, 1997). Some of the epidemiological processes driving selection within a pathogen population can act during the interepidemic period, partly because the local host might change locally (cultivar rotation), so the time scale to be considered when investigating the evolution of aggressiveness is crucial. However, it is rarely taken into account explicitly: "Most empirical studies have replicated sampling across space rather than through time, based on the argument that assessment across multiple populations in space provides a reasonable surrogate for variation through time" (Tack et al., 2012).
Selection for greater aggressiveness during an epidemic period may be followed by reciprocal counter-selection during the subsequent interepidemic period. Greater aggressiveness may impede interepidemic transmission, by limiting the persistence of the host organs on which the pathogen survives (e.g., potato tuber for P. infestans ;Mariette et al., 2015;Pasco, Montarry, Marquer, & Andrivon, 2015) or by decreasing the ability to reproduce sexually (Abang et al., 2006;Sommerhalder, McDonald, Mascher, & Zhan, 2011;Suffert, Ravigné, & Sache, 2015;Suffert et al., 2016). The empirical detection of trade-offs relationship between intra-epidemic multiplication and interepidemic transmission in agrosystems is challenging (Laine & Barrès, 2013): At least two different, nested, selective dynamics act over two different time scales (annual and pluriannual) under common environmental conditions (same location, same host population) and have yet to be characterized.
The goal of this study was to test the hypothesis of a trade-off relationship between intra-and interepidemic evolutionary dynamics.
We therefore investigated changes in the aggressiveness traits of a field population of Z. tritici at the annual and pluriannual scales. This fungus causes recurrent epidemics of Septoria tritici blotch on wheat and has a dual sexual-asexual reproduction cycle. At relevant dates characterizing the annual and pluriannual scales, two pairs of Z. tritici subpopulations were sampled from a field in which wheat had been grown for several years. The intensity of intra-and interepidemic evolutionary dynamics was investigated by assessing the aggressiveness traits of the fungal isolates in planta. We quantified temporal changes in the between-isolate variance within each subpopulation, and this variance being expected to decrease with selection. We began by characterizing the epidemiological context in two different ways: We assessed disease variables reflecting the "pathogen pressure" at different dates characterizing key epidemiologic periods, to estimate the temporal continuity in disease dynamics at the annual and pluriannual scales; we also assessed the aggressiveness of the isolates (lesion size and latent period) on both "sympatric" and "allopatric" host cultivars, for isolates sampled early and late over the time course of the experiment, for the detection of local host adaptation patterns.

| Host-pathogen system
During the plant-growing season, Z. tritici is clonally propagated by asexual pycnidiospores (conidia), which are splash-dispersed upwards, from leaf to leaf. The rate of spread of the epidemic is determined by the number of asexual, embedded infection cycles.
Wind-dispersed sexual ascospores, mostly produced on wheat debris during the period between crops, initiate the next epidemic (Suffert, Sache, & Lannou, 2011). Recombination maintains high levels of genetic diversity in Z. tritici populations. Selection for both virulence and aggressiveness on wheat cultivars leads to adaptation to the predominant host genotypes (Ahmed, Mundt, & Coakley, 1995;Ahmed et al., 1996;Cowger & Mundt, 2002;McDonald et al., 1996;). In our study, Soissons was referred to as "sympatric" host cultivar for the tested pathogen isolates because they were directly sourced from it. As a once-predominant cultivar now in decline, Soissons was assumed to have played a major role in the overall evolutionary trajectory of pathogen populations in France. Soissons played a specific role in the experimental study area because it was grown there in monoculture for 8 years. Apache was considered to be an "allopatric" host cultivar. It partially replaced Soissons as the predominant cultivar in France, but probably played a less important role than Soissons in the evolutionary trajectory of the local pathogen population, with potentially some isolates immigrating from commercial fields located around the field plot. We therefore considered the most likely origins of the local pathogen subpopulations to be, firstly, cv. Soissons, and, secondly, cv. Apache.

| Disease dynamics over ten years
The temporal dynamics of pathogen pressure in the wheat monoculture plot was characterized from 2008 to 2016 with five quantitative disease variables assessed in field conditions: the amount of primary inoculum at the onset of the epidemic, the earliness of the attack, winter disease severity at the late tillering stage, spring disease severity during the stem extension stage, and spring disease severity during the grain-filling period (see complete definitions in Figure 3). The overall continuity in disease dynamics was investigated (i) at the intraepidemic scale, by assessing correlations between variables recorded during a single annual epidemic; and (ii) at the interepidemic scale, by assessing correlations between variables (the same or different) recorded during two successive epidemics.

| Assessment of aggressiveness traits
The aggressiveness of the 60 isolates was assessed on adult plants,

| Data analysis
The aggressiveness traits of each pair of pathogen subpopulations were assessed in a separate glasshouse trial, by two different people.
This assessment is assessor-dependent and influenced by environmental conditions, so data from the two trials could not be pooled.
A nested ANOVA was used to assess differences between the two subpopulations of each pair, for each aggressiveness trait (maximum lesion size and latent period). For the pair of conidial subpopulations, we considered subpopulation (initial Ci-2009, final Cf-2010) and cultivar (Soissons, Apache) as fixed effects, isolates as a random factor nested within subpopulation, and their interactions. For the pair of F I G U R E 3 Normalized disease variables used to quantify Zymoseptoria tritici pathogen pressure in the wheat monoculture plot from 2008 to 2016. Each variable was normalized to give a value in the range 0-1 (from the lowest to the highest annual value calculated over the 2008-2016 period). An overall index (black circles), calculated as the mean of the five variables for each epidemic period, was used as a proxy for annual pathogen pressure. The "earliness of attack" (blue diamonds) corresponds to the date on which the epidemic reached a threshold intensity (mean date on which the proportion of diseased plants reached 5% and the date on which the mean disease severity for leaf layer L1 reached 20%): 0 = earliest date, 1 = latest date. "Winter disease severity" at the late tillering stage (light green diamonds) corresponds to the mean disease severity assessed on leaf layer L4 from mid-January to mid-February. "Early-spring disease severity" during stem extension (dark green diamonds) corresponds to the mean disease severity assessed on leaf layers L5 and L6 from mid-March to mid-April. "Late-spring disease severity" during the grain-filling period (yellow diamonds) corresponds to the mean disease severity assessed on leaf layer F2 in early June.
The "amount of primary inoculum" available at the end of the epidemic period (red diamonds) corresponds to the mean number of ascospores collected from 1 g of wheat debris (mean number of ascospores collected in mid-October and in mid-November: 0 = lowest primary inoculum pressure, 1 = highest inoculum pressure). The circled numbers correspond to discontinuities in pathogen pressure ( ascospore-derived subpopulations, we considered subpopulation (initial Ai-2009, final Af-2015 and leaf layer as fixed effects, isolate as a random factor nested within subpopulation, and their interactions. We determined whether the between-subpopulation variance of lesion size and latent period on cv. Soissons decreased significantly between Ci-2009and Cf-2010and between Ai-2009and Af-2015 by calculating the null distribution of the ratio of variances in permutation tests (100,000 permutations). ANOVA was performed using the S-PLUS 6.0 software (Lucent Technologies, Inc.) and permutation tests using the R software.

| Correlation between disease variables at different time scales
The correlation between the earliness of attack and the amount of primary inoculum available at the end of the previous epidemic period was clearly positive but not statistically significant (ρ = .505 with p > .1; Table 1): The higher the number of ascospores discharged from wheat debris in the fall, the earlier the first symptoms appeared after seedling emergence, consistent with previous experimental results . The correlation between earliness of attack and disease severity in the current season was positive, whatever the period of severity assessment (ρ = .407 with p > .1 for winter disease severity; ρ = .491 with p > .1 for early-spring disease severity; ρ = .663 with p < .1 for late-spring disease severity): The earlier the onset of the first symptoms, the higher the disease severity. The correlation between disease severity assessed on two dates within the same epidemic period was highly positive and statistically significant (ρ = .761 with p < .05 and ρ = .831 with p < .01), consistent with the generally accepted view that Septoria tritici blotch severity is proportional to the intensity of secondary infections, driven by pycnidiospores splashdispersed from existing, sporulating lesions.
The correlation between disease severity and the amount of primary inoculum available for the subsequent epidemic period was positive but not statistically significant (p > .1). Higher correlation coefficients were obtained for earlier assessments of disease severity during the epidemic period (ρ = .427 for winter disease severity; ρ = .209 for early-spring severity; ρ = .127 for late-spring disease severity; Table 1). This is consistent with the hypothesis that sexual reproduction is a density-dependent process (positively correlated with Correlations were assessed for the same and different variables from two successive annual epidemics and for different variables from the same annual epidemic. ρ values higher than .50 appear in bold, and significant ρ values are indicated by ***(p < .01), **(p < .05), and *(p < .1). a Available for the subsequent epidemic period.

T A B L E 1 Intra-and interannual
Spearman's rank correlation coefficients (ρ) for the relationships between five disease variables (see definitions in Figure 3) characterizing the pathogen pressure in the wheat monoculture plot from 2008 to 2016 the density of lesions, that is, disease severity; Suffert, unpubl. data) probably occurring toward the base of the plants, where the proportion of mature Z. tritici pseudothecia among the overall fruiting bodies (pycnidia and pseudothecia) is systematically higher than on the upper part of plants over the course of an epidemic (Eriksen & Munk, 2003). This finding is also supported by previous experimental results showing that ascospore production is generally greatest after the most severe previous epidemics (Cowger & Mundt, 2002).
The distribution of the coefficients of correlation between different disease variables from a single annual epidemic was compared with that from two successive annual epidemics ( Figure 5)  Table 1). The overall temporal continuity of pathogen pressure was tighter between successive epidemiological stages than between identical epidemiological stages from two successive annual epidemics.
We defined a "significant discontinuity" in pathogen pressure as spring intra-epidemic period, with an early attack followed by moderate winter disease severity, due to weather conditions not conducive to disease development (low rainfall, low temperature; point ② in Figure 3); (iii) during the 2011-2012 late-winter intra-epidemic period, with very small amounts of primary inoculum followed by a very high winter disease severity, due to weather conditions conducive to disease development (high rainfall and temperature; point ③ in Figure 3). The discontinuity between late-spring disease severity in 2013 and the amount of primary inoculum at the beginning of the subsequent epidemic (point ④ in Figure 3) was not considered "significant" as previously defined, because the two disease variables were not positively correlated during the 2008-2016 period (Table 1). From 2012 to 2016, pathogen pressure remained high and no significant discontinuity was found.

F I G U R E 5 Distribution of the coefficients of correlation between different disease variables from a single annual epidemic (black bars; intra-epidemic scale) and between different disease variables from two successive annual epidemics (white bars; interepidemic scale)
Classes of Spearman's rank correlation coefficients

| Subpopulation and cultivar effects on aggressiveness traits
Between-isolate variability was high for both latent period and lesion size, revealing a high level of phenotypic diversity in the four pathogen subpopulations: In the conidial and ascospore-derived subpopulations, lesion size (p = .052 and p < .001, respectively) and latent period (p < .001) differed significantly between isolates (Tables 2 and 3).
No difference in between-isolate variance (Vg) was detected between  Lesion size (%) Latent period was shorter in Cf-2010 than in Ci-2009 (428.0 ddpi vs. 409.1 ddpi; data not shown), although the difference cannot be considered as statistically significant (p = .087; Table 2); there was no difference in lesion size between these two subpopulations (p = .881).
Lesion size was larger in Af-2015 than in Ai-2009 (61.9% vs. 58.0%), although the difference cannot be considered as statistically significant too (p = .090; Table 3). No significant difference in latent period was detected (p = .281). The cultivar effect was significant only for latent period, which was longer in the allopatric cultivar than in the sympatric cultivar (491.4 ddpi in Apache vs. 426.3 ddpi in Soissons; p < .001; Table 3

| DISCUSSION
The correlation between disease variables assessed at different epidemiological time scales revealed intra-and interannual continuity in pathogen pressure, consistent with current knowledge concerning the epidemiology of Septoria tritici blotch. The three significant discontinuities identified during the 6-year period may have been due to environmental factors, such as weather conditions in particular.
The significant effect of cultivar on the latent period of one of the two pairs of Z. tritici subpopulations may reflect the difference in resistance between the two cultivars. This assumption is, however, invalidated by results obtained in similar experimental conditions (same study area, same cultivars tested; . Rather, it may indicate local host adaptation (better performance on the "local" vs. "foreign" host): Considering the latent period, after several years of monoculture, the resident pathogen population becomes more adapted to sympatric host. This is consistent with the differential adaptation of resident and immigrant Z. tritici subpopulations to wheat cultivars previously established in the same study area  and elsewhere for the same pathosystem (Ahmed et al., 1995(Ahmed et al., , 1996, and, more generally, with evolutionary concepts (Gandon & van Zandt, 1998;Kawecki & Ebert, 2004;Tack et al., 2012).
Considering the lesion size, this conclusion must be, however, qualified by the gain of adaptation observed to the allopatric host (only) over the 6-year period, which can be consider as an evidence of maladaptation (worse performance on the "local" vs. "foreign" host; Kaltz, Gandon, Michalakis, & Shykoff, 1999). This result is also consistent with those obtained by    (Zhan & McDonald, 2013). However, the results of Ahmed et al. (1996) and Cowger and Mundt (2002) concerning selection for higher levels of aggressiveness on susceptible versus moderately resistant wheat cultivars were inconsistent, possibly caused by an artifact due to a genetic trade-off between virulence and aggressiveness (Zhan, Mundt, Hoffer, & McDonald, 2002).
Finally, the overall temporal continuity in disease development over the 6-year period and the evidence of local host adaptation  (winter conditions; Suffert et al., 2015). These results suggest that aggressiveness increased over the course of a single annual epidemic, although we could not demonstrate it formally. The trend observed during a single year results from differential selection effects that would have needed to be maintained during several years to reveal more significant effects. This evolution reflects a pattern of adaptation, interpreted as the outcome of short-term selection driven by seasonal environmental conditions. Our interpretation is supported by the significant decrease in the between-isolate variance for latent period at the intra-annual scale, compared to the stability of this variance at the interannual scale. McDonald et al. (1996) have already suggested that selection affects the genetic structure of Z. tritici populations, but they found no experimental evidence for adaptation to any of the host genotypes over the growing season. However, the neutral genetic markers used in their study were not appropriate; the use of markers of aggressiveness or the phenotyping of isolates would have been more relevant approaches. Moreover, sample size was too small to allow the detection a change in the frequency of pathogen genotypes: The probability of detecting the same clone several times is very low with respect to the high population diversity. Finally, with the experimental design used by McDonald et al. (1996), it was not possible to exclude the possibility of sexual reproduction during the growing season Duvivier, 2015), which might conceal the effects of short-term selection.
By contrast, no difference in aggressiveness (latent period or lesion size) was found between the initial and final ascospore- Selection led to a short-term increase in the aggressiveness of the pathogen subpopulation primarily responsible for secondary infections (Suffert et al., 2015) at the annual scale, with no significant impact at the pluriannual scale. Highly aggressive strains could be selected in the pathogen population after only a few embedded asexual multiplication cycles because sexual reproduction plays a lesser role in disease development during the intra-epidemic period than during the interepidemic period. The impact of such intra-annual selection was nullified at the beginning of the next epidemic, probably because sexual reproduction played a crucial role during the early epidemic stages, in which ascospores are the main form of primary inoculum .
Using a field design connecting epidemic and interepidemic peri-  (Abang et al., 2006;Laine & Barrès, 2013;Pasco et al., 2015;Sommerhalder et al., 2011;Susi & Laine, 2013 (Papaïx, Burdon, Lannou, & Thrall, 2014;Papaïx, Goyeau, du Cheyron, Monod, & Lannou, 2011). It should be noted that assessments of the aggressiveness of the isolates on the allopatric host (cv. Apache) did not highlight this trade-off. Similar apparently inconsistent results obtained for cultivar mixtures were interpreted as an expression of disruptive selection affecting the evolution of Z. tritici populations (Mundt et al., 1999). The aggressiveness of the pathogen population per se should therefore not be considered independently of the nature and level of host resistance.
The strength of the difference in the evolution of aggressiveness at the intra-and interepidemic periods is probably determined by the balance between processes leading to selection over the course of a single annual epidemic and processes hampering interepidemic transmission. It is clearly challenging to assess this balance in field conditions. In our study area, during a growing season conducive to disease (2009)(2010), the pathogen probably completed six asexual infection cycles. We therefore compared the effects of these six asexual reproduction cycles with those of six sexual reproduction cycles over the 6-year duration of the whole experiment.
As no functional trade-off between asexual and sexual reproduction was found at the plant scale in previous experiments (Suffert et al., 2015, we suggest that the counter-selection observed during the interepidemic period results principally from an Allee effect. Indeed, more aggressive Z. tritici isolates (Cf-2010) were collected from the upper leaves, on which disease severity was generally lower than on the lower parts of the plants ( (Eriksen & Munk, 2003), as indicated by the positive correlation between winter disease severity (reflecting pathogen density on lower leaf layers) and the subsequent amount of primary inoculum (reflecting the intensity of sexual reproduction). These processes probably provide the best explanation for the low interannual transmissibility of the most aggressive pathogen isolates selected during the annual epidemic for their ability to propagate clonally and for the difference in the evolution of aggressiveness in the pathogen population over the intra-and interannual scales.
The results of our study call for more thorough investigations of the quantitative balance between epidemiological processes leading to a trade-off relationship in the evolution of aggressiveness during the intra-and interepidemic periods, for improving the deployment of host resistance in a landscape over several years. This issue is particularly important for pathogens of annual crops that, like Z. tritici, have a dual reproduction cycle, including a saprophytic survival stage on crop debris (e.g., Rhynchosporium secalis, Abang et al., 2006;Phaeosphaeria nodorum, Sommerhalder et al., 2011). This issue is also important for strict biotrophic pathogens of annual or perennial crops, such as rusts (e.g., Melampsora larici-populaina, Pernaci, 2015;P. triticina, Soubeyrand et al., 2017), for which alternative hosts or volunteers act as a green bridge during the interepidemic period. The epidemiological processes involved depend on the biology of the pathogen, climatic conditions, and agronomic context. Changes in the management of crop debris and volunteers in crop systems, such as the development of simplified cultivation practices, may account for the past or future evolution of aggressiveness in pathogens.