Aggressiveness and its role in the adaptation of plant pathogens




Aggressiveness, the quantitative component of pathogenicity, and its role in the adaptation of plant pathogens are still insufficiently investigated. Using mainly examples of biotrophic and necrotrophic fungal pathogens of cereals and Phytophthora infestans on potato, the empirical knowledge on the nature of aggressiveness components and their evolution in response to host and environment is reviewed. Means of measuring aggressiveness components are considered, as well as the sources of environmental variance in these traits. The adaptive potential of aggressiveness components is evaluated by reviewing evidence for their heritability, as well as for constraints on their evolution, including differential interactions between host and pathogen genotypes and trade-offs between components of pathogenicity. Adaptations of pathogen aggressiveness components to host and environment are analysed, showing that: (i) selection for aggressiveness in pathogen populations can be mediated by climatic parameters; (ii) global population changes or remarkable population structures may be explained by variation in aggressiveness; and (iii) selection for quantitative traits can influence pathogen evolution in agricultural pathosystems and can result in differential adaptation to host cultivars, sometimes leading to erosion of quantitative resistance. Possible links with concepts in evolutionary ecology are suggested.


Understanding why and how pathogens harm their hosts is a central focus in plant pathology and is of particular importance in the context of cultivated host plants. Upon encountering a potential host, a pathogen may be able to cause infection or not. This compatibility relationship has so far largely monopolized the attention of plant pathologists and shaped the discipline (Barrett, 1985; Thrall & Burdon, 2003). One good reason for this may be that compatibility relationships are relatively easy to investigate empirically. In addition, a convincing genetic mechanism was proposed early on [the ‘gene-for-gene’ model (Flor, 1955)] and was rapidly supported by empirical evidence (Flor, 1971). Finally, plant epidemiology has repeatedly proved the dramatic importance of these compatibility relationships for disease dynamics in crop systems (e.g. Hovmøller et al., 1993). In contrast, relatively few studies have examined the quantitative aspects of host–pathogen interactions and their consequences for the dynamics and evolution of pathogen populations.

Most studies on the quantitative aspects of the host–pathogen interaction make reference to pathogen ‘aggressiveness’. Defining aggressiveness may not be simple, however, as the practical (and often implicit) definition in use in many papers often differs from both the original and the ‘official’ definitions.

Originally, Van der Plank (1963) defined aggressiveness as the non-specific component of pathogenicity. Later, Van der Plank (1968) illustrated this definition with data from an experiment (Paxman, 1963) in which isolates of Phytophthora infestans were grown for successive generations on a potato cultivar with no major resistance R-genes (i.e. all isolates were virulent, according to the gene-for-gene model). Van der Plank (1968) summarized the results of Paxman's experiment in an anova table in which the isolate effect (main effect) represented aggressiveness. The absence of isolate × cultivar interaction supported Van der Plank's concept of pathogenicity, according to which differential interactions always relate to ‘virulence’ (which is linked to the presence or absence of R-genes), whereas ‘aggressiveness’ describes a quantitative component of pathogenicity that is, by definition, non-specific relative to host genotypes. In the same book, Van der Plank (1968) distinguished the effects of ‘vertical’ and ‘horizontal’ resistance (counterparts in the host of ‘virulence’ and ‘aggressiveness’ in the pathogen): ‘vertical’ (qualitative) resistance, determined by the presence of R-genes, only reduces the amount of initial inoculum (by removing avirulent spores), whereas ‘horizontal’ (quantitative) resistance reduces the epidemic rate by altering spore infection efficiency, lesion development rate, time from infection to sporulation and the abundance of spores produced. Thus, Van der Plank clearly related ‘virulence’ to the presence or absence of R-genes and considered that ‘aggressiveness’, determined by quantitative traits of the pathogen cycle, cannot lead to differential adaptation to susceptible host genotypes. However, this paradigm was challenged by Caten (1974), who presented a diagram clearly contradicting Van der Plank's definition and concluded that ‘the criterion of the presence or absence of a differential interaction used by Van der Plank for the classification of ... pathogenic variation is of limited use’. Indeed, in the phytopathological literature, ‘aggressiveness’ is often used as an equivalent for ‘quantitative traits related to pathogenicity’ (referred to as the ‘practical definition’ in this paper) and many papers present significant interactions for such traits among isolates and cultivars that are not accounted for by R-genes (see section: Adaptive potential of aggressiveness components).

The last definition in use is that of specialized dictionaries. It departs from the two others and either rejects the term ‘aggressiveness’ or defines it as the relative ability of a plant pathogen to colonize and cause damage to plants, without distinguishing between quantitative and qualitative aspects (FBPP, 1973; Shurtleff & Averre, 1997; Holliday 1998; This review is not an attempt to redefine the term, but will simply follow the most general definition, compatible with most cited papers. ‘Aggressiveness’ will therefore refer to the quantitative variation of pathogenicity on susceptible hosts, without any restriction related to specificity.

In practice, aggressiveness may be measured on different scales. For authors dealing with the epidemic level (field and landscape scales), aggressiveness is usually evaluated by directly measuring epidemic rates (Cumagun & Miedaner, 2003). In another context, authors interested in the interaction between plant and pathogen genotypes at the host-plant scale measure aggressiveness through a variety of quantitative traits expressed during the host–pathogen interaction. These traits, referred to as aggressiveness components, are: infection efficiency, latent period, spore production rate, infectious period and lesion size. Neither scale is exclusive: aggressiveness components measured on the scale of a given plant largely determine the rate of epidemic development (Sackett & Mundt, 2005).

Aggressiveness is traditionally assumed to be polygenically determined (as are most quantitative traits; see below for a detailed discussion of this point). Therefore, quantitative adaptation to the host is theoretically expected to be slower than the acquisition of additional qualitative virulence factors, and quantitative plant resistances are generally expected to be more durable than qualitative resistances. However, both experimental evidence and theory remain scarce, and much work is still to be done to investigate the existence and to evaluate the modalities of adaptation of pathogens to their hosts for quantitative traits. It is therefore necessary to gather empirical knowledge on the nature of aggressiveness (which traits are significantly involved), as well as its genetic and environmental determinants, and on the ability of pathogens to respond to the selection pressures imposed by quantitative host resistances. In this paper, adaptation is defined as the result of selection for heritable traits that confers an increase in reproductive performance.

This paper reviews the literature on quantitative aspects of the host–pathogen interaction and their evolution in response to host and environment. Most published papers on aggressiveness concern fungi and oomycetes, but data from other plant pathogens will be presented when available. First, this review describes (i) how aggressiveness components are measured, (ii) the sources of environmental variance on these traits, and (iii) what is known about their genetic basis. Secondly, the potential for quantitative traits to evolve in response to environmental or host-related selection pressures is evaluated. Finally, published data on responses of pathogen aggressiveness components to host and environment are presented.

Measuring quantitative components of the host–pathogen interaction

Aggressiveness is often separated into elementary quantitative traits of the pathogen life cycle, such as infection efficiency, latent period, sporulation rate, infectious period or lesion size. For some pathogens, the capacity for toxin production is also sometimes evaluated.

Infection efficiency is defined as the probability that a spore deposited on a receptive host surface produces a lesion in the absence of competitive interactions. It is usually measured as a percentage of successful infections resulting from a controlled number of deposited spores (Mehta & Zadoks, 1970; Sache, 1997). For practical reasons, infection efficiency may be indirectly measured by the observed numbers of lesions or chlorotic flecks per unit of leaf area (Clifford & Clothier, 1974; Milus & Line, 1980; Knott & Mundt, 1991). This trait is difficult to precisely estimate because it depends on the number of spores deposited as well as on microclimatic conditions, which are difficult to control (Milus & Line, 1980; Lehman & Shaner, 1997).

The latent period is the time interval between infection and the onset of sporulation from that infection. It determines the duration of epidemic cycles and thus largely controls the rate of epidemic development. The definition of latent period is clear when applied to a single lesion. In most experimental studies, however, artificial inoculations result in a large number of infections per leaf, and the variation in observed latent period among infection sites on a leaf may be considerable (Shaner, 1980). To cope with this difficulty, several criteria are used to estimate latent periods, such as the time from inoculation to first sporulation (Jeffrey et al., 1962; Jinks & Grindle, 1963; Knott & Mundt, 1991; Miller et al., 1998) or the time needed for half of the final number of lesions (T50) to sporulate (Knott & Mundt, 1991; Flier & Turkensteen, 1999) or to show apparent sporulation structures (Johnson, 1980; Tomerlin et al., 1983). The most precise method for estimating T50 was proposed by Shaner (1980) and is based on an adjustment of the dynamics of lesion emergence to a sigmoid curve. Since latent period is highly dependent on temperature, it is recommended to express the time in degree-days to allow comparisons between different experiments (Lovell et al., 2004). It has been observed that using different methods to measure latent period (e.g. T50 vs. the time to first sporulation) could lead to differences in its estimation [see Knott & Mundt (1991) for Puccinia triticina on wheat and Flier & Turkensteen (1999) for P. infestans on potato]. Such differences might, of course, result from uncontrolled environmental effects. A more interesting alternative is that variability in latent period among pathogen genotypes could reveal heterogeneity in both the time at which the first sporulation occurs and the dynamics of lesion maturation, as discussed by Shaw (1990).

Sporulation rate is the amount of spores produced per lesion and per unit of time (Clifford & Clothier, 1974; Sache, 1997). In practice, spores are either weighed (Imhoff et al., 1982; Kardin & Groth, 1989) or counted (Leonard, 1969; Rouse et al., 1980). Sporulation is sometimes expressed in spore production per unit area of diseased leaf (Clifford & Clothier, 1974) or relative to lesion size (Miller et al., 1998). It has repeatedly been shown that spore production per lesion is highly density-dependent (Kardin & Groth, 1989). It can be useful to consider the spore production per unit area of sporulating tissue (Hamid et al., 1982a; Subrahmanyam et al., 1983; Dowkiw et al., 2003), which is considerably less density-dependent (Robert et al., 2004).

The infectious period is the time from the beginning to the end of sporulation. This component is difficult to precisely estimate since sporulation often shows an early peak followed by an asymptotic decrease (Leonard, 1969; Robert et al., 2004), but more irregular patterns may be obtained (Imhoff et al., 1982). For cereal rusts, sporulation can last for more than 40 days under controlled conditions on adult plants (Leonard, 1969; Mehta & Zadoks, 1970; Imhoff et al., 1982; Robert et al., 2004).

Since many pathogen species have two or more forms of propagule related to sexual or asexual reproduction (Pringle & Taylor, 2002), infection efficiency, sporulation rate, latent period and infectious period may, in principle, be measured for each type of spore (e.g. Gilles et al., 2001; Karolewski et al., 2002). Nevertheless, a given parameter may not have the same meaning when measured on sexual and asexual spores. For example, the latent period associated with sexual spores is very different from that of asexual spores because it depends on the fortuitous encounter and merger of two sexually compatible lesions. Therefore, while there seems to be room for adaptive adjustment of asexual latency, sexual latency is expected to be highly dependent on environmental stochasticity. Moreover, organs resulting from sexual reproduction often ensure inter-season survival, as in Blumeria graminis or in Leptosphaeria maculans, and the latent period, as defined above, has no meaning in such cases.

Lesion size is another quantitative trait that is measured as an aggressiveness component (Kolmer & Leonard, 1986; Mundt et al., 2002b). It is generally defined as the surface area that produces spores. For some pathogens, such as P. triticina, lesion size remains limited, but it can dramatically increase in some species such as P. infestans or Puccinia striiformis, for which lesion growth is semisystemic (Emge et al., 1975). In this case, lesion size accounts for a large part of the quantitative development of epidemics and lesion growth rate is a key factor in pathogen competition for available host tissue. Lesion size is not easy to precisely determine for pathogens such as Mycosphaerella graminicola that induce necrosis on the host leaf (Cowger et al., 2000). Moreover, such pathogens often indirectly cause apical necrosis on the leaves that can be confused with a diseased area.

Aggressiveness is sometimes estimated through disease severity, measured as the percentage of the infected plant organ (root, leaf or spike) covered by pathogen lesions (Krupinsky, 1989; Ahmed et al., 1995, 1996; Gilbert et al., 2001; Cowger & Mundt, 2002; Zhan et al., 2002; Cumagun & Miedaner, 2003). Disease severity here is a composite variable resulting from the integrated effect of infection efficiency and lesion size, but also, when assessed at the crop scale, sporulation and dispersal.

The production of mycotoxins (inducing host necrosis) is generally considered an aggressiveness component, but the relationship between disease severity and mycotoxin production is not straightforward. In fusarium head blight of wheat, some studies did not show any relationship between aggressiveness (measured as disease severity) and toxin production (Gilbert et al., 2001), whereas other authors found that DON-toxin concentrations in grains were closely correlated with disease severity (Cumagun & Miedaner, 2004). Deciphering the exact function of toxin production is important to determine its role in aggressiveness. In necrotrophic species, where toxin production is only used to kill plant cells and convert them into resources for growth, toxin production may indeed be correlated with within-host multiplication. In contrast, in species where toxin production is needed to allow spore release (e.g. by accelerating host death), it is not expected to directly correlate with other measurements of within-host multiplication (Day, 2002).

Effects of environment on expression of aggressiveness components

The effects of climatic parameters (mainly temperature and relative humidity) on the expression of disease have been extensively described in the literature. In addition, it is known that host physiological status (e.g. nitrogen content, tissue age) affects disease development, particularly for biotrophic pathogens (Eversmeyer et al., 1980; Tomerlin et al., 1983; Turechek & Stevenson, 1998; Robert et al., 2004). Finally, some components of the pathogen life cycle (e.g. spore production per lesion) are strongly influenced by lesion density (Katsuya & Green, 1967; Mehta & Zadoks, 1970; Rouse et al., 1980; Kardin & Groth, 1989; Robert et al., 2004). These sources of variation are generally considered unwanted effects in aggressiveness measurements, but may account for a large portion of variability. For instance, in a study of the adaptation of P. triticina to wheat cultivars (Knott & Mundt, 1991), most of the variability was accounted for by the growth chamber for two out of three parameters measured. Similarly, ranking by disease severity of isolates of M. graminicola, as well as cultivar-by-isolate interaction, were found to vary between a greenhouse and a growth-chamber experiment (Krenz et al., 2008).

Effect of climatic conditions

Variation among years in field experiments can be considerable, underlining the strong sensitivity of aggressiveness measurements to climatic conditions. For instance, in a field study with Fusarium graminearum, Cumagun & Miedaner (2004) calculated that the isolate-by-environment interaction accounted for 29% of the variance for aggressiveness (measured by disease severity) and 19% of the variance for mycotoxin production.

Most studies linking aggressiveness and climate are limited to the effect of temperature. It is well known that temperature influences pathogen development as well as the expression of host resistance. The effect of temperature on aggressiveness components has been established for many pathogen species and presents an optimum for spore germination, lesion development and sporulation. However, the response to temperature may differ among individuals (Milus et al., 2006). For instance, Milus & Line (1980) showed that the spore production rate of two leaf rust isolates (P. triticina) was identical at 2–18°C but different at 10–30°C.

Interestingly, differences in aggressiveness among pathogen isolates have sometimes been reported to be greater under non-optimal conditions: differences in the latent period among isolates were more effectively observed at suboptimal temperatures for pathogen development in P. triticina and P. striiformis f.sp. tritici (Eversmeyer et al., 1980; Johnson, 1980; Milus et al., 2006). This result suggests that differential responses in terms of aggressiveness may be less detectable under optimal environmental conditions (Eversmeyer et al., 1980; Johnson, 1980).

Effect of host physiological status

For several biotrophic parasites, high nitrogen content in host tissues results in increased infection efficiency and spore production (Tiedemann, 1996; Jensen & Munk, 1997; Robert et al., 2004). Spore production of biotrophic parasites was also reported to increase when host photosynthesis was stimulated (Cohen & Rotem, 1970). Moreover, the response to infection may depend on host growth stage (Eversmeyer et al., 1980; Johnson, 1980; Milus & Line, 1980; Tomerlin et al., 1983) or on the type or age of host tissues (Turechek & Stevenson, 1998). Changes in the quantitative expression of disease with host development stage were reviewed by Develey-Rivière & Galiana (2007) and probably largely relate to differences in the expression of resistance factors. Host status may affect the quantitative host–pathogen interaction through the amount of available resources or through the expression of resistance genes (the latter is not considered further).

Milus & Line (1980) observed that the relative spore production of two P. triticina cultures changed with host growth stage (seedling or adult plant) on some cultivars. Turechek & Stevenson (1998) showed that the age of host tissues can have a strong effect on a tree disease such as pecan scab (caused by Cladosporium caryigenum) for aggressiveness components such as infection efficiency, incubation period, lesion size and sporulation. Knott & Mundt (1991) found a significant difference between upper and lower leaves for latent period and infection efficiency when measuring the aggressiveness of field populations of P. triticina on wheat. On average, spore populations exhibited 25% higher infection efficiency and a 3–4% shorter latent period on the upper leaves than on the lower leaves. This was attributed either to greater susceptibility of the upper leaves or to physiological effects. Katsuya & Green (1967) even found significant differences in the latent period of wheat stem rust, depending on the position of the lesions along the leaves: latent period was about 1 day shorter at the leaf base than toward the tip.

Measurements performed on detached leaves, although generally considered reliable, may sometimes alter the differences observed: Miller et al. (1998) compared the responses of three to five P. infestans isolates on two potato cultivars, either on detached leaflets or whole plants. They found that one isolate (537) had a significantly greater sporulation capacity on whole plants than two others (367 and 416), whereas no significant differences were found among isolates on detached leaflets, regardless of cultivar.

Effect of lesion density

For many biotrophic pathogens, lesion size and spore production are highly density-dependent (e.g. Robert et al., 2004), probably because of increased competition among lesions for host resources and available tissues. This density effect may have major consequences for experimental measurements of aggressiveness components such as spore production and lesion size, particularly when differences in infection efficiency among isolates result in different lesion densities. In such cases, observed differences in spore production may result from a density effect rather than from genetic differences among isolates. For instance, in a study by Clifford & Clothier (1974) on barley leaf rust (Puccinia hordei), the greater sporulation capacity observed on moderately resistant cultivars than on the susceptible control could have partly resulted from a density effect generated by differences in infection efficiencies.

Pathogen genotypes may respond differently to this competition effect, however, and some individuals seem to be less affected than others by lesion density (Katsuya & Green, 1967; Kardin & Groth, 1989). Katsuya & Green (1967) observed the competition between two isolates belonging to two different pathotypes of Puccinia graminis during 14 generations on wheat seedlings in a greenhouse. One isolate was predominant at low densities (less than 10 lesions per leaf) on both cultivars, whereas the other isolate became predominant on one of the cultivars at high densities (> 100 lesions per leaf). Additional evidence of genotype-by-density interaction was found by Kardin & Groth (1989) with bean leaf rust (caused by Uromyces appendiculatus). They observed that the relative lesion size of several isolates changed when lesion density was increased: isolates with the largest lesions at low lesion density presented the smallest lesions at high density. The authors concluded that a reproductive advantage found at low densities might be obviated at higher densities. However, the consequences of these differential responses to competition might be weak on an epidemic scale (see Lannou & Mundt, 1997).

Effect of pathogen physiological status

Variability in aggressiveness measurements may also result from the physiological state of the pathogen. In particular, storage or multiplication conditions may alter pathogen aggressiveness. This was clearly shown for P. infestans by Day & Shattock (1997) in which isolates collected in different years and stored in liquid nitrogen were compared with ‘standard’ reference isolates. Jeffrey et al. (1962) and Jinks & Grindle (1963) observed a decrease in aggressiveness of P. infestans strains after repeated transfers on chickpea medium, and reported that different strains underwent these changes to varying extents and at differing rates. Nonetheless, Jinks & Grindle (1963) observed that cycling on potato tubers could enable P. infestans to recover its initial aggressiveness level, suggesting that the observed changes resulted from phenotypic plasticity. Mundt et al. (2002b), using the bacterial pathosystem Xanthomonas oryzae pv. oryzae on rice, tested two cultures resulting from separate maintenance of the same initial material. The difference between cultures approached significance (P = 0.06) in one trial out of two, and variance component analysis indicated that the culture environment of the pathogen accounted for 14% of the total variation in lesion length. The authors concluded that culture variability should be considered more often in aggressiveness measurements.

The great sensitivity of the quantitative host–pathogen interaction to environmental variations (including climatic effects, lesion density, host physiological status, isolate maintenance and culture conditions) has at least two consequences. First, it constitutes a constraint for empirical studies of aggressiveness, since it appears to be of primary importance to perform aggressiveness measurements under well-defined and controlled conditions and with standardized host and pathogen material. Secondly, it is questionable from an evolutionary perspective whether aggressiveness components, variable as they are with environment, may respond to selection. The genetic basis of aggressiveness traits and their adaptive potential are discussed below.

Genetic basis of aggressiveness components

Compared with studies on the genetics of quantitative resistance, investigations of the genetic bases of aggressiveness components in fungi are scarce. An example of the former is the meta-analysis of the genetic support of quantitative and qualitative resistance of rice to Magnaporthe grisea by Ballini et al. (2008). Generally, the number of effective genetic factors for quantitative resistance tends to range from two to 10 or more, but it is commonly determined by three to five loci (Parlevliet, 1993; Young, 1996; Singh et al., 2005). Moreover, it has been suggested that quantitative loci could be allelic versions of qualitative resistance genes with intermediate phenotypes (Young, 1996, Ballini et al., 2008). It thus appears that the genetic determination of quantitative resistance is both complex and diversified in terms of the number and nature of genes involved. As a consequence, the genetic control of aggressiveness is likely to be complex and variable, and is probably of polygenic determination in most cases.

Several studies have suggested that aggressiveness components are polygenically determined (Blanch et al., 1981; Caten et al., 1984; Hawthorne et al., 1994; Cumagun & Miedaner, 2004). This was shown, for instance, with segregation for wheat leaf necrosis and production of pycnidia in Phaeosphaeria nodorum, by tetrad analysis (Halama et al., 1999).

The genetic architecture of quantitative traits can be further studied through quantitative trait loci (QTL) mapping. Only a few QTL were detected for components such as lesion length or fungal growth in Gibberella zeae (Cumagun et al., 2004) and Heterobasidion annosum (Lind et al., 2007). However, several obstacles make this approach generally difficult for plant pathogenic fungi. First, the number of loci detected in QTL analysis depends on several factors, including the genetic properties of the QTL that control the traits studied, environmental effects, population size and experimental error (Collard et al., 2005). Secondly, QTL analysis cannot be applied to asexually reproducing species and thus to a large number of pathogenic fungi. Lastly, QTL analysis requires phenotypic evaluation and, as mentioned above, quantitative measurements in plant pathogens are time-consuming and subject to high variability, which limits the number of progeny and components analysed.

Molecular genetic and genomic approaches have also been used to identify the genes involved in pathogenicity, either qualitatively or quantitatively (aggressiveness-related genes) (Xu et al., 2006). The pathogenicity-related genes identified in the rice blast model M. grisea were recently reviewed (Ebbole, 2007). The vast majority of these genes were involved in the ability to infect the host, but some were found to control quantitative variations of aggressiveness components.

As a conclusion on this point, empirical evidence for genetic control has been found for at least some aggressiveness components. In all cases, this genetic support implies that several genes are involved, even if the precise number of genes and their interactions are still to be determined. An alternative approach to elucidate the genetic basis of adaptation involves genome scans of DNA polymorphism (Schlötterer, 2003; Storz, 2005), based on the identification of neutral markers with deviant behaviour in natural populations. This population-based approach makes it possible to identify loci undergoing selection without having to evaluate the phenotypes themselves, but is again limited to sexually reproducing fungi. Although promising, it has not yet been applied to fungal pathogens.

Adaptive potential of aggressiveness components

The fact that aggressiveness components have a genetic basis opens up opportunities for pathogen adaptation. Such adaptation is only possible, however, if heritable genetic variations exist in pathogen populations for these traits. Assuming that heritable genetic variation on aggressiveness components exists in pathogen populations, evolution of aggressiveness will be shaped by genetic constraints such as (i) differential interactions between hosts and pathogen isolates for quantitative traits, (ii) trade-offs between qualitative virulence and aggressiveness (‘cost of qualitative virulence’) and (iii) trade-offs between aggressiveness components or between aggressiveness and survival capacity.

Variability for aggressiveness within pathogen populations

Within-population variation for quantitative traits is a basic prerequisite for adaptation. However, it is important to note that potential genetic diversity in aggressiveness has often been underestimated. In fact, pathotypes have sometimes been considered as homogeneous genetic units and compared on the basis of a single isolate per pathotype (Katsuya & Green, 1967). Nevertheless, apparent uniformity within a pathogen population studied on the basis of qualitative indicators does not exclude the possibility of a certain level of variability in aggressiveness: many studies have documented differences in aggressiveness among isolates belonging to the same pathotype or sharing similar genotypes as defined by neutral markers (Jeffrey et al., 1962; Hamid et al., 1982b; Miller et al., 1998; Carlisle et al., 2002; Mundt et al., 2002b; Milus et al., 2006). Hamid et al. (1982b) found significant intra-pathotype variations for infection efficiency, lesion length and sporulation capacity among isolates of the same pathotype of Cochliobolus carbonum, with differences between the least and the most aggressive isolate of up to 91%. Milus et al. (2006) found that two isolates of P. striiformis f.sp. tritici belonging to the same pathotype not only had different latent periods at 18°C on wheat seedlings, but also presented different optimal temperatures for this trait, one isolate developing faster at 12°C and the other at 18°C. Significant differences in aggressiveness (lesion expansion rate, latent period, sporulation and infection efficiency) were found among 17 P. infestans individuals collected from a Northern Ireland population and sharing an identical multilocus genotype (allozyme profiles and mtDNA haplotypes), the same mating type, the same capacity to overcome the specific R1 resistance gene and the same sensitivity to a fungicide (Carlisle et al., 2002). Mundt et al. (2002b) investigated the potential association of aggressiveness variation among isolates and phylogenetic classification in the bacterial pathogen X. oryzae pv. oryzae, and found significant differences in aggressiveness between isolates of the same clonal lineage, and even between isolates of the same RFLP haplotype within a clonal lineage, suggesting that mutations leading to increased aggressiveness had rapidly accumulated within the phylogenetic lineages.

Heritability of aggressiveness components

Heritability is defined as the proportion of phenotypic variance attributable to genetic variance and can be estimated by different methods (Hamid et al., 1982a; Hill & Nelson, 1982; Kolmer & Leonard, 1986; Lehman & Shaner, 1996, 2007). Lehman & Shaner (1996) estimated the heritability of the latent period in P. triticina from variance analysis involving all possible combinations of seven single-uredinial isolates on four wheat cultivars expressing different levels of quantitative resistance. Except on the most susceptible cultivar, on which all isolates responded equally, heritability values for the latent period ranged from 0.28 to 0.76, depending on the cultivar tested. The greatest part (42–49%) of the variation among isolates attributed to genetic factors was found on cultivar CI-13227. This result was later confirmed (Lehman & Shaner, 2007) by a selection experiment: heritability of the latent period was then about 0.70 on cultivar CI-13227 and 0.20 on another partially resistant cultivar. This suggests that heritability of aggressiveness strongly depends on the genetics of the host–pathogen interaction. Hill & Nelson (1982) estimated heritability of several aggressiveness components in progeny populations obtained from different crosses between five isolates of Cochliobolus heterostrophus race T on one maize line. Their estimates ranged between 0 and 0.6 for lesion length, 0.23 and 0.52 for sporulation capacity and 0.21 and 0.58 for infection efficiency. According to the authors, the low heritability value for lesion length confirmed the low genetic variation within race T for this aggressiveness component. In C. carbonum race 3, Hamid et al. (1982a) estimated lesion length heritability at 0.87, based on an analysis of variance of crossed responses of 22 isolates on two corn inbred lines. However, Kolmer & Leonard (1986) commented that heritability values determined from genetic responses in multiple environments were more realistic than values determined by analysis of variance and based on restricted environmental variation and a limited number of phenotypes. They estimated heritability of lesion length in C. heterostrophus race O in an artificial selection experiment on a single corn line: starting with a set of bulk isolates, the largest lesions were selected to constitute the following generation and the rate of increase of mean lesion length throughout generations was calculated by regression. On the basis of this method, a heritability value of 0.27 was estimated for lesion length.

Other authors evaluated the variability explained by genotype vs. environment without formally calculating heritability. In a study on P. infestans on artificial medium, Caten (1974) found that 83% of the total variation between isolates was of genetic origin. In a greenhouse trial with the bacterial pathogen X. oryzae pv. oryzae on rice, Mundt et al. (2002b) recorded 47–55% of the total variation in aggressiveness measurements was accounted for by genetic factors.

Differential interactions between isolates and cultivars for aggressiveness components

According to the original definition of aggressiveness (Van der Plank, 1963), variation for quantitative traits was considered to occur among isolates, but without interactions with the host genotypes (see Introduction). Nevertheless, many studies have shown that these kinds of interactions can be observed. Most of these studies were undertaken to evaluate quantitative resistance in host plants, but indirectly revealed differential interactions between host and pathogen genotypes for quantitative traits.

Using the pathosystem Hordeum vulgare–P. hordei, Parlevliet (1977) compared the latent period and infection efficiency of five isolates on adult plants of three cultivars, both under controlled conditions and in the field. The cultivar-by-isolate interaction was significant for the latent period in both trials. With P. infestans on potato, Carlisle et al. (2002) found cultivar-by-isolate interactions for latent period, lesion expansion and sporulation capacity. Cultivar-by-isolate interactions were also found for infection efficiency, latent period and sporulation capacity with P. triticina (Kuhn et al., 1978; Milus & Line, 1980; Lehman & Shaner, 1996), for infection efficiency and sporulation capacity with B. graminis f.sp. tritici (Rouse et al., 1980), for infection efficiency with P. nodorum (Scharen & Eyal, 1983), and for lesion expansion with Septoria musiva on poplar (Krupinsky, 1989). It should be mentioned, however, that such interactions are not always found: Van Ginkel & Scharen (1988) analysed the responses of 14 wheat cultivars to 34 Septoria tritici (M. graminicola) isolates for lesion size (necrotic area) on seedlings. Significant differences were found among cultivars and isolates, but with no significant interactions.

Cost of virulence: a trade-off between qualitative virulence and aggressiveness

The cost of qualitative virulence is the reduction in pathogen fitness induced by a mutation from avirulence to qualitative virulence. This concept was originally developed by Van der Plank (1963) and has since been widely discussed in the literature (see below). With this concept, changes in pathogen aggressiveness result directly from the loss of avirulence gene function.

Estimating this fitness cost may be challenging because (i) fitness varies with experimental conditions (Weltz et al., 1990) and (ii) the effect of genetic background has to be eliminated (Østergård, 1987). When comparing aggressiveness components of B. graminis f.sp. tritici isolates differing in number of qualitative virulence genes, Menzies & MacNeill (1987) did not clearly distinguish which part of the reduction in fitness could be attributed to qualitative virulence genes or to isolate genetic backgrounds. Bronson & Ellingboe (1986) even showed by a progeny study that reduced fitness segregated independently of the qualitative virulence loci.

Some attempts to evaluate the cost of qualitative virulence have remained inconclusive. In an artificial selection experiment with P. infestans on potato, a single-virulence pathotype was shown to predominate in a pathotype mixture after two to nine successive generations (Thurston, 1961). However, in the field, this pathotype was not always the most aggressive, and more complex pathotypes could be more frequent on a susceptible cultivar. In a similar experiment with P. triticina on wheat, Kolmer (1993) found that even though pathogen fitness seemed to sometimes be associated or dissociated with certain individual qualitative virulences, no general relationship could be established between the number of unnecessary qualitative virulence factors and pathogen fitness.

Several authors, however, obtained fitness differences between avirulent races and races carrying unnecessary virulence factors that could be clearly attributed to the cost of qualitative virulence. Because the effect of genetic background cannot be easily separated from the effect of the avirulence gene itself, these studies were based on indirect measurements. Measured values for virulence cost (reduction in aggressiveness) ranged between 14 and 39% for P. graminis f.sp. avenae (Leonard, 1969), 12 and 30% for C. heterostrophus, depending on the experimental procedure used (Leonard, 1977), 4 and 5.2% for P. graminis f.sp. tritici (Grant & Archer, 1983), and 5.4 and 6.1% for B. graminis f.sp. hordei (Grant & Archer, 1983). The first direct evidence of a cost of virulence was obtained in the causal agent of bacterial blight on rice (X. oryzae pv. oryzae) by comparing virulent and avirulent isogenic lines (Vera Cruz et al., 2000). Since then, the same result was obtained with other cloned avirulence genes (Leach et al., 2001).

Recent progress in plant pathogen genomics has shown that mutations from avirulence to qualitative virulence may stem from very different events, ranging from a single-base mutation to a large chromosome deletion (e.g. Gout et al., 2007). Depending on the function of the qualitative virulence gene, its redundancy in the genome and the nature of the mutation, the cost for qualitative virulence might vary from neutral to nearly lethal, the latter obviously being not selected in field populations. Moreover, it has been shown with other biological systems that fitness costs resulting from the acquisition of resistance to antibiotics or insecticides (Levin et al., 2000; Schoustra et al., 2006; Labbéet al., 2007) can be progressively compensated for by subsequent mutations. Similar mechanisms could reduce the cost of virulence in plant pathogen populations, possibly explaining why pathotypes carrying multiple virulence genes can be present at high frequencies and over long periods of time (e.g. Goyeau et al., 2006).

This link between qualitative virulence and aggressiveness probably has considerable consequences on pathogen evolution, since genotypes which accumulate a large number of qualitative virulence genes might never be the most aggressive on a given host (Thrall & Burdon, 2003).

Genetic correlations between quantitative traits

Genetic correlations between traits, either positive or negative (trade-offs), should constrain quantitative trait evolution. Trade-offs between aggressiveness components or between aggressiveness and other life-history traits should therefore be of the utmost importance for pathogen adaptation. Nevertheless, there are surprisingly few datasets available to show the existence of such trade-offs in fungal plant pathogens.

Trade-offs between aggressiveness and survival were suggested in a study by Leonard et al. (1988) which showed that C. carbonum presented a low aggressiveness level on maize but a great survival ability, whereas C. heterostrophus expressed high aggressiveness levels but a low survival ability. At the intra-species level, Carson (1998) found evidence of a trade-off between C. heterostrophus lesion length and survival rate on the soil surface during winter. In a recent study on P. infestans, Montarry et al. (2007) found no trade-off between aggressiveness (measured by combining lesion size, sporulation and latent period) and overwinter survival on potato tubers.

Correlations between aggressiveness components have rarely been investigated in fungal plant pathogens. Leonard et al. (1988) found that B. maydis race T, which produces a host-specific toxin, quickly disappeared from the pathogen population when the host genotypes susceptible to the toxin were removed from the host population, suggesting the existence of a trade-off between toxin production and fitness of the fungus. Authors often consider that components of the pathogen life cycle related to aggressiveness are positively correlated, and sometimes suggest that they are under pleiotropic control (Ohm & Shaner, 1975; Milus & Line, 1980). For instance, Milus & Line (1980) found that long latent periods were associated with small lesion sizes in wheat leaf rust.

Adaptation for quantitative traits in pathogen populations

Most population studies on pathogen adaptation for quantitative traits deal with agricultural systems, but interesting results have also been obtained with wild pathosystems. Four points are examined here: (i) selection for aggressiveness mediated by climatic parameters; (ii) global population changes related to aggressiveness; (iii) adaptation to host cultivars for quantitative traits; and (iv) adaptation to identified quantitative resistances.

Adaptation to environmental conditions

Most studies linking aggressiveness components to the environment have aimed to understand quantitative epidemic development in a range of environmental situations, only considering the species level. In some cases, however, it has been demonstrated that the relative fitness of different genotypes within the same pathogen species can vary according to environmental effects (Eversmeyer et al., 1980; Johnson, 1980; Milus & Line, 1980; Milus et al., 2006) and a few studies have attempted to link pathogen population structures and climate.

Milus et al. (2006) demonstrated that better adaptation to warmer temperatures might explain the observed changes in P. striiformis f.sp. tritici populations in the south-central USA around the year 2000. Under a controlled conditions trial, ‘new’ and ‘old’ isolates had similar aggressiveness levels at 12°C, whereas at 18°C the latent period was shortened by about 2 days for isolates from the ‘new’ population and germination rates were doubled compared to the ‘old’ population. The authors concluded that these differences may have contributed to the recently expanded geographic range for P. striiformis. Similarly, Katsuya & Green (1967) explained the replacement of wheat stem rust (P. graminis) pathotype 15B by a new pathotype (56) in Canada by a differential effect of temperature. In a competition experiment performed under controlled conditions, they showed that the relative fitness of these pathotypes was reversed between 15 and 20°C, with the ‘new’ pathotype being more frequent at higher temperatures. It should be mentioned, however, that Katsuya & Green (1967) used a single isolate for each pathotype and thus ignored the potential within-pathotype diversity.

These studies suggest that temperature may sometimes lead to large population shifts within a pathogen species. However, more subtle effects can also operate in a local adaptation context. There is indeed evidence that the effect of temperature on pathogen aggressiveness may affect parasite performance through genotype-by-environment interactions (Price et al., 2004; Laine, 2008). With a fitness estimate based on latent period and spore production, Laine (2008) demonstrated that both the strength and the direction of local adaptation of the powdery mildew fungus, Podosphaera plantaginis in a metapopulation of Plantago lanceolata could change with temperature. In particular, in one of the host subpopulations, the sympatric pathogen population was better adapted than the allopatric populations at 17°C, whereas it was clearly maladapted at 23°C.

Changes in population structure related to aggressiveness

Major shifts in pathogen populations have sometimes been linked to invasion by a more aggressive population. One of the most well-documented situations is that of the relatively recent replacement of clonal lineage US-1 of P. infestans in the USA by new genotypes. Almost all isolates of this pathogen collected from the Columbia Basin, Idaho, in 1992 were of the US-1 genotype, whereas 97% were identified as US-8 by 1995 (Miller et al., 1997). Miller et al. (1998) compared the aggressiveness level of 22 isolates from different clonal lineages, including six US-1 isolates and three US-8, on four potato cultivars with different levels of quantitative resistance. They found that US-8 isolates had a higher lesion expansion rate, a higher sporulation capacity and a shorter latent period than US-1 isolates. Moreover, US-8 isolates rotted tuber slices faster than other isolates, confirming previous studies (Lambert & Currier, 1997). From these data, the authors concluded that relative differences in aggressiveness may partially explain the shift in the P. infestans population in the Columbia Basin from the US-1 to the US-8 lineage. Even though they relied on a small number of isolates (one to six per lineage), and despite significant intra-lineage variability, the quantitative differences between lineages presented by Miller et al. (1998) were consistent for several aggressiveness components and were confirmed by other similar studies (e.g. Kato et al., 1997; Lambert & Currier, 1997) and studies investigating defence responses by potato genotypes to virulent US-1 and US-8 genotype isolates (e.g. Wang et al., 2008). A comparable situation occurred in Europe, where exotic P. infestans genotypes displaced the old population (mating type A1) in only a few years in the 1980s. Higher infection efficiency and spore production per lesion produced by new than by old genotypes was postulated by Day & Shattock (1997) to explain the displacement. Interestingly, ‘old’ isolates were clearly less aggressive on two cultivars with quantitative resistance, but differences were less distinguishable on a less resistant cultivar. In addition to higher aggressiveness, resistance to the fungicide metalaxyl may have influenced these population shifts (Day & Shattock, 1997; Miller et al., 1998).

Differential adaptation to host cultivars, related to quantitative traits, may sometimes influence the structure of pathogen populations. Goyeau et al. (2006) surveyed P. triticina populations in France between 1999 and 2002. On wheat cv. Soissons the pathogen was present at a relatively high frequency (9–15%) in the host population and a single pathotype represented 30–60% of the pathogen population, even though more than 10 other compatible pathotypes were detected on the same cultivar, but at much lower frequencies. Since this pathotype distribution was found only on cv. Soissons and on a closely related cultivar, and lasted for several years, the authors hypothesized that a greater level of aggressiveness could explain the dominance of a single pathotype. This hypothesis was later confirmed in greenhouse experiments: the dominant pathotype had a shorter latent period, greater spore production and larger lesion size on cv. Soissons than did other pathotypes (Pariaud et al., 2007).

One of the clearest demonstrations of the central importance of aggressiveness to pathogen evolution has been made with the wild pathosystem Melampsora liniLinum marginale by Thrall & Burdon (2003). In southern Australia, M. lini develops recurrent rust epidemics in an L. marginale metapopulation. In this system, the pathogen disperses more broadly than the host and a pattern of local adaptation to the host population has been demonstrated. The authors showed a negative relationship between aggressiveness (measured as spore production per lesion) and average qualitative virulence (defined here as the average ability of a pathogen population to overcome the diversity of resistance genes present in the host population). This trade-off was identified as the central cause preventing the most virulent pathotypes from invading all host sub-populations and finally dominating the system. It is likely that such trade-offs between qualitative virulence and aggressiveness play an important role in generating local adaptation in gene-for-gene systems by impeding the emergence and evolution of pathotypes that are both highly aggressive and capable of multiplying on all host genotypes.

Quantitative adaptation to host cultivars

Most of what we know about pathogen evolution in agricultural systems is based on qualitative gene-for-gene virulence. However, it is now clear that selection for quantitative traits influences pathogen evolution in agricultural pathosystems, and it has been repeatedly demonstrated that selection for quantitative traits can result in differential adaptation to host cultivars. This was shown by artificial selection experiments (Leonard, 1969), but a differential adaptation to the host of origin in field epidemics was also demonstrated (Ahmed et al., 1996).

Differential adaptation to host cultivars in artificial selection experiments

One of the first studies of pathogen quantitative adaptation based on artificial selection was published by Leonard (1969). He maintained a genetically heterogeneous population of P. graminis f.sp. avenae on two different host genotypes for seven asexual generations and showed that the mean infection efficiency of the population had increased by approximately 10–15% at the end of the experiment on the host on which it had been grown, but not on the other one. Leonard's results were later confirmed by two different studies. Chin & Wolfe (1984) sampled powdery mildew (B. graminis f.sp. hordei) on two different barley cultivars grown either in pure stands or in mixture. When collected in pure stands, isolates had a higher multiplication rate (up to 22–24%) on the cultivar from which they were isolated than on the other one, but this did not occur when the cultivars were grown in mixture. This result was confirmed and extended in a field study with wheat powdery mildew (B. graminis f.sp. tritici) by Villaréal & Lannou (2000). They demonstrated that selection for quantitative traits operated in the pathogen population on the scale of a single epidemic and resulted in a higher aggressiveness level at the end of the crop season on the host genotype on which the pathogen population multiplied. They compared the average infection efficiency of a spontaneous B. graminis population before and after seven successive pathogen generations on pure stands of two different cultivars, a mixture of both, or alternate stands of each cultivar. In this system, the average infection efficiency of the pathogen increased on the pure stands, but did not significantly change in the host mixture or when the cultivars were changed at each pathogen generation (alternate stands). Moreover, in some plots, mixtures and alternate sowings tended to select for pathogen populations with identical aggressiveness levels on both cultivars.

Differential adaptation to the host cultivar was tested in selection experiments with different pathosystems. Caten (1974) multiplied six isolates of P. infestans for successive generations on tubers from three potato cultivars and then tested their growth capacity on tubers of the same cultivars. After six generations, and except for a resistant cultivar, pathogen aggressiveness increased by 10% in homologous combinations (where source and test cultivars were the same) compared to heterologous combinations. In other studies, however, experimental selection did not demonstrate quantitative adaptation to the host (Alexander et al., 1985; Kolmer, 1990).

Adaptation to the cultivar of origin

Several authors have investigated whether populations isolated from a given cultivar in the field are more aggressive on this cultivar than on others.

In both field and greenhouse experiments, Bonman et al. (1989) observed that Korean isolates of M. grisea induced more disease on japonica rice cultivars than Philippine isolates. Japonica cultivars are predominant in Korea, whereas indica cultivars are more frequent in the Philippines. Since the isolates tested in this experiment produced compatible reactions on all the cultivars tested, the authors concluded that specificity in adaptation to genetic background was the primary cause of these differential interactions, underlining, however, that japonica and indica rice cultivars represent different germplasms.

Andrivon et al. (2007) obtained similar results in P. infestans, comparing French and Moroccan populations on the potato cvs Bintje (prevalent in France, but not grown in Morocco) and Désirée (popular in Morocco, but cultivated to a very small extent in France). French and Moroccan populations globally had greater lesion sizes and sporulation capacities on detached leaflets of cvs Bintje and Désirée, respectively.

Similarly, Ahmed et al. (1995) found that M. graminicola isolates from California induced more disease on Californian than on Oregonian cultivars, while the reverse result was obtained for Oregonian isolates. In another study, Ahmed et al. (1996) sampled M. graminicola isolates from winter wheat cultivars in field plots near crop maturity and measured their aggressiveness, defined by disease severity, on seedlings of the same wheat cultivars in the greenhouse. In two separate experiments, the linear contrast between homologous (where source and test cultivars are the same) and heterologous combinations was highly significant. On two susceptible cultivars, aggressiveness was 17.2% greater in homologous than heterologous combinations. The authors concluded that M. graminicola isolates were better adapted to the host cultivar from which they originated than to other cultivars. However, in another study with the same pathosystem, Cowger & Mundt (2002) found only weak evidence that the fungal population was subject to selection for greater aggressiveness on its host of origin.

Some studies gave mixed results or found no evidence at all for quantitative adaptation to the cultivar of origin. Jeffrey et al. (1962) compared nine isolates of P. infestans grown on potato tubers of three cultivars, including their cultivar of origin. They found evidence of adaptation to the cultivar of origin for lesion growth rate, but not for latent period. Knott & Mundt (1991) sampled populations of P. triticina in field plots and tested their aggressiveness in a growth chamber on seedlings of the same cultivars, but found no evidence of increased infection efficiency or shortened latent period on the cultivar of origin. Similarly, Zhan et al. (2002) found no clear evidence of increased lesion size for M. graminicola isolates on the wheat cultivar from which they were isolated.

This lack of clarity could partly result from the limited number of isolates tested (three in Bonman et al., 1989; five in Zhan et al., 2002), which may not represent the original population properly. In addition, since a differential effect between seedlings and adult plants was shown (Milus & Line, 1980), it is possible that seedling tests (Knott & Mundt, 1991; Zhan et al., 2002) may not reveal quantitative differences in aggressiveness components selected on adult plants.

Adaptation to quantitative resistance

Among the studies on pathogen adaptation to host cultivars, a few specifically refer to identified quantitative resistances.

Lehman & Shaner (1997) studied adaptation of P. triticina on a partially resistant cultivar in an artificial-selection experiment. They made an isolate population from 200–300 uredinia collected from volunteer seedlings in the field. The population was grown for five asexual generations under greenhouse conditions on adult plants of a wheat cultivar with quantitative resistance (determined by four different genes with unequal effects), then tested on five different cultivars, including three other partially resistant cultivars and a susceptible check. At each generation, a truncation selection procedure was applied: the spores produced by early erupting uredinia (lesions) were collected separately from those of later erupting uredinia. This resulted in strong selection for a shortened latent period. Before selection, the latent period was 4.3 days longer on the resistant cultivar than on a susceptible control. This difference was reduced to 2.3 days after selection, which means that the selected population overcame 47% of the resistance. The authors estimated that the selected population had overcome at least one of the resistance genes with partial effects. It is interesting to note that the data (Fig. 2 in their paper) suggested that the latent period was reduced on all resistant cultivars but did not change on the susceptible control (this effect was not statistically significant on the additional cultivars). This selection for a shorter latent period also changed spore production per lesion, which increased on the resistant cultivar but decreased on the susceptible check. These results were globally confirmed in a later experiment (Lehman & Shaner, 2007).

Kolmer & Leonard (1986) obtained an increased lesion size in C. heterostrophus by artificial selection on maize cultivars with partial resistance. They studied successive pathogen generations on five different cultivars and, at each pathogen generation, they mated the seven (out of 25) most aggressive isolates, i.e. those showing the greatest lesion sizes. This resulted in a significant increase in lesion length, both across all the cultivars tested (5–10%) and specifically on the cultivar of selection (14%).

Clifford & Clothier (1974) sampled P. hordei isolates on three different cultivars and multiplied these field populations in the laboratory for between four and seven generations, on seedlings of the host of origin. One of the cultivars (Vada) was known to exhibit quantitative resistance that reduced infection efficiency. This study clearly showed a differential adaptation to the cultivar of origin, with significant interactions between ‘isolate population’ and cultivar. Moreover, populations multiplied on Vada showed increased infection efficiency on this cultivar, as well as on the other cultivars. In this experiment, the increase in infectivity was associated with a decrease in spore production per lesion. However, given the density dependence of spore production for such pathogens, it is possible that the decrease in spore production simply resulted from a density effect.

In order to evaluate the long-term durability of quantitative resistance, it is of primary importance to understand how the deployment of such resistance affects aggressiveness in pathogen populations. Very few reports have compared the selective effect of susceptible and quantitatively resistant cultivars on pathogen populations. A theoretical model (Gandon & Michalakis, 2000) predicts that increasing levels of quantitative host resistance will select for increasing levels of damage caused by the parasite to its host (‘virulence’ as defined in ecology). An experimental study by Cowger & Mundt (2002) was in accordance with this prediction: changes in average aggressiveness (estimated by disease severity) of M. graminicola were compared during field epidemics on six cultivars differing in their resistance levels. It appeared that the highest levels of aggressiveness at the end of the epidemics were found on the more resistant cultivars, and the data presented tended to support the conclusion that resistant hosts select for more aggressive pathogens than susceptible hosts. In a recent study on the same pathosystem (Krenz et al., 2008), evidence for adaptation to Madsen, a quantitatively resistant wheat cultivar, remained equivocal because of an inconsistency among results obtained in the different trials. However, it was previously shown that the quantitative resistance of this cultivar was gradually eroded (Mundt et al., 2002a), which, together with other studies (Ahmed et al., 1996; Zhan et al., 2002), suggests an adaptation of pathogen populations to Madsen's quantitative resistance.

Results obtained with M. graminicola on the selective effect exerted by quantitative resistance on a fungal pathogen are consistent with other reports on different systems. For example, potato cyst nematodes (Globodera pallida) reared for 12 generations on four partially resistant potato genotypes exhibited an increased reproductive rate, whereas those raised on susceptible potato cultivars did not (Phillips & Blok, 2008). Moreover, selection was specific to the source of resistance used: populations selected on a particular source of resistance reproduced better on hosts with that source of resistance. Pink et al. (1992) suggested that the multiplication of lettuce mosaic virus in lettuce (Lactuca sativa) cultivars with quantitative resistance may have contributed to the emergence of more aggressive viral strains.

It is interesting to note that a similar adaptation effect was found in M. graminicola toward a multisite fungicide (Cowger & Mundt, 2002): isolates from sprayed plots were more aggressive than isolates from unsprayed ones. The authors suggested that the multisite fungicide and quantitative host resistance exercised similar selective pressures on the fungal populations.

Aggressiveness in an evolutionary perspective

Current uses of the term ‘aggressiveness’ in plant pathology are operational for a discipline that primarily aims at reducing the impact of diseases on crop yield and quality. However, to understand how adaptation of pathogens to their hosts and environments is translated in terms of aggressiveness, it is essential to be able to link the concept of aggressiveness to other concepts used in evolutionary epidemiology, i.e. fitness and virulence (Galvani, 2003).

Fitness is generally defined as the per capita rate of increase of an individual or a gene copy (Futuyma, 1997). It is frequently measured as the average number of secondary infections produced from a single infected host in the absence of density-dependent constraints [also known as R0 (May & Anderson, 1983); see Salvaudon et al. (2007) for a plant pathogen application]. Although focussing on among-host transmission seems natural for many animal pathogens, it may be more relevant in some plant pathogens to measure fitness as the average number of secondary lesions produced from a single initial lesion in the absence of density-dependent constraints, including alloinfection (among-host transmission) and autoinfection (within-host multiplication).

In evolutionary epidemiology, as well as in animal and human epidemiology, and unlike in plant pathology, virulence is defined as the quantity of damage induced by a pathogen on its host, and is measured in units of host fitness and/or mortality (Poulin & Combes, 1999; Read et al., 1999). Virulence is generally assumed to be a direct consequence of within-host pathogen multiplication, although this direct causative relationship can be questioned (Day, 2002).

Aggressiveness, because it describes the ability of a pathogen to cause severe epidemics at the host population scale, combines both notions of pathogen fitness and virulence. Both fitter and more virulent pathogens are generally considered as more aggressive since they will cause faster epidemics or more damage to the host population, respectively. However, aggressiveness cannot be considered strictly equivalent to pathogen fitness. Fitness-related traits such as spore viability or inter-seasonal survival are not usually considered aggressiveness traits. Similarly, aggressiveness in plant pathology is not a synonym for virulence in evolutionary epidemiology since many aggressiveness components (e.g. sporulation rate) do not quantify a decrease in host fitness or survival. In addition, other parameters that would be relevant to measuring pathogen virulence, such as decrease in host photosynthetic ability (usually larger than the effect accounted for by lesion size, see Bastiaans, 1991) and induced necrosis (pathogen-induced senescence, distinct from necrotic infected tissue, e.g. Magboul et al., 1992) are not usually measured.

As seen above, aggressiveness can be broken down into several components and each of these components is likely to evolve (e.g. Leonard, 1969; Kolmer & Leonard, 1986; Lehman & Shaner, 1997, 2007). Recognizing that aggressiveness results from the expression of elementary quantitative components should make it possible to benefit from the predictions of evolutionary epidemiology models. For instance, most of these theoretical approaches assume that within-host multiplication harms hosts (i.e. causes virulence), so that fitness results from a trade-off between pathogen transmission and virulence (Day, 2003; Galvani, 2003). As a consequence, aggressiveness components that are more closely related to transmission are not expected to evolve under the same evolutionary forces as aggressiveness components linked to virulence. Establishing the correspondence between aggressiveness components, transmission and virulence is not always easy, however. Lesion size can be related to virulence (Bastiaans, 1991). The latent period may have a twofold status, since a shorter latent period accelerates transmission but a longer latency could allow a greater development of the pathogen's organs within host tissues and increase its ability to exploit the host. Infection efficiency, spore production rate and infectious period are transmission traits, but also participate in within-host multiplication through autoinfection.

Some difficulties met in this analysis obviously come from evolutionary epidemiology models making a distinction between within- and among host scales, which is not always relevant in plant pathology. Although many animal diseases are caused by parasites with systemic effects that globally reduce the host viability, many plant parasites have localized effects and do not affect their host in a systemic manner. This is particularly true for foliar parasites, for which it has been demonstrated that the effect on the host is largely limited to a local reduction in photosynthetic capacity (Bastiaans, 1991; Robert et al., 2004). However, the pathogen can increase dramatically on an infected host leaf through autoinfection (see Lannou et al., 2008), which is analogous to within-host multiplication in animal diseases. The consequence is that the correspondence between aggressiveness components, transmission and virulence depends on the scale considered. At the scale of the lesion for instance, lesion growth may be considered as causing virulence (i.e. killing local host tissues), and other traits (infection efficiency, spore production, etc.) would correspond to transmission.

Theoretical work on pathogen evolution progresses much faster than experimental work, and the lack of experimental evidence to evaluate theoretical predictions and model hypotheses has been emphasized (Ebert & Bull, 2003). The growing body of experimental data on pathogen adaptation for quantitative traits, based on the study of crop pathogens, could offer opportunities to fill this gap, provided that the traits measured are clearly linked to parameters that underlie evolution.

The extent of phenotypic variability and of heritability of these traits is likely to radically condition the durability of resistant cultivars. Collecting more information on the genetic architecture of aggressiveness components could bring valuable information for the development of quantitative resistance. Nonetheless, considering the huge evolutionary potential of plant pathogens, the design of quantitative resistance should take advantage of the potential trade-offs between aggressiveness components, in order to enhance the sustainability of crop resistance. Such trade-offs would reflect the constraint for the pathogen to simultaneously invest in different traits, such as sporulation or within-host growth. It is therefore most important that plant pathologists record, to the greatest extent possible, all aggressiveness components in pathogen adaptation studies, and check for any negative correlations between them. Further study on the survival of pathogens during intercropping, although difficult in most pathosystems, would bring additional valuable information for both understanding pathogen evolution and improving disease management.

A major question related to host resistance durability is to evaluate to what extent plant pathogens can be considered specialists on a given host cultivar or generalists. It remains difficult at this time to produce an answer from the experimental data presented above. Clifford & Clothier (1974) observed a non-specific increase of aggressiveness in P. hordei induced by repeated multiplication on a cultivar with quantitative resistance, but other results obtained in artificial-selection experiments (Leonard, 1969; Chin & Wolfe, 1984; Villaréal & Lannou, 2000), as well as field data (Ahmed et al., 1996), suggested that selection for aggressiveness led to specific adaptation to a given host. Here again, more data are needed on the level of specificity of each individual aggressiveness component.

A growing body of evidence shows that pathogen populations may quantitatively adapt to their compatible hosts under laboratory, experimental field and natural conditions, resulting in modifications of their aggressiveness. The speed and nature of such adaptations will determine the durability of quantitative resistance in cultivated landscapes.