The use of cultivar mixtures to control foliar fungal diseases is well documented for windborne diseases, but remains controversial for splash-dispersed diseases. To try to improve this strategy, a cultivar mixture was designed consisting of two wheat cultivars with contrasted resistance to Mycosphaerella graminicola , responsible for the rainborne disease septoria tritici blotch (STB), in a 1:3 susceptible:resistant ratio rather than the 1:1 ratio commonly used in previous studies. The impact of natural STB epidemics in this cultivar mixture was studied in field experiments over 4 years. Weekly assessments of the number of sporulating lesions, pycnidial leaf area and green leaf area were carried out on the susceptible cultivar. In years with sufficient STB pressure, disease impacts on the susceptible cultivar in the mixture were always significantly lower than in the pure stand (e.g. 42% reduction of pycnidial leaf area for the three upper leaves in 2008 and 41% in 2009). In years with low STB pressure (2010 and 2011), a reduction of disease impacts was also shown but was not always significant. After major rainfall events, the number of sporulating lesions observed on the susceptible cultivar after one latent period was reduced on average by 45% in the mixture compared to the pure stand. All the measurements showed that a susceptible cultivar was consistently protected, at least moderately, in a mixture under low to moderate STB pressure. Therefore, the results prove that the design of an efficient cultivar mixture can include the control of STB, among other foliar diseases.
Septoria tritici blotch (STB) is one of the foliar diseases with the greatest economic impact on wheat worldwide, especially in temperate humid regions where it may lead to substantial yield losses. For example, during the 1998–1999 cropping season in France, yield losses of 1–2 t ha−1, and sometimes 3–3·5 t ha−1, were attributed to STB (Oste et al., 2000). Commonly used methods to control Mycosphaerella graminicola (anamorph Septoria tritici), the causal agent responsible for this disease, are based on fungicide spraying and resistant cultivar breeding. In the case of the former, it was observed that the effectiveness of fungicide use declined over the years due to intensive pathogen strain selection (Leroux et al., 2007). Indeed, the emergence since the early 2000s of M. graminicola resistance to QoI, one of the major groups of fungicides represented mainly by the strobilurin family, has been widely documented (e.g. Torriani et al., 2009). More recently, a loss of triazole fungicide efficacy has been noticed in the field (Cools & Fraaije, 2008; Leroux & Walker, 2011). With regard to plant breeding, the classic way to select resistant cultivars consists of using genetic markers for introgressing a major resistance gene into a new line. So far, 13 major genes in wheat for resistance to STB have been identified and mapped (Orton et al., 2011). Nevertheless, with major race-specific resistance genes, loss of efficacy may rapidly occur under field conditions because of the unidirectional selection pressure exerted on the pathogen population (McDonald & Linde, 2002).
In addition to the previously mentioned methods, complementary approaches that enlarge the current range of STB management solutions should be developed. Cultivar diversity within crops is one such solution. Finckh & Wolfe (2006) reported that ‘genetic diversity within and among species appears to be a concomitant of survival and of stability in communities'. Several studies have proved that mixing cultivars together provides effective control against some pathogen agents (e.g. Wolfe, 1985; McCann, 2000; Mundt, 2002; de Vallavieille-Pope et al., 2006). Moreover, this strategy provides additional potential benefits such as yield stability (Pfahler & Linskens, 1979).
The two main dispersal processes for cereal diseases are wind and rain. For windborne cereal diseases such as rusts and powdery mildew, cultivar mixtures present interesting protective effects against pathogen populations (Finckh et al., 2000; Mundt, 2002; de Vallavieille-Pope, 2004; Huang et al., 2012) and are effectively used commercially in different countries. For example, barley mixtures have been cultivated in Danish fields since the 1980s to limit the severity of powdery mildew (Blumeria graminis f. sp. hordei; Finckh et al., 2000) and wheat cultivar mixtures are sown in the United States to reduce epidemics of yellow (stripe) rust (Puccinia striiformis f. sp. tritici), brown (leaf) rust (Puccinia triticina), and powdery mildew (Finckh & Wolfe, 2006). The control efficiency of rainborne diseases such as STB using cultivar mixtures varies according to pathosystems, climatic conditions and biology, and the proportion of each component of the mixtures. Significant (Jeger et al., 1981b; Mundt et al., 1995; Mille & Jouan, 1997) or non-significant (Cowger & Mundt, 2002) benefits have already been demonstrated. Newton et al. (1997) and Mille et al. (2006) showed, for example, that cultivar mixtures with more than two components act better than two-component mixtures on two typically splash-dispersed fungi, Rhynchosporium secalis on barley and M. graminicola on wheat.
Wheat and other cereals are considered as crops that benefit from using cultivar mixtures to control foliar diseases because of their small host genotype unit area (defined by Mundt & Leonard, 1985). However, according to Garrett & Mundt (1999), M. graminicola possesses biological traits unfavourable for the efficiency of this agricultural practice: large lesion size (compared to brown rust, for example), long pathogen generation time (average latent period of around 270–500 degree-days in field conditions (Lovell et al., 2004a), absence of strong host specialization, and steep dispersal gradient. This last feature is directly related to the splash dispersal mechanism. Splash-dispersed pycnidiospores of M. graminicola are carried over very short horizontal and vertical distances, up to 1 m (Fitt et al., 1989; Saint-Jean et al., 2004), compared to distances of several hundred kilometres for windborne cereal rust urediospores (Brown & Hovmøller, 2002).
Although the protective effects are expected to be less effective than with other pathosystems, the increasing concern for sustainable pest management (e.g. Savary et al., 2012) and the impacts of climate change on plant diseases (Pautasso et al., 2012) prompted a revisit to the question of cultivar mixtures in the case of the wheat–STB pathosystem. In this context, the use of cultivar mixtures, considered as a preventive measure (Pautasso et al., 2012), should not be ignored. Combined with other STB management methods, it makes it possible to create one more obstacle to disease progression. Compared to windborne diseases, for which extensive literature is available, the utility of cultivar mixtures to control splash-dispersed diseases remains controversial, and more experimental data are needed to assess the epidemiological and dispersal processes involved in disease progression. In particular, there is a need to investigate how the temporal disease development is affected within a cultivar mixture, and whether the effect of a cultivar mixture on disease development can be consistent over several host growing seasons.
This study creates intrafield diversity by mixing two cultivars, a very susceptible one and a quite resistant one, in a ratio of 1:3. It is known that this ratio is more relevant than the usually studied 1:1 ratio used to control windborne disease epidemics (de Vallavieille-Pope, 2004). The use of a cultivar mixture as a means of contributing to the control of STB is demonstrated here by decreasing the disease severity on the most susceptible cultivar. The objectives were to: (i) determine when and how the cultivar mixture affects the STB epidemics; and (ii) quantify the effect of the mixture on the progression of natural STB epidemics over four host growing seasons.
Materials and methods
Wheat cultivars and field experiments
Experiments were conducted over four consecutive wheat-growing seasons, from 2007–2008 to 2010–2011. For convenience, growing seasons are referred to by the harvest year only. Experimental fields were located at the INRA-Grignon Research Station (48°50′ N, 1°56′ E, 583 mm mean annual rainfall from 1992 to 2011), approximately 30 km west of Paris, where STB epidemics regularly occur. Commercial winter bread wheat cultivars were chosen based on their contrasted scores of resistance to M. graminicola and intermediate to high resistance levels to the other main pathogenic foliar fungi, and also for similar earliness and plant height. The two-way cultivar mixtures studied were cv. Koreli – cv. Sogood in 2008 and cv. Maxwell – cv. Sogood in 2009, 2010 and 2011. Cvs Koreli and Maxwell were among the most resistant wheat cultivars commercially available (score 7 on a 1–9 scale, with 1 corresponding to highly susceptible and 9 fully resistant), and cv. Sogood was very susceptible (score 4) to M. graminicola (Anonymous, 2009). For convenience, cv. Koreli or cv. Maxwell are referred to as the resistant component, and cv. Sogood as the susceptible component of the cultivar mixture. To allow for early identification of mixture components, the seeds of the resistant and susceptible cultivars were coloured differently before sowing, using two contrasted colouring agents from the seed industry. In addition, a criterion based on presence (for the resistant cultivar) or absence (for the susceptible cultivar) of ear awns made it possible to unambiguously distinguish mixture components after heading. Both cultivars were mixed together in the seed drill in seed number proportions of 75% for the resistant cultivar and 25% for the susceptible one. The seed number proportions in cultivar mixtures were adjusted by weighing, taking the 1000-kernel weight difference of each cultivar into account.
For each cropping season, the experimental plan was a randomized block design with four replicates for each modality, namely the two pure stands and the mixture previously described. Plots were 3·5 m wide (18 rows spaced 17·5 cm apart) and 10–14 m long. To limit cross-contamination, triticale (×Triticosecale cv. Maximal, non-host for M. graminicola) borders at least 3·5 m wide were sown between plots. The experimental design was fungicide-free. Wheat crops were sown at a density of 220 seeds m−2, which is comparable to common practices in the region, on 11 October 2007, 29 October 2008, 27 October 2009 and 26 October 2010.
Nitrogen fertilization was adjusted according to the balance-sheet method (Rémy & Hébert, 1977). Two applications took place between Growth Stage (GS) 30 and GS 69 (Zadoks et al., 1974) for a target grain yield of 5·5–6·0 t ha−1 (Meynard et al., 1997). Herbicide treatments were applied in early spring on all the experimental plots.
A meteorological data logger (Campbell Scientific Ltd.) was set up near the experimental plots to record the amount of rain and air temperature at 2 m above ground level, in order to compute thermal time from the sowing day. The base temperature, equal to 0°C for wheat, refers to the temperature below which plant development is considered null (Gate, 1995). For comparing years to each other, wheat thermal time from sowing in degree-days was used to take wheat development into account (Lovell et al., 2004b).
Assessment of disease severity and statistical analysis
At the beginning of stem elongation (GS 30), five main stems per replicate and cultivar were randomly selected, avoiding plot borders and using the difference in the seed colour for the mixture. They were identified with a plastic coloured ring, to follow STB severity on them until ripening (GS 90). This allowed non-destructive visual scoring to be carried out for about 3 months on the same stems. For each of the six upper leaf levels, the number of sporulating lesions due to infection by M. graminicola, the percentage of leaf area covered by pycnidia (or pycnidial leaf area), and the percentage of green leaf area were assessed on a weekly basis, as long as the green area was present.
An approach based on areas under progress curves (AUPCs) was used. For a period ranging from the dates 0 to N, an AUPC was calculated according to the following formula:
where Sd and Td refer to the scoring value and the wheat thermal time on day d, respectively. Differences between the mixture and the pure stand for the susceptible cultivar were computed for the number of sporulating lesions (ΔAUPC for lesion number), the pycnidial leaf area (ΔAUPC for pycnidial area) and the green leaf area (ΔAUPC for green area).
Differences in median thermal times, corresponding to the thermal times for which 50% of the final values was reached, were also calculated for the pycnidial leaf area (ΔTT50 for pycnidial area) and the green leaf area (ΔTT50 for green area) for the three upper leaf levels. Median thermal times were assessed with logit adjustments, using the method of maximum likelihood for model fittings of the R statistical software (R Development Core Team, 2010).
As upper leaves are the most important for explaining potential yield loss (e.g. Thomas et al., 1989), part of the analysis was focused on the three upper leaf levels during the post-heading period (from GS 55 to GS 90). Data from the three upper leaves were aggregated, taking into account their relative mean area measured after heading, when wheat growth is considered complete. Destructive samplings of all consecutive plants over 50 cm in two adjacent rows were made in each of the replicates of the mixture and the pure stand. Leaf scans of the susceptible cultivar made it possible to assess the mean lamina area of each of the three upper leaves (Table 1).
Table 1. Average contribution of each leaf level to the total leaf area of the three upper leaf levels over 4 years
Leaf areas were measured using destructive sampling between 1650 and 1850 degree-days from sowing (1744 degree-days in 2008, 1772 degree-days in 2009, 1809 degree-days in 2010 and 1651 degree-days in 2011), at the end of inflorescence emergence (from GS 57 to GS 59).
To compare the number of sporulating lesions, the pycnidial leaf area and the green leaf area of the susceptible component in the cultivar mixture and in the pure stand, Student's t-test procedure was used with the R software. Up to three significance levels were used (1, 5 and 10%) to highlight significant differences in spite of the great variability inherent in most field experiments. Twenty stems per modality and per year were tested, taking into account the three or the six upper leaf levels according to the indicators considered. The mixture protective effect corresponds to the relative difference for the susceptible cultivar between the mixture and the pure stand.
May and April rainfalls are known to be predictive of overall STB severity (Shaner & Finney, 1976; Thomas et al., 1989). Using meteorological data recorded at Grignon over a 20-year period (from 1992 to 2011), 2010 and 2011 were among the three driest years in terms of cumulative rainfall in April and May, with 42 and 37 mm, respectively. 2008 and 2009 were about average in terms of cumulative rainfall during April and May, with 126 and 82 mm, respectively. These figures show that the springs of 2008 and 2009 were favourable for moderate epidemics, whereas rainfall events for the springs of 2010 and 2011 were not conducive to STB epidemics.
Detection of a protective effect over the epidemic period
Each year, one or two major peaks of weekly emergence of sporulating lesions were identified and related to major rainfall events that occurred about one latent period earlier (Fig. 1). Major rainfall events mean rain of at least 10 mm in 1 day (Thomas et al., 1989). Six major peaks of lesion emergence were identified: two in 2008 (at 1279 and 2024 degree-days), one in 2009 (at 1637 degree-days), one in 2010 (at 2035 degree-days) and two in 2011 (at 1293 and 1716 degree-days). For these peaks, reductions of 36, 8, 68, 25, 58 and 75% were observed, respectively, in the number of STB sporulating lesions newly emerged after 1 week on the susceptible cultivar in the mixture, compared to the pure stand. Among these reductions, half were significant (at P <0·01). There was no obvious correlation between: (i) the quantity of rain during the major daily rainfall event; (ii) the peak amplitude of the number of sporulating lesions; and (iii) the reduction of the number of sporulating lesions within the mixture compared to the pure stand.
For each of the six upper leaf levels, assessments were made of the differences in two STB impact indicators (the total number of sporulating lesions and the final pycnidial leaf area due to M. graminicola) between the cultivar mixture and the pure stand for the susceptible cultivar (Table 2). All the significant differences (at P <0·1) indicate protective effects within the mixture. These significant values ranged from −30 to −89% with a mean of −49% for the total number of sporulating lesions, and from −33 to −88% with a mean of −56% for the final pycnidial leaf area. The significant protective effects of the mixture (globally observed on upper leaves in 2008, 2009 and 2010, and on lower leaves in 2011) appeared at different leaf levels, depending on the year.
Table 2. Absolute and relative differences (±standard errors) of final values reached for sporulating lesion number and pycnidial leaf area between the 1:3 susceptible:resistant cultivar mixture and the pure stand, for the susceptible wheat cultivar (cv. Sogood), on the six upper leaf levels and for 4 years
Difference in number of sporulating lesions
Difference in pycnidial leaf area
The significance (sign.) of each absolute difference is assessed by a Student's t-test. Leaf level number 1 corresponds to the flag leaf.
The next steps focused the analysis on the three upper leaf levels during the post-heading period, taking into account their relative mean area (Table 1) to aggregate the data.
Concerning the increase of pycnidial leaf areas on the susceptible cultivar (Fig. 2) in 2008, a significant difference in modality was observed between the susceptible cultivar in the mixture and in the pure stand (t =3·67, P =0·001). At the last scoring carried out at 2024 degree-days, the proportion of pycnidial area over the total leaf area (for the three upper leaves) reached 26 and 38% for the mixture and the pure stand, respectively, which corresponded to a reduction of more than 30% of the leaf pycnidial area within the mixture. For 2009, a significant difference was observed between the modalities (t =3·72, P <0·001). For the last scoring, at 2166 degree-days, the pycnidial leaf area proportion of the susceptible cultivar reached 23% in the mixture and 34% in the pure stand, which corresponded to a reduction of more than 30% of the leaf pycnidial area within the mixture. In 2010, no significant difference was reported between the two modalities (t =1·28, P =0·21). For the last scoring, at 2344 degree-days, the pycnidial leaf area proportion of the susceptible cultivar reached 10% in the mixture and 12% in the pure stand. Finally, for 2011, despite the very low STB pressure (less than 2% of pycnidial leaf area in the susceptible pure stand at ripening stage), a significant difference was observed between the two modalities (t =3·38, P =0·002). For the last scoring, at 2289 degree-days, the pycnidial leaf area proportion of the susceptible cultivar reached 0·4% in the mixture and 1·6% in the pure stand. The differences of pycnidial leaf areas on the susceptible cultivar between the mixture and the pure stand at ripening were all negative: −33% in 2008, −31% in 2009, −19% in 2010 and −78% in 2011. Three of these differences were significant, i.e. 2008 (t =3·12, P =0·004), 2009 (t =3·20, P =0·003) and 2011 (t =2·87, P =0·009). The disease severity on the susceptible cultivar, assessed with final values of pycnidial leaf area, was significantly lower in most cases in the mixture than in the pure stand.
With regard to the decrease in green leaf area for the susceptible cultivar over the post-heading period (Fig. 3), in 2008, a significant difference was revealed between the modalities of the mixture and the pure stand (t =4·01, P <0·001). For the date corresponding to the maximal difference between each modality (i.e. 2024 degree-days), the green leaf area proportion for the susceptible cultivar reached 63% in the cultivar mixture and 42% in the pure stand, and the difference was significant (t =4·82, P <0·001). In 2009, there was also a significant difference between the two modalities (t =4·46, P <0·001). At the date of maximum difference between the modalities (1861 degree-days), the green leaf area proportion of the susceptible cultivar reached 64% in the mixture and 34% in the pure stand, and the difference was significant (t =4·46, P <0·001). Finally, in 2010 and 2011, no significant difference was observed (t =1·03, P =0·31 and t =5 × 10 −4, P ≈ 1; respectively).
Quantification of the protective effect of the cultivar mixture
Differences for the susceptible cultivar between the mixture and the pure stand for the number of sporulating lesions, the pycnidial leaf area and the green leaf area were computed, based on AUPC over the entire post-heading period (ΔAUPC; Fig. 4). For the 4 years, the ΔAUPC for the sporulating lesion number was significant (t =3·18, P =0·003 for 2008; t =3·92, P <0·001 for 2009; t =2·77, P =0·009 for 2010; and t =3·01, P =0·006 for 2011) and ranged from −74% in 2011 to −30% in 2008. The ΔAUPC for the pycnidial leaf area was significant (t =3·67, P =0·001 for 2008; t =3·72, P <0·001 for 2009; and t =3·38, P =0·002 for 2011) except for 2010 (t =1·28, P =0·21), ranging from −77% in 2011 to −18% in 2010. The ΔAUPC for the green leaf area, always positive (or null for 2011), was only significant for 2008 (t =4·01, P <0·001) and 2009 (t =4·46, P <0·001), with values of 8 and 26%, respectively.
The difference in median thermal time for the susceptible cultivar between the mixture and the pure stand (ΔTT50) was assessed for pycnidial and green leaf areas. The ΔTT50 was calculated for all the years except 2011, for which the disease pressure was too low to obtain relevant logit adjustments. The ΔTT50 for pycnidial leaf area, significant in both 2008 (t =1·93, P =0·07) and 2009 (t =2·76, P =0·02), ranged from 14 degree-days in 2010 to 80 degree-days in 2009. The ΔTT50 for the green area was only significant in 2009 (t =4·48, P <0·001) and ranged from −12 degree-days in 2008 to −98 degree-days in 2009.
Temporal variations in the protective effect of the cultivar mixture on the susceptible component were investigated in relation to pathogen dispersal events by means of weekly field assessments of STB progression. Although modulated by microclimatic conditions during the latent period, especially temperature and wetness (Magboul et al., 1992), the number of newly emerged sporulating lesions after one week is a relevant indicator of the intensity of pycnidiospore dispersal. Over the 4 years, the major dispersal events responsible for substantial spreading of pycnidiospores were identified (Fig. 1). All of these dispersal events led to fewer sporulating lesions on the susceptible cultivar within the mixture than in the pure stand. This reduction is calculated to be 45% on average for the 4 years. Although Garrett & Mundt (1999) mentioned that the steep splash dispersal gradient is not favourable for slowing down epidemics with the use of cultivar mixtures, experimental conditions here showed that the mixture prevented STB development on the susceptible cultivar through the limitation of spore dispersal.
This study also showed that the protective effect of the cultivar mixture on the susceptible cultivar could appear on each of the six upper leaf levels (Table 2), depending on the occurrence of rainfall when a given leaf level was present. Nevertheless, a link between the intensity of rainfall events and the level of protective effect was not shown. For example, although the highest daily rainfall recorded during spring occurred in 2010 (49·5 mm at 1730 degree-days), no significant reduction in the number of sporulating lesions was observed one latent period later on the susceptible cultivar in the mixture (Fig. 1). Inoculum present in the field when dispersal events occurred (Suffert & Sache, 2011), microclimatic conditions (Magboul et al., 1992) and precise characterization of rain properties such as raindrop energy (Saint-Jean et al., 2006) need to be taken into account in order to further understand the correlation between rainfall events and protective effects of cultivar mixtures against STB epidemics.
On average over the 4 years, on the three upper leaf levels, −45% of sporulating lesions, −45% of pycnidial leaf area and 9% of green leaf was observed on the susceptible cultivar within the mixture when compared to the pure stand (Figs 2, 3 and 4). The host compensation due to various abiotic and biotic stresses (e.g. drought, phytophagous insects) and wheat developmental plasticity could explain the relatively low protective effect for the green leaf area, which was only significant for the years with sufficient disease pressure (2008 and 2009). The differences in median thermal time for both pycnidial and green leaf area provide evidence that the photosynthetically active period for the three upper leaves of the susceptible cultivar was longer within the mixture than within the pure stand in the presence of STB. Indeed, these differences could reach up to about 90 degree-days in absolute value (in 2009), which is comparable to one phyllochron (Gate, 1995), corresponding to the establishment of one leaf level.
Previous experiments on equiproportional two-component mixtures of cultivars with no contrasting STB resistance levels have shown inconsistent protective effects (Cowger & Mundt, 2002). The present study used a cultivar mixture with contrasted resistance levels and a susceptible:resistant ratio of 1:3, basing the choices on assembly criteria known to provide significant control of windborne diseases (de Vallavieille-Pope, 2004). The importance of contrasted resistance scores for cultivar mixture components has already been experimentally demonstrated for the control of rainborne diseases (Jeger et al., 1981b; Mundt et al., 1995). Though there were variations, some level of consistency was shown over the years. Under low to moderate disease pressure, the susceptible component within the cultivar mixture was always less impacted by STB than in the pure stand. In such a cultivar mixture, the resistant component experienced an increased inoculum load in comparison to being grown in pure stand because of higher disease pressure caused by the presence of the diseased susceptible component (Garrett & Mundt, 1999). After analyses similar to those presented for the susceptible cultivar, it was shown that, in these experimental conditions, the resistant cultivar (cv. Koreli or cv. Maxwell, depending on the year) in the mixture was not significantly more affected by STB than in the pure stand, whether for the number of sporulating lesions, the pycnidial leaf area, or the green leaf area for the different years studied (data not shown). Although no decrease in efficacy of the resistant cultivar was noted during the experimental period, it is recommended that successively different resistant components are introduced within a mixture over the cropping seasons to minimize this risk (Mundt, 2002).
In these experimental conditions, the mixture effect, defined as the relative difference between the mixture and the mean of the pure stands, was computed based on pycnidial leaf area data and ranged from 3% in 2010 to 35% in 2008. Although, as expected, the mixture effect was low in comparison to mixture effects of 60–70% observed in the cases of windborne diseases (Wolfe, 1985; Burdon, 1987; Kølster et al., 1989), it was consistently positive over the years. Furthermore, the average yield of the whole cultivar mixture for the 2008–2011 period was 0·4 t ha−1 greater than for the mean of pure stands in the same 1:3 cultivar ratio (data not shown). No significant effects were observed for yield within a cropping season, which concurs with some observations of Mille et al. (2006). In their field experiments with several cultivar mixtures, there were no correlations between mixture efficiencies for grain yield and disease severity reduction. Moreover, estimation of grain yield mixture efficiency is more appropriate in larger fields (Garrett & Mundt, 1999).
Although not mainly concerned with controlling STB epidemics (Garrett & Mundt, 1999), this study provides evidence that with contrasted resistance levels and a 1:3 susceptible:resistant ratio, a two-component cultivar mixture could provide a consistent protective effect on the most susceptible component over several years, rainfalls and leaf levels, despite different disease pressures. Under these experimental conditions, similar behaviours were observed with the two different resistant cultivars for years with similar disease pressure (2008 and 2009). Nevertheless, it is usually impossible to predict the efficiency of a given cultivar mixture before planting, notably because of genotype–environment interactions and competition degree between mixture components (e.g. Finckh & Mundt, 1992; Mundt et al., 1995). Instead of the integrative STB resistance score currently available for each wheat cultivar, precise knowledge of the factors underlying host resistance, such as infection efficiency, sporulation rate and latent period, should substantially help to design cultivar mixtures with greater control of STB. Indeed, models from Jeger et al. (1981a) on non-specialized fungal pathogens, such as M. graminicola, predicted that the use of quantitative resistances in a cultivar mixture may decrease, increase, or have no effect on disease severity, depending on the differences in infection frequencies and sporulation rates between mixture components.
The authors are grateful to Laurent Gérard, Brigitte Durand and Olivier Maury for their invaluable technical assistance, and Christophe Montagnier and his team (INRA Experimental Unit, Thiverval-Grignon) for crop management. This experimental study received financial support from the French Ministry of Agriculture (contract CTPS C-03-2010). Managed by the ANRT (Association Nationale de la Recherche Technique), the PhD grant of Christophe Gigot was co-financed by ARVALIS – Institut du Végétal and the French Ministry of Research and Education. The authors thank the anonymous reviewers for both general and specific comments on the manuscript.