Potential strategies and future requirements for plant disease management under a changing climate
Climate change will probably influence the occurrence, prevalence and severity of plant diseases. This will also affect disease management with regard to timing, preference and efficacy of chemical, physical and biological measures of control and their utilization within integrated pest management (IPM) strategies. Prediction of future requirements in disease management is of great interest for agroindustries, extension services and practical farmers. A comprehensive analysis of potential climate-change effects on disease control is difficult because current knowledge is limited and fragmented. This review reveals that certain existing preventive plant protection measures, such as use of a diversity of crop species in cropping systems, adjustment of sowing or planting dates, use of crop cultivars with superior resistance and/or tolerance to diseases and abiotic stress, use of reliable tools to forecast disease epidemics, application of IPM strategies, and effective quarantine systems, may become particularly important in the future. Effective crop protection technologies are available and will provide appropriate tools to adapt to altered climatic conditions, although the complexity of future risks for plant disease management may be considerable, particularly if new crops are introduced in an area. Overall, the challenge of adapting disease control measures to climate change is not likely to be fundamentally different from the adjustments to technological innovations or changes in the economic framework already required in current crop protection. Potential beneficial effects of climate change, such as longer growing seasons, fewer frosts and shifted precipitation patterns, must not be neglected, as they could counteract the presumed enhancement of particular diseases.
Current evidence suggests a rise in global mean surface temperatures leading to a generalized global warming (Karl & Trenberth, 2003). This is thought to be the result of emissions of so-called greenhouse gases, although natural climate variability may have contributed to recent global warming as well (Hulme et al., 1999). Depending on future emissions scenarios, atmospheric CO2 concentration is estimated to increase from about 380 p.p.m. currently to between 500 (B1 emissions scenario) and 800 p.p.m. (A2 emissions scenario) at the end of this century (Meehl et al., 2005). A temperature increase of about 1·1–3·5°C, dependent on the emissions scenario applied in models, is possible (Meehl et al., 2005). However, temperature projections are greatly influenced by the climate model used (Knutti et al., 2008). The next generation of climate projection models will take into account recent climate observations and new information on climate system processes (Moss et al., 2010), which may alter projections of the future climate to an unknown extent. Future changes in cloud development and distribution are the single largest source of uncertainty in climate projections (Karl & Trenberth, 2003), underlying the difficulty in projecting future precipitation patterns, especially on a regional and local scale. This has particular relevance to plant diseases, since projection of moisture-related variables, such as leaf wetness duration, as well as temperature, is an important requirement for the occurrence and spread of most foliar diseases (Harvell et al., 2002). In general, optimal temperature conditions may compensate for reduced canopy wetness and vice versa (Salinari et al., 2006). Pathogen spore germination and infection of the host plant often require close to 100% relative humidity. These moist conditions usually occur during overnight dewfall. Optimal temperature for pathogens at this time is particularly important (Harvell et al., 2002). Under temperate climatic conditions, night-time minimum temperatures are expected to increase more than daytime maximum temperatures. This may facilitate dew formation and thereby increase leaf wetness duration. Dew formation is principally dependent on (i) the radiative cooling of the surface, and (ii) the water-vapour concentration of the air (Agam & Berliner, 2006).
According to Tilman et al. (2002), by 2050, global grain demand will double as a result of population growth. There will be limited arable land available, particularly in areas most affected by population growth, because different sectors are competing for land use. Thus, increasing crop productivity is a major challenge. However, neither planting more productive crops nor cultivating more land area where possible can solely meet the need for more food if yield losses from pests and diseases remain at current levels (Oerke & Dehne, 2004) or even further increase. Therefore, improvements in plant disease and pest control are needed in order to reduce both pre- and postharvest yield losses (Strand, 2000). Solutions to pest and disease problems must be specifically tailored to location, crop and the type of pest. They must also consider existing standards for reduced human and/or agroecosystem exposure to agricultural contaminants such as heavy metals, mycotoxins, plant protection products (PPPs), and harmful microorganisms (Boxall et al., 2009). Hence, the challenge will be to find a balance between public demand for a reduction in PPP use and the need to efficiently control harmful diseases potentially driven by changes in climate (Hannukkala et al., 2007).
In view of global population growth, climate change is regarded as a threat to global food production and security. Higher global temperatures, altered precipitation regimes and increases in the frequency of extreme events, particularly in regions expected to suffer greater heat and water stress as a result of projected climate change (Rosenzweig & Parry, 1994; Rosenzweig et al., 2001; Ortiz et al., 2008), raise concerns among the scientific community, the public and policy makers.
Both climate change and climate variability are relevant drivers for plant disease epidemiology. Potential effects of inter-annual climatic variability on plant disease development were addressed in an earlier review (Coakley, 1988). Shifts in climatic seasonality may result in alterations of the synchrony between crop phenology and disease or pest patterns. Ignoring such effects of altered coincidence and/or shifted escape may bias disease or pest forecasts. In addition, changes in climate or climatic seasonality may modulate host susceptibility/resistance responses to pathogens (van Maanen & Xu, 2003). Projected atmospheric and climate change will thus affect the interaction between crops and pathogens in multiple ways (Coakley, 1988; Manning & Tiedemann, 1995; Sutherst et al., 1996; Tiedemann, 1996; Chakraborty et al., 1998, 2008; Scherm et al., 2000; Runion, 2003; Weigel, 2005; Garrett et al., 2006; Jeger & Pautasso, 2008; Legreve & Duveiller, 2010).
This inevitably will also affect the management of plant diseases (Coakley et al., 1999; Sutherst et al., 2007; Ghini et al., 2008; Wolfe et al., 2008). Under worst-case scenarios, several crops may require more fungicide sprays to control diseases as effectively as under present management regimes (Salinari et al., 2006). The effectiveness and durability of chemical control measures is affected by environmental conditions. Therefore, climate change may have both positive and negative effects on the efficacy of PPPs (Olesen & Bindi, 2002). If the prevalence of existing insect pests, diseases and weeds will increase under climate change, this could result in more frequent applications of PPPs (Bloomfield et al., 2006). Further consequences may be higher application rates and an increased likelihood of resistance of pests and pathogens to current PPPs, lower cost-effectiveness for farmers and higher product prices for consumers. More frequent applications of PPPs may also have environmental implications, and tolerance of crops to PPPs might be reduced as a result of increased climatic stress, which may encourage industry to develop new phytocompatible ingredients and additives (Bloomfield et al., 2006). The half-life of PPPs in soils may fall significantly as a result of higher soil temperatures, as suggested by a computer modelling study of the persistence of the herbicide isoproturon (Bailey, 2004). This may make autumn and winter weed management in winter wheat more difficult than at present, but may benefit the environment. On the other hand, greater decreases in summer rainfall and soil moisture projected by climate models for temperate areas may lead to increased problems of herbicide carryover to subsequent sensitive crops (Bailey, 2004).
Bloomfield et al. (2006) concluded that the main climatic drivers for changing the environmental fate and behaviour of PPPs are rainfall seasonality and intensity, and increased temperatures. The translocation and degradation of PPPs are likely to be variable and difficult to predict, because there are conflicting effects of various environmental factors on the fate of PPPs. In addition to the use of appropriate fungicides, preventive plant protection measures may become particularly important to control plant disease incidence and severity under projected climate change (Ghini et al., 2008; Tiedemann & Ulber, 2008). Integrated, biologically balanced crop management and protection will provide appropriate tools to adapt to potentially altered conditions in the future (Tiedemann, 1996).
The scope of the present review is to compile and discuss strategies for plant disease management under a changing climate, focussing on fungal plant pathogens in agriculture and horticulture. Information on viral (Canto et al., 2009; Jones, 2009; Reynaud et al., 2009) and bacterial diseases is available elsewhere (Manning & Tiedemann, 1995; Boland et al., 2004; Garrett et al., 2009). This review will provide a brief overview of important strategies used for plant disease management and how these may be influenced by future climate changes.
Plant disease management tools affected by climate change
Plant disease control through host-plant resistance and fungicide treatment makes a major contribution to both climate-change mitigation and sustainable crop production systems to ensure global food security (Berry et al., 2008; Mahmuti et al., 2009; Legreve & Duveiller, 2010). Particularly in high-input production systems, greenhouse gas emissions can be decreased by greater nitrogen use efficiency by a healthy crop (Mahmuti et al., 2009). Disease management strategies may require adjustments under climate change (Garrett et al., 2006). To our knowledge there is no previous review article available that specifically and solely focuses on climate change and strategies for plant disease management, except for a paper by Strand (2000) which addresses agrometeorological aspects of pest and disease management for the 21st century. Thus, the information on climate-change impact on plant disease management available in the literature is scarce and fragmented (Chakraborty et al., 2000). Most review articles related to climate change and plant diseases include disease management as a relatively small section only. Influences on host-plant resistance and on biological control agents (BCAs) have been the most frequently treated aspects (Table 1). In the following paragraphs some selected plant disease management methods will be discussed in relation to climate change (see also Table 2). This review will consider potential mitigating effects of such crop protection methods and also give particular attention to cases where the efficacy of disease management may be modulated under climate change (Coakley et al., 1999). In a few cases the review presents examples from temperate regions side-by-side with examples from tropical countries, despite the large climatic differences between temperate regions and the tropics (Coakley, 1988). On the other hand, some preventive plant protection methods, such as delaying planting/sowing dates in order to escape periods of enhanced disease risk, are used by farmers in any climate zone worldwide. Thus, there are not only differences but also similarities across climatic regions which can be considered.
Table 1. Selected review articles addressing aspects of the impact of global climate and atmospheric change on plant disease management
|A, F, H||Cultivar breeding for host resistance and tolerance, novel selection techniques, diagnostic tools to detect newly introduced pathogens on new crops, appropriate irrigation technologies, careful selection of crop species and site to minimize disease pressure||Boland et al., 2004|
|A||Fungicides, biological control agents, host resistance||Chakraborty et al., 2000|
|A, F, H, N||Host resistance, fungicides, disease suppression by soil organic matter, biological control agents, induced systemic resistance||Chakraborty & Pangga, 2004|
|A, F, N||Host resistance, chemical control, microbial interactions, quarantine and exclusion||Coakley et al., 1999|
|A, N||Biological control, risk models for movement of invasive pathogens, host resistance and its durability, uncertainty in decision making||Garrett et al., 2006|
|A, F, N||Preventing spread of exotic pathogens, host landscape structure, proximity to higher risk areas, in situ and ex situ conservation of genetic resources for maintaining resistance and tolerance||Garrett, 2008|
|A||Fungicides, non-chemical methods, host resistance, biological control||Ghini et al., 2008|
|A, F||Resistance breeding, agricultural practices, rotation, planting time, avoidance, chemical control, forecasting models, quarantine regulations, phytosanitary measures, IPM||Legreve & Duveiller, 2010|
|A||Choice of crop and suitable location, appropriate cultivar, crop rotation, tillage practices, manipulation of planting date, intercropping, crop management such as suitable water and nutrient management, disease forecast/prediction, timing of PPP application, biological control agents, modification of microclimate, habitat management||Strand, 2000|
|A, F, N||Host resistance and tolerance, biological control agents, procedures to prevent invasion of pathogens||Sutherst et al., 2007|
|A||Crop management, increase of crop diversity, development of new fungicides, increase of range of preventive plant protection methods, reduction of the risk of invasive species, development of IPM tools e.g. models to forecast disease occurrence||Tiedemann & Ulber, 2008|
Table 2. Some assumptions on the potential influence of changing atmospheric composition and climate on selected plant disease management strategies/tools
|Avoidance||Barrier to entry (quarantine)||Climate-mediated change in pathogen dispersal – frequency, abundance, distance, speed||Altered efficacy of quarantine practices|
|Preventive||Crop rotation||No direct effect, diversity in cropping systems will remain important to reduce risks of diseases||Crop species better adapted to local climatic conditions may be required|
|Preventive||Plant residue management||Potential increase in crop biomass through the CO2 fertilizing effect, unless high temperature and drought counterbalance the fertilizing effect||Innovative approaches needed to reduce inoculum level and saprotrophic colonization|
|Preventive||Sowing/planting date||Adjustments likely to be necessary, simple and cheap method to escape biotic and abiotic stress; however, disadvantages also possible||Apparently a powerful tool, but maybe limited in overly warm winters (e.g. late sowing in autumn under temperate conditions)|
|Preventive||Host-plant resistance||Temperature dependent resistance may be overcome by pathogens, changes in plant morphology and physiology with effects on resistance, potentially accelerated pathogen evolution may erode disease resistance prematurely||Altered efficacy of host-plant resistance (higher, same or lower efficacy, depending on resistance (R) gene, pathogen population, etc.)|
|Preventive||Cleaning machinery and tools||Presumably no major effects||Phytosanitary methods will remain important|
|Preventive||Use of healthy seeds and plantlets||Presumably no major effects||Preventive methods will remain important|
|Preventive||Input levels e.g. of irrigation||Presumably higher temperature will promote irrigation in more crops and regions||Water conservation may demand efficient technologies such as drip irrigation, thereby reducing risk of foliar diseases|
|Preventive and/or curative||Field monitoring and use of decision support systems (DSS)||Presumably no major effects||Field monitoring and DSS will remain or become more important|
|Preventive and/or curative||Soil solarization||Global warming may facilitate the use (effective in more plant–pathogen systems and regions, heat may reach deeper soil layers, shorter duration of mulching period)||Altered efficacy, generally positive effects, unless drought counterbalances temperature effects|
|Preventive and/or curative||Antagonists, biological control agents (BCAs)||Presumably, vulnerability of BCAs will be higher as a result of climate variability||Altered efficacy (higher, same or lower, dependent on product, environment, management, etc.)|
|Preventive and/or curative||Contact fungicides||If rainfall occurs more frequently, more applications may be triggered, faster/slower crop growth may shorten/lengthen the time between applications||Altered efficacy (higher, same or lower, dependent on product, environment, management, etc.)|
|Preventive and/or curative||Systemic fungicides||More knowledge on foliar uptake process of systemic fungicides must be gained to make reliable predictions||Altered efficacy (higher, same or lower, dependent on product, environment, management, etc.)|
Agronomic practices such as crop rotation, soil tillage, fertilization, liming and irrigation play an important role in preventing or reducing the risk of diseases (Heitefuss, 1989). The impact of such agronomic practices on diseases is known to be large and likely to be more influential within a shorter time than the long-term climatic changes, which renders them efficient tools to counterbalance the potential increases in disease risks.
In general, farmers first of all have to select the optimal site for their crop species and may introduce new crop species that would benefit from climate change and elevated levels of atmospheric CO2 (Wolfe et al., 2008). Besides the likely benefits from planting new crop species, there is a risk that newly introduced crops will be accompanied by new pathogens (Boland et al., 2004; see Quarantine section).
Enhanced diversity in cropping systems may reduce the risks of diseases that otherwise, in monoculture, would become more severe as a result of climate change. For example, Hannukkala et al. (2007) concluded that increased and earlier occurrence of late blight (Phytophthora infestans) epidemics in potato (Solanum tuberosum) were probably associated with both climate change and lack of rotation. They analysed data on the cropping history of 235 fields and experiments in Finland during the years from 1992 to 2002 and detected that from 1998 to 2002, epidemics of late blight started 9 days earlier when the preceding crop was potato than when it was another crop. This emphasizes the importance of crop rotation, including the integration of cover crops and intercropping, in reducing specific disease risks associated with expected climate change.
Changing planting and/or harvest dates of annual crops and short-lived perennials can be an effective, low-cost option to render the crop less vulnerable to pests and diseases or adverse abiotic conditions (Srivastava et al., 2010). However, the practicability of changing planting and/or harvesting dates is dependent on region, cultivar, magnitude of climate change, and market situation (Wolfe et al., 2008). Under certain circumstances, delayed planting as an avoidance strategy against pathogens may become less reliable under climate change (Garrett et al., 2006), for example, if in temperate climates mild conditions prevail in autumn and early winter. Srivastava et al. (2010) concluded that more low-cost adaptation strategies, such as changing sowing date and cultivar, should be explored to reduce the vulnerability of crop production to climate change, although these can have trade-offs. For example, in the African highlands, potato growers must choose between avoiding the rainy season in order to reduce the risk of late blight or accepting an increased risk of drought stress (Hijmans et al., 2000).
Planting of cultivar mixtures and intercropping (e.g. potatoes and faba beans, Vicia faba) might be another way to reduce disease risks, for example, by slowing down epidemic rates, thereby facilitating disease control with fungicides, such as shown for late blight in potatoes (Garrett et al., 2001). However, this effect varied across years and locations. Particularly under high disease pressure, these methods were not sufficiently effective and had to be combined with other integrated pest management (IPM) measures, such as genetic resistance and fungicide application (Wolfe, 1985; Garrett et al., 2001). Often, the advantage of disease management in cultivar mixtures is associated with yield penalties (Wolfe, 1985) and attention should be given to using cultivars with similar earliness and quality. In some cases, the growth of the intercrop might be rather too poor to function as a physical barrier to protect the susceptible genotype (Garrett et al., 2001), whilst in other cases the intercrop might grow too vigorously and compete with the main crop, actually reducing its yield and quality. Cultivar mixtures and intercropping in field experiments frequently have only small effects on disease severity. However, their cumulative effects over many growing seasons could be important (Garrett et al., 2001), such as maintaining durability of resistance genes. More research is needed to evaluate the usefulness of cultivar mixtures and intercropping to cope with potential disease risks associated with changing climate.
Management of overwintering pathogens in crop residues left on the field is one important agronomic practice. Many residue-borne plant diseases caused by necrotrophic pathogens can be managed through crop rotation and other agronomic practices designed to reduce inoculum levels (Melloy et al., 2010). Crop residue management for disease control has gained importance with the expansion of conservation agriculture. More powerful methods should be developed to impede saprotrophic colonization of crop residues by pathogens in order to decrease the carry-over of inoculum between cropping seasons (Melloy et al., 2010). Under worst-case conditions, farmers may consider ploughing in order to turn the soil and bury diseased residues, although the benefits from conservation tillage related to soil structure and water retention are lost. Thus, there may be trade-offs between strategies to cope with climate change and possible penalties resulting from diseases. In summary, an obvious strategy in the context of climate change is conservation agriculture, because it benefits crop resilience in stressed environments (Ortiz et al., 2008). To run conservation agriculture successfully in the long term, cultivars are needed with better water use efficiency, improved root health and durable resistance to the economically important pathogens that emerge from crop residues or result from adoption of conservation practices (Ortiz et al., 2008).
Increased water use in agriculture as a result of warmer temperatures, longer growing seasons and increased summer drought may occur (Wolfe et al., 2008). Increased water supply and fertilizer application may also be needed to support greater plant growth and development resulting from the CO2 fertilization effect. Pathogen aggressiveness may also be favoured as a result of increased plant canopy density (see below section on host-plant resistance), which could result in increased fungicide applications (Bloomfield et al., 2006). Increasing PPP application could lead to more transfer of PPPs to soil and groundwater. On the other hand, water conservation may require efficient technologies such as drip irrigation, as already used under arid climatic conditions (Boland et al., 2004). These systems may also contribute to reduced transfer of PPPs to soil and groundwater. In addition, efficient technologies such as drip irrigation may contribute to reduced leaf wetness and humidity in the crop canopy, resulting in reduced foliar disease occurrence and severity compared with more traditional practices such as overhead irrigation systems.
Soil solarization prior to planting is a non-chemical, physical disease-control method for soil disinfection using plastic mulches (Garibaldi, 1987). Typical target pathogens are soilborne and include Verticillium dahliae, Sclerotium rolfsii, Rhizoctonia solani and Fusarium species (Katan, 1981). Soil solarization fits the concept of IPM, although it has disadvantages, such as its non-selectivity and therefore its detrimental effects on beneficial organisms, including antagonists of soilborne plant pathogens. The longer the treatment period, the higher the pathogen-killing rates and the deeper the effectiveness of soil solarization (Garibaldi, 1987). Climate warming may facilitate the use of soil solarization, both in the greenhouse and open field, because it could be used successfully in more pathogen–plant systems and more regions, with the heat penetrating to deeper soil layers and the mulching periods (usually 4 weeks or longer) becoming shorter. However, in regions where global warming is associated with reduced rainfall, effectiveness of soil solarization might be lower, as dry soils provide reduced efficacy, unless irrigation will be applied. Thus, global warming may improve the efficacy and handling of certain disease control methods, particularly those requiring high temperature, such as soil solarization.
Host-plant resistance and pathogen aggressiveness
Host-plant resistance involves the use of cultivars that are able to resist or tolerate pathogen attack (Legreve & Duveiller, 2010). The use of resistant or tolerant cultivars is easy, cheap, environmentally sound and effective (Dodds & Rathjen, 2010), unless pathogens overcome the resistance. Expression of plant resistance is determined by the interaction between genetic factors in the pathogen and the plant, and the environmental impact represented by cultural practices and climate.
Exposure to unsuitable environmental conditions can cause loss of resistance (Strand, 2000). For example, disease resistance can be temperature-dependent (Hartleb & Heitefuss, 1997), as in the case of phoma stem canker (Leptosphaeria maculans) of oilseed rape, where resistance is expressed at 15°C but not at 25°C (Huang et al., 2006). There is a need for improved resistance of oilseed rape to phoma stem canker, which is favoured at increased temperatures (Stonard et al., 2010). Therefore, strategies for breeding cultivars with improved resistance to pathogens may need to include trials in countries with a warmer climate that represent predicted climates in currently cooler latitudes (Butterworth et al., 2010). Inversely, a rice bacterial blight resistance gene was reported to benefit from high temperature (Webb et al., 2010). This is also true for some resistance genes in wheat against the rust fungi Puccinia recondita and Puccinia striiformis (Uauy et al., 2005; Legreve & Duveiller, 2010). However, there is evidence for increased aggressiveness at higher temperatures of isolates of P. striiformis demonstrating that wheat rust fungi can adapt to and benefit from warmer temperatures to cause severe disease in previously unfavourable environmental conditions (Milus et al., 2009). These adapted isolates are also challenging the effectiveness of disease resistance genes which operate at higher temperatures. Disease resistance is also affected by other environmental factors, such as duration of leaf wetness, nutrient status (e.g. nitrogen fertilization), soil type and availability of water. Thus, temperature is not necessarily the sole and single most important factor affecting disease resistance. Typically, it takes more than 10 years to breed a new disease-resistant cultivar (Chakraborty et al., 1998), so breeding programmes must be planned and started well in advance of serious disease problems or be based on minor genes which are expected to be more durable. Therefore, breeders need advice on which plant pathogens might become economically important in the future (Chakraborty et al., 1998). The projection of which pathogen might become a threat in the future is complex, because several factors might interfere, such as unusual wind or storm patterns which influence the global movement of pathogen species and isolates. Long-distance dispersal of fungal spores by wind or storms can spread plant diseases across and even between continents (Brown & Hovmøller, 2002).
Changes in atmospheric composition and climate may influence host–pathogen interactions and host resistance through numerous types of morphological and physiological alterations, such as stomatal morphology and physiology, increases in rates of net photosynthesis, papillae formation, silicon accumulation at appressorial penetration sites, carbohydrate accumulation in leaves, increased cuticular waxes, additional epidermal cell layers, increased fibre content, reduced nutrient concentration and modified production of resistance-related enzymes such as rubisco (Chakraborty et al., 2000). For example, enhanced aggressiveness of powdery mildew (Erysiphe cichoracearum) on Arabidopsis thaliana under elevated CO2 was recently shown (Lake & Wade, 2009). Infected leaves were altered in stomatal density, guard cell length and trichome numbers compared with non-infected leaves. Eastburn et al. (2010) showed that elevated CO2 alone or in combination with O3 consistently decreased downy mildew (Peronospora manshurica) in soybean (Glycine max), but increased septoria brown spot (Septoria glycines). The reduction in downy mildew may have been the result of accelerated soybean leaf senescence, and hence a reduced susceptible period, caused by elevated O3 concentration. This study was conducted in a FACE (free air carbon enrichment) facility under field conditions, allowing realistic disease assessment because plants were exposed to natural pathogen inoculum and microclimatic conditions. Such studies focusing on CO2 concentration allow researchers to assess the concurrent effects of variability in temperature and precipitation (Eastburn et al., 2010). These studies should be extended to more plant–pathogen systems.
Of particular concern is the emergence of new virulent and aggressive isolates of a pathogen which may rapidly erode disease resistance in crop plants under climate and atmospheric change (Chakraborty & Pangga, 2004). Preliminary evidence (Chakraborty & Datta, 2003) suggested that pathogen evolution for increased aggressiveness was enhanced under elevated atmospheric CO2 as a result of greater pathogen fecundity and a more favourable microclimate for disease development, providing larger populations in which pathogen evolution was accelerated. However, more studies across a range of pathosystems are needed before general conclusions about pathogen aggressiveness under elevated atmospheric CO2 can be drawn (Scherm & Coakley, 2003). Research to understand such evolutionary processes will be of utmost importance in predicting responses of pathogens to climate change (Scherm, 2004). More information can be obtained from Eastburn et al. (2011) and Pangga et al. (2011).
In summary, future plant disease management strategies should include resistance breeding approaches for broad adaptation to multiple environments. Breeding for resistance against several pathogens should be combined with breeding for tolerance to abiotic stresses such as drought and heat (Legreve & Duveiller, 2010). However, when several plant diseases occur at the same time, multiple host-plant resistance may not be sufficient and needs to be supported by integration of other plant protection methods, such as crop rotation and the use of spatial and temporal crop diversity (Tilman et al., 2002).
Chen & McCarl (2001) analysed how costs of PPP application were influenced by temperature and precipitation in the USA. They found that more rainfall increased average costs per acre for PPP application in corn, cotton, potatoes, soybeans and wheat, whereas warmer weather increased PPP application costs in corn, cotton, potatoes and soybeans, but decreased the cost in wheat.
Fungicides are used to control pathogenic fungi and oomycetes. They may have additional beneficial side effects on the physiology of the plant, resulting in enhanced general stress resistance and higher yield (Wu & Tiedemann, 2001). Predominantly, fungicides are applied foliarly or in seed dressings. There are only few reports on how fungicide treatments may be affected by climate change (Ghini et al., 2008). Most authors have focused on precipitation patterns, whereas the temperature dependency of fungicide efficacy is rarely addressed. Extreme temperatures may affect the efficacy of PPPs, including fungicides, or increase their phytotoxicity (Strand, 2000). In contrast, there are few preliminary studies available on whether and how climate and atmospheric changes may affect herbicide efficacy, particularly related to elevated CO2 (Edis et al., 1996; Ziska et al., 1999; Ziska & Teasdale, 2000). Bunce (2001) assumes that weed control with herbicides may become less effective or more expensive because elevated CO2 may alter leaf characteristics such as leaf and cuticular thickness, stomatal density and stomatal conductance, finally leading to reduced herbicide uptake. Ziska & Runion (2007) indicate that chemical control of weeds will still be possible with climatic changes or rising CO2 concentrations, but additional sprayings or increased doses may be necessary. Martin & Edgington (1980) showed that higher temperature (15 vs. 10°C) reduced the efficacy of fungicides (seed treatment) to control loose smut (Ustilago nuda) of barley (Hordeum vulgare). Such studies may help to elucidate the potential effects of global warming on fungicide efficacy.
Changes in temperature, rainfall, wind speed, soil/air moisture and light conditions can influence the effectiveness of PPP applications (Bedos et al., 2002; Ziska & Runion, 2007). These environmental factors may alter fungicide dynamics in the soil (Monkiedje et al., 2007) and on/in the foliage, including uptake, degradation and volatilization (Bedos et al., 2002). Precipitation during the post-application period is particularly critical. It may either improve fungicide distribution or deplete the fungicide layer on the foliage, altering the fungicide’s efficacy (Chakraborty et al., 2000). According to Wolfe et al. (2008), more frequent rainfall events projected for the northern latitudes during winter and spring may trigger more frequent fungicide applications, because of the difficulty of keeping contact fungicides on the plant canopy. The introduction of fungicides with a greater rainfastness might help to reduce this problem (Hannukkala et al., 2007). Uptake, translocation and mode of action of systemic fungicides could be negatively affected by plant morphological responses, such as smaller stomatal openings or thicker epicuticular waxes on the leaves, in response to elevated CO2 and/or increased air temperature. This may reduce or slow down the uptake rates of systemic fungicides, although increased plant metabolic rates at warmer temperatures could increase fungicide uptake (Coakley et al., 1999). Optimizing the timing of fungicide application can be a simple way to increase fungicide efficacy (Bedos et al., 2002). For example, Augusto et al. (2010) reported that applying fungicide early in the morning to folded, wet leaves in peanut (Arachis hypogaea) improved spray deposition in the lower canopy, thereby increasing stem rot control (caused by Sclerotium rolfsii) and crop yield.
Farmers, advisors and researchers in temperate regions facing a rise in temperature could learn from fungicide application principles in tropical countries. However, most PPP performance data are generated in North America and Europe and extrapolated to tropical countries, where much less research on PPPs is done (Garcia, 2004). For specific fungicides, data have been gained in laboratory experiments and do not necessarily represent field conditions, as shown by Monkiedje et al. (2007). The fate of PPPs in the tropics follows distinct patterns. In general, high temperature and humidity seem to favour degradation and volatilization of the active ingredients. Nonetheless, warm and humid conditions may increase the efficacy of most PPPs, because penetration through plant tissues may be easier and the uptake of active ingredients faster (Garcia, 2004). Data are lacking on whether this enhanced PPP uptake would generally increase or reduce the effectiveness of fungicide applications. Climate change may also affect the phytotoxicity of PPPs. How far these general principles can be applied to fungicide treatments in temperate regions under climate change is unknown. Even under temperate climatic conditions, foliar uptake of PPPs is a complex event steered by clearly more than one factor or mechanism (Wang & Liu, 2007). For a specific active ingredient, foliar uptake varies greatly depending on adjuvants, plant species, cultivars, environmental conditions, and management practices (Bedos et al., 2002; Wang & Liu, 2007). On the other hand, foliar uptake can be increased to a certain degree by minimizing volatilization of PPPs, because the penetration process competes with the volatilization process (Bedos et al., 2002). Measures that minimize volatilization of PPPs include appropriate (i) physicochemical characteristics of the active ingredient, (ii) formulation of the product, (iii) timing of PPP application related to prevailing environmental conditions, (iv) application dose, (v) distribution of PPPs on the leaf surface, and (vi) application technology. In summary, current knowledge is scarce and does not allow predictions of future alterations in PPP foliar uptake processes.
More fungicide sprays could be necessary to control diseases as effectively as under present management practices. For example, simulation studies showed that under the most severe climate-change scenario, two additional fungicide treatments might become necessary to manage downy mildew (Plasmopara viticola) in grapevine (Vitis vinifera) as a result of the projected temperature rise in the northwest of Italy by the end of this century. A potential adaptation response to future climate change may require farmers to pay more attention to the management of initial downy mildew infections (Salinari et al., 2006). The prediction of more severe epidemics was related to more favourable temperature conditions during the months of May and June. Disease-promoting effects of increasing temperatures overruled disease reduction through reduced precipitation. Therefore, the costs for downy mildew management would increase in future scenarios, unless product costs will be lower. In contrast, McElrone et al. (2010) found across a 5-year FACE experiment under field conditions in North Carolina, USA, that cercospora leaf spot disease incidence and severity in red bud (Cercis canadensis) and sweetgum (Liquidambar styraciflua) trees were greater in years with above-average rainfall. In years with above-average temperatures, disease incidence decreased significantly. Fewer fungicide treatments may be needed for control of phoma stem canker (Leptosphaeria maculans) in oilseed rape (Brassica napus) in Scotland in the 2020s and 2050s, because simulations suggest that this disease will cause less yield loss, indeed, fungicide use may no longer be economically justified (Butterworth et al., 2010). In contrast, fungicide treatments in southern England may become essential to maintain yields of oilseed rape, because increased temperatures could considerably increase the yield losses from phoma stem canker and minimize the benefits of potential yield increases by increased atmospheric CO2. These yield losses are predicted to be greatest for susceptible cultivars, suggesting that in the absence of improved resistance to phoma stem canker there will be an even greater need for precisely timed fungicide sprays to control this disease (Butterworth et al., 2010). The above-mentioned examples suggest that the adaptation of fungicide use under climate change will be specific for the disease/crop pathosystem and location. These specific relationships will also determine whether higher or lower associated costs will result for the growers.
Biological control with antagonistic organisms
Biological control agents (BCAs) may be effective either upon introduction by application or through strengthening their natural occurrence. In general, their effectiveness requires specific, conducive environmental conditions. If appropriate temperature and moisture are not consistently available, BCA populations may fail to reduce disease incidence and severity, and may not recover as rapidly as pathogen populations when conducive conditions recur (Garrett et al., 2006).
According to Ghini et al. (2008) there is little information on the potential effects of projected climate change on biological control methods to manage plant diseases. The few available results focus on below-ground composition and dynamics in the microbial community, which of course can be important for both below- and above-ground plant health. In general, the impact of antagonists may be modified by various interaction factors, such as plant species, soil type, soil temperature, moisture and nutrient availability. According to Wolfe et al. (2008) it is not well understood how naturally occurring antagonists of pathogens may change if populations of microorganisms would shift under changed temperature and moisture regimes. In some cases, antagonistic organisms may thereby out-compete pathogens, whilst in other cases, pathogens may be favoured (Pritchard, 2011).
Garrett et al. (2006) assumed that vulnerability of BCAs will be higher under climate change, because if climate variability becomes greater it would impose difficulties on the survival and activity of applied antagonists. On the other hand, Ghini et al. (2008) argued that in spite of potential problems when applying BCAs research efforts must be strengthened to develop biocontrol measures with more tolerance to variable conditions. This, however, is not a new challenge, as this has been a perpetual requirement for successful biocontrol since the discovery of the first BCAs.
Integrated pest management (IPM)
Plant pathology research and extension work have emphasized the integration of diverse control strategies in an IPM framework, the early foundations of which were laid by entomologists (Jacobsen, 1997). IPM is an ecosystem-based strategy that focuses on long-term prevention of pests or their damage through a combination of methods such as biological control, use of resistant cultivars, habitat management and cultural practices (Strand, 2000). PPPs are used following thorough monitoring and based on established economic damage thresholds. Integrated crop management (ICM) adds to IPM components such as soil tillage, nutrient supply and water management in order to develop sustainable production systems (Juroszek et al., 2008).
There is potential for an overall increase in the number of outbreaks and northward migration of a wide variety of pathogens in the northern hemisphere as a result of global warming (Wolfe et al., 2008). Educational outreach efforts and other tools, such as appropriate pathogen diagnostics, will need to be intensified and updated to keep pace with the changing disease situation (Weber, 2009). Disease-forecasting models are needed for more pathogens and with further improved quality to be used to guide farmers (De Wolf & Isard, 2007). Such prediction tools may allow farmers to respond in a timely and efficient manner with PPP applications. A frequent review of IPM recommendations is necessary to minimize negative environmental and economic impacts associated with the likely increase of PPP use, including fungicides (Wolfe et al., 2008).
Some pathogens may be more harmful in a new environment (Garrett et al., 2010). Avoidance of pathogen spread through quarantine may become more difficult for authorities as novel or invasive pathogens may be transferred more frequently with imported plant products (Wolfe et al., 2008) or by air-stream drift. There will be new opportunities for crops and cultivars to be introduced in regions and locations where they have not been grown before, but effective systems, such as diagnostic tools, must be in place to detect and follow invading pathogens and monitor their behaviour under such altered conditions (Boland et al., 2004). Use of climate matching tools and geographical information systems may assist quarantine agencies in determining the threat posed by a given pathogen in different regions under current and future climate shifts (Coakley et al., 1999; Legreve & Duveiller, 2010).
It is probable that climate change will significantly influence plant disease management (Chakraborty et al., 2000; Strand, 2000; Ghini et al., 2008; Gregory et al., 2009), degradation and uptake of PPPs (Bloomfield et al., 2006), average external costs of PPP use (Koleva & Schneider, 2009), and their environmental distribution and toxicity (Miraglia et al., 2009; Noyes et al., 2009). A particular challenge is the effective dissemination and use of known but currently underutilized techniques. For example, significant contributions can derive from better field monitoring of diseases and pests, improved timing of PPP application and better systems for delivering PPPs to their targets (Strand, 2000). This includes in situ and ex situ conservation of genetic diversity in crop species, including their wild relatives, in order to increase the availability of genes to breed for resistance to existing and new pathogens and to abiotic stresses (Garrett, 2008). The requirement to breed new cultivars with more durable resistances within shorter time periods will possibly give biotechnological approaches a key role in crop adaptation to changing climate. Accumulation of minor genes and race-non-specific resistance to pathogens by conventional breeding programmes will remain equally important to achieve durable resistance in diverse environments. Preventive plant protection measures may become particularly important under climate change. These may include greater heterogeneity in cropping systems (Christen, 2008) that reduce disease risks, use of superior cultivars resistant and/or tolerant to abiotic and biotic stress (Ordon, 2008), and reliable tools for forecasting pathogen occurrence in order to respond in a timely manner to plant pathogens (Tiedemann & Ulber, 2008).
The precise prediction of pathogen responses to climate change will be limited (Dukes et al., 2009) by a lack of comprehensive, current multi-factor and multi-species data (Ziska & Runion, 2007) and, moreover, by the diversity and adaptability of pathogen populations (Garrett, 2008). Given the many interactions between ecosystem processes, human influences, environmental conditions, pathogen populations and their potential for adaptation, long-term predictions will remain particularly difficult (Fuhrer, 2003). This similarly applies to predictions of future changes in plant disease management strategies.
Disease-forecasting models based on weather data can help to identify the meteorological factors (and the time period) which are significantly correlated with disease (Coakley et al., 1988). These types of disease-forecasting models can be combined with general circulation models in order to simulate future scenarios of disease epidemics, although most general circulation models operate on larger scales of resolution. Down-scaling climate models can contribute to bridge this gap (Soussana et al., 2010). Nevertheless, the challenge remains to consider the variability in disease epidemiology (Legreve & Duveiller, 2010). Based on future scenarios of disease epidemics, disease management practices can be suggested and/or improved (Salinari et al., 2006).
Large-scale projections of disease risks based on climatic conditions can also be used to identify priorities for research (Hijmans et al., 2000). Likewise, projections are also essential to provide strategic guidance for industry and government policy for adaptation to climate change so that national agricultural production is able to meet future demands of food security (Evans et al., 2008; Garrett et al., 2009). Projections may suggest that, unless there is effective control of a certain disease available, it may be necessary to move future production of a certain crop from the south to the north (northern hemisphere), from the north to the south (southern hemisphere) or from lowlands to uplands of a certain region or country, in order to escape the detrimental effects of temperature rise (Butterworth et al., 2010).
In contrast, it should be noted that besides potential risks, there are also potential gains associated with expected climate change. Agricultural systems have shown considerable adaptability to climate through management changes (Chakraborty, 2005). This adaptability also provides a high likelihood that significant climate shifts may be buffered. Moreover, there may be some gains from fewer frosts, altered precipitation patterns, CO2 fertilization, and longer growing seasons (Chakraborty, 2005). For example, global warming may favour the use of soil solarization practices in horticulture, using plastic mulches over moist planting beds, known to control soilborne pathogens such as Verticillium dahliae and Fusarium spp. (Strand, 2000; see above).
A fundamental conclusion from this review is that agriculture, in its recent history, has evolved a large set of powerful tools to adapt crop production and crop protection to the local climatic situation and changing economic constraints. This has resulted in constant adjustments to conditions differing from region to region, year to year, and within a season. In addition, the ongoing, yet accelerating progress in agricultural technologies, mainly new cultivars, novel PPPs and agrotechnical innovations, has constantly required significant adaptations of the production systems by farmers, with increased yields and higher revenues being the main drivers. The same tools and adaptability are needed and can be utilized to cope with the climate change currently in question. Therefore, climate change does not represent a completely novel challenge to agriculture, although the situation may be more difficult if new crops are introduced into areas where they have not been grown before and/or if climate change promotes invasion of pathogens which are novel for a certain region. Nonetheless, there is a clear requirement for consistently utilizing, adapting and improving crop protection strategies and tools and enhancing the relevant knowledge base. Improved understanding of the drivers of disease cycles, epidemic development and host responses will continue to be a fundamental prerequisite to enable farmers and advisors to predict and manage diseases under changing local conditions (Chakraborty et al., 2000; Melloy et al., 2010). Consequently, diverse, flexible and resilient crop production systems will be needed, even more than today, that can cope more readily with conditions in a changing environment (Garrett, 2008).
This work was funded by the Ministry for Science and Culture of Lower Saxony, Germany, through the Climate Change Research Network ‘KLIFF’. We are grateful to Dr Sukumar Chakraborty and two anonymous reviewers for their helpful comments to our manuscript.