Experiments were conducted on olive plants in controlled environments to determine the effect of conidial concentration, leaf age, temperature, continuous and interrupted leaf wetness periods, and relative humidity (RH) during the drier periods that interrupted wet periods, on olive leaf spot (OLS) severity. As inoculum concentration increased from 1·0 × 102 to 2·5 × 105 conidia mL−1, the severity of OLS increased at all five temperatures (5, 10, 15, 20 and 25°C). A simple polynomial model satisfactorily described the relationship between the inoculum concentration at the upper asymptote (maximum number of lesions) and temperature. The results showed that for the three leaf age groups tested (2–4, 6–8 and 10–12 weeks old) OLS severity decreased significantly (P <0·001) with increasing leaf age at the time of inoculation. Overall, temperature also affected (P <0·001) OLS severity, with the lesion numbers increasing gradually from 5°C to a maximum at 15°C, and then declining to a minimum at 25°C. When nine leaf wetness periods (0, 6, 12, 18, 24, 36, 48, 72 and 96 h) were tested at the same temperatures, the numbers of lesions increased with increasing leaf wetness period at all temperatures tested. The minimum leaf wetness periods for infection at 5, 10, 15, 20 and 25°C were 18, 12, 12, 12 and 24 h, respectively. The wet periods during early infection processes were interrupted with drying periods (0, 3, 6, 12, 18 and 24 h) at two levels of RH (70 and 100%). The length of drying period had a significant (P <0·001) effect on disease severity, the effect depending on the RH during the interruption. High RH (100%) resulted in greater disease severity than low RH (70%). A polynomial equation with linear and quadratic terms of temperature, wetness and leaf age was developed to describe the effects of temperature, wetness and leaf age on OLS infection, which could be incorporated as a forecasting component of an integrated system for the control of OLS.
Olive leaf spot (OLS), also called peacock spot, is caused by the fungus Spilocaea oleagina (syn. Cycloconium oleagina). OLS is the most important leaf disease of olive in many olive-growing countries such as Italy, Spain, USA, South America, Australia and New Zealand (Graniti, 1993; Teviotdale & Sibbett, 1995; MacDonald et al., 2000), and is particularly serious in moist, cool climatic regions, such as New Zealand (MacDonald et al., 2000). The symptoms usually occur on the upper surface of the leaves (Graniti, 1993), expanding and coalescing to cover a large proportion of leaf area, often causing premature leaf-fall. Spots are usually more abundant on foliage from the lower parts of olive trees, as is defoliation and sometimes dieback. After successive periods of defoliation and regeneration of leaves, the affected trees may show poor growth and reduced yields (Laviola, 1992; López-Doncel et al., 2000). Under very wet conditions, small sunken brown lesions may appear on the petioles, fruit peduncles and fruit (Graniti, 1993), resulting in leaf or fruit drop and decreased oil yields (Verona & Gambogi, 1964). In New Zealand, a survey of olive trees found that about 40% of trees were affected by OLS, suggesting that it may play a role in the low productivity reported by MacDonald et al. (2000). In California, Wilson & Miller (1949) reported severe outbreaks of OLS in the period 1941–1949, with yield losses of up to 20% in some areas. There is wide variation in the level of susceptibility of olives to OLS (Graniti, 1993; Sutter, 1994); cultivars may be highly susceptible, moderately or highly resistant. The most commonly cultivated olive variety in New Zealand, particularly in the Canterbury region, is Barnea which is highly susceptible to OLS (MacDonald et al., 2000).
OLS can be controlled by application of fungicides prior to winter rains (Teviotdale et al., 1989). However, since timing of the fungicide applications was reported to be critical for effective control of the disease (Graniti, 1993; Obanor et al., 2008b), and increased disease severity is known to be associated with wet seasons, accurate forecasting of OLS risk periods is desirable. In addition, development of disease forecasting systems may improve disease control and/or reduce fungicide use, thereby helping to allay environmental, economic and health concerns on the use of fungicides.
Forecasting systems have not been developed for OLS, but have been for apple scab caused by Venturia inaequalis, a pathogen closely related to S. oleagina, which has a similar mode of infection (Graniti, 1993). The apple scab-forecasting systems are based on the relationships between amount and duration of rain or leaf wetness, at a range of temperatures favourable for infection and disease development. They commonly result in the reduction of fungicide input while maintaining effective control of apple scab (MacHardy, 1996; Hindorf et al., 2000; Berrie & Xu, 2003). For OLS, field observations have indicated that environmental factors, such as temperature and moisture, are the driving forces of the infection and spread of the disease (Graniti, 1993), but precise information on the effects of environmental variables on OLS infection and development has not been reported.
OLS infection can occur at any time of the year, but usually during late autumn through to early summer if environmental conditions are favourable (Graniti, 1993; Guechi & Girre, 1994). Obanor et al. (2008a) reported that S. oleagina conidia can germinate on detached olive leaves only if supplied with free moisture, and at temperatures ranging from 5 to 25°C, with an optimum of 20°C. They developed models describing the relationship between leaf wetness duration, conidium germination and germ tube growth on detached leaves at different constant temperatures. However, these simple models can have only limited usage in the development of disease forecasting systems in olive groves where environmental conditions are variable. In particular, periods of rain are often intermittent during the main infection periods (autumn and spring to early summer) and the effects of dry periods, in which there is wide variation in the relative humidity (RH), on germination and infection processes are unknown, making calculation of infection periods difficult. For V. inaequalis, the duration of dry periods interrupting continuous wet periods has been reported to reduce spore survival and development of apple scab (Becker & Burr, 1994; MacHardy, 1996), while the RH during the dry periods affected the survival of V. inaequalis spores (Aylor & Sanogo, 1997).
The inoculum for primary infection of S.oleagina is provided by the existing lesions on leaves that have overwintered on the trees which are able to produce fresh conidia in cool, moist conditions (Guechi & Girre, 1994). The numbers of S. oleagina conidia and their viability vary between olive groves and across the seasons (Guechi & Girre, 1994; Obanor et al., 2005), often with the greatest numbers of conidia being produced in spring from the overwintering lesions (Obanor et al., 2005). These conidia may remain viable for several months, but once detached from the conidiophores, they lose their viability within a week (Laviola, 1966; Tosi & Zazzerini, 2000). The relationship between inoculum concentration and disease severity has not been reported for OLS as it has for other related pathogenic fungi, such as V. inaequalis (MacHardy, 1996) and Venturia pirina (Villalta et al., 2000). According to Palm (1987) (cited in MacHardy, 1996), inoculum dose in an orchard was vital in predicting levels of apple scab infection, as investigated in orchard experiments that used inoculum concentrations ranging from 5·0 × 102 to 1·0 × 105 conidia mL−1, which showed that the resulting disease levels increased up to a dose of about 5·0 × 104 conidia mL−1 and thereafter did not increase further.
This study investigated factors affecting the epidemiology of OLS, namely the quantitative effects of (i) temperature and duration of leaf wetness on infection by conidia of S. oleagina, (ii) dry periods interrupting infection processes at two temperatures, (iii) inoculum concentration and temperature on disease development. The modelling of this data indicated that the effects were consistent, such that this type of data could be used to develop a dynamic forecasting system for OLS in the future.
Materials and methods
One- to two-year-old olive plants of the highly susceptible cultivar Barnea were used in all experiments. Plants were produced from cuttings and grown in a greenhouse (22 ± 5°C) as described previously (Obanor et al., 2008a). The youngest pair of fully expanded leaves were marked on each plant after 2 months of growth, and thereafter, at fortnightly intervals for 10 weeks. Prior to inoculation, plants were placed for 24 h inside growth chambers set at the constant temperature specified below to ensure acclimation of the plant tissues.
For each experiment, including repeats, inoculum was freshly prepared from naturally infected cv. Barnea leaves grown in a nearby commercial grove, as described by Obanor et al. (2008a). For the inoculum dose experiment, the stock suspension was adjusted to 2·5 × 106 conidia mL−1 using a haemocytometer and serial dilutions made from the stock suspension to provide the required range of conidium concentrations. For all experiments except the one investigating effects of conidium concentration on disease development, the inoculum was adjusted to 5·0 × 104 conidia mL−1 (Obanor et al., 2008a). Conidium viability of all conidial suspensions was determined according to the methods of Saad & Masri (1978), and ranged from 50% to 60%.
Effect of conidial concentrations on OLS severity
The effect of different conidial concentrations at various constant temperatures (5, 10, 15, 20 and 25°C) was investigated. The experiment was conducted consecutively in a single growth chamber (Conviron PGV36; Controlled Environments Limited), with the order of incubation temperatures selected at random. Temperature and RH inside the chamber was monitored with Hobo temperature and RH sensors (Onset Computer Corp.) which had an accuracy of ±0·2°C and 2%, respectively.
For each given temperature, 36 plants were inoculated and the experiment was conducted twice. Six plants per treatment were inoculated using an atomizer to spray them with 0, 1·0 × 102, 2·5 × 103, 1 × 104, 5·0 × 104, 2·5 × 105 or 2·5 × 106 conidia mL−1, until all the adaxial leaf surfaces were completely covered but not to run-off. Immediately after inoculation, the plants were arranged in a completely randomized design inside the growth cabinet, which was set at the designated temperature. Continuous wetness in the growth chamber was provided by a misting unit that produced fine water droplets for 30 s every 4 h, which maintained RH at 98–100%, and the plants continuously wet for 48 h. Subsequently, the plants were transferred to a shadehouse, where an automatic overhead sprinkler system that turned on for approximately 10 min per day, improved conditions for disease development. The mean daily temperature in the shadehouse ranged from 5 to 15°C. Plants were monitored for lesion development at weekly intervals for 4–12 weeks, at which time no new lesions seemed to be developing. Disease assessment was then done on the plants after 12 weeks of incubation by assessing the top six fully expanded leaves (2–12 weeks old) at the time of inoculation. The number of leaf lesions per plant provided a measure of disease severity which was considered to be valid for OLS because earlier studies by MacDonald et al. (2000) and Obanor et al. (2005) found high correlations (P <0·001) between diseased leaf areas and numbers of leaf lesions per plant, as well as numbers of OLS-infected leaves per tree, possibly due to the fact that OLS lesions expand very slowly.
Effect of wetness periods, temperature and leaf age on OLS severity
The same growth chamber as above was used for each temperature, with the temperature order being selected at random and each test of temperature-leaf wetness period combination conducted three times. The eight leaf wetness periods (6, 12, 18, 24, 36, 48, 72 and 96 h) were tested at five temperatures (5, 10, 15, 20 and 25°C). Plants were spray-inoculated with 5·0 × 104 conidia mL−1 of S. oleagina made as described previously and non-inoculated control plants were sprayed with sterilized water. Continuous wetness in the growth chamber was provided as described above.
For each temperature experiment, four randomly selected plants were removed from the chamber after each wetness period. Wetness period measurements included the time required (30–40 min) for leaves to dry after removal from the growth chamber, which was done by placing the selected plants 90 cm from a fan set at slow speed at room temperature and ambient RH (about 50%). After the leaves were dry, the plants were transferred to a shadehouse for disease development. After 12 weeks of incubation, disease severity was assessed as before but on the first 12 fully expanded leaves. The leaf age categories were young (2–4 weeks old), intermediate (6–8 weeks old) and old (10–12 weeks old).
Effect of dry periods (70% and 100% RH) after an initial wet period on OLS severity
The same type of growth chamber as above was used. It was set to provide 70% RH for each temperature tested, 10 and 20°C. The chamber was divided into three sections, of which one provided the ambient 70% RH dry conditions that interrupted the wet periods. The remaining two sections were occupied by two humidity tents (145 × 95 × 145 cm) which had frameworks of polyvinyl chloride (PVC) pipes and were completely covered in polyethylene plastic sheet. The tent with 98–100% RH during interruption of wetness periods had its bottom completely covered with moistened absorbent paper whose ends were dipped into two containers filled with water. These RH chambers provided the ‘dry’ conditions when leaves appeared dry throughout; they interrupted the ‘wet’ periods when leaves showed a film of free water on their surfaces. Moisture in the continuous leaf wetness tent was provided as previously described. Air exchange and some escape of water vapour were allowed by three 2–3 cm diameter openings in each side of the tents.
The effect of dry periods that interrupted continuous wetness on OLS development was investigated on inoculated olive plants in a growth chamber maintained at 10 or 20°C. After spray-inoculation as described previously, the plants were exposed to either continuous wetness or interrupted periods of wetness, which consisted of an initial 12 h wet period followed by a dry period (0, 3, 6, 12, 18 or 24 h) of either low (70%) or high (100%) RH and a final wet period of 24 h. At each temperature, four replicate plants were also exposed to a period of continuous wetness that lasted for 36, 39, 42, 48, 54 and 60 h. Thus, for each interrupted wet period treatment, there was a corresponding continuous wet period treatment with the same total leaf wetness duration.
The effects of different periods of initial leaf wetness on conidium infection were also investigated. After spray-inoculation, the plants were exposed to an initial wet period of 0, 3, 6, 12, 18 or 24 h, followed by a fixed dry period of 12 h and a final wet period so that the total length of the cycle was 48 h. The dry period consisted of either low (70%) or high (100%) RH. For both experiments, there were eight plants for each wetness period, of which four plants were for each of the high or low RH. Both the effects of dry periods that interrupted continuous wetness and the effects of different periods of initial leaf wetness on infection experiments were conducted twice.
For the wetness treatments, the plants were placed inside the tent equipped with a misting unit. Each dry period was initiated by removing eight randomly selected plants from the misting unit after the designated time and drying their leaves with a fan as described earlier. When foliage was dry (30–40 min), four plants were placed in the growth chamber (70% RH), and four plants in the 100% RH humidity tent constructed inside the same growth chamber. After the designated period, the plants were returned to the misted tent for the second wet period. Inoculated plants were incubated in darkness during wet and dry periods. At the end of the second wet period, all plants were transferred to the shadehouse to allow development of OLS symptoms, and then disease severity was assessed after 12 weeks as described previously.
All experiments were arranged in completely randomized designs and the data collected were analysed using analysis of variance (anova) (Genstat 7·2, Lawes Agricultural Trust). A preliminary F-test was conducted to determine whether the data from the repeated experiments were similar. Pairwise comparisons were conducted using Fisher’s least significant difference tests to identify the sources of the heterogeneity within experiments.
For the inoculum concentration experiment, three-parameter logistic function:
was fitted to the data at each temperature, where YInoc is an estimate for disease severity at the inoculum concentration, Inoc is the inoculum concentration, α is the estimated upper asymptote or the estimated maximum disease severity, γ is the relative rate parameter of disease development and β is the inflection point. The fitted model was used to estimate α, γ and β for each temperature. The inoculum concentration at the upper asymptote was then regressed on temperature.
For the effect of leaf wetness duration, leaf age and temperature on infection and disease development, disease severity (L) were square root transformed (√L) to reduce heterogeneity in the variance. Because there were no significant differences between experiments, the data were pooled before model fitting. The selected model was of the form:
where Y is √L and ƒ (T, W, A) is a function of temperature (T), leaf wetness duration (W), and leaf age (A). Linear and quadratic terms of T, W and A, and their interactions were tested.
where log[L(θ/y)] is the log-likelihood function at its maximum point and K is the number of estimated parameters in the model. The model with lowest AIC value was selected as the best model (Pinheiro & Bates, 2000). The goodness-of-fit of the models was assessed based on size of the asymptote, the standard errors associated with the estimated parameters, P values of the estimated parameters and the analysis of residual plots.
For the effect of interrupted wetness periods on disease severity, the mean number of leaf lesions per plant for each treatment was expressed as a percentage of the maximum number of lesions per plant (PML) observed on plants from the corresponding continuous wetness treatment. Regression analysis was then used to establish a relationship between PML and length of the dry period or initial wetness duration at the two relative humidity regimes and temperatures.
Effect of conidial concentrations of S. oleagina on disease severity
Inoculum concentration had a significant (P <0·05) effect on disease severity, with OLS severity increasing with increasing inoculum concentration. The number of lesions at each inoculum concentration was consistently lower at 25°C than at the other temperatures tested. The numbers of OLS lesions on leaves of olive plants (disease severity) that were inoculated with different inoculum concentrations at different temperatures were significantly (P <0·05) lower in the repeat experiment than in the first experiment. However, the effects were similar for both experiments and there was no statistically significant interaction of experiment with other factors, therefore the data were pooled before model fitting.
Logistic function fitted to the data at each temperature estimated the asymptote (α) or the maximum lesion number to be highest at 15°C and lowest at 25°C (Table 1). Similarly, the relative rate of disease development (γ) was higher at 15°C than for the other temperatures tested. Inoculum concentration at the asymptotes was regressed against temperature and was best described by the equation:
where Y is the logarithm (base 10) of the asymptotic inoculum concentration and T is the temperature during the time of infection (Fig. 1). The model accounted for 94·7% of the variance, and the estimated regression coefficients were each significant at P ≤0·001.
Table 1. Estimated parameters for the logistic function, YInoc = α/[1 + e(β−γ*Inoc)], which describes the relationship between olive leaf spot severity (YInoc) and inoculum concentration (Inoc) of Spilocaea oleagina at five constant temperatures after 48 h of leaf wetness
aLogistic parameter estimates where α is the upper asymptote or the maximum disease severity, γ is the relative rate parameter of disease development and β is the inflection point.
Leaf lesions were first observed 8 weeks after inoculation with the higher concentrations (≥5·0 × 104 conidia mL−1) compared with 12 weeks for the lower concentrations (≤1·0 × 104 conidia mL−1). In addition, inoculation with the two highest concentrations (2·5 × 105 and 2·5 × 106 conidia mL−1) resulted in OLS lesions on leaf petioles and along the midribs on the lower leaf surfaces of some plants, for all temperatures tested except 25°C.
Effect of wetness periods, temperature and leaf age on OLS severity
The effect of leaf age on severity was not significantly different (P =0·788) among repeated experiments. The young leaves developed significantly (P <0·001) more lesions than either the intermediate or old leaves (Fig. 2). Temperature and leaf wetness period had significant (P =0·005) effects on OLS severity. There was no difference between experimental replications on the OLS severity although at 10°C and after 96 h of wetness the mean number of lesions on the leaves was lower (35 lesions per plant) in Experiment 2 compared with Experiment 1 (47 lesions per plant) and 3 (50 lesions per plant). The number of lesions averaged over all leaf wetness durations increased gradually from 20 lesions per plant at 5°C reaching a maximum of 57 lesions per plant at 15°C, and then declined to 13 lesions per plant at 25°C. Generally, OLS infection and development on the leaves of inoculated plants increased with increasing periods of leaf wetness after inoculation, at all temperatures tested. OLS developed for all leaf ages, temperatures and leaf wetness periods tested, except for 0 h and 6 h wetness periods, when no lesion developed at any of the temperatures tested. The minimum leaf wetness periods where lesions developed at 5, 10, 15, 20 and 25°C were 18, 12, 12, 12 and 24 h, respectively.
At higher leaf wetness durations (>48 h) and at 5–15°C, lesions were also observed on leaf petioles, and midribs on the undersides of some of the inoculated leaves, but no lesions were observed on the lower leaf blade surfaces. The time at which the first lesion was observed on each leaf seemed to vary according to temperature and leaf wetness conditions during the infection process. For instance, under optimum conditions of 15°C and 96 h of wetness, it took 8 weeks for the first lesions to appear in all experimental repeats. However, it took up to 12 weeks post-inoculation for lesions to develop in the less optimum conditions.
T, W, A, and the interaction of W and A all had significant (P <0·001) effects on the square root of OLS severity. However, there was no significant interaction between T and A (T × A) (P =0·568) or T, W and A (T × W × A) (P =0·816). Consequently, both these interactions were excluded from the model. The model that best fitted the pooled data from the three experimental repeats was:
where Y is √(disease severity), A is leaf age (weeks), T is temperature (°C), W is wetness period (h), and β0…β6 are the determined parameters. Parameter estimates and their standard errors are given in Table 2. All the estimated parameters in the model were significant at P <0·001. The response surface of the model for the effects of temperature, wetness duration and leaf age on OLS severity is presented in Figure 2. The model accounted for 89·5% of the variance.
Table 2. Estimated parameters and associated standard errors for the prediction model [Y = β0 + β1A + β2T + β3W + β4 (A × W) + β5T2 + β6W2] to describe the relationship between temperature (°C; T), leaf wetness (h; W), leaf age (weeks; A) and square root of olive leaf spot severity (Y)
aEstimated regression parameters for the fitted terms.
bAll parameters were significant at P <0·001.
Effect of dry period and RH after an initial wet period on OLS severity
In the first experimental series, with fixed initial wetness periods of 12 h, there was a significant (P <0·001) difference in the disease severity on the leaves between the two experiment repetitions conducted at 10°C, with the severity being consistently higher in the first experiment than in the second. However, at 20°C there was no significant (P =0·420) difference between the experiment repeats, therefore the results were pooled.
The length of the dry period that followed the initial 12 h wet period and RH during the dry period significantly (P <0·001) affected the severity on olive leaves inoculated with S. oleagina conidia at 10 and 20°C. At both temperatures, plants exposed to high humidity (100% RH) during the dry periods, had higher numbers of leaf lesions with increasing dry periods, whereas at low relative humidity (70% RH) the numbers of leaf lesions decreased with increasing dry periods.
For the continuous wetness treatments at 10 and 20°C, numbers of leaf lesions increased with increasing continuous wetness period, except for a similar number of lesions with 36 and 39 h wetness at 20°C. Overall the rate of increase over time was much greater (P <0·0001) under wet incubation conditions than with the corresponding interrupted incubation periods that incorporated 70% or 100% RH. Regression analysis was used to establish a relationship between PML (maximum number of lesions per plant) and length of the dry period at the two relative humidity regimes and temperatures (Fig. 3). The relationship was best described by:
where Y is PML, D is the dry period (h), and a, b, and c are estimated parameters (Table 3).
Table 3. Estimated parameters and associated R2 values for the regression model, Y = c + bD + aD2, which describes the relationship between percentage of maximum lesions of olive leaf spot per plant and the length of dry period at two different temperatures and relative humidity treatments
aThe estimated regression parameters.
For the second experiment in the series, which had varying initial wetness periods from 0 to 24 h, there were significant differences at 10 (P <0·0001) and at 20°C (P <0·05) in lesion numbers between experiment replications. At both temperatures, the numbers of lesions were consistently lower in the first experiment than in the second experiment. However, the differences did not affect the consistency of the overall trend.
Initial wetness of various durations, followed by a fixed dry period of 12 h, had a significant (P <0·001) effect on the number of lesions that developed on the inoculated olive plants. Plants subjected to a 12 h initial wetness period followed by a 12 h dry period prior to a final wet period had the significantly lowest numbers of lesions at both temperatures and RH levels. At both temperatures, the effects of the two RH levels during the 12 h dry periods followed the same trends observed in the previous experiment, with fewer lesions recorded for plants under low RH (70%) than for those under high RH (100%) during the dry period.
Numbers of lesions per plant at low and high RH treatments were significantly different from the continuous wetness control, being P =0·0008 and 0·002 respectively at 10°C, and P =0·006 and 0·001 respectively at 20°C. For both humidity treatments and temperatures, percentage of maximum lesions (PML) values decreased with increasing duration of initial wetness up to 12 h and then increased with increasing wetness period (Fig. 4a,b). No disease developed on plants that were dried immediately after inoculation (0 h).
The relationship between initial wetness duration and percentage of maximum lesion (PML) was best described by:
where Y is PML, W is the initial wetness period (h), and a, b and c are estimated parameters (Table 4).
Table 4. Estimated parameters and associated R2 values for the regression model, Y = c + bW + aW2, which describes the relationship between percentage of maximum lesions of olive leaf spot per plant and the length of initial wetness period at two different temperatures and relative humidity treatments
aThe estimated regression parameters.
In this study, inoculum concentration played a significant role in determining OLS disease severity and the rate of symptom development, which increased with increasing inoculum concentration up to a maximum of 2·5 × 105 and 5·0 × 104 conidia mL−1, respectively. A relationship between inoculum concentration and symptom development has also been reported by Hartman et al. (1999) for incidence of scab lesions caused by V. inaequalis on apple seedlings in controlled experiments. Increasing inoculum dose, ranging from 1·5 × 103 to 2·5 × 105 conidia mL−1, caused increasing apple scab incidence up to 8·12 × 104 conidia mL−1, after which it did not increase.
Plant tissue susceptibility to a particular pathogen can vary with the age of the tissue. In the present study, the youngest leaves developed more OLS lesions than older, more physiologically mature leaves, especially when the temperature and leaf wetness durations were optimal. Typically, disease severity on the youngest leaves was about three to five times higher than on the oldest leaves, and there was a more rapid increase in the severity between 6 and 24 h of wetness for the youngest leaves than for the other two leaf ages. These results are consistent with those of López-Doncel et al. (2000) who found that younger leaves were more susceptible to OLS than older leaves.
The decrease in OLS severity with increasing leaf age was probably associated with the greater rate of conidium germination on young (2 weeks) than on old (10 weeks) leaves (Obanor et al., 2008a). The differences could also be associated with the water-repellent waxes that are built up on leaf surfaces as they age, which prevent formation of water films in which pathogens might germinate (Agrios, 1997). In addition, the thicker cuticle of older leaves can better impede penetration (Jeyaramraja et al., 2005), an important factor for S. oleagina which infects by direct cuticle penetration (Graniti, 1993).
Field reports on the effect of olive leaf age on their susceptibility to S. oleagina are contradictory. Guechi & Girre (1994) demonstrated that in France, OLS lesions appeared on olive leaves in December and were most abundant on the first three pairs of leaves of young shoots that developed in spring of the same year. However, in California, Wilson & Miller (1949) reported that OLS lesions were more abundant on older leaves, which were more susceptible to the disease. This may have been due to the slow symptom expression found in this study, even with young foliage where lesions appeared no sooner than 8 weeks after inoculation.
Duration of wetness during conidium germination and infection affected lesion development, with more than 6 h wetness being required for infection even in optimal temperatures. Temperature also affected lesion development since they developed most quickly at 15°C and slowly at 5 and 25°C. These results are consistent with a previous detached olive leaf study, which demonstrated that longer incubation periods are required for S. oleagina conidium germination at <10 and >20°C (Obanor et al., 2008a). These results showed that the temperature effects were more pronounced for longer wetness periods than shorter periods, indicating that leaf wetness is probably the most important environmental variable, with temperature regulating the rapidity and level of disease development. The importance of these factors was further demonstrated by the good fit of the proposed polynomial model (Eqn 2), to the observed data. The model predicted OLS severity from leaf age, temperature and leaf wetness duration variables, of which only leaf age was relatively unusual compared to other models. However, leaf age was also included in a prediction model developed for phomopsis leaf blight of strawberry, caused by Phomopsis obscurans (Nita et al., 2003).
Interrupting the periods of leaf wetness at 10 and 20°C also affected OLS development, but it depended on the relative humidity during the dry period, an effect not previously reported for OLS. Similar results were documented by Villalta et al. (2000) for V. pirinia lesions for which an initial wet period of 3–5 h followed by dry periods of 1–90 h with high relative humidity (>90%) and 20°C, caused the number of lesions on pear to decrease from 2·7 to 0·15 per cm2. However, no lesions developed if leaves were dry for more than 12 h at low relative humidity (<70%). Also, Arauz & Sutton (1990) showed that Botryosphaeria obtusa infection of apple leaves stopped irreversibly if the wetness period was interrupted by one or more hours.
In the first interrupted wetness experiment reported here, high RH (100%) during the dry periods caused increasing disease severity with the length of the dry period, to a greater degree than at the equivalent times under continuous wetness. This result agrees with previous findings that at 20°C conidia of S. oleagina could germinate when high relative humidity followed 6 h of wetness (Obanor et al., 2008a). Similar results were also found by Shaw (1991) for Mycosphaerella graminicola infection of wheat, in which periods of 100% RH interrupted by dry periods of 75% RH did not affect infection whereas it was significantly reduced by 50% RH during the dry periods.
The ability of the conidia to withstand some measure of drying and to exploit successive periods of leaf wetness cumulatively to successfully germinate and infect olive leaves was dependent on the length of the initial wetness period. At both temperatures tested (10 and 20°C), the 12 h dry period after an initial 12 h wet period caused the greatest reduction in lesion numbers. This may be because at least 12 h wetness was needed to initiate germination. This hypothesis is also supported by the results of the first interrupted wetness experiment, in which all treatments had an initial 12 h wet period, but reductions in lesion numbers only occurred when the following dry periods were 70% RH of ≥12 h. It is also possible that the germlings are most vulnerable to desiccation at this stage of development. A similar observation was reported by Vloutoglou et al. (1999) who found that at 15°C, Alternaria linicola conidia applied to linseed plants were susceptible to drying, particularly when the dry period was 12 h long and occurred after 1–12 h of wetness.
These results and those from a previous study on conidium germination (Obanor et al., 2008a) indicate that it is the ability of the S. oleagina conidia to survive drying rather than to germinate and penetrate rapidly that is critical to their successful infection under interrupted wet periods. They clearly need an initial wetting period of at least 6 h and the amount of continuing moisture after that determines whether conidia are able to complete germination and infection. However, even in the least conducive conditions there were some conidia able to germinate and infect, causing 60–70% of the lesions produced in optimum conditions. Clearly the variations in requirements of individual conidia or in different microsites on the plants, even within such a uniform environment, are likely to mean that the germination requirements of some conidia are met within field conditions.
Under field conditions, germinating spores become exposed to sunlight, fluctuating temperatures and relative humidity, which could cause much higher mortality than with a controlled environment. For B. obtusa on apple leaves, Arauz & Sutton (1990) showed that in no instance did germ tubes resume growth after being rewetted following dryness periods ranging from 4 to 8 h and that the relative humidity during the dry period did not have any effect on the ability of the germ tubes to continue their elongation upon rewetting. Further research is required to investigate the effect of dry periods in situ, including the influence of direct sunlight on S. oleagina conidium survival, germination and infection processes.
In conclusion, the data presented here have demonstrated that young olive leaves are more susceptible to OLS than older leaves, but the overall effect is moderated by the temperature and leaf wetness duration. The inoculum levels, leaf wetness, length and RH of dry periods that interrupt wet periods are important factors in infection and development of the disease. A disease infection model was developed, which could be further developed into a disease forecasting system, a tool that may then be used in devising a spray schedule that accurately indicates the timing of fungicide application in response to climatic factors. However, in order to develop a comprehensive OLS forecasting system, information on factors influencing S. oleagina conidia production is required. This is currently under investigation.
We would like to thank the New Zealand Foundation for Research, Science and Technology (FRST), and the New Zealand Olive Association for funding. Thanks also to Brent Richards of Lincoln University Nursery for maintaining the olive plants used for this study.