Use of organic fertilizers, such as animal manures and slurries (Beuchat 1996; Natvig et al. 2002), abattoir wastes (Avery et al. 2005) and sewage sludge (Al-Ghazali and Al-Azawi 1990) introduce pathogens directly to the field, and run-off can contaminate irrigation water.
Over 90 million tonnes of animal waste are put to land annually in the UK (Food Standards Agency 2004a, Aviation House, London, UK). There are comprehensive guidelines available to growers that advise on sufficient treatment of wastes and correct timing of application, with the aim of limiting contamination of crops. In the UK, these guidelines are set out in the Safe Sludge Matrix (ADAS) (Anon 2001b) and the Codes of Good Agricultural Practice (Department of the Environment, Food and Rural Affairs). The Safe Sludge Matrix, for example, states that even when enhanced-treated sludge is applied to land, a 10-month harvest interval is necessary and the use of conventionally treated sludge requires a 30-month harvest interval for salad crops. These intervals should be sufficient to ensure the microbial quality of produce at harvest. Similar recommendations are set out in the United States Environment Protection Agency Part 503 Biosolids Rule (Anon 1993) and Canadian Ministry of the Environment guidelines (Anon 1996).
Irrigation water quality
Faecal material, soil and other inputs such as sewage overflow introduce enteropathogens directly to watercourses from which irrigation water may be extracted. In the UK, 71% of irrigation water is obtained from surface waters, which receive treated sewage effluent (Tyrell et al. 2006). The potential for contamination via irrigation water is increased in the developing world, as untreated wastewater is used for irrigation of around 10% of crops (Anon 2003). Wastewater irrigated crops show an increased incidence of enteropathogens (Steele and Odemeru 2004). Wachtel et al. (2002a) describe E. coli contamination of the roots of cabbage irrigated with sewage-contaminated stream water, although the edible part of the plant was unaffected. Islam et al. (2004) demonstrated that a single application of S. Typhimurium inoculated irrigation water resulted in contamination of carrot and radish at harvest, with Salmonella surviving for 203 days in soil postapplication. Lettuce plants irrigated with a single application of E. coli O157:H7 contaminated water tested positive for presence of E. coli O157:H7 at harvest (30 days postinoculation), and plants contaminated at days 7 and 14 of the study were shown to yield increased populations (Solomon et al. 2003). Quantitative risk assessment models for the use of reclaimed water show that risk varies between crops, with lettuce found to pose a higher risk than cucumber, but comparable to that of broccoli and cabbage (Hamilton et al. 2006). The interval between irrigation and harvest will affect the likelihood of pathogenic bacteria surviving to reach the consumer. A survey of UK-based salad vegetable producers showed that over 50% growers will harvest baby-leaf crops within 24 h of the last irrigation (Tyrell et al. 2006).
A number of outbreaks have been traced to the use of contaminated water in irrigation. Iceberg lettuce imported from Spain during 2005 caused cases of S. Typhimurium throughout the UK and Finland, after wastewater was used to irrigate the crop (Takkinen et al. 2005). Cases of E. coli O157:H7 in Sweden in 2005 were traced back to lettuce irrigated with water from a stream contaminated with cattle faeces (Söderström et al. 2005). Water may also act as a vehicle for the dissemination of viral particles. Beuchat (1996) reports a nontypical outbreak of norovirus linked to celery and irrigation with sewage-contaminated water has resulted in the outbreaks of hepatitis A linked to lettuce consumption (Seymour and Appleton 2001) and spring onions (Josefson 2003).
Hillborn et al. (1999) describe an outbreak of E. coli O157:H7 attributed to mesclun lettuce, assumed to be irrigated with water contaminated by cattle grazing a nearby field. Solomon et al. (2002) showed that E. coli O157:H7 in contaminated water can enter the vascular system of lettuce and reach the edible parts of the plant, although the authors point out that unrealistic inoculum concentrations were used.
Pathogens may be naturally present in soil, for example Listeria spp. (Nicholson et al. 2005), or may become incorporated in the soil matrix from organic wastes added as fertilizer. Pathogens within soil may contaminate crops directly when heavy rain or water gun irrigation causes leaf splash.
The ability of the pathogen to survive in the environment will impact on the likelihood of crop contamination and pathogen viability at harvest and through to consumption. Initially, the pathogen must survive in the propagation environment until crops are planted out, or in organic wastes applied to the land. Table 2 lists survival times for each enteropathogen from a number of studies.
Survival times are often inconsistent and reflect the variability in propagation environments and organic waste treatments. Kudva et al. (1998) demonstrated that aeration of ovine manure decreased survival of E. coli O157:H7 from >365 to 120 days. The application method used for organic wastes may increase survival time: clumping of material applied above ground, and injection application of liquid manures can protect bacteria from desiccation and high temperatures (Hutchison et al. 2004).
Stresses encountered during passage through the gut, for example the acidity of the environment, may increase survival by inducing entry to survival stages. Escherichia coli and Salmonella will exhibit the general stress response, producing a range of stress proteins which can confer cross-resistance to a range of stresses (Barker et al. 1999). Cross-protection mechanisms may extend bacterial survival in the environment, by reducing the impact of abiotic factors. Leyer and Johnson (1993) report that after acid adaptation, S. Typhimurium displayed increased tolerance of heat and osmotic stress, whilst Hartke et al. (1995) demonstrated that pre-irradiation of Lactococcus lactis increased resistance to lethal challenges of acid. The stress response of L. monocytogenes is similar to that of E. coli, but is regulated by the sigma factor σB, which has been suggested to increase virulence (Wonderling et al. 2004). Campylobacter may enter a viable but nonculturable stage (Buswell et al. 1998), but the main mechanism of survival is production of large numbers of cells within the host (Jones 2001). Seasonal variation in shedding of pathogens can result in higher than expected microbial loads in faecal material: Campylobacter shedding increases in spring and autumn (Stanley and Jones 2003) and E. coli levels during spring and summer (Chapman et al. 1997). If increased pathogen loads are present, then simply following guidelines may not be sufficient for preventing crop contamination.
Survival in the phyllosphere
Interest is shifting towards the fitness of the enteropathogen on the leaf surface (phylloplane): if a pathogen can persist on the phylloplane, then the chance of an infectious dose remaining at consumption is increased. Beuchat (1999) showed that E. coli O157:H7 contained in bovine faeces and inoculated onto lettuce could be isolated from lettuce up to 15 days after inoculation. Fett (2000) suggested that transient occupants of the leaf, such as enteropathogens, may become incorporated into phylloplane biofilms.
Biofilms are complex structures composed of many species of bacteria, filamentous fungi and yeasts, with 106–108 cells fw−1 g−1 (Morris et al. 1998). Cells are enclosed within an exopolymeric matrix, which can buffer environmental changes such as nutrient stress and desiccation (Monier and Lindow 2005); therefore, bacteria within biofilms will have an increased survival rate. Between 30% and 80% of the total bacterial population on a leaf surface will be contained in these aggregates (Morris and Monier 2003), which tend to be associated with sources of nutrients such as leaf veins and glandular trichomes (Monier and Lindow 2005). Fett (2000) showed that biofilms were present on the cotyledons, hypocotyls and roots of alfalfa, broccoli, sunflower and clover sprouts, by 2-day postgermination.
Enteropathogens can adapt to the phyllosphere environment, but may be outcompeted by epiphytic bacteria (Cooley et al. 2006), especially if both species compete for the same carbon source. Interactions between immigrant bacteria and epiphytes are diverse: Salmonella enterica has been demonstrated to aggregate with Pantoea agglomerans on the leaf surface of cilantro (Brandl and Mandrell 2002) and Wausteria paucula was shown to support actively the survival of E. coli O157:H7 in the rhizosphere and leaf-surface of lettuce (Cooley et al. 2006). Barak et al. (2002) demonstrated that S. Newport attached to alfalfa sprouts as efficiently as the plant-associated bacteria Pseudomonas putida, Pan. agglomerans and Rhanella aquatilis, and significantly better than E. coli O157:H7. However, epiphytes may also limit survival of immigrant bacteria; Cooley et al. (2006) demonstrated that S. Newport and E. coli O157:H7 could be out-competed on lettuce by Enterobacter asburiae, repressing growth of the enteropathogens 10-fold. Janiesiewicz et al. (1999b) reported that co-inoculation of E. coli O157:H7 and Pseudomonas syringiae into wound sites on apples suppressed the growth of E. coli and Carlin et al. (1996) demonstrated that the background flora present on endive prevented the growth of L. monocytogenes. These results suggest that there is potential for the naturally occurring microflora to be used as a biocontrol agent, to prevent enteropathogenic bacteria becoming established on the leaf.
Solomon and Matthews (2006) showed that bacterial processes, such as gene expression, motility or production of extracellular compounds, were not necessary for initial attachment but are likely to be important in further colonization and survival on the leaf. Plant-associated bacteria produce acyl-homoserine lactones (AHLs) for communication via quorum sensing, and Brandl (2006) hypothesizes that AHLs may help upregulate factors in enteropathogens beneficial to their survival on the leaf, such as expression of rpoS which increases resistance to stresses commonly encountered on the leaf, for example, desiccation.
A major factor in limiting bacterial survival in the phyllosphere is UV radiation. Ecologically successful phylloplane bacteria are efficient in UV-induced DNA damage repair or preferentially colonize sites that are protected from UV, such as within the interior of a leaf (phytopathogens) or at the base of structures such as trichomes (saprophytes) (Jacobs and Sundin 2001). The biofilm matrix also shields against the damaging effects of UV irradiation (Elasri and Miller 1999). Phyllosphere communities exhibit a marked shift towards UV tolerant phenotypes, for example, pigmented bacteria, as the growing season progresses (Jacobs and Sundin 2001). In Pseudomonas syringae, expression of the gene rulAB confers DNA repair capabilities and therefore increased UV tolerance (Sundin and Murillo 1999). Escherichia coli and S. enterica possess homologues of rulAB (Brandl 2006), suggesting an ability to withstand UV irradiation.
Enteropathogens encounter osmotic stress frequently when passing through the host gut, and consequently display a number of stress-avoidance mechanisms, mediated by rpoS, which may induce cross-resistance to stresses encountered on the leaf (Brandl 2006). For this to impact on survival, pathogens would have to become established on the leaf surface relatively and quickly after excretion from the host. Exposure to plant-produced antimicrobials upregulates a homologue of the sap operon in Erwinia chrysanthemi. In S. enterica, induction of the sap operon promotes acid-resistance and therefore survival in the acidic conditions of the gut; Brandl (2006) suggests that therefore a period of residence in the phyllosphere may lead to increased virulence of enteropathogens.
Further protection from environmental stresses may be afforded by movement into the internal tissue of the plant. This is normally a passive process, unlike the destructive entry of many phytopathogens: enteropathogens in irrigation water can be taken up by the root systems and enter the edible portion of the crop (Wachtel et al. 2002a), for example, lettuce (Seo and Frank 1999; Takeuchi and Frank 2000; Solomon et al. 2002), apple (Burnett et al. 2000) and tomato (Guo et al. 2001, 2002). Enteropathogens may also gain entry via wounds (Janisiewicz et al. 1999), or structures such as lenticels (Burnett et al. 2000) and stomata (Seo and Frank 1999). Infiltration of enteropathogenic bacteria through these structures can occur when bacteria are present in water on the surface of fruits (Burnett et al. 2000) and can be increased during processing if wash water is of a lower temperature than the fruit, creating a negative temperature differential (Buchanan et al. 1999).
The presence of phytopathogens may increase the penetration and growth of enteropathogenic bacteria, because of disruption of the cuticle and increased release of nutrients (Wells and Butterfield 1999). Richards and Beuchat (2005) report that co-inoculation of wound sites on cantaloupe with S. Poona and the phytopathogens Cladosporium cladosporiodes or Penicillium expansum increased penetration of Salmonella into the internal tissues of the fruit, because of the tissue breakdown caused by the fungi. Stopforth et al. (2004) demonstrated that E. coli O157:H7 can survive and proliferate in injured apple tissue, even after the use of sanitizers.
Oron et al. (1995) demonstrated that poliovirus applied to the roots of tomato plants can be recovered from the leaves, but not the fruit. This was attributed to the presence of antiviral substances and not to an inability of the virus to reach the fruit. Dingman (2000) analysed the proliferation of E. coli O157:H7 in bruised apple tissue and observed that growth was suppressed in McIntosh apples, unlike the other cultivars used. This was thought to be due to production of an unstable or volatile inhibiting factor, as the effect was reduced during storage. Reinders et al. (2001) studied the effect of caffeic acid on E. coli O157:H7 survival in a model apple juice medium and demonstrated a reduction in E. coli populations, suggesting that phenolic acids play an important role in limiting the bacterial survival in planta. A range of phenolic acids, including caffeic acid, were shown to inhibit the survival of L. monocytogenes in vitro (Wen et al. 2003) and Delaquis et al. (2006) report evidence that an antilisterial factor, thought to be of a phenolic nature, is produced by wounding (shredding) of iceberg lettuce and therefore is likely to play a role in limiting L. monocytogenes growth in bagged salads.
The interaction between the host plant and epiphytes, symbionts and phytopathogens has been extensively studied. However, the role of plant–microbe interactions in limiting the colonization by enteropathogenic bacteria is not so well described. Production of antimicrobial factors may be a direct response to the presence of pathogenic bacteria. Dong et al. (2003) suggest that genetic aspects of both the host plant and Salmonella are involved in endophytic colonization. Barak et al. (2004) showed that virulence factors, including aggregative fimbriae and expression of rpoS, were involved in S. enterica attachment to alfalfa. Interestingly, these virulence genes are essential for infection in an animal host. There are a number of similarities between the systems employed by plant and animal pathogens. The type III secretion system (TTSS), which enables the delivery of pathogenicity proteins to the host cell, is conserved across the Gram-negative plant and animal pathogens, although the proteins secreted differ (Hueck 1998). Recognition of elements of the TTSS in phytopathogens causes induction of host plant defence mechanisms (Hueck 1998). The conserved nature of these factors in human pathogens suggests that plants may also respond to the presence of enteropathogenic bacteria in the phyllosphere. Iniguez et al. (2005) describe the role of host defences in limiting endophytic colonization by the enteric bacterium, S. Typhimurium. Addition of ethylene, the signal molecule which induces systemic resistance, decreased Salmonella colonization of the roots of the legume Medicago trunculata. Use of Salmonella mutants, deficient in structural components of the TTSS or flagella, showed increased colonization of the roots of alfalfa, and ethylene production was reduced, suggesting that the plant did not recognize the pathogen. Further colonization studies using an Arabidopsis thaliana mutant (npr1) provided evidence that recognition of TTSS components induces salicylic acid-mediated defence signalling (Iniguez et al. 2005). The authors suggest that overexpression of defence-related genes in crop plants may present a novel method to control enteropathogen colonization in the field.