Cattle and sheep farms as reservoirs of Campylobacter

Authors


Karen Stanley, Rowett Research Institute, Greenburn Road, Bucksburn, Aberdeen AB21 9SB, UK (e-mail: k.stanley@rowett.ac.uk).

  • 1Summary, 104S
  • 2Introduction, 104S
  • 3General aspects of Campylobacter, 104S
    • 3.1 Historical aspects, 104S
    • 3.2 Biology of thermophilic campylobacters, 105S
    • 3.3 Cycles of growth and survival, 105S
  • 4The role of ruminant animals in human infection, 105S
    • 4.1 General epidemiology in humans, 105S
    • 4.2 The significance of ruminant sources of infection, 105S
    • 4.3 The seasonality in human infection, 106S
  • 5The incidence of Campylobacter in cattle and sheep, 107S
    • 5.1 Isolation rates at slaughter, 107S
    • 5.2 Patterns of Campylobacter shedding on the farm, 108S
    • 5.3 Seasonal variation in dairy herds, 108S
    • 5.4 Factors affecting shedding in dairy herds, 108S
    • 5.5 Campylobacter shedding by sheep, 109S
    • 5.6 Young animals on the farm, 109S
      • 5.6.1 Effects of lambing on colonization, 109S
      • 5.6.2 Colonization and shedding by calves, 109S
  • 6Contamination of the farm environment, 109S
  • 7Potential sources of new infection in adult cattle, 110S
  • 8Conclusions, 110S
  • 9References, 111S

1. Summary

Aim: This is a review of the natural Campylobacter colonization and transmission among ruminant livestock in the dairy farm environment.

Methods and Results: Using cultural detection methods and enumeration techniques the distribution of Campylobacter in ruminant animals at birth, on the farm, at slaughter and in the farm environment have been examined. Colonization and shedding rates are higher among young animals while patterns of shedding in adult animals may be seasonal. Stored and land-dispersed slurries provide a reservoir for scavenging birds and flies and a source for runoff.

Conclusions: The dairy farm plays a significant role in the dissemination of Campylobacter sub-types that can cause disease in the human community.

Significance and Impact of Study: An understanding of the role of the dairy farm in the environmental cycle of Campylobacter is required in order to devise intervention strategies.

2. Introduction

Campylobacter jejuni is the most commonly isolated bacterial pathogen associated with diarrhoea in the UK (Ketley 1997) and other industrialized countries (Mead et al. 1999). Historically, the North-west region of England has had a higher than average rate of infection (Jones and Telford 1991) but otherwise the data in that area reflect the trends observed in the UK national data, i.e. an increase over the years in the annual number of cases since 1981, and a late spring/early summer annual peak. There is also a lower secondary autumn peak.

Campylobacter jejuni colonizes the gastrointestinal tract of a broad range of animals but the most important risk factor for human Campylobacter infection is widely held to be the handling and consumption of raw poultry and cross-contamination to uncooked products (Tauxe 1992). However, there is now a growing body of molecular evidence that suggests the significance of non-poultry sources of human clinical infection has been underestimated. This review focuses on the significance of Campylobacter colonization of cattle and sheep, the incidence and rate of shedding among these animals and the role of the dairy farm as a reservoir of Campylobacter infection.

3. General features of Campylobacter

3.1 Historical aspects

For most of the last century Campylobacter was recognized exclusively as an animal pathogen but the description of a suitable selective isolation medium and exacting growth requirements (Skirrow 1977) was required before the role of Campylobacter, as an aetiological agent of human enteritis, was fully recognized. During the last 25 years new pathogenic species have been assigned to the genus. Although a number of these have been implicated in human enteritis the most common by far are the so-called ‘thermophilic’ campylobacters of which, C. jejuni causes more than 90% of Campylobacter enteritis in the UK. The other ‘thermophilic species’ strictly include C. coli and C. lari but C. hyointestinalis is also associated with enteric disease and can be isolated from cattle.

3.2 Biology of thermophilic campylobacters

Thermophilic campylobacters are small, non-spore forming, Gram-negative bacteria which are vigorously motile by means of a single polar flagellum at one or both ends of the cell. Unique properties of the flagellum impart increased cell motility in highly viscous environments (Alm et al. 1993) and allow the organism to colonize mucus within the intestinal and caecal crypts (Lee et al. 1986). The flagellum is the most intensively studied virulence determinant.

3.3 Cycles of growth and survival

Thermophilic Campylobacter neither ferment nor oxidize carbohydrate but obtain energy from oxidation of amino acids or tricarboxylic acids. They have simple nutritional requirements and can be grown on a peptone base (Grau 1991) but they compete poorly with other flora. They grow best in an atmosphere containing 5–10% oxygen and are described as microaerophilic to distinguish their preferential use of oxygen as a terminal electron acceptor under reduced oxygen tensions (Krieg and Hoffman 1986). The ‘thermophilic’ species are so-named not so much for their ability to grow at high temperatures but to emphasis their inability to grow below 30°C. Such temperature growth constraints, when considered along with their microaerophilic nature, suggest that they are unlikely to find suitable conditions for growth outside of the mammalian gut. We must also assume that thermophilic Campylobacter cannot amplify on food or in water and therefore they should be considered ‘food-borne’ rather than ‘food-poisoning’ organisms.

Indeed, they do not survive well at ambient temperatures. Further, thermophilic campylobacters are reportedly sensitive to many environmental factors, such as high temperatures, atmospheric concentrations of oxygen, oxygen species, such as free radicals and peroxides, and to desiccation (Griffiths and Park 1990). This leads to the inevitable question – how does such a ‘fragile’ organism cause disease so frequently? The fact that Campylobacter outbreaks are rare, accounting for only 2% of outbreaks in England and Wales where an aetiological agent was identified between 1994 and 1999 (Frost et al. 2002) is consistent with the assumption that Campylobacter do not grow on food. However, recent biochemical investigation (Kelly 2001) and the recently published genome seqence data (Parkhill et al. 2000) support the notion that C. jejuni is a versatile and metabolically active organism which may be able to exploit more diverse environments than the described constraints suggest.

The major environmental reservoirs of thermophilic campylobacters are the intestines of warm-blooded mammals and birds, where it is thought that they are non-pathogenic, at least in older animals (Griffiths and Park 1990). The intestines of host animals are therefore a critical site of amplification in the Campylobacter life cycle as this is where the maximum population size will occur. Enumeration studies, more than just simple presence/absence testing, are essential if we are to understand the ecology, distribution and ‘life cycle’ of these organisms. Once excreted into the environment at least a small proportion of the voided cells must adapt a suitable survival strategy until ingested by another susceptible host. Vehicles and vectors that transmit Campylobacter between hosts must play a significant role in the epidemiology of this organism.

4. The role of ruminant animals in human infection

4.1 General epidemiology in humans

Although Campylobacter has been the most frequently isolated pathogen associated with gastrointestinal infection in England and Wales since 1981, its transmission to humans is poorly understood. Much has been learnt about the epidemiology of Salmonella from studying outbreaks. In turn, these data have informed control strategies that have no doubt played a part in the current downward trend in Salmonella cases in the UK. However, most cases of human campylobacteriosis are thought to be sporadic and it is frustrating that the application of most available molecular subtyping techniques (for review see Frost 2001) to resolving Campylobacter infections has, over the last decade, yielded more questions than answers. Population genetic analysis has recently revealed that C. jejuni is genetically diverse with a weakly clonal population structure and that intra- and inter-species horizontal genetic exchanges are common (Dingle et al. 2001).

4.2 The significance of ruminant sources of infection

It is widely held that the major source of sporadic cases is the handling and consumption of contaminated poultry meat and this is supported by case–control studies (Tauxe 1992). Broiler flocks are frequently contaminated with C. jejuni (see Corry and Abatay 2001 for review) as are raw poultry meat and its packaging at retail (Jorgensen et al. 2002). Many studies have shown that once Campylobacter is introduced into a broiler shed, birds rapidly excrete high numbers in their faeces and the organism spreads rapidly so that 100% of birds may be colonized within a few days. Avian species are purported to be the natural hosts of thermophilic campylobacters because they have a core temperature of 42°C, which is the optimum growth temperature of C. jejuni.

The significance of Campylobacter colonization of dairy and beef cattle and sheep relates not only to the potential for contamination of milk at the farm and the carcass at slaughter, but also surface and sub-surface water during disposal of abattoir effluents and animal slurries to land. Further, many studies have found the presence of farm animals, such as cattle and sheep, on broiler farms is associated with increased risk of infection in broiler flocks.

The relative direct and indirect contributions of cattle and sheep to sporadic human infections are currently unknown (Frost 2001). There is little evidence to suggest that red meat is an important risk factor for human infection, either from case–controlled studies or prevalence studies on meat. This possibly reflects the more hygienic slaughter procedure for ruminants than poultry, although chilling and desiccation have a significant effect on survival of thermophilic campylobacters on the carcass (Grau 1991). Gross microbial contamination of the carcass with gut contents may occur during evisceration but it is thought that most contamination occurs during removal of the hide or from cross-contamination from hide to carcass via hands and instruments of slaughtermen (Gannon 1999). Although rates on offal tend to be higher (Bolton et al. 1985), most, if not all, surveys over the last two decades have found that the incidence of Campylobacter on red meat is low compared with poultry meat even when other pathogens, such as Salmonella and Escherichia coli O157, can be isolated. An extensive survey of 1400 butchery products and premises isolated Campylobacter spp. from 15 of 2330 raw meat products, including four raw sausages, 10 raw burgers and one other, compared with 84 positive Salmonella samples (Little and De Louvois 1998). The prevalence on lamb appears slightly higher than beef. Campylobacter is more frequently isolated from the lamb lairage than other pathogens (Small et al. 2002) and contamination of lamb offal (liver) at retail has been estimated at 73% (Kramer et al. 2000).

However, there is now a growing body of molecular subtyping evidence that suggests the significance of non-poultry sources of human clinical infection has previously been underestimated (Owen et al. 1995; On et al. 1998; Nielsen et al. 2000; Fitzgerald et al. 2001). These studies have found that certain strains from poultry, cattle, sheep and humans are indistinguishable by a variety of molecular subtyping methods such as PFGE, Fla polymerase chain reaction-restriction fragment length polymorphism (PCR-RFLP) and ribotyping. In the Lancashire area, Fitzgerald et al. 2001 found that some strains appear to have a host specific relationship, that is, some strains were only isolated from one host type. However, six different genotypes, accounting for 50% of strains in the study had been isolated from healthy cattle, sheep and turkey faeces as well as human clinical samples. This shows that cattle and sheep are colonized with, and excrete strains of, C. jejuni which are capable of causing disease in the local community. These findings have significant epidemiological implications for two reasons. First, there has been some reluctance to accept that ‘environmental’ or non-poultry strains pose as great an infection risk and secondly it further questions whether molecular subtyping and genotyping studies can help us understand the non-point sources of sporadic human infection when strains from a range of potential sources are genotypically indistinguishable.

The role of ruminant animals in outbreaks is better understood. Contaminated water, often arising from agricultural runoff, and improperly pasteurized milk account for large outbreaks around the world. For example, the first reported outbreak of Campylobacter enteritis affected 3000 people in Vermont, USA when contaminated water was distributed throughout the town (Vogt et al. 1982). Campylobacter jejuni was also isolated during the recent large outbreak of E. coli O157 associated with contaminated groundwater in the Canadian town of Walkerton (Mackay 2002). The importance of agricultural runoff to sub-surface waters is increasingly recognized as outbreaks in the UK (Duke et al. 1996) and other countries have been associated with groundwater supplies.

Campylobacter is most likely to arise in milk because of faecal contamination but on rare occasions outbreaks have been traced to asymptomatic Campylobacter mastitis which has caused high numbers of organisms to be directly excreted into the milk (Orr et al. 1995). Studies have suggested a low incidence (between 3·8 and 8·1%) of Campylobacter in UK bulk milk tank samples and conventional methods of pasteurization readily destroy it. However, consumption of raw milk was implicated in 30 of 80 outbreaks reported to the centers for Disease Control (CDC) in the USA between 1973 and 1992 (Altekruse et al. 1999) and four of 21 outbreaks reported to the Communicable Disease Surveillance Centre (CDSC) from England and Wales between 1992 and 1994 (Pebody et al. 1997).

4.3 The seasonality in human infection

Rates of human infection correlate with temporal and climatic factors. In many temperate countries there is a striking spring or summer peak (Nylen et al. 2002) while in tropical countries there is little seasonal variation except there may be slightly more infection during the rainy season (Taylor 1992). A number of studies have sought to explain the spring peak, which in the UK consistently occurs 6–8 weeks earlier than the August peak in Salmonella infections. A similar seasonal peak is also seen in the USA (Tauxe 1992) and other European countries such as Italy (Stampi et al. 1992), but is reported a little later in mid-late summer in Northern Europe (Walder and Forsgren 1982). Interestingly, the spring peak is mirrored in temperate regions of the southern hemisphere including New Zealand (Brieseman 1990), Australia (Grau 1991) and South Africa (Franco 1988).

Tauxe 1992 drew attention to the marked seasonality of common source outbreaks of Campylobacter reported through the national Campylobacter surveillance system in the USA, where outbreaks caused by raw milk or contaminated water have a bimodal distribution with peaks in May and October. Epidemiological data suggest that the vehicles and circumstances of common-source outbreaks are different from sporadic infections. It was suggested that the seasonality of human infections might reflect important differences in the ecology of the poultry and bovine reservoirs of Campylobacter.

5. The incidence of campylobacter in cattle and sheep

5.1 Isolation rates at slaughter

Thermophilic campyobacters are readily isolated from the intestinal tract of healthy ruminants although the estimated carriage rate varies significantly between individual herds and flocks. Over the last two decades overall prevalence in adult cattle had been estimated at 0·8% in Norway (Rosef et al. 1983), 19·5% in Portugal (Cabrita et al. 1992), 46·7% in Japan (Giacoboni et al. 1993), 23% in Denmark (Nielsen 2002), 5% (Hoar et al. 2001) and 37% (Wesley et al. 2000) in the USA. Variables, such as herd size and type, season, age of animal, sample site, sample frequency and isolation method, geography, diet and husbandry practices, have been suggested to account for differences. For example, a high incidence of C. jejuni has been observed in cattle raised in feedlots compared with cattle on pasture (Garcia et al. 1985). Isolation rates vary between herds. In the UK, Campylobacter spp. were isolated from 79, 40 and 37% of three different herds (Abatay and Corry 1998).

Stanley et al. (1998a,b) investigated the Campylobacter isolation rate of cattle and sheep slaughtered at a large abattoir in Preston, Lancashire, UK, over a 2 year period. Samples were taken from the small intestine and tested for thermophilic campylobacters by a combination of direct plating of a swab onto campylobacter blood free selective medium (m-CCDA) and also by enrichment of the swab in Preston Selective Enrichment Broth (PSEB) for 24 h. By direct plating alone Campylobacter was isolated from 26·7% of cattle which compared with previous UK estimates that suggested 21% and 23·5% of adult cattle carried Campylobacter at slaughter (Bolton et al. 1982; Manser and Dalziel 1985). By including an enrichment step a further 62·7% of samples were found to be Campylobacter positive giving a total of 89·4% (Stanley et al. 1998a). Similarly, the isolation rate from lambs was estimated at 47·8% by direct plating and 91·1% by enrichment culture.

An enrichment step has often been found to increase the recovery of Campylobacter from ruminant samples. This may be because the average number of Campylobacter in adult bovine and ovine intestinal samples is lower than in broiler samples. Cattle faeces yielded an average of 2·8 log10 MPN (most probable number)Campylobacter per gram fresh weight (gfw−1) which is similar to values observed by Nielsen (2002) who found 2·1 log10 CFU in cows. In the studies of Stanley et al. (1998a) throughout the 2 year sampling period only one sample failed to yield detectable numbers and the maximum value was 7·4 log10 MPN Campylobacter gfw−1. Given the very low infectious dose of around 10–100 C. jejuni cells (Robinson 1981) this represents a considerable risk if the meat should become contaminated with visceral contents during the slaughter process.

Lamb intestinal samples yielded an average of 4·0 log10 MPN Campylobacter gfw−1 and a significant seasonal trend was observed in the 2 years of data which suggested that the Campylobacter MPN peaked in March while the nadir was in October (Stanley et al. 1998b). This spring peak correlates approximately with lambing seasons and with outbreaks in young lambs of rotavirus, Salmonella and Cryptosporidium (Mawdsley et al. 1995). However, neither very young lambs nor gravid ewes should be represented among the daily kill at the abattoir. In comparison a seasonal trend was not found in the quantitative data from cattle at slaughter.

A particularly important factor affecting the estimates in carriage rate is the sample strategy. Generally, estimates based on intestinal samples exceed those observed in studies which test faeces or rectal contents. The latter may give an indication of recent shedding but may underestimate the true rate of carriage (Humphrey and Beckett 1987). Hoar et al. (1999) showed that rectal faecal samples were significantly more likely to yield Campylobacter than faeces sampled from the ground and suggested that this reflected die-off. It also highlights that a faecal sample may not reliably indicate whether an animal is a carrier or not.

Stanley et al. (1998a,b) did not isolate Campylobacter spp. from the large intestine or caeca but C. jejuni were isolated from the rumen of 30% of cattle and lambs. There is little published data to suggest that C. jejuni is able to grow or persist in the rumen but its presence suggests recent ingestion (Grau 1991). An underdeveloped rumen may increase the ease of infection in the lower intestinal tract in younger animals but the organism must pass though the rumen of adult animals if re-infection is to occur during an animal's adult life.

5.2 Patterns of Campylobacter Shedding on the farm

The factors that govern faecal shedding of zoonotic organisms such as Campylobacter and E. coli are poorly understood. Very little data are available on pathogen shedding in naturally colonized animals and more is needed to help identify patterns of shedding. Such studies are difficult to perform as they necessarily involve labour intensive quantitative analysis of a large number of potentially negative samples. The application of real time PCR may offer the opportunity for quantitative analysis combined with a high throughput of samples.

Shedding of Campylobacter and E. coli O157 by naturally colonized cattle is intermittent (Robinson 1982; Zhao et al. 1995; Stanley et al. 1998a). Generally, studies agree that a lower prevalence of Campylobacter is observed on the farm compared with continuous surveillance at slaughterhouses (Grau 1991; Stanley et al. 1998a; Nielsen 2002). A brief survey carried out on 1 day in May of 20 freshly voided faeces from a Campylobacter positive dairy herd found that 50% were positive. Of the 10 positive samples, seven yielded counts <104 but three yielded between 105 and 106 CFU C. jejuni gfw−1 (Stanley 1996). This suggests that just a small proportion of a herd may be shedding high numbers of the organism at any time.

The problem of ‘high shedders’ was recently highlighted by an environmental study that followed up an outbreak of E. coli O157 associated with public use of agricultural land (Ogden et al. 2002). High shedders, that is those animals shedding greater than 105 organisms per gram, may present a greater risk at the abattoir and act as ‘hot spots’ in a flock or herd, facilitating intraherd transmission through contamination of their hides, water troughs or grazing pasture. Pasture remains contaminated for longer when pathogens are excreted in higher numbers (Ogden et al. 2002). Methods for rapidly identifying animals that are shedding high numbers of pathogens are highly desirable if such animals can be held back from slaughter or withdrawn from a flock or herd for a period.

5.3 Seasonal variation in dairy herds

A 2 year longitudinal study on four Lancashire farms measured the seasonal variation in the numbers of Campylobacter shed in the faeces by the dairy herd. Average numbers in dairy herd faeces were lower than from intestinal samples. Statistical analyses of quantitative data gained over 2 years revealed a true seasonality, that is, the same periodicity in numbers from 1 year to the next. Each herd had two peaks per year, in approximately spring and autumn. Peaks coincided in herds on close neighbouring farms but on farms situated 20 miles to the north peaks preceded those on southerly farms by 2 months in spring and 1 month in autumn.

Other studies have reported seasonal periodicity in Campylobacter carriage rates within dairy herds in temperate areas of both the northern and southern hemispheres. During a previous longitudinal study of two dairy herds in the north-west of England, C. jejuni was isolated from approximately 10% of each herd during the summer, from neither during the winter but it re-emerged during spring (Robinson 1982). In New Zealand, Meanger and Marshall (1988) measured the isolation rate during three seasons and found high isolation rates in both summer (24%) and autumn (31%), but low in winter (12%). The authors suggested that climatic conditions in autumn closely matched those of an English summer and implied that these observations were directly related to climate. The consistency of the peaks in the Lancashire data over 2 years further implies that there is a temporal regulatory factor.

5.4 Factors affecting shedding in dairy herds

The seasonal peaks in the Lancashire dairy herds could not be correlated with minimum or maximum air temperatures, hours of sunshine or rainfall (Stanley et al. 1998a). The similarity in the data obtained from farms situated adjacent to one another yet varying from farms situated only 20 miles away suggest that there may be local sources. Indirect temperature dependent factors, for example, migratory birds, rodents or insects may be important.

It is not clear whether periodicity in Campylobacter populations is due to recrudescence, that is, fluctuations in population levels of indigenous campylobacters, or indicates seasonal re-infection. The spring and autumn peaks roughly correlate with traditional milk flushes and periods of calving which suggest that reproductive hormones and/or stress may exert a seasonal regulatory effect, particularly as these peaks are absent in those animals reared for beef (Stanley et al. 1998a). Changes in shedding patterns may reflect hormonal disturbances in the gut flora. However, UK farmers are nowadays encouraged to calve their dairy herds all year round to meet the constant consumer demand for milk and dairy products.

The peaks roughly coincide with the spring transition from winter housing to summer grazing and the autumn return to winter housing and may reflect a change in diet or water. Seasonal patterns of shedding of other pathogens have been associated with diet. The prevalence of Listeria monocytogenes in the faeces of dairy cattle is higher during the periods of winter housing than when animals are on pasture (Husu 1990). Feedstuffs used during winter including silage, hay and concentrates are the major sources of both pathogenic and non-pathogenic species of Listeria (Husu 1990). For dairy cattle, feeding of whole cottonseed or hulls and alfalfa has been identified as a risk factor for shedding C. jejuni (Wesley et al. 2000) but it was not suggested that the feed was contaminated with the pathogen.

5.5 Campylobacter shedding by sheep

The production of lamb is determined by geography. In the Lancashire area a system referred to as ‘stratified farming’ essentially means that breeding stock are reared on hill or high land and brought down to fatten on lower land before slaughter. Ewes are cared for on lower land until the lambing season, which is usually between January and March. A study of Campylobacter shedding by grazing sheep on lowland farm land, saltmarsh and upland fells, conducted on various farms in the North Lancashire region, revealed that the organism was intermittently shed depending on season. Campylobacter shedding was lowest (0%) in November and December when sheep were fed on hay and silage compared with when they were grazing pasture (Jones et al. 1999). The highest rates of shedding (100%) coincided with increased stress as a result of lambing, weaning and movement onto new pasture. Abrupt changes in diet have also been show to cause increased shedding of E. coli 0157:H7 by sheep (Kudva et al. 1997).

5.6 Young animals on the farm

5.6.1 Effects of lambing on colonization.

Peaks of disease outbreaks in sheep can be observed around lambing (Mawdsley et al. 1995). Jones et al. (1999) found that young lambs born on the farm are negative for Campylobacter which suggests that horizontal not vertical transmission is the mechanism by which they acquire infection. Ewes that were not shedding Campylobacter before lambing began shedding 3 days after lambing while ewes that shed low numbers (1–2 log10 gfw−1) before lambing subsequently shed up 5 log10 g−1. Within 5 days of being born, 100% of lambs born to either set of ewes were colonized. Seventeen isolates from lambs and ewes that were submitted for typing had the same macro restriction profile (Fitzgerald et al.2001).

5.6.2 Colonization and shedding by calves.

Similarly, new born calves rapidly acquire the organism from the farm environment via horizontal transmission (Stanley et al. 1998a). In a study of three batches of calves, all calves were free of Campylobacter at birth but most began shedding within 4 days. High numbers could be found in faecal samples within 1–2 months of age. Some individual calves shed more than 8 log10 MPN campylobacters gfw−1 before they were 1 month old. One batch of calves were sampled until 7 months old when the average MPN had declined to between 2 and 3 log MPN which was similar to that shed by the dairy herd (Stanley et al. 1998a).

Different systems of husbandry are practized in the production of meat and the maintenance of reproductive stock which may involve different risks of infection. In order to ensure the microbiological quality of milk, dairy herds are subject to stricter hygiene regulations than beef cattle. In contrast with dairy farming, it is common practice in the rearing of beef cattle for farmers to throw down fresh straw on top of old litter to soak up liquid waste. Calves have physical contact with their bedding at all times and it is unlikely that they could avoid re-ingestion of material of faecal origin either via their own contaminated hides or from food or water troughs. It is likely that animals housed in this way are constantly re-infected from their own litter as well as from other sources and this may be reflected in the high carriage rates of the beef cattle at slaughter.

What is striking about data from young calves aged between 30 and 60 days is that the Campylobacter numbers in the faeces are similar to the high numbers observed in broiler chickens before slaughter at 40 days of age. Age of the different groups of animals may account for variation in average Campylobacter numbers detected in faecal samples. Beef cattle are up to 30 months old by the time they come to slaughter whereas the average age of the dairy herd is likely to be significantly greater. The average numbers of campylobacters in faecal samples from calves up to 6 months old is between 10 and 100 times higher than those found at the abattoir in visceral samples of beef cattle at slaughter or the faeces of adult animals. Grau (1988) enumerated, on average, 3·7 log CFU C. jejuni per gram from the viscera of calves which was approximately 10 times higher than the average level observed in adult cattle. Higher rates and levels of Campylobacter infection have been observed in calves than in adult cattle, for example, 54%, (Grau 1988), 52% (Gill & Harris 1982) and 97·1% (Giacoboni et al. 1993).

6. Contamination of the farm environment

On average each dairy cow produces 57 l of faecal waste per day and each beef bullock 27 l day−1 (MAFF, 1991). Average sized dairy herds comprising around 100 dairy cows will produce approximately 4000 l per week of faeces which in summer may be deposited on pasture and spread around the farm when cows are brought for milking but during winter must be collected in storage tanks (Jones 2001). Campylobacter jejuni was isolated all year round from slurry tanks from five Lancashire farms in numbers between 1 and 2·4 log10 MPN g−1 (Stanley et al. 1998c) suggesting that there was little die-off in storage. Kearney et al. (1993) showed that the T90 of C. jejuni was 438·6 days during full scale mesophilic anaerobic digestion of beef slurries which was more than 10 times longer that of Salmonella, Listeria, Yersinia or E. coli.

Prior to intensification, livestock wastes contained large amounts of bedding. Pathogenic bacteria were killed when solid farm yard manure was composted (Mawdsley 1995). Herd sizes have now increased and frequently cattle may be housed on bare concrete or slatted floor from where the liquid slurry is collected and stored along with urine, rain water and parlour washings. Other non-noxious liquids generated through other activities on the farm may also be included so that the contents of a given slurry tank may vary enormously from farm to farm. Pressure on storage space may be greater in winter when rain or snow further increases the liquid volume in uncovered tanks and prevents spreading on land. Slurry from tanks is disposed of by spreading on land. Stanley et al. (1998c) showed that aerobically digested slurries put to land in Lancashire in summer contained fewer campylobacters than non-aerated slurries put to land in winter but that survival times on land in winter were much longer than in summer. Further, slurry dried out and the grass grew over rapidly in summer while in winter, hard rain washed away slurries, leading to a greater likelihood of contaminated runoff.

The importance of workers’ boots as vehicles of campylobacters has been shown by Kazwala et al. (1990). Humphrey et al. (1993) demonstrated that good hygiene practices including boot dipping can delay the introduction of Campylobacter in poultry flocks. Hence boots should be considered an important vector in both introducing and maintaining exposure to Campylobacter in housed animals.

7. Potential sources of new infection in adult cattle

It might be expected that the underdeveloped gastrointestinal system of young ruminants may allow easy colonization of Campylobacter or other zoonotic pathogens. It is not clear whether young calves, once infected, excrete the same genotypes intermittently or whether they become re-infected with new genotypes throughout their adult lives. Nor is it clear how ingestion of contaminated feed or water or exposure to new genotypes might influence the rate of Campylobacter shedding in adult animals.

As yet, only a few longitudinal studies have assessed changes in strain diversity within dairy herds. Without such data it is difficult to devise a control strategy. Fitzgerald et al. (2001) found 19 different macro restriction profiles among 88 isolates from adult dairy cows on four neighbouring farms. Nielsen (2002) found that groups of adult animals harboured a broader range of serotypes than groups of calves. Up to five different C. jejuni serotypes were carried by a single herd and 8% of individual animals carried two serotypes. As only pure cultures are amenable to routine serotyping and molecular sub-typing techniques, colony picking from primary isolation plates is an important part of such studies. There are no phenotypic colony characteristics that help discriminate different campylobacter genotypes although our own studies have found that lower sample dilutions can yield different species and biotypes to higher dilutions (Stanley 1996; Stanley et al. 1998a) which helps to increase the estimated diversity from a given sample.

Campylobacter jejuni has been isolated from wild birds such as pigeons, crows, geese, ducks and cranes that might visit grazing pasture. Migratory birds travel long distances and could be a source of new genotypes within flocks or herds. Stored animal slurries and cowpats on pastures attract potential pathogen vectors such as birds and flies. Infected fomites and insects, contaminated water supplies, rodents, insects and free living birds which have all been suggested as potential routes of infection in commercial poultry flocks (Shane 1992).

Humphrey and Beckett (1987) reported that 10 of 12 herds with access to river water shed Campylobacter, at least temporarily, while two herds that drank only tap water were culture-negative. Hanninen et al. (1998) also found lower rates during winter when they drank only municipal chlorinated tap water and higher isolation rates during summer grazing when their source of water was a lake. They isolated C. jejuni from most of the lake samples. Using PFGE and serotyping they were able to show that the lake was probably contaminated by cow number 25 and that this was a likely transmission route for infection of the rest of the herd. Their study also found that some cows may persistently excrete one sero/genotype.

Van Donkersgoed et al. (2001) suggested that contaminated water troughs may be a transmission route of E. coli O157 within herds. Springs and shallow ground water may be contaminated with Campylobacter spp. (Stanley et al. 1998d; Savill et al. 2001). Groundwater sources can rise some distance from where the water is drained and may carry microorganisms from neighbouring farms or beyond. Contaminated groundwater has been implicated as a source of introduction in broiler farms although Campylobacter may not be readily culturable (Pearson et al. 1993; VandeGeissen et al. 1996).

8. Conclusions

  • • Young cattle and sheep born on the farm are exposed to Campylobacter infection within the first few days of life. Calves excrete very high numbers of Campylobacter in their faeces when young.
  • • Sheep and cattle shed Campylobacter intermittently throughout their lives. This may show a seasonal pattern.
  • • Most ruminants sent to slaughter in the north-west of England carried one or more species of thermophilic Campylobacter. Isolation procedures should include an enrichment step to fully measure the isolation rate.
  • • Campylobacter is readily isolated from stored slurries, cowpats and bedding, all of which may be scavenged by rodents, wild birds and insects.
  • • Contamination of surface and sub-surface waters may transmit Campylobacter within herds and between farms and other livestock groups.
  • • Ruminants and other animals on farms carry Campylobacter genotypes that are capable of causing disease in the local community.
  • • Farms rearing ruminants animals play a significant role in the global contamination cycle of Campylobacter.
  • • A greater understanding of the patterns of shedding by animals on the farm and the relationship between host, environment and Campylobacter genotype are necessary in order to devise practical intervention strategies, if indeed they are to be deemed worthwhile.

Ancillary