Paul Gale WRc-NSF Ltd, Henley Road, Medmenham, Marlow, Buckinghamshire, SL7 2HD, UK (e-mail: firstname.lastname@example.org).
Aims: The aim is to determine the risk of transmission of BSE to humans and cattle through the application of sewage sludge to agricultural land.
Methods and Results: A quantitative risk assessment based on the Source–Pathway–Receptor approach is developed. Central to the model is the estimation of the arithmetic mean concentration of BSE agent in sewage sludge. The main sources of uncertainty in the risk assessment are the degree to which sewage sludge treatment destroys BSE agent, whether there is a threshold dose for initiation of BSE infection in cattle, and most importantly, the amount of brain and spinal cord material which enters the sewer from the abattoir. Assuming 1% of brain and spinal cord is lost to the sewer from abattoirs, the model predicts a risk of BSE transmission of 7·1 × 10–5 cow–1 year–1 for cattle grazing on land to which sewage sludge has been applied.
Conclusions: The risks to humans through consumption of vegetable crops are acceptably low. Although the risks to cattle are higher, because of higher exposure to soil and greater susceptibility, the model demonstrates that sewage sludge alone cannot sustain the BSE epidemic in the UK cattle herd. Furthermore, the model suggests that recycling of BSE agent through sewage sludge will not sustain endemic levels of BSE in the UK cattle herd.
Significance and Impact of the Study: The conclusions are consistent with the findings from epidemiological studies which so far have not detected horizontal transmission of BSE (which would include transmission from contaminated pastures). The model demonstrates the importance of containment of brain and spinal cord within the abattoir.
Risk assessment methods have been developed to identify the potential risks of variant Creutzfeldt–Jakob disease (vCJD) infection in humans through bovine spongiform encephalopathy (BSE) agent in the aquatic environment (DNV 1997; Gale et al. 1998). In 1996, the UK Government introduced the Over Thirty Month Scheme (OTMS) which provides a control and management strategy for the slaughter and disposal of cattle over the age of 30 months. Although all suspect and confirmed BSE cases are killed on the farm by lethal injection and the carcass disposed of by incineration, a small percentage of OTMS cattle slaughtered at abattoirs will be incubating BSE. This paper addresses the contribution of BSE agent from the slaughter and processing of OTMS cattle at abattoirs to sewage sludge and assesses the risks posed.
In the UK 480 000 tonnes of dry solids (tds) year–1 of sewage sludge (1996/7) were applied to agricultural land (WRc 1998). Under the provisions of the Safe Sludge Matrix (http://www.adas.co.uk/matrix/), the application of raw (untreated) sludge to agricultural land is prohibited. For treated sludge (e.g. treated by anaerobic digestion) a time interval of 12 months applies between application and the harvesting of vegetable crops (such as potatoes and leeks). Application of sewage sludge in accordance with the Safe Sludge Matrix, therefore greatly reduces the loadings of conventional pathogens (e.g. salmonellas and viruses) in the soil at the point of harvest of vegetable crops. For pasture land, a grazing restriction of three weeks applies and the treated sludge must be deep injected or ploughed down immediately after application.
Brown (1998) suggests that it is plausible that surface or subsurface disposal of BSE-contaminated tissue will result in long-lasting soil infectivity, and speculates that, if the site were then used for herbivore grazing, or tilled as arable land, the potential for disease transmission remains. He concludes that whatever risk exists is extremely small, but not zero. The objective of this paper is to develop a risk assessment to quantify the risks from disposal of treated sewage sludge to agricultural land. In April 1996, The Fertilisers (Mammalian Meat and Bone Meal) Regulations (SI 1996/1125) were introduced by the UK Government prohibiting the use of meat and bone meal (MBM) as, or in, fertiliser used on agricultural land.
METHODS – DEVELOPING A QUANTITATIVE RISK ASSESSMENT
Environmental risk assessment models are based on the Source–Pathway–Receptor approach. Critical to defining the pathways is the identification of the barriers which attenuate or inactivate BSE agent.
In total, over the 4·5 years the cull has been in progress, 3 176 205 OTMS cattle (to the week ending 28 January 2001) have been slaughtered at abattoirs in England and Wales (http://www.maff.gov.uk/animalh/bse). On average this represents about 700 000 cattle year–1. Gale et al. (1998) assumed that 0·54% of OTMS cattle were infected with BSE in 1997. Therefore, according to the model, 700 000 × 0·0054=3780 BSE-infected cattle would have been slaughtered at abattoirs in England and Wales in 1997 under the OTMS cull. In practice, the actual number would have been less because confirmed and suspect BSE cases would have been killed on the farm by MAFF vets and the carcass incinerated instead of being slaughtered at abattoirs.
Quantifying the infectivity in the Source Term.
Studies in which groups of cattle were given oral doses of 1 g, 10 g, 100 g and 3 × 100 g of BSE-infected bovine brain have been undertaken at the Central Veterinary Laboratory, UK. Results obtained at 50 months were presented by Anderson et al. (1996). At the end of the experiment (at 9 years), seven of the 10 cattle given 1 g, seven of the nine cattle given 10 g, 10 of the 10 cattle given 100 g, and 10 of the 10 cattle given 3 × 100 g doses were infected (S. Hawkins and G. Wells, pers. comm.). The bovine oral ID50 was estimated to be 0·38 g (95% c.i. 0·03–5·27 g). It is generally assumed for the purpose of risk assessment, that the oral ID50 for cattle is 0·1 g of BSE-infected bovine brain (DNV 1997). Gale et al. (1998) assumed a value of 1 g of BSE-infected bovine brain for the human oral ID50 based on mouse data. Since the brain and spinal cord of a bovine weigh about 1000 g, each BSE-infected bovine will contain 1000 human oral ID50 units (Gale et al. 1998) which is equivalent to 10 000 bovine oral ID50 units.
Event trees and identification of barriers
Figure 1 presents an event tree, which defines the pathway by which the BSE agent in OTMS cattle slaughtered at abattoirs could contaminate agricultural land through application of sewage sludge. The fractions define the proportion of BSE infectivity through the pathway and must add up to 1·0 for the arrows coming from each node.
Three barriers are in place at abattoirs to prevent entry of BSE infectivity from slaughtered cattle to the sewer.
(i) Cattle which are symptomatic on arrival are not slaughtered at abattoirs. Indeed suspected or confirmed BSE cases are killed on farms by lethal injection by MAFF vets and the carcass incinerated.
(ii) Organs known to contain infectivity, namely the brain, spinal cord and the specified bovine offals are removed as specified bovine material (SBM) and disposed of at rendering plants. Rendering inactivates at least 98% of the BSE infectivity (Taylor et al. 1995).
(iii) Screens with 4 mm mesh are in place across drains at abattoirs to prevent particles larger than 0·1 g entering the sewer. The trappings are treated as SBM and rendered.
Since the Specified Bovine Offal (SBO) Order introduced in August 1995 (SI 1995 No. 1928) which prohibited the removal of the cow’s brain and eyes from the skull so that the whole skull is disposed of as SBO, the risk of contamination of abattoir waste water and effluents with bovine brain has been greatly reduced. This risk has been lowered further by the SBM Order (SI 1996 No. 963) introduced in March 1996 which required the whole head of all cattle over 6 months to be treated as SBM. Therefore, from March 1996, the possible routes of contamination of waste wasters with BSE infectivity at abattoirs are:
(i) small particles of spinal cord produced when the carcass is sawn in half up the vertebral column;
(ii) spinal cord dropped on the floor after removal and washed away in yard washings;
(iii) spinal fluid draining from the carcass from where the head is removed;
(iv) particles of spinal cord released when the head is removed;
(v) particles of brain released when the captive bolt is released; and
(vi) particles of brain from pithing techniques. Pithing is the process where a metal rod is inserted into the brain through the hole in the skull made by the captive bolt. When the pithing rod is withdrawn from the skull, any brain material is wiped off on paper tissue which is then disposed of as SBM.
The practice with the greatest potential for contaminating abattoir waste waters with BSE infectivity is the splitting of the carcass up the spinal cord with a band saw. The event tree in Fig. 1 assumes 1% of the brain and spinal cord from each carcass does not go to the rendering plant but enters the waste water stream after the 4 mm screens. In 1999 the Health and Safety Executive recommended that OTMS carcasses be butchered off-centre of the vertebral column. This eliminates sources (i) and (ii) above, greatly reducing the amount of spinal cord material entering the waste water stream. Pithing is not practised in most OTMS abattoirs. A more realistic estimate of the amount of brain and spinal cord entering the sewer may therefore be 0·01%.
Fate of BSE infectivity in sewage treatment works.
Particulate matter in raw sewage is separated during treatment to form the raw sewage sludge while the treated water component is discharged as effluent usually to a water course or to sea. BSE infectivity has been described as ‘sticky’ (Gale et al. 1998) reflecting the amphipathic nature of the disease-related form of the prion protein. Infectivity is associated with biological membranes, and as such will tend to remain within the brain and spinal cord material on discharge to sewage. Gale et al. (1998) concluded that BSE infectivity will bind to particulate matter in the aquatic environment. For the purpose of risk assessment, it is assumed that all BSE infectivity partitions into the raw sludge during sewage treatment (Fig. 1). This presents a worst-case for risks from sewage sludge.
Destruction of BSE agent during sludge treatment.
According to the ‘protein-only’ hypothesis, replication of the BSE agent is mediated by the disease-related form of the prion protein, PrP-res (‘res’ for resistant to proteases) converting the cellular form of the prion protein, PrP-sen (‘sen’ for sensitive to proteases) into more PrP-res. It is concluded that BSE agent cannot replicate in the sewage sludge or aquatic environment because:
(i) the PrP-sen precursor is sensitive to proteolytic degradation and will be rapidly degraded, particularly in the sewage sludge environment; and
(ii) both PrP-res and PrP-sen will be diluted in the aquatic environment, greatly minimizing the chance of the two species coming into contact.
Scrapie and Creutzfeldt–Jakob disease (CJD) agent are unusually resistant (Brown et al. 1986). Although the disease-related form of PrP is more resistant to proteolytic digestion than PrP-sen and other proteins, both PrPSc (the scrapie prion) and scrapie infectivity are degraded by prolonged digestion by proteolytic enzymes. Thus, over a 16-h period, McKinley et al. (1983) reported a> 5-log destruction of scrapie infectivity by the enzyme Proteinase K. Indeed, McKinley et al. (1983) concluded that although PrPSc was initially resistant to proteolytic digestion, it was eventually degraded after prolonged digestion. However, there are no specific data available for the effect of sewage sludge treatment on scrapie or BSE infectivity. Based on the degree of destruction reported by McKinley et al. (1983), proteolytic activity during sewage sludge digestion could result in some destruction of the BSE prions. Pavlostathis and Giraldo-Gomez (1991) reported a first-order degradation rate constant for the protein albumin by anaerobic digestion of 0·57 d–1. This is equivalent to a 1-log reduction in 4 d. However, this low level of proteolytic activity may not be sufficient to degrade PrP-res. It is therefore assumed in the risk assessment, that conventional treatment of sewage sludge (e.g. by anaerobic digestion) does not destroy any BSE infectivity (Fig. 1).
Enhanced treatment of sewage sludge by using lime could potentially destroy at least 90% of BSE agent. Thus, 0·01 mol l–1 NaOH (pH 12) gives a 1-log destruction of sheep scrapie agent after 1 h exposure (Brown et al. 1986). Under the Code of Practice for Agricultural Use of Sewage Sludge (Department of the Environment 1996), quicklime or hydrated lime is added to raise the pH to greater than 12·0 for a minimum period of 2 h.
Calculating the arithmetic mean BSE concentration in sewage sludge.
Assuming 1% of the brain/spinal cord from each bovine were to break through into the sewer, then according to the Source Term above, the slaughter of 700 000 OTMS cattle year–1 in England/Wales would contribute 378 000 bovine oral ID50 year–1 (1997) to the raw sewage sludge. In England/Wales in 1996/7, sewage sludge totalling 967 000 tds year–1 was produced (WRc 1998). The arithmetic mean concentration of BSE agent in the treated sewage sludge is therefore 0·39 bovine oral ID50 tds–1.
Dilution in soil.
In England and Wales in 1996/7 a total of 480 000 tds year–1 was applied to 73 031 ha of agricultural land (WRc 1998). Sewage sludge is therefore applied with an arithmetic mean loading rate of 6·57 tds year–1 ha–1 (10 000 m2) and is injected to a depth of 0·25 m. Thus 6·57 tds is added to 2500 m3 of soil. Assuming the dry bulk density of subsoil is 1·5 g cm–3 (Rowell 1997) then 2500 m3 of soil weighs 3750 tonnes. Injecting 6·57 tds sludge ha–1 gives a dilution factor of 570-fold (w/w). The BSE concentration in the soil from application of sewage sludge is calculated as 6·9 × 10–4 bovine oral ID50 tonne–1. In the event tree (Fig. 1), dilution is represented as the probability (0·0018) of a bovine ingesting a sludge particle compared to the much greater probability of a bovine ingesting a soil particle (0·9982). The risk assessment does not discriminate between the two extreme distribution scenarios:
(i) the 6·57 tds sludge being homogeneously distributed within the 3750 tonnes soil; and
(ii) the 6·57 tds sludge remaining as a single aggregate within 3750 tonnes soil.
This is quite acceptable because the risk of infection from pathogens is not related to the spatial variation in pathogen exposures (Gale 1998; Gale and Stanfield 2000). Indeed, the arithmetic mean pathogen exposure is sufficient for risk assessment (Haas 1996).
Decay and leaching in the soil.
Leaching of BSE agent away from the top soil would greatly reduce any exposure to cattle or vegetable crops. Brown and Gajdusek (1991) demonstrated no leaching of the scrapie agent from the top soil to lower layers of soil over a 3-year period. Brown and Gajdusek (1991) buried scrapie-infected hamster brains mixed with topsoil in dishes in a garden. In the two separate experiments, 0·3–1·7% of the infectivity was recovered from the soil after 3 years burial (Table 1). The data suggest considerable decay in the soil environment over a 3-year period. However, the experiment was primarily designed to demonstrate survival. Indeed, the failure to recover 98·3% to 99·7% of the input infectivity could either be due to decay or due to failure to wash all of the BSE agent from the soil particles prior to filtration through a 0·45 μm filter. The positive control used was hamster brains mixed with soil and then frozen at −70°C for the 3-year period. Scrapie infectivity is associated with biological membranes (see Gale et al. 1998). In the positive control at −70°C, the phospholipids comprising those biological membranes will be preserved. However, in the soil environment they will decay over the 3-year period. The aggregation properties of PrP in the positive control may therefore be different to those of PrP in the experiment and this may alter the amount of infectivity passing through the 0·45 μm filter. Thus a greater proportion of PrP may be stuck to the soil in the experiment than in the positive control. This would lead to an overestimation of the amount of decay over the 3-year period. The event tree (Fig. 1) therefore allows for no leaching or decay of the BSE agent in soil over the 12 or 30 month periods allowed for by the Safe Sludge Matrix between application of treated sludge and harvesting of vegetable or salad crops, respectively. This is a worst-case assumption. It should be noted that the Brown and Gajdusek (1991) experiment is a more appropriate model for attenuation/decay of BSE agent by soil/chalk substrata as used in the drinking water risk assessment (Gale et al. 1998). This is because either mechanism for attenuation, be it decay or adsorption/filtration, serves to prevent BSE agent from entering the drinking water supply.
Table 1. Data for two experiments to monitor the survival of scrapie agent buried within soil-containing pots between Sept 86 and Aug 89 in the United States (Brown and Gajdusek 1991)
RESULTS – A RISK ASSESSMENT FOR SLAUGHTER OF OTMS CATTLE AT ABATTOIRS
Exposure to grazing cattle
Assuming 1% of brain and spinal cord breaks through into the sewer from abattoirs, the risk assessment predicts 0·00069 bovine oral ID50 units tonne–1 soil to which sewage sludge has been applied (Table 2). A bovine ingesting 0·41 kg (dry weight) d–1 of soil (EUSES 1997) would therefore ingest 1·0 × 10–4 bovine ID50 year–1 through grazing on land to which sewage sludge had been applied. Gale (1998) demonstrated that, according to the negative exponential dose–response relationship, the risk to an individual is calculated as 0·69 × fraction of ID50 ingested. (This dose–response model assumes that there is no threshold dose and that the BSE prions act independently during infection). The annual risk of BSE infection to each individual cow is therefore calculated as 7·1 × 10–5. In a herd of 700 000 cows grazing on such land, there would be 49·52 BSE cases year–1 (Table 2). Introducing processes at abattoirs to reduce the leakage of brain/spinal cord into the sewer by 100-fold (such that only 0·01% entered the waste water stream) would reduce the number of BSE cases in the 700 000 cows to just 0·50 per year (Table 2).
Table 2. Risk assessment for transmission of BSE to cattle through application of treated sewage sludge to land
Exposure to humans through vegetable crops
There are 10-fold fewer human oral ID50s tonne–1 soil than there are bovine oral ID50s (Table 3). At point of harvest as much as 2% (w/w) of root crops may be soil. A tonne of potatoes will therefore contain 0·02 tonnes of soil. EUSES (1997) estimate the daily consumption of root crops to be 0·384 kg person–1 d–1. This is equivalent to 0·14 tonnes of root crops person–1 year–1, of which 2·8 kg is soil. The annual risk of vCJD infection is 1·32 × 10–7 person–1 year–1 (Table 3). Reducing the leakage of brain/spinal cord into the sewer to just 0·01% reduces the risk to 1·32 × 10–9 person–1 year–1 (Table 3).
Table 3. Risk assessment for transmission of vCJD to humans through consumption of root crops grown on land to which sewage sludge has been applied
The objective of the model is to determine the additional number of BSE cases in cattle in England/Wales through the agricultural disposal route for sewage sludge. Central to this, is the estimation of the arithmetic mean concentration of BSE agent in sewage sludge. This is calculated on the basis of the slaughter of 700 000 OTMS cattle year–1 and the total, annual sewage sludge production in England/Wales. In effect, an arithmetic mean concentration is calculated for sludge across England/Wales as a whole. For agents such as Cryptosporidium parvum (Haas 1996; Gale and Stanfield 2000) and BSE (Gale 1998) the risk of infection is directly related to the arithmetic mean exposure. Therefore, this statistic is the appropriate estimator of exposure for the risk assessment. Undoubtedly, there will be spatial and temporal variation, with sewage sludges from works receiving abattoir waste waters potentially having higher loadings, and sludges from works with no abattoir inputs potentially having zero BSE infectivity. Similarly, higher sludge loadings may be applied on some fields than others. Thus, although the arithmetic mean loading is 6·57 tds ha–1 year–1, loadings of 10 tds ha–1 year–1 are not unusual (WRc 1998). However, assuming a linear dose–response curve for BSE infection, this variation is not important for risk prediction (Gale 1998). For a given arithmetic mean BSE concentration, there will still be the same additional number of BSE cases throughout England/Wales as a whole from exposure to sewage sludge on grazing land, irrespective of the variation in BSE loadings between different batches of sludge. Thus the model predicts the excess number of BSE cases for England and Wales as a whole.
The model assumes that all BSE agent in sewage, partitions into the raw sewage sludge. Sludge treatment by anaerobic digestion, and enhanced sludge treatment by lime in particular, may destroy some of the BSE agent. However, the risk assessment is worst-case in assuming:
(i) no destruction of BSE agent during sewage sludge treatment;
(ii) no decay in the soil; and
(iii) no threshold dose, such that just a single prion can initiate BSE infection in a human or bovine (Gale 1998).
Furthermore, no allowance is made for the fact that levels of BSE infectivity in the OTMS cattle culled at abattoirs will be lower than assumed here because confirmed and suspected BSE cases are removed and the carcasses disposed of by incineration. Indeed, the risk assessment relies on dilution in the soil and containment of brain/spinal cord at the abattoir as the only barriers (Fig. 1).
The model assumes that 1% of brain and spinal cord breaks through into the sewer from each abattoir. This is equivalent to the entire brain and spinal cord from one in every 100 cows being slaughtered at the abattoir and is an unrealistic worst-case assumption. The model predicts 49·52 cases of BSE per 700 000 cattle year–1 through exposure to sewage sludge (Table 2). This is 76-fold less than the 3780 cases per 700 000 OTMS cattle year–1 in the Source Term. To promote the BSE epidemic, the risk assessment would need to predict > 3780 BSE cases through application of sewage sludge to land. It is concluded that application of sewage sludge on land is not sufficient to sustain the BSE epidemic in the UK cattle herd. This is strongly supported by epidemiological evidence. Indeed, the incidence of BSE in the UK cattle herd has declined sharply since 1992 (Donnelly et al. 1999).
Bearing in mind the worst-case assumptions used here, the model suggests that, providing containment is maintained at abattoirs, the application of sewage sludge to agricultural land is not sufficient to maintain an endemic level of BSE in the UK cattle herd. This is supported by the available epidemiological data. Thus, Donnelly et al. (1999) write, ‘The rapid decline in BSE incidence observed in the last few years was found to be inconsistent with horizontal transmission occurring at a national rate sufficient to allow BSE to become endemic in the herds in Great Britain…’. However, Donnelly et al. (1999) conclude that further and extensive modelling is required to exclude the possibility of horizontal transmission occurring in a ‘core’ subset of cattle where within-herd case clustering has been seen. On the basis of the risk assessment developed here for sewage sludge, it is suggested that the available BSE data-bases should be analysed to compare the incidence of BSE on three types of farm, namely:
(i) (organic) farms not using sewage sludge;
(ii) farms using sewage sludge from works not receiving abattoir waste; and
(iii) farms using sewage sludge from works receiving abattoir waste.
The model demonstrates the importance of containment of bovine brain and spinal cord at abattoirs. Thus according to the model, if all of the brain/spinal cord were to enter the sewer (instead of 1%) there would be 4952 BSE cases per 700 000 cattle year–1. This would be sufficient to maintain endemic levels of BSE in the UK cattle herd and perhaps even sustain the epidemic. The barrier provided by abattoir containment is all the more important because subsequent destruction by sewage and sludge treatment cannot be assumed to remove a significant amount of infectivity. Where only one barrier exists it must be maintained to the highest possible degree.
The model is worst-case in assuming that there is no threshold dose for BSE infection. If there is, or if BSE prions co-operate during initiation of infection, then dispersion of the BSE prions in the sewage and sludge would virtually eliminate the risk of transmission of BSE through the aquatic environment (Gale 1998). It may be calculated from Table 2 that the arithmetic mean exposure to cattle would be equivalent to 1·0 × 10–5 g cow–1 year–1 of BSE-infected bovine brain. Data presented in Anderson et al. (1996) from cattle feeding studies show that the incubation time to infect 50% of animals increases with decreasing dose. Thus, the incubation time to infect 50% of the cattle was 40 months for a dose of 3 × 100 g of bovine brain, 43 months for 100 g, 48 months for 10 g, and> 52 months for 1 g of bovine brain. This raises the question as to how long the incubation period is for smaller doses, e.g. 10–5 to 0·1 g of BSE-infected bovine brain. Cattle feeding experiments using doses < 1 g are currently in progress (G. Wells, pers. comm.). If the trend of increasing incubation time with lower dose continues, then for very small doses the incubation time may exceed the natural lifetime of a bovine (and even a human) such that small doses essentially present zero risk if ingested orally. However, it should be noted that this observation is not proof of a threshold. Thus, Meynell and Meynell (1958) showed that the mean death time for mice challenged by intraperitoneal injection with Salm. typhimurium increased with decreasing doses greater than the ID50 but tended to become constant for doses less than the ID50. Indeed, they concluded that inoculated organisms acted completely independently such that at doses below the ID50, a mouse fatally infected will die following the multiplication of only one effective organism.
The exposure to humans through consumption of root crops (assuming 1% of brain and spinal cord is lost to the sewer from abattoirs) is 1·92 × 10–7 ID50 person–1 year–1 (Table 3). The individual risk from root crops (1·32 × 10–7 person–1 year–1) is almost an order of magnitude higher than that (1·5 × 10–8 person–1 year–1) estimated for drinking water consumers supplied from an aquifer potentially contaminated with rendering plant effluent (Gale 1998; Gale et al. 1998). It would seem reasonable that for a fatal brain disease, such as vCJD, the individual risk should be < 10–8 person–1 year–1. However, taking into account the worst case assumptions used here, it is concluded that the predicted risk to humans through consumption of root crops is acceptably low. In particular, the assumption that humans ingest the 2% (w/w) of soil on root crops at point of harvest is unrealistic. Indeed, food-processing companies wash such produce, and some root crops are peeled prior to consumption. The 1% loss of brain/spinal cord to the sewer allowed for here is unrealistic because OTMS carcasses are no longer sawn in half up the length of the vertebral column. This will greatly reduce the amount of spinal cord entering the waste water stream. Thus, if washing root crops removed 90% of the soil and only 0·01% of brain/spinal cord entered the sewers from abattoirs, then the risk would be 1·3 × 10–10 person–1 year–1.