An evaluation of United Kingdom environmental bovine spongiform encephalopathy risk assessment


  • Eric P.M. Grist

    Corresponding author
    1. School of Biological Sciences, Royal Holloway, University of London, Egham, Surrey TW20 OEX, United Kingdom
    • School of Biological Sciences, Royal Holloway, University of London, Egham, Surrey TW20 OEX, United Kingdom
    Search for more papers by this author


As a member of the group of diseases known as transmissible spongiform encephalopathies (TSEs), bovine spongiform encephalopathy (BSE) has been causally associated with a new variant of Creutzfeldt-Jakob disease (vCJD) in humans. Given the many uncertainties on the transmission and persistence of TSE pathogens in the environment, quantitative assessment of risks to humans and animals continues to remain a public health issue. This paper reviews quantitative BSE risk assessments undertaken in the United Kingdom since 1997 to address the potential for human exposure and theoretical health risks through environmental pathways. The review focuses on how model assumptions and methodology may influence the results.


Bovine spongiform encephalopathy (BSE) was 1st recognized as a novel disease in cattle during the late 1980s (Wells et al. 1987). It was observed to occur more frequently in dairy than in beef cattle and, particularly, in dairy cattle fed with protein-rich mammalian meat and bone meal (MBM). This led to the hypothesis that contaminated MBM was probably the source of the BSE outbreak and that the BSE pathogen had entered MBM after changes in rendering practices were made during the 1970s. At that time, a high proportion of sheep and scrapie-infected sheep were included in the mix of rendered cattle carcasses used to produce MBM. Although the Phillips Inquiry (2000; concluded no association with the rendering and recycling of scrapie-infected sheep through the feed chain, it has been suggested that this might still help to explain why the United Kingdom, with high populations of both cattle and sheep, is where BSE 1st emerged (Wilesmith et al. 1991). Today, the contaminated MBM theory remains the predominant hypothesis for explaining the origin of the BSE outbreak in cattle in the United Kingdom.

Establishment of MBM as a pathogenic source led to a total ban in the use of MBM as feed for all ruminants in the United Kingdom in July 1988 and throughout Europe in 1994 (per directive 94/381/EC in June, 1994). However, the annual number of BSE cases reported in the United Kingdom did not decline until some years later, after a peak incidence of 37,280 reported cases in 1992. The time lag is explained by a protracted incubation period, defined as the time from infection until clinical onset of the disease (Donnelly et al. 2003). Since the establishment of a causal association between a variant of Creutzfeldt-Jakob disease (vCJD) and BSE pathogens (Bruce et al. 1997; Collinge et al. 1997), it has been accepted that vCJD in humans probably stems from the ingestion of BSE-infected bovine meat products (hereafter, referred to by the term “infectivity”) originally derived from infected cattle. This is reinforced by results from experimental studies that have confirmed that BSE is a highly transmissible pathogen by the oral route, not only in cattle (Wells et al. 1998) but also in sheep (Foster et al. 1993) and, most notably, in primates (Collinge and Palmer 1997). The time lag between the peaks of the outbreaks of BSE in cattle and vCJD in humans is consistent with a human incubation period of between 10 and 15 years (Haley 2002).

Somewhat perplexingly, it has been shown that clinical signs of BSE in sheep are difficult to distinguish from those of the long-established, transmissible spongiform encephalopathy (TSE) disease referred to as “scrapie” (Foster et al. 1993). Scrapie poses no known health risks to humans (Harries et al. 1988). It is possible that BSE may have been introduced into sheep in the United Kingdom as a result of the use of contaminated animal feed (Baylis et al. 2002; Kao et al. 2002). At present, evidence of BSE in the sheep population has yet to be demonstrated; although, surveillance studies to address this concern are ongoing in the United Kingdom (Schreuder and Somerville 2003).

Currently, with the single exception of Sweden, BSE has been confirmed in cattle populations based in all European Union countries, as well as in Japan and Canada (OIE 2003). Since the disease was 1st recognized, nearly 200,000 cases have been diagnosed in the United Kingdom (OIE 2003). This is still the largest global recorded BSE epidemic. Since 1993, the annual number of cases of BSE-infected cattle in the United Kingdom has continued to decline, reaching its lowest level (around 600 cases) in 2003 (OIE 2003).

Quantitative risk assessments undertaken in the United Kingdom since 1997 have attempted to estimate the health risks to humans associated with exposure to BSE infectivity through the food chain and environmental pathways. Environmental pathways broadly include the following routes of exposure:

  • Direct ingestion of material from the land (e.g., crops consumed on farmland where cattle waste has been spread),

  • Consumption of contaminated drinking water (e.g., groundwater sources contaminated with leachate from landfills or from rivers contaminated with runoff from land spreading),

  • Inhalation of particles in the atmosphere (e.g., from burning or incineration of infected cattle), and

  • Direct ingestion of untreated water (e.g., streams containing runoff from land spreading).

Most prominent among the environmental exposure studies are those performed by the London-based consultancy Det Norske Veritas (DNV 1997a, 1997b, 1997c, 1997d, 1997e, 2001a, 2001b, 2001c; hereafter, referred to as the DNV risk assessments), as well as several peer-reviewed papers (Comer et al. 1998; Gale 1998; Gale et al. 1998; Gale and Stanfield 2001). These assessments are examined in the light of modeling approaches recently adopted in environmental risk assessment. The potential influences of the underlying assumptions on the various estimates of exposure risk are discussed.


Early research into the possible causes of scrapie in sheep led to the suggestion that TSE pathogens were likely either a virus or virus-like pathogen. Although some scientists still hold this as axiomatic (Manuelidis 2003), following the prion hypothesis advanced by Prusiner (1982), the virus theory has largely been dismissed. The prion hypothesis postulates that TSEs are caused by an abnormal prion protein (denoted by PrPSC), which accumulates in a host over time. The BSE pathogen is mainly perceived as a malformed macromolecule. Although the process of prion generation within a host remains unknown, it is conjectured to occur through catalytic conversion or replication. Rather than by synthesis, abnormal prions (PrPSC) are assumed to accumulate through the transformation of normal cellular prion (PrPSC). The build up of abnormal prions in vital organs ultimately leads to the onset of illness and the death of the host.

Scientific understanding of TSEs is confounded by a lack of knowledge on the species barrier, which refers to the relative difficulty of transmitting a disease or pathogen between different species. At the same time, this concept is rendered less than transparent by the fact that some species have a capacity to act as pathogen reservoirs without displaying clinical signs. This property has been amply demonstrated in certain types of transgenic mice, which are capable of acting as a carrier of the BSE pathogenic agent without displaying any signs of infection or disease (Aguzzi 1998). Indeed, initial clinical signs of BSE are vague or variable, making diagnosis difficult. Additionally, the long incubation period has constrained attempts to determine disease transmissibility. Experimental studies with cattle, however, suggest that the level of BSE infectivity peaks at the end of the incubation period when, by definition, disease signs 1st appear (Wells et al. 1998). Similar data, originally gathered in studies with mice and sheep, led to the Over Thirty-Month Scheme (OTMS) bovine safeguard rule in the United Kingdom, which directs that only younger cattle, perceived as less infectious, are allowed to enter the human food chain.

Whether by vertical (maternal) or horizontal transmission, fundamental knowledge of TSE epidemiology remains unclear (Ridley and Baker 1997). Evidence of the BSE pathogen in blood has been recently demonstrated in blood transfusions between sheep (Hunter et al. 2002), but this has yet to be observed in cattle (Wells et al. 1999). Confirmation of the presence of BSE infectivity in other body fluids (most notably milk) remains inconclusive (Kimberlin 1994). On a wider scale, to overcome the dearth of information on TSE globally, active surveillance to determine prevalence of both BSE and scrapie in all ruminant populations at local and national levels is now a concern throughout the European Union (EC 2001; Morignat et al. 2002).

The information on factors that lead to TSE persistence in the environment continues to be sparse. It has been postulated by Brown (1998) that long-lasting contamination of soil with TSE pathogens may result from the disposal of BSE-contaminated tissue on land. This would imply a potential for disease transmission from soil if subsequently used either for herbivore grazing or growing arable crops. For this reason, the Fertilisers (Mammalian Meat and Bone Meal) Regulations, introduced in the United Kingdom in April 1996, prohibited the use of MBM as, or in, fertilizer applied to agricultural lands.

Although pathogen longevity and contamination of livestock grazing areas seem likely to enhance transmission risks, no evidence exists of contact- or pasture-borne (lateral) transmission of BSE (Donnelly et al. 2003). Less remarkably, the probability of vertical (maternal) transmission has been statistically estimated at around 10% and reduced to an estimate of approximately 1% under field conditions (Donnelly et al. 1997). A more recent study has suggested that the probability of transmission is limited to the last 6 months of the incubation period of the dam with a transmission probability of around 0.5% (Donnelly et al. 2002). This is concordant with placental transmission, which has been proposed as a likely route for transmission of scrapie in sheep (Foster and Dickinson 1989). Thus, it is clear that BSE-exposure risks must be dependent on a multitude of factors and, in particular, the management of young livestock (Morley et al. 2003). For example, changes to the diets of young calves (especially dairy animals) in the United Kingdom during the 1970s resulted in MBM being regularly included in animal feed (Adam 2001). Because bovine susceptibility to BSE infection is more likely at a younger age (Anderson et al. 1996; Woolhouse and Anderson 1997), the original transmission of the disease in cattle herds in the United Kingdom likely began among young cattle.


Although the quantification of exposure risk in most risk assessments follows the recognized Codex Alimetaris of hazard identification, exposure assessment, hazard characterization, risk characterization, documentation, and reassessment (Vitale et al. 2001), these can be concisely compressed into 3 stages in environmental, BSE risk assessment. Respectively, these are infective load estimation, exposure pathway identification, and risk estimation. Each is now considered separately.

Infective load estimation

The most fundamental part of any BSE risk assessment is the estimation of the potential for BSE infectivity contained in the source under consideration. This quantity is referred to as the “infective load”. Infective load is usually expressed in ID50 units, where a unit is the estimated mass of infected tissue that each individual in a population would need to ingest for 50% of the population to become infected. The choice of 50% is traditional but arbitrary. It leads to a convenient working definition of infective load as the potential of a specific mass of tissue to cause infection (Comer et al. 1998).

More formally, the estimated infective load, L, expressed in human, oral ID50 units is given by the following equation:

equation image

where N is the number of animals in the cohort under consideration in the risk assessment, p is disease prevalence in the cohort under consideration, m is the mass of infected tissue per animal (in grams), i is infectivity in bovine (or ovine) oral IDx units (IDx · g-1), and s is the bovine (or ovine) to human, species barrier factor (dimensionless). Unless otherwise stated, an ID50 unit is expressed in terms of an oral dosage throughout this article.

Disease prevalence (p)—Efforts have recently been made to determine values for TSE disease prevalences, both for scrapie and BSE, in the United Kingdom at the regional and national levels (Woolhouse and Anderson 1997; Hoinville et al. 2000; Donnelly et al. 2002; Kao et al. 2002). These include focused attempts at back-calculation to reconstruct historical trends in the incidence of BSE infection by considering disease incidence together with an incubation-period distribution (Donnelly et al. 2002, 2003). However, epidemiological modeling studies reported by Ferguson et al. (1999) have suggested that low-level, horizontal transmission of BSE (if it occurs) would not have been sufficient to sustain the BSE epidemic observed in cattle in the United Kingdom.

All such estimations are heavily reliant on accurate epidemiological data (Doherr et al. 1999; Hoinville et al. 2000); however, progress in the quantification of key parameters has been hindered by several factors typical to TSE disease surveillance. Most obviously, there are difficulties in accurate diagnosis of disease signs and underreporting of disease incidence (Hoinville et al. 1999, 2000). Theoretical modeling studies have been constrained further through difficulties in establishing a minimum threshold dose to ensure the onset of any TSE disease in a given species. Such unknowns can, at best, still be quantified only in broad terms. In this context, the use of Bayesian estimation methods, increasingly being applied elsewhere in epidemiology and risk estimation (e.g., Clough et al. 2003; Orr et al. 2003) has been surprisingly neglected in BSE risk assessment. Estimated TSE disease prevalences based on surveillance data gathered for bovine and ovine populations have usually been incorporated as a single value (point) rather than a range (interval), as in the DNV risk assessments and those by Gale and Stanfield (2001), Gale (1998), and Gale et al. (1998). Throughout such studies in the United Kingdom, the disease prevalence of BSE-infected cattle was generally assumed to be under the value for OTMS cattle in 1996 (0.54%) derived by Anderson et al. (1996). For BSE in sheep, a ceiling disease prevalence of 2% for scrapie was assumed and hypothetical scenarios were constructed for BSE prevalence as a proportion of scrapie prevalence, which were specified to be between 0.01 and 10% of scrapie prevalence (DNV 2001c).

Infective tissue (m) and incubation period—Although evidence of the BSE pathogen has been found in the distal ileum of cattle as early as 6 months after oral exposure (Wells et al. 1998), BSE infectivity is known to accumulate only in the central nervous system (CNS) of cattle towards the end of the incubation period. In general, duration of any TSE incubation period varies considerably within a species. Statistical analyses of several scrapie titration experiments with transgenic mice have demonstrated that the average incubation period decreases as dose is increased, whereas the variance in its duration becomes greater as dose is lowered (McClean and Bostock 2000).

In cattle, BSE disease signs usually take 4 to 6 y to appear, with a mean incubation period estimated at 5.2 y (Dealler 2001). However, no evidence exists, as yet, of bovine genetic susceptibility. In sheep, BSE disease signs typically appear much earlier, most frequently between 2 and 4 y in age (Jeffery et al. 2001). Additionally, studies of both scrapie- and BSE-infected sheep have shown the existence of a wide range of genetic susceptibility to TSE diseases (Foster et al. 1996; Hunter 1997; Jeffery et al. 2001). This susceptibility has been characterized at the molecular level in terms of 5 alleles, giving rise to 15 main sheep genotypes (Hoinville et al. 1999). In the most resistant genotype, homozygous sheep (denoted by ARR/ARR), no evidence, to date, shows that either scrapie or BSE can be induced by an oral challenge (Anon 2002; Houston et al. 2003). However, it has been shown that ARR/ARR sheep may become infected by intracerebral challenge (Houston et al. 2003); thus, raising the possibility that ARR/ARR sheep might act as pathogen reservoirs or carriers. To further complicate matters, different genotypes are not uniformly distributed among the various breeds of sheep in the United Kingdom (Dawson et al. 1998). In the face of such complex uncertainties, it is justifiably expeditious to simply assume that sheep carcasses in the source term are all of the most susceptible genotype (DNV 2001c).

Environmental BSE risk assessments have proceeded by applying the precautionary principle to the potential infectivity in tissue. A strong emphasis is placed on worst-case scenarios, for example, by assuming an upper proportion (and sometimes all) of the animal carcasses under consideration are maximally infected. Quantification of the level of infectivity present in a cattle carcass is based on the principle that total infectivity carried by a symptomatic, infected bovine is directly proportional to the mass of its brain and connected spinal tissue (DNV 1997a, 1997b, 1997c, 1997d, 1997e, 2001a, 2001b). Each such carcass is assumed to contain the level of infectivity typically associated with a fully symptomatic individual (DNV 1997a, 1997b, 1997c, 1997d, 1997e, 2001a, 2001b). Although this should imply an upper estimate of risk, the simplifications avoid consideration of uncertainties connected with disease incubation period or the potential level of variability of the infectivity in tissue.

For sheep, the calculation is more complicated because, during the course of pathogenesis (in a susceptible genotype), both scrapie and BSE infectivity are known to spread extensively into a variety of tissues. It has been demonstrated experimentally that TSE infectivity may enter virtually all tissues in sheep, and especially the lymphatic system, early after infection (Hadlow et al. 1982; Foster et al. 1996; Jeffrey et al. 2001). The level of TSE infectivity present in sheep tissue is, therefore, dependent on both tissue type and age. The total, infected tissue mass m must be obtained by summing over all tissue and age categories. This approach was adopted in the assessment to determine BSE risks from the disposal of sheep carcasses (DNV 2001c), where scrapie data from Hadlow et al. (1982) were used to group ovine tissues into 3 and 4 main categories based on tissue type and age, respectively. However, it implies an underlying assumption that the pathogenesis of BSE and scrapie in sheep is identical, which in reality has yet to be established.

Infectivity (i)—Several experimental BSE studies have shown that the likelihood of inducing BSE infection through oral ingestion increases as the dose of infectious material is increased (e.g., Wells et al. 1998). However, it has not been established whether there is a minimum threshold dose that is required to initiate infection in a given species. Applying the precautionary principle yet again, it has been universally assumed in environmental BSE risk assessments that there is no threshold dose and that BSE infectivity accumulates in direct proportion (in other words, linearly) to the quantity ingested over the course of time. This assumption appears to err on the side of safety because the assumption that there is no threshold for infection seems to imply that the risk must be overestimated. However, the assumption implicitly assumes that any infectivity accumulates without replication during the same period; in other words, any dose, no matter how small, may cause death among some individuals in an exposed population because of (1) variation in individual susceptibility to infection, (2) variation in infectivity of tissue, or (3) chance passage of infectivity through the gut wall.

Following advice from the Spongiform Encephalopathy Advisory Committee (SEAC), the majority of the DNV risk assessments assert that 0.1 g of BSE-infected bovine CNS tissue is equivalent to 1 bovine oral BSE ID50 unit. This estimate was based on data obtained in a single, bovine, attack-rate experiment (commenced in 1992 at the Central Veterinary Laboratory, UK) with results quoted in Anderson et al. (1996). The calculation of the ID50 was performed using the computer program QUAD, which used a logit model and the δ-statistical method to derive confidence intervals (Morgan et al. 1989). This approach generated an estimate of 0.38 g of BSE-infected CNS tissue for 1 bovine oral ID50 unit with a wide 95% confidence interval of 0.03 to 5.27 g (the point value was prudently rounded down to 0.1 grams after SEAC expert opinion).

Ideally, the ID50 estimate would have been inferred directly from a dose-response model established between exposure to BSE infectivity and mortality. This would be obtained by fitting a model curve (typically sigmoidal) to the plot of the mortality proportions of cohorts incurred under the different exposure levels (which were respectively, 1, 10, 100, and 300 g BSE-infected bovine brain tissue). However, the sparse data collected in the 1992 attack-rate experiment implied that estimation of the effects of small fractions of an ID50 unit were subject to very high uncertainty. In the DNV risk assessments, no minimum threshold dose was assumed, and the infectivity contained in doses between 0 and 0.1 g was taken to be in direct proportion to the fractional quantity of an ID50 unit ingested (in other words, linear).

An alternative approach was adopted by Gale et al. (1998). Again, no minimum threshold was assumed. However, calculation of the level of infectivity present in fractions of a bovine oral BSE ID50 unit was based on an estimate of the number of PrPSC macromolecules that would make up 1 bovine oral BSE ID50 unit. Using a species-adapted scrapie model, a value of 1 × 105 PrPSc molecules had been previously estimated to make up an intra-cerebral bovine ID50 (McKinley et al. 1983). A radical assumption was then made: that the oral ingestion route is 1 × 105 times less efficient than intracerebral challenge (Kimberlin 1996) and that the cow-to-human species barrier factor is 1,000. This would imply that 1 human oral BSE ID50 unit would contain 105 × 105 × 1,000 = 1 × 1013 PrPSc molecules.

Gale et al. (1998) then went on to consider how the β-Poisson and negative-exponential models compared with the dose-response data obtained for BSE infectivity in inbred mice (Taylor et al. 1995). The mathematical form of the negative-exponential model is P = 1 - exp(-rN), where P is the probability of infection from N pathogens, and r is a species-specific parameter. At low doses, this model can be approximated by the linear function P = rN. However, it is unclear why this relationship should necessarily hold for a conglomerate of prions, which having been described earlier as novel pathogens, are unlikely to be a typical water-borne pathogens. Gale et al. (1998) also claimed that the (even simpler) relationship of p = 0.5 X fraction of 1 ID50 unit ingested compared well with the available dose-response data. By this logic, Gale et al. (1998) concluded that the probability of human infection from an aggregate of 100,000 PrPsc molecules, the largest supported to date by biochemical evidence (Prusiner 1984), would be [(0.5 × 105) / 1013] = 5 × 10-7, which formed the basis of their calculations.

Species barrier factor (s)—Probably the least-certain parameter in any BSE risk assessment is the livestock-to-human species barrier factor. Early advice by SEAC stated that the cattle-to-human species barrier factor could lie anywhere in the range of 1 to 10,000 with 10 as a best estimate (SSC 2000). The risk assessment by Gale et al. (1998) considered that the cattle-to-human species barrier factor could be as high as 1,000. More recent, statistical research based on empirical disease data indicates is that the barrier is likely to be several orders of magnitude higher than 10 (Donnelly et al. 2002; Ghani et al. 2003). No direct experimental data, of course, is available for verification.

Assuming a cattle-to-human species barrier of 10 and a BSE infectivity at 10 bovine ID50 units/g, the estimated infectivity translates to 1 human ID50 units/g (or about 10 bovine ID50g-1/10) of BSE-infected, bovine, CNS tissue. This value was taken as a best estimate for both bovine and ovine CNS tissue in nearly all the DNV risk assessments. An exception is the assessment addressing the risk of BSE exposure to humans from large-scale disposal of sheep carcasses (DNV 2001c). In that case, the sheep-to-human species barrier factor was put at 50, albeit with no evidence of BSE in sheep in the United Kingdom, which resulted in an estimate of 0.2 (or about 10/50) human ID50 units/g of BSE-infected, ovine, CNS tissue.

In summary, environmental BSE risk assessments typically determine the infective load as the total mass of infected tissue contained in the source term under consideration divided by the estimated mass of tissue associated with 1 ID50 unit. More complex calculations to address the risk of exposure associated with BSE in sheep use additional assumptions; namely, that the spread and accumulation of BSE infectivity in sheep tissue occurs identical to that associated with scrapie (DNV 2001c), as recorded by Hadlow (1982).


The task of risk assessment is assisted greatly through the construction of an event-tree diagram. In environmental BSE risk assessment, an event tree charts the pathways from the source of infection by which the transfer of BSE infectivity through the environment might occur (DNV 1997a, 1997b, 1997c, 1997d, 1997e, 2001a, 2001b, 2001c; Gale and Stanfield 2001; Gale 2003; Morley et al. 2003). An event tree provides a framework for estimating both individual and societal exposure risks. Proportions of the total infective load estimated to pass through each branch of the event tree are evaluated either as an average value or as an expert guess. In either case, these are effectively referred to as exposure probabilities, which in tandem with a dose-response relationship, permit individual or societal BSE-infection risks to be calculated.

A comprehensive event tree showing the major environmental pathways by which humans might be exposed to BSE is provided in DNV (1997a). This risk assessment is an overview of 4 other environmental risk assessments conducted at that time (DNV 1997b, 1997c, 1997d, 1997e). Although the connection between vCJD in humans and BSE in cattle had not yet been established in 1997, the key connection was made that vCJD could be caused by consumption of a sufficient quantity of BSE infectivity.

The DNV, environmental BSE-risk methodology is a continuation of that used to determine human-exposure risks associated with BSE in wastewater effluent discharged from the Thruxted Mill rendering plant (DNV 1997b; Young 1997; Comer et al. 1998). Similar to other DNV risk assessments, environmental pathways by which transmission of the BSE agent might occur within the specified domain of England and Wales were meticulously identified. Long-term deposition contours for particulates in flue gases were estimated using the Atmospheric Dispersion Modeling System air dispersion model (CERC 1995) under the long-term atmospheric conditions that occur in Manchester (UK), which are considered representative of average weather conditions throughout the United Kingdom. Deposition of ash particles onto the ground was apportioned according to the national average fraction of land associated with different land-type categories (DOE 1996). Protracted calculations to estimate the quantity of particulates that would be inhaled by people before reaching the ground, deposited on to crops and then ingested by people, and run off to surface water (1% for agricultural land and woodland, 70% for urban land) were included. Estimates of the level of BSE infectivity in cattle waste products were provided, as well as descriptions of their transport and eventual fate into the environment. In this sense, the event trees constructed by DNV (1997a, 1997b, 1997c, 1997d, 1997e) serve as a template for other environmental BSE risk assessments.

Exposure risks from environmental pathways, as calculated in the DNV risk assessments, ranged over several orders of magnitude. The median risk to an individual ranged from an upper limit of less than 1 × 10-6 human ID50 units to the most exposed person in any group in 1996 (DNV 1997a) to less than 6 × 10-10 human ID50 units per year assuming the worst-case exposure scenario to ash generated by a specified risk material (SRM) incinerator (DNV 2001a). The societal risks to the whole population of England and Wales ranged from an upper median risk of 3 human ID50 units associated with exposure via all potential environmental sources in 1996 (DNV 1997a) to as low as 2 × 10-5 human ID50 units associated with exposure only to emissions from a foot and mouth disease pyre of 100 burning beef cattle carcasses (DNV 2001b).

In view of the possibility that a minimum number of prions may be needed to initiate infection, the validity of such low-average, theoretical exposure estimates has been seriously questioned (Gale 1998; Gale et al. 1998). Further, it has been suggested that the complex effect of hydrogeological and other physical environmental barriers are likely to be more critical assumptions in environmental BSE-exposure risk assessments than the magnitude of the cattle-to-human species barrier (Gale 1998). More notably, there is an absence of information on the effects on prions of incineration or burning. By default, incineration and burning have been assumed to reduce infectivity content by an order of 1 × 106 (DNV 1997c, 1997d), in a manner similar to the data describing the effect of incineration on proteins (CAMR 1996).


Monte Carlo simulation has been used in several environmental BSE risk assessments performed to date in the United Kingdom (DNV 1997a, 1997b, 1997c, 1997d, 1997e, 2001a, 2001b, 2001c). Some BSE risk assessments performed elsewhere have used multitiered simulations in which each tier (or module) in the risk assessment is assumed to be a complex dynamic system (Cohen et al. 2001; Habtemariam 2002). These studies have largely attempted to extrapolate the potential risks of BSE exposure in situations in which few epidemiological data were available. They provide risk estimates by considering strategic, what-if scenarios, rather than through statistical inference. However, the modeling approach by which risk is estimated is subject to a wide choice of method (Cummins et al. 2001). Foremost in all cases is the decision on whether to use a probabilistic or deterministic approach.

Probabilistic versus deterministic approaches—Deterministic risk assessments generate point estimates of risk by considering exposure scenarios based either on the worst-case assumptions or mean input-parameter values. Because there is no attempt to incorporate variability or uncertainty, the underlying calculations are straightforward and generate single (or point) estimates of the potential risks associated with exposure via each environmental pathway. In recognition of this shortcoming, a probabilistic risk assessment attempts to include the range of possible uncertainty by allowing one or more parameters to take any value within a range perceived as possible.

Instead of a single calculation based on a set of fixed-parameter estimates, probabilistic risk assessments use computationally intensive techniques such as Monte Carlo simulation. The risk assessment relies on repeated calculations in which model parameter values are drawn randomly from probability distributions. Output distributions are generated for exposure risks, giving a range (or interval) in which each risk estimate would be expected to occur. Such estimates are specified to a given level of confidence typically chosen as the 5 to 95% percentile range. In short, probabilistic risk assessment offers considerable benefits to environmental BSE risk assessment because it makes use of all available information and allows variability, uncertainty, and model sensitivity to be quantified. As a potential weakness, this approach involves greater computational time and complexity, as well as the problem of differentiating between uncertainty and variability.

Specific approaches—The DNV risk assessments give both point and interval estimates for all exposure risks. In contrast, risk assessments by Gale and Stanfield (2001), Gale (1998), and Gale et al. (1998) provide only simple, deterministic calculations. Justification for this disparity in approach can be sought in philosophical terms. Gale et al. (1998) essentially argue that the combination of very high uncertainty (through environmental pathways) with very low levels of infectivity (through low amounts of prions) imply little can be gained from increased computational complexity in the risk assessment. According to Gale et al. (1998), uncertainties in livestock-to-human species barriers are unlikely to be overriding in such circumstances. It is claimed that the central question to be posed in environmental BSE risk assessment should not be what the risk of exposure to infectivity is, but rather, whether it is probable that a person could be infected by a given exposure route.

Congruent with the above, there is very little support provided for the choice of distributions adopted in the Monte Carlo approaches used in the DNV risk assessments. An exception is found in the treatment of the species-barrier factor. The cattle-to-human species-barrier factor appears to lie between 1 and 10,000, with 10 believed to represent a “best estimate” (SSC 2000). However, this uncertainty is represented by a 5-point distribution in which the species-barrier factor has a probability of 0.2475 of being 10, 100, 1,000, or 10,000 and a probability of 0.01 of being 1. This implies there is zero probability of the species-barrier factor being any other nonzero value. An alternative would be simply to assign a uniform distribution with a range between 1 and 10,000. This could be modified further into a 2-step uniform distribution consisting of weighted-probability distributions, most obviously as 10 / 10,000 and 9,900 / 10,000 for the respective intervals of 1 to 10 and 10 to 10,000, to reflect a lesser perceived likelihood of lying below 10 and a higher overall probability of lying between 10 and 10,000. Another alternative is to assign a triangular distribution ranging from 1 to 10,000 with a peak at 10 (as used in Cummins et al. 2002). A justification was not provided in the DNV risk assessments regarding the selection of a log-normal distribution to characterize the uncertainty of the infectivity of CNS tissue. The distribution of the data assumed in the risk assessments influences the output percentile range derived for each exposure-risk interval estimate. Most of the other selected distributions were normal, usually truncated (but without explanation) at specific lower and upper limits. In contrast, risk assessments by Gale and Stanfield (2001), Gale (1998), and Gale et al. (1998) do not incorporate any probabilistic risk approaches.


It has been cautioned in BSE risk assessments that merely concluding a risk is possible cannot be justifiable (Heim and Kihm 2003). It has been argued further that the language used in TSE research may have advanced more than the scientific understanding (Manuelidis 2003). Therefore, to serve as a decision tool, environmental BSE risk assessment must at least deliver estimated risks in terms of an ascertainable likelihood (Comer and Huntly 2003).

Recently, some of the difficulties associated with environmental BSE risk assessment have been highlighted. Gravenor and Kao (2003) emphasized that overspecification of exposure pathways may result in underestimation of BSE-exposure risk. As a case in point, Gravenor and Kao (2003) cite the BSE risk assessment by Morley et al. (2003), performed in accordance with OIE (2003) guidelines, to estimate the risk of importing BSE to Canadian cattle herds over the entire period 1979 to 1997. An event tree model was constructed and various input distributions thought to be appropriate were assigned to input parameters. Monte Carlo simulation in conjunction with Latin hypercube sampling was used to derive output distributions and probabilistic estimates of the potential risks of BSE exposure to Canadian cattle. Whereas the probability of importing BSE infected cattle into Canada was estimated to be high, the likelihood of at least 1 infection occurring in indigenous cattle was calculated as 0.007 and less than 0.02 within a 95% confidence interval. This estimate was considered negligible because of the risk management measures in place. With the identification of BSE in the Canadian cattle population, it would, however, be incorrect to infer that the risk assessment was refuted because negligible risk does not translate necessarily to a probability of zero. The acceptability of risk is an inherent part of risk management and has been addressed elsewhere; for example, in the nuclear power industry in the United Kingdom, acceptable risk translates to a risk of catastrophic failure of less than 1 in 1 million y. This is regarded as an acceptable level by the United Kingdom Health and Safety Executive for risk to the public from major hazard sites (HSE 1992).

The current lack of understanding of the fundamental mechanisms by which BSE may be transmitted must eventually be brought into the equation. Species population ratios and the densities of livestock that vary both nationally and internationally, are likely to be contributory factors to disease transmission and persistence. This is acutely illustrated by the fact that BSE has continued to appear, albeit in declining numbers, in cattle in the United Kingdom reared after March 1996. It has been suggested that this may be connected with the import of contaminated feed from abroad after the MBM ban (Wilesmith 2002), but a quantification of those risks has yet to be made. Similarly, the influence of factors such as natural occurrence, maternal transmission, or even the effect of active surveillance in inflating prevalence estimates from previous years, remains unclear. Emphatically, it has been pointed out that a risk assessment performed on a global scale is required to evaluate the true BSE distribution worldwide (Heim and Kreysa 2002).

Environmental BSE risk assessments have played a role in identifying major risk-reduction measures implemented in the United Kingdom to minimize contamination of the environment and to reduce risk. Since the MBM ban in July 1988, there has been a stepwise tightening of controls in the United Kingdom on protein feed to ruminants. The specified bovine offals (SBO) ban in 1989 was followed by an SRM ban in 1990. The SRM ban was extended to include the intestines and thymus of young calves in 1994, followed by the ban on all mammalian-derived MBM in March 1996. In recognition of the marked effectiveness of MBM controls as a risk-reduction measure in the United Kingdom, most countries in the European Union have introduced a total ban of all types of MBM as feed for farm animals (EC 2001).

A notable risk area deserving future attention is the importance of variability in size of particulates containing infective tissue, which to date has not been fully incorporated into any BSE risk assessment. In environmental risk scenarios in which a large number of exposure pathways potentially exist, the average amount of BSE infectivity to pass through each plausible pathway would be expected to be low. However, if there is high variability in the size of the particulates, then calculations could underestimate the true exposure risk because the effect of exposure to large particulates would not be fully taken into account. This would be especially pertinent if the existence of a minimum quantal (as opposed to cumulative) threshold prion dose is subsequently established. This may be important in light of recent, practical innovations in the abattoir process such as development of the loop saw (Knott 2002). In particular, Helps et al. (2002) demonstrated a statistically significant lower risk of contamination of meat when both sheep and cattle carcasses are split in the abattoir with a loop saw that removes the spinal cord from the spine before the carcass is split.

In all cases, the question of how time dependence can be brought into the calculation of risk must also be confronted (Crane et al. 2002). Although BSE infectivity is consistently assumed to accumulate in proportion to the amount ingested (linearly), the assumption that prions do not replicate significantly over the same time period is also made. For a disease that is demonstrably a time-dependent disease, the absence of a temporal component in BSE risk assessments is a striking oversimplification.

Environmental BSE risk assessment is dependent on predictions of the distribution, fate, and behavior of the BSE pathogen in the environment. Whereas risk assessment can demonstrate mitigating factors in the environment that reduce exposure risk, any analysis must scientifically address the uncertainties in (1) the prevalence of BSE-infected individuals in a specified population, (2) the magnitude of livestock-to-human “species barrier” factors, (3) the nature of prion transportation and destruction through environmental pathways, (4) whether there is a minimum (threshold) dose of prion required to initiate infection, and (5) whether prions ingested by an individual accumulate over the course of time.


Acknowledgement—I express thanks to Vincent Jansen, Robert Payne, and Kenneth Leung for several enlightening prionic discussions.