Culling wildlife hosts to control disease: mountain hares, red grouse and louping ill virus
Correspondence author.E-mail: email@example.com
1. Culling wildlife hosts is often implemented as a management technique to control pathogen transmission from wildlife to domestic or other economically important animals. However, culling may have unexpected consequences, can be expensive and may have wider implications for biodiversity and ecosystem functioning.
2. We assess the evidence that culling mountain hares Lepus timidus is an effective and practical way to control louping ill virus in red grouse Lagopus lagopus scoticus.
3. Evidence from the available literature is limited, restricting our ability to reliably assess the effectiveness of culling mountain hares to control ticks, louping ill virus, or increase red grouse densities. Furthermore, the information required to assess the cost-benefit of this management strategy is lacking. The population response of mountain hares to culling is not well understood and the possible effects on their conservation status and the upland ecosystem remain unexplored.
4. We conclude that there is no compelling evidence base to suggest culling mountain hares might increase red grouse densities.
5. Synthesis and applications. Widespread culling of wildlife is not necessarily effective in reducing disease or improving economic returns. The use of wildlife culls for disease control should be proposed only when: (i) the pathogen transmission cycle is fully understood with all host-vector interactions considered; (ii) the response of wildlife populations to culling is known; and (iii) cost-benefit analysis shows that increased revenue from reduced disease prevalence exceeds the cost of culling.
The interaction between wild animals and domestic or other managed species may cause conflict through the effects of predation, competition, or the transmission of epizootic pathogens (Woodroffe, Thirgood & Rabinowitz 2005). The transmission of pathogens from wildlife to domestic animals can create conflict when the resulting disease reduces the economic viability of animal husbandry (Daszak, Cunningham & Hyatt 2000). Although sometimes controversial, a common strategy for disease control has been to cull the wildlife host (Carter et al. 2009). However, culling wildlife to manage pathogen transmission from wildlife to domestic animals may be fraught with difficulties, be ineffective and can generate unanticipated results.
For example, in North America, bison Bison bison and elk Cervus canadensis are hosts to the bacterium Brucella abortus that causes brucellosis in cattle. Extensive culling of bison within the Yellowstone National Park between 1996 and 1997 led to the neighbouring American states gaining brucellosis-free status (USDA-APHIS 2009). In the past decade, however, the Yellowstone bison population has more than doubled and now exceeds the original population target size. Future control of the bison population will raise original cost predictions reducing the cost-benefit of the initial cull (Kilpatrick, Gillin & Daszak 2009). Despite the extensive culling of bison, the brucellosis-free status was lost in two states between 2004 and 2006 by infection from elk (Cross et al. 2007). Management aimed at reducing contact between elk and cattle, only increased elk aggregation, the prevalence of brucellosis amongst elk, and the risk of transmission to cattle (Roffe et al. 2004).
In the UK, there has been an increase of bovine tuberculosis (bTB) in UK cattle herds over the past 30 years. The persistence of and failure to control bTB have been linked to a pathogen reservoir in sympatric badger Meles meles populations (Donnelly et al. 2006). Whilst badger culling has been successful in controlling bTB in Ireland (Eves 1999), culling has not reduced the incidence of bTB in England. Large-scale experimental studies have demonstrated that social perturbations associated with culling badgers may lead to increased immigration of badgers into culled areas increasing the spread of bTB (Woodroffe et al. 2006).
In both these cases the response of the wildlife species to culling was unanticipated and reduced the effectiveness of the cull. The increased bison population growth rate led to a reduction in cost benefit, and the increased immigration by badgers resulted in spread rather than control of disease. Furthermore, the failure to consider all potential disease hosts reduced the effectiveness of the cull in the bison-elk-brucellosis-cattle system. In light of these problems, we discuss a third system where culling of a wild mammal is implemented to reduce the disease prevalence in economically important game bird populations.
Mountain hares Lepus timidus and sheep have been implicated in the transmission of the louping ill virus (LIV) to red grouse Lagopus lagopus scoticus. LIV is a flavivirus transmitted by sheep ticks Ixodes ricinus and can cause high mortality in infected red grouse chicks (Reid 1975) reducing grouse density and the associated revenues from shooting (Laurenson et al. 2003). Therefore, controlling LIV is considered to be important for the continued viability of estates managed for grouse shooting (Hudson 1992). Although mountain hares do not show clinical symptoms of LIV, they are hosts for ticks (Laurenson et al. 2003) and laboratory trials have shown that non-viraemic tick to tick transmission of LIV can occur when they co-feed on mountain hares (Jones et al. 1997). Promulgation of research findings to moorland managers and the proposal that culling wildlife hosts, including mountain hares, could be an effective management strategy to control ticks and LIV (e.g. Smith 2009), has led to increased culling of mountain hares in some areas of Scotland. Traditionally, mountain hare populations are harvested by land owners and paying clients with the aim of sustainable hunting for sport. However, there is evidence that an increasing proportion and number of hares are now being killed as part of tick control programmes (Patton et al. 2010) .
Mountain hares are distributed widely across northern Europe and occur at particularly high density on heather Calluna vulgaris-dominated moors in Scotland. The mountain hare has recently been added to the UK Biodiversity Action Plan list of priority species (UK Biodiversity Action Plan 2008) and is on Annex V of the EC Habitats Directive (1992), which requires Member States to ensure that their exploitation ‘is compatible with their being maintained at a favourable conservation status.’ Combined with threats from habitat loss and climate change, culling has sparked concern amongst conservationists and government agencies about the status of UK mountain hare populations, and the effect of culling on upland ecosystem functioning (Patton et al. 2010).
Here we assess the published evidence for the effectiveness of culling hares as a management strategy to control ticks and LIV in order to increase grouse density, and consider whether it is either an effective control measure or sustainable given our understanding of this system.
What are the effects of culling on mountain hare population dynamics?
Whilst mountain hares are an important quarry species in Scotland, little is known about the impact of harvesting or culling on their demography and population dynamics. As their landscape is fragmented (Robertson, Park & Barton 2001), dispersal and any factors affecting it, are likely to be important for metapopulation viability (Hanski & Gilpin 1997). Studies of the movements and natal dispersal of mountain hares in the boreal forest of Sweden demonstrated high adult site fidelity and limited natal dispersal (Dahl & Willebrand 2005) corroborating earlier reports that mountain hares in Scotland show limited dispersal (Hewson 1976). Intensive, localized culling could potentially further fragment populations and if the resulting distance between subpopulations exceeds dispersal distance, or if dispersing individuals are subject to greater mortality risk, dispersal rates and metapopulation persistence may be reduced (Hanski & Gilpin 1997).
A recent comparison of two questionnaire surveys conducted in 1995/1996 and 2006/2007 found no evidence at the 100 km2 scale of major changes in the distribution of mountain hares in Scotland during this time (Patton et al. 2010). Combining questionnaire data on hare management with long-term hunting statistics, Patton et al. (2010) found no evidence of a decline in the numbers of hares recorded in game bags between 1995/1996 and 2006/2007.
Direct evidence of the effect of hare culling on hare density is limited to one study in the Central Highlands where hares in an area of 130 km2 were reduced from a density in excess of 20 km−2 in the late 1980s to very low density (<1 km−2) by 1998 (Laurenson et al. 2003). Whilst this study provides evidence that localized culling can significantly reduce hare density at this spatial scale, neither the efficacy nor effect of culling on densities nationwide can be assessed until long-term abundance data are available across Scotland (Patton et al. 2010).
Mountain hare populations in Scotland are shown to have cyclic, or unstable, dynamics (Newey et al. 2007). Associated changes in demographic parameters seen at different phases of the population cycle may affect any underlying density-dependent mechanisms. Although necessary for understanding the response of mountain hare populations to culling, knowledge of compensatory mechanisms and at what intensity culling becomes additive to natural mortality is lacking.
Does culling mountain hares reduce tick abundance and LIV prevalence?
To test the hypothesis that mountain hares can act as a LIV reservoir in the wild, Laurenson et al. (2003) conducted a study where mountain hare density was reduced over a number of years on one sporting estate in the Central Highlands where seroprevalence to LIV antibodies in shot young red grouse was high (80%), red deer Cervus elaphus were absent and there were very few roe deer Capreolus capreolus. The subsequent changes in tick abundance, LIV prevalence, brood survival and post-breeding densities were compared to grouse moors where mountain hare density was not manipulated. This experiment demonstrated that reducing mountain hare density resulted in fewer ticks, lower LIV seroprevalence in shot young grouse and increased grouse chick survival (Laurenson et al. 2003). However, this did not result in increased post-breeding grouse densities when compared to control areas (Cope, Iason & Gordon 2004; Laurenson et al. 2004). Interpretation and generalization of these findings, however, requires caution as: (i) the study area was unusual in having no red deer. Red deer do not display symptoms of LIV or contribute to its transmission (Jones et al. 1997); they are, however, important hosts for ticks (Gilbert et al. 2000). Modelling studies predict that reducing mountain hare density in areas with red deer will not reduce ticks or LIV because ticks are maintained by the deer population and LIV is maintained by the grouse population (Gilbert et al. 2001). Most Scottish estates have red deer, which means that the results of the Laurenson et al. (2003) study are unlikely to be applicable to most of upland Scotland; and (ii) concurrent with the reduction in hare density, sheep on the study area were intensely managed to reduce ticks and LIV through treatment with acaracide three times a year, and vaccination of yearling sheep against LIV. This additional form of tick and LIV control was not implemented on all control sites confounding the effects of mountain hare culling.
The spatial distribution of each species in this system may also be important when considering the effectiveness of mountain hare culling for tick and LIV control. Both mountain hare and red grouse densities tend to increase with altitude between 400 and 700 m, whilst ticks and red deer show the opposite trend (Gilbert 2010). Ticks are most likely to have impacts on red grouse at lower altitudes where tick density is highest and therefore tick control may be most beneficial in these areas. However, there are fewer mountain hares and more red deer at these lower altitudes (Gilbert 2010) and so culling hares from these areas is unlikely to have a significant impact on red grouse tick burdens as deer will maintain the tick population in the absence of hares.
Do ticks and LIV affect red grouse demography?
Tick abundance and distribution has increased in the UK (Kirby et al. 2004; Scharlemann et al. 2008), possibly due to increasing host densities (Clutton-Brock, Coulson & Milner 2004), or a warmer and wetter climate (Barbett et al. 2006), providing a longer season and more favourable conditions for tick development (Lindgren & Polfeldt 2000).
Grouse chicks in areas of high tick abundance tend to have high tick burdens and increased probability of contracting LIV (Reid 1975). Ticks may also directly affect chicks through anaemia, reduced feeding due to ticks aggregating around the eyes, and secondary infections.
Empirical evidence of tick induced morbidity are, however, equivocal. Experimental reduction of tick burdens on breeding female grouse through treatment with acaracide, at one site with very high LIV prevalence (up to 75%) in 1995 and 1996 produced contradictory results; in 1995 acaracide treatment did not reduce LIV infection rates nor increase chick survival at 10 weeks, in 1996 however, LIV prevalence in chicks was significantly reduced at 35 days of age and increased chick survival (Laurenson et al. 1997). A similar study carried out at sites with low LIV seroprevalence (up to 1·4%), found no significant difference in LIV seroprevalence nor brood survival between chicks of treated and control females (Moseley et al. 2007; Mougeot et al. 2008).
Louping ill virus prevalence is spatially patchy (Laurenson et al. 2007) and highly variable, with reported seroprevalence in red grouse ranging from low; 1·4% (Moseley et al. 2007), 0% and 7·1–26·1% (Gilbert et al. 2001), to high; 75% (Laurenson et al. 1997), 46% and 81·8% (Gilbert et al. 2001). The high variation in LIV seroprevalence and the patchy nature of its spatial distribution complicate our understanding of the effects of LIV on red grouse demography. With such high variation in LIV prevalence between areas, the ability to assess and reliably recommend control measures for LIV is limited. Furthermore, although LIV can cause 78% mortality in infected red grouse in the laboratory (Reid 1975), the susceptibility of chicks to LIV in the wild is likely to vary depending on maternal condition, chick health, genetics, weather and stress.
In summary, the evidence that culling mountain hares can reduce tick burdens and LIV seroprevalence of red grouse is provided by Laurenson et al. (2003). This site, however, was unusual with very high LIV seroprevalence levels in grouse, and an absence of red deer. Moreover, the simultaneous management of sheep to act as ‘tick mops’ to reduce tick numbers (Laurenson et al. 2003) is confounding, making it impossible to disentangle the effect of hare culling and sheep management on changes in tick burdens, LIV prevalence and grouse density. In addition, Gilbert et al. (2001) provide evidence that mountain hare culling would not be effective if alternative tick host (such as deer) were present. Laurenson et al. (2003) report that grouse densities increased following hare culling, although the increase was not significant when compared to control sites. Grouse populations in Scotland show cyclic dynamics (Haydon et al. 2002) making it difficult to interpret short-term changes in grouse density. Findings on the effect of ticks on LIV prevalence in grouse chicks and chick survival appear equivocal and dependant on the prevalence of LIV at the study site (Laurenson et al. 1997; Mougeot et al. 2008).
Although widely used, culling of wildlife hosts for disease control can be ineffective and may generate unanticipated results. Failure to take account of the possible effects of perturbation on social behaviour, density-dependent fecundity or survival, and the role of alternative hosts may at best render the approach unsuccessful and at worse may exacerbate the situation.
Heather moorland, red grouse and mountain hares are of significant cultural and conservation concern in the UK. Their future management demands a sound evidence base, and we suggest future work is needed to gain comprehensive knowledge on the effects of high tick burdens on red grouse, individually and at the population level. The role of mountain hares as hosts for ticks and the persistence and transmission of LIV needs greater investigation under a wider variety of circumstances. Research into alternative methods for tick and LIV control, such as treating alternative tick hosts (deer and sheep), with acaracide to act as ‘tick mops’, should be pursued and their efficacy and efficiency investigated. Long-term studies are required to asses the effects of mountain hare culling on local and nationwide abundance and to understand the implications of unstable population dynamics shown by many mountain hare populations. Furthermore, information on density-dependent and compensatory mechanisms, in addition to dispersal patterns, is critical in making informed predictions on the demographic response of hare populations to culling. Given the economic importance of grouse moor management in Scotland, there are surprisingly few studies investigating the socio-economics of grouse moors (Redpath & Thirgood 2009; Sotherton, Tapper & Smith 2009; Thompson et al. 2009), prohibiting any meaningful assessment of the cost-benefit of mountain hare culling for tick and LIV control. The way in which economic factors govern and interact with wildlife management and biodiversity are key areas for future research. We conclude that the evidence currently available is insufficient to provide scientific justification for culling of this Annex V species for the purposes of tick and LIV control.
AH was supported by a NERC CASE studentship and SN, LG and ST by the Scottish Government Rural Environment Research and Advisory Directorate. We thank Pete Goddard, Glenn Iason, Justin Irvine, Steve Redpath, Adam Smith, Des Thompson, and two anonymous referees for comments on the manuscript.