Serological responses to Cryptosporidium in human populations living in areas reporting high and low incidences of symptomatic cryptosporidiosis

Authors


Corresponding author and reprint requests: P. R. Hunter, School of Medicine, Health Policy and Practice, University of East Anglia, Norwich NR4 7TJ, UK
E-mail: paul.hunter@uea.ac.uk

Abstract

One approach to investigating differences in the reported incidence of disease is to measure the extent of exposure to the organism in question by testing for a specific antibody response. IgG responses to Cryptosporidium sporozoite antigens of low molecular size in adults have been shown to be consistent and of sufficient intensity to act as reliable markers of exposure. This study used a western blot procedure to investigate the relative intensity of IgG antibody responses to the 15/17-kDa Cryptosporidium sporozoite antigen complex and the 27-kDa antigen in sera from two cities in north-west England: Liverpool (low numbers of clinical cases reported) and Preston (high numbers reported). The intensity of antibody response to the 15/17-kDa antigen complex was significantly greater in the Liverpool sera, but there was no significant difference in intensity of response to the 27-kDa antigen. The relationship between diagnosed and reported cryptosporidiosis infections and infections identified by serological testing is complex, but could indicate a protective effect resulting from either exposure to non-pathogenic strains or from repeated low-level exposure to pathogenic strains.

Introduction

The protozoan parasite Cryptosporidium is distributed widely in the environment, and is a common cause of gastrointestinal disease in humans [1]. Cryptosporidium infections can cause serious morbidity and mortality in immunocompromised patients, e.g., those living with AIDS [2], and ill-health can persist for weeks after the acute illness, even in immunocompetent patients [3].

Most reports concerning the epidemiology of cryptosporidiosis have been concerned with outbreaks of disease, particularly those caused by drinking from piped water supplies [1,4]. However, recognised outbreaks represent only a small proportion of cases of disease, and the majority of cases reported are sporadic, in that they are not linked to other known cases. Multiple sources and routes of transmission contribute to a complex epidemiological picture, and little is known about the levels of endemic infection in the population. Therefore, it cannot be assumed that the causes of sporadic cryptosporidiosis are broadly the same as those of outbreaks, or occur in roughly the same proportion.

To date, there have been very few studies of sporadic cryptosporidiosis. However, it would appear that the incidence of cryptosporidiosis varies quite markedly from one region to another, and from one health authority to another within the same region, suggesting that the epidemiology of the disease also varies from one district to another (within the UK, each health region is made up of several health authority areas). For example, the mean annual incidence of cryptosporidiosis in the north-west of England between 1993 and 2002 was 12.0 cases/ 100 000 population, compared with 8.4 cases/ 100 000 population in England and Wales as a whole (http://www.hpa.org.uk/infections/topics_az/crypto/data_uk_geog2.htm). Within the north-west region, the incidence varies substantially from one health authority area to another, with the incidence in north-west Lancashire during the 1990s being 17.4-fold greater than the incidence in Liverpool [5].

Production of IgG antibodies to low molecular size Cryptosporidium sporozoite antigens in adults has been shown to be consistent and of sufficient intensity to act as a reliable marker of exposure, and correlates better than the results of ELISA with known risk-factors for Cryptosporidium infection [6]. The present study investigated responses to the 15/17-kDa and 27-kDa antigens; antibody responses to the former appear to decline to baseline over a 4–6-month period post-infection, representing a marker of recent infection, while responses to the latter remain for 6–12 months post-infection, providing a marker of slightly more distant infection in terms of time [7]. Antibodies to these antigens appear to be a reliable marker of exposure to Cryptosporidium, at least in adults [8,9].

To ascertain the prevalence of exposure to Cryptosporidium in two communities in the north-west of England with marked differences in the reported incidence of disease, a seroprevalence study was undertaken using a western blot technique. The high-incidence city was Preston, part of North-West Lancashire Health Authority, and the low-incidence city was Liverpool. Further sera were collected, where possible, from individual patients to investigate the stability of antibody levels over time.

Materials and methods

Sera

Anonymised paired serum samples, taken from adult patients (aged ≥15 years), were collected from each of the two Public Health Laboratories in Preston and Liverpool. The sera were residual sera from samples submitted by local general practitioners and local hospital trusts for a variety of clinical reasons unrelated to Cryptosporidium infection, and were tested with the approval of the relevant ethics committees. In order to investigate the stability of test results over time, a second (paired) serum was recovered from each patient, where possible, after a gap of ≥4 months.

Serological tests

The sera were analysed for Cryptosporidium sporozoite antibodies at the Cryptosporidium Reference Unit, Swansea, using a western blot method described previously [10]. In brief, a Cryptosporidium parvum sporozoite antigen preparation was made [11,12] from calf faeces infected with C. parvum IOWA (also known as the Harley Moon isolate [13]), and was separated into component antigen proteins using SDS-PAGE mini-gel electrophoresis. The proteins were then transferred by semi-dry transfer on to nitrocellulose sheets, which were then placed in a multiscreen apparatus that allows isolation of vertical strips of the blot for contact with the test sera.

Test and control sera were prepared as 1:50 v/v dilutions in phosphate-buffered saline containing Tween-20 0.3% v/v. Bound human antibodies in the sera were detected by incubation with a secondary biotinylated mouse anti-human IgG antibody. The bound secondary antibody was then detected using streptavidin–alkaline phosphatase, which was visualised in a colourimetric reaction using 5-bromo-4-chloro-3-indolyl phosphate as the substrate and nitroblue tetrazolium as the chromagen [10]. Intensitometric data were obtained for the serological responses to three sporozoite antigens: the 15-kDa and 17-kDa antigens which, since mini-gels do not resolve the antigens separately, are hereafter referred to as the 15/17-kDa antigen complex [10], and the 27-kDa antigen, using a digital camera and KDS1D analysis software (Kodak Digital Science, Hemel Hempstead, UK). The relative intensity of each antibody response was calculated as a percentage of that obtained for the positive control on each blot. Data analysis was performed using SPSS software (SPSS Inc., Chicago, IL, USA).

Positive control serum from an acute laboratory-acquired and confirmed case of cryptosporidiosis caused by C. parvum (IOWA) [14] was included on each blot. Inter-blot consistency was achieved by reporting all intensities relative to the control serum. Intra-blot consistency was assessed by probing four entire blots with the positive control serum. The relative intensity of each lane compared to the mean was determined, and the overall variance in relative intensity was calculated across all four blots. The variance ratios of the positive control sera compared to the test sera were used to determine the potential contribution of within-batch variation to variance in the calculated sample intensities. In order to demonstrate a dose–response between antibody quantity and the intensity of the response, a doubling-dilution series of the positive control was tested twice.

Results

In total, 248 pairs of sera were collected from Preston Public Health Laboratory (the high-incidence area) between July 2000 and September 2002. Of the paired sera, 57 (23%) were from males, 188 (76%) were from females, and the gender was not known for three (1%). The age range at the time of the first specimen was 15–89 years (mean 36, median 32 years). The mean time between collection of the first and second serum samples was 344 days (range 109–750 days, median 318 days).

In total, 84 pairs of sera and 152 single sera were collected from Liverpool Public Health Laboratory (the low-incidence area) between July 1995 and July 2000. Of the paired sera, 27 (32%) were from males, 55 (65%) were from females, and the gender was not known for two (2%). The age range at the time of the first specimen was 17–59 years (mean 33, median 31 years). The mean time between collection of the first and second serum samples was 557 days (range 182–1356 days, median 473 days). Of the single sera, 45 (30%) were from males, with an age range of 19–73 years (mean 32, median 30 years). Differences between the paired sera from Preston and Liverpool in terms of the age of the donor at the time of the first sample were not significant (Mann–Whitney two-sample test z −1.853, p 0.064); similarly, the gender distribution was not significantly different (uncorrected χ2 3.425, p 0.064).

For the first or single sera, the mean relative intensities of the 15/17-kDa and the 27-kDa responses were 15.97 and 12.86 (variance 1478 and 585.7), respectively. This compares with variance values of the positive control in the intra-blot reproducibility studies of 0.1087 and 0.1126, respectively. The variance values of the positive control sera were significantly lower than those of the test sera (15/17 kDa, F 13560, p <0.0001; 27 kDa, F 5194, p <0.0001). Consequently, uncertainty in the relative intensity caused by test methodology would be minor compared with natural variation in the test sample intensity relative to the control. Fig. 1 shows the relationship between intensity and dilution of the positive control, which demonstrates an approximately linear log–log relationship. The relative intensity of the response obtained by this method is therefore a good estimate of antibody levels.

Figure 1.

 Relationship between intensity of the antibody response and dilution of the positive control serum.

There was a strong correlation between relative intensity and the age of the donor for both the anti-15/17-kDa antigen (Spearman's rho 0.162, p <0.0001) and the anti-27-kDa antigen (Spearman's rho 0.134, p 0.003). The median relative intensity of the 15/17-kDa antigen response was significantly higher in the sera from Liverpool than in the sera from Preston (3.68 vs. 1.58; Mann–Whitney U, p 0.044). Although the relative intensities of the 27-kDa antigen response were also higher in the Liverpool sera, this was not significant (median 5.04 vs. 3.68; Mann–Whitney U, p 0.427).

When the analysis was restricted to the paired sera only, the relative intensities of antibody responses in the first and second sera from each individual, relative to the control sera, were highly correlated for both the 15/17-kDa (Spearman's rho 0.682, p <0.0001) and the 27-kDa antigens (Spearman's rho 0.750, p <0.0001) (Fig. 2). However, there was a significant decline in the relative intensities of the response in the second sera compared to the first sera for both the 15/17-kDa (p 0.0019) and the 27-kDa antigens (Wilcoxon's signed-rank test, p 0.0108). Upon further analysis, this decline was seen in the sera from Preston, but not in the sera from Liverpool (for the 15/17-kDa antigen, p 0.0136 vs. p 0.0829; for the 27-kDa antigen, p 0.0002 vs. p 0.5546).

Figure 2.

 Relative intensity of antibody responses to (a) the 15/17-kDa antigen complex and (b) the 27-kDa antigen in paired sera from the north-west of England.

If seroconversion is defined as an increase in intensity by >10% of the intensity of the positive control, then seroconversion was observed in 27 (8%) of 332 sera according to the 15/17-kDa antigen response, and in 31 (9%) of 332 sera according to the 27-kDa antigen response. Although the rate of seroconversion was higher in the Liverpool sera for both markers, this was a consequence of the longer period between sample dates in the Liverpool sera compared to the Preston sera. The conversion rates/100 person-years did not differ significantly for either marker between the two populations.

Discussion

Cryptosporidium infection elicits an antibody response in most exposed individuals, and the western blot method is regarded as providing a reliable measure of detecting Cryptosporidium infection in sera from adult populations [6,8,9]. The mini-gel assay is a convenient format that offers specificity of detection while maximising the use of antigen preparations. The finding that the relative intensity of the antibody response to the 15/17-kDa antigen was more variable between the first and second samples than that to the 27-kDa antigen is consistent with results from the USA, which revealed that the response to the 15/17-kDa antigen is short-lived (4–6 months), while that to the 27-kDa sporozoite persists for up to 2 years [7,10]. On the basis of these kinetics, it would be expected that the relative intensity of the response to the 27-kDa antigen would be greater in randomly selected population sera than the response to the 15/17-kDa antigen complex, and this was found to be the case in the present study, as well as in most, but not all, previous studies.

Based on knowledge of the kinetics of antibody responses, it might be expected that the prevalence of antibodies to the 27-kDa antigen would also be greater in randomly selected population sera than the response to the 15/17-kDa antigen complex. Frost et al. [15] compared the antibody responses to each antigen in blood donors from two cities in the USA, and demonstrated that positive serological responses (defined as a band intensity >10% that of the positive control) to the 27-kDa antigen were detected in more individuals than were positive responses to the 15/17-kDa antigen. Caputo et al. [16] reported similar results from a cohort study of homosexual and bisexual males with human immunodeficiency virus infection, in that positive serological responses (defined as a band intensity >35% that of the positive control) to the 27-kDa antigen were found in more individuals than were positive responses to the 15/17-kDa antigen. However, the present study did not reveal a clear difference in the prevalence of antibodies to these two antigens, and this was similar to the observations made by Frost et al. [17] in their study of a cryptosporidiosis epidemic in Toronto, Canada. The present study also found that the net intensity of responses to the 27-kDa antigen was lower than the response to the 15/17-kDa antigen (data not shown), but Frost et al. [18] reported little difference between the intensity of each antibody response in a study of serological responses among users of surface and groundwater sources. However, these findings may be associated with the fact that the mini-gel western blot format underestimates the 27-kDa response, particularly at low levels [6].

Currently, there is no consensus concerning the intensity of the positive antibody response required to define Cryptosporidium seropositivity in a western blot. Frost et al. [6] chose a cut-off relative intensity value of 35% of the positive control, citing evidence from paired sera collected over an unspecified period that individuals may maintain responses of ≤30% for extended periods while responses >35% decline. Based on this value, seroprevalence among blood donors was 22%, 26% and 48% for the 15-kDa, 17 k-Da and 27-kDa antigens, respectively. Using the same cut-off value, a seroprevalence of 26% was detected for the 15/17-kDa antigen complex and of 39% for the 27-kDa antigen among homosexual and bisexual males [16]. Other studies have used cut-off values of >5%[17] and ≥1%[19] that of the positive control to define a detectable response. In some studies, blots have been assessed by eye for any detectable antibody response [9,20,21], which may equate to 5–10% relative intensity, depending on the intensity of the positive control. However, until further work has defined criteria for positive sera, focusing on changes in the mean relative intensity of antibody responses is perhaps a more useful approach for analysis of data and generation of information regarding population exposure to Cryptosporidium. Moss et al. [9] explored changes in reactivity using intensitometry, and concluded that increases in reactivity were more likely in experimentally infected volunteers who were developing cryptosporidiosis than in those who were infected asymptomatically or who were oocyst-negative. Variations in mean net intensity have also been correlated with cases/non-cases, in that symptomatic infection was associated with consistent changes in antibody responses [8]. Thus, relative intensity is a useful measure for monitoring exposure to Cryptosporidium at the population level, and is a practical proxy for titration in the western blot format.

Given the differences in the reported incidence of cryptosporidiosis between Preston (high) and Liverpool (low), it is interesting that there was no significant difference in intensity of response to either the 15/17-kDa or the 27-kDa antigens, indicating similar levels of exposure to Cryptosporidium. During the period of collection of the sera, the mean annual number of reported cases in Liverpool was 1.36/100 000 population, compared with 23.43/100 000 population in Preston. As reporting mechanisms are supposed to be similar in the two locations [22], this finding is difficult to explain. One potential explanation is that seropositivity may indicate infection caused by species other than Cryptosporidium hominis or C. parvum. Most other species are unlikely to cause symptomatic illness, and it is known that such species are quite widespread in the environment [23]. It is tempting to speculate that the population of one city is exposed to a different range of species, so that seroconversion is detected in the absence of reportable illness after exposure to non-pathogenic species of Cryptosporidium. This suggestion is supported by a recent study that showed similar antibody responses following infections caused by Cryptosporidium spp. that were and were not associated with clinical symptoms [24]. Liverpool and Preston both use drinking water from surface-derived supplies, albeit from very different catchment areas with different livestock densities. Water for Preston (high levels of reported illness) is more subject to contamination from cattle rather than sheep, with the reverse situation being true for Liverpool (low levels of reported illness), and it is known that sheep may be colonised with a strain that is demonstrated only rarely in human infections [25].

The present findings do not necessarily conflict with an earlier study that reported higher antibody levels in a community following a recent outbreak compared with an overall low incidence reported in the area [26]. It is hypothesised that residents in the low-incidence area in the present study were subject to asymptomatic infection through drinking water contaminated by non-pathogenic species, but that the residents in the low-risk area in the previous study were probably not exposed in this way.

In the present study there was a correlation between age and the increasing relative intensity of the antibody response. However, this increase with age probably reflects residual antibody following multiple infections [27]. Two studies, one in immunocompetent individuals and the other in individuals infected with human immunodeficiency virus [28,29], have both demonstrated that initial high levels of antibody to the 27-kDa antigen were associated with a subsequent reduced risk of self-reported diarrhoeal infection. This suggests that cryptosporidiosis is responsible for a considerably greater proportion of cases of diarrhoea than thought previously, and that repeated exposure may actually reduce the risk of illness. This may explain the difference in reporting rates between Liverpool and Preston.

A further interesting observation was the decline in relative intensity in the second sample compared with the first samples in paired sera. This was observed in sera from Preston, but not in sera from Liverpool. It is tempting to suggest that this reduction is associated with a reduction in the incidence of cryptosporidiosis in the area secondary to the recent outbreak of foot and mouth disease, or to the ‘Cryptosporidium in Drinking Water Regulations’, which came into force in the UK during the collection of sera from Preston [30,31].

Acknowledgements

We thank the UK Drinking Water Inspectorate of the Department for the Environment, Food and Rural Affairs for funding this study. PRH is Chair of the Board of Directors of the Institute of Public Health and Water Research, and has received grant funding and undertaken consultancy work for various water utilities.

Ancillary