SEARCH

SEARCH BY CITATION

Imagine a delicious new dessert that's calorie free and lowers cholesterol levels. It takes the world by storm. Other desserts no longer have any appeal to the masses, although not everyone has given up vanilla ice cream. Eventually, evidence emerges that the new dessert is not perfect. Some people experienced painful leg cramps after eating the dessert. Should the new dessert be pulled from the market? Should only those who developed pain refrain from eating? What of consumers who have never tasted the dessert? Should they even tempt themselves?

Like the imaginary dessert, solid phase HLA antibody detection technology took the transplant community by storm several years ago. As a result of the increased sensitivity and specificity of this methodology, the concept of virtual crossmatching became a reality, locally, nationally and internationally (reviewed in 1). Allocation of organs to highly sensitized patients with a calculated PRA (cPRA) >80% significantly increased with the implementation of solid phase antibody detection technology [2]. This has been attributed to more precise identification of HLA antibodies predicted to result in positive crossmatches with targets expressing the corresponding antigens compared to cell based and/or ELISA assays. For these reasons, solid phase antibody detection technology was viewed as a revolutionary development in transplantation [3].

Not surprisingly, blemishes in the technology and its application became apparent with time. First was the observation that not all donor directed antibodies detected by solid phase assays appeared to be clinically relevant [4]. Next came reports that some antibodies detected using microparticle technology were against denatured (but not native) HLA antigens, that is, artifacts of microparticle manufacturing [5]. Additional concerns were raised when so-called naturally occurring HLA antibodies were identified in non-transfused male donors [6]. Now, in this issue of the American Journal of Transplantation, Gombos et al. [7] report that, of 534 patients awaiting renal transplantation, nearly 70% possessed one or more HLA antibodies detectable only by solid phase technology. Alone, these data are not particularly concerning; it has long been known that solid phase assays are significantly more sensitive than other tests for antibody identification [1]. More alarming, however, is that 77% of patients with no history of immunization (n = 133) had antibodies with median fluorescence intensity (MFI) values ranging from 1001 to 14 400. In the majority of cases, antibodies were directed against antigens with such low population frequencies that considering these antigens to be unacceptable (even if the antibodies were falsely positive) would have essentially no impact on a patient's access to compatible donors. However, for some antibody specificities, the population frequencies were high enough that listing the corresponding antigens (e.g. HLA-B*08 with a 12.5% frequency) as unacceptable would, perhaps needlessly, restrict a patient's access to compatible donors. What to do? In an effort to decide which antibodies were “real”, Gombos et al. [7] tested selected samples on a second testing platform from the same manufacturer (namely, microparticles coated with HLA phenotypes rather than individual HLA alleles) as well as on another testing platform manufactured by a different vendor. But rather than clarifying the data, these supplemental tests raised additional concerns. Specifically, while the additional tests were consistent with the patients' reported history of nonsensitization (i.e. negative for HLA antibodies) for some sera, other sera were apparently “confirmed” to have HLA antibodies. Now what? Is it acceptable to interpret a test as negative because a positive result does not fit preconceived notions? Is a patient's personally reported history of nonsensitization always accurate? If there was 100% certainty that a patient is unsensitized, why perform an antibody screening assay? The answer is that transplant programs want to be certain that no potentially hazardous antibodies were overlooked. Should surrogate crossmatches be performed as a final arbiter when patient history and single antigen bead results do not agree? Not necessarily, since DSA+/crossmatch negative transplants carry some added risk for early antibody mediated rejection [1]. Moreover, while we would certainly like to believe that antibodies directed against denatured HLA antigens are clinically irrelevant as has been reported [1, 5], it should be noted that long-term follow-up data among informative patients do not yet exist, meaning that it is premature to consider such antibodies as harmless.

What is the “perfect” HLA antibody detection methodology? One that detects clinically relevant, donor directed HLA antibodies with 100% specificity and sensitivity. That perfect test, as clearly demonstrated by Gombos et al. [7], does not currently exist. But, bear in mind that solid phase antibody detection assays have provided useful information not previously available to the transplant community. While these tests are not perfect, they are still quite good. As Voltaire said, “Le mieux est l'ennemi du bien”. The perfect is the enemy of the good.

Disclosure

  1. Top of page
  2. Disclosure
  3. References

The authors of this manuscript have no conflicts of interest to disclose as described by the American Journal of Transplantation.

References

  1. Top of page
  2. Disclosure
  3. References