Corneal Transplantation: The Forgotten Graft

Authors

  • A. J. T. George,

    Corresponding author
    1. Department of Immunology, Division of Medicine, Faculty of Medicine, Imperial College London, Hammersmith Hospital, London, UK
    Search for more papers by this author
  • D. F. P. Larkin

    1. Department of Immunology, Division of Medicine, Faculty of Medicine, Imperial College London, Hammersmith Hospital, London, UK
    2. Moorfields Eye Hospital, London, UK
    Search for more papers by this author

*Corresponding author: A. J. T. George, a.george@imperial.ac.uk

Abstract

The most commonly performed transplant is that of the cornea, with 2292 corneal grafts performed in the UK in 2002–03, compared with 1775 renal transplants [1]. In the USA approximately 40 000 transplants are performed every year [2]. However this preponderance is not reflected in the amount of attention given to this transplanted tissue by the scientific community: for example up till now there have been no papers published in the American Journal of Transplantation that have cornea as a key or title word (as determined by a Medline search in December 2003). There are several reasons for this. The first is that corneal grafting is the province of ophthalmologists, who (with notable exceptions) are isolated from the transplant community. The second is that there is a widespread belief that, because of the existence of immune privilege, corneal grafts are not rejected and so there is no need for further research. As we will discuss later, this is incorrect. In this article we will seek to show that study of corneal transplantation is important in its own right, and also that it has lessons for those interested in other forms of allograft.

History

The concept of corneal transplantation was first suggested in 1796 by Erasmus Darwin, the grandfather of Charles Darwin, in his influential book Zoonomia (3). However, theory did not translate into practice until 1835 when an Irishman, Samuel Bigger, while a prisoner in Egypt, successfully transplanted an allogeneic cornea into the blind eye of a pet gazelle (4). In 1838, many years before the invention of anaesthesia, Richard Kissam reported the first corneal transplant in a human: this was a procedure in which pig cornea was grafted into a human recipient eye and remained transparent for a couple of weeks (5). Almost all donor corneas were xenografts until the late 19th century when a successful lamellar, or partial thickness, corneal transplant was performed (6). A significant milestone was the first successful full-thickness human corneal graft, reported in 1906 (7). Since then corneal transplantation has grown rapidly, with the first eye bank being established in Moorfields Eye Hospital, London, in the 1960s (8).

Transplantation Indications and Procedure

Corneal transplantation (keratoplasty) is the only form of therapy for many disorders of the cornea leading to blindness. In addition it can benefit patients with infection, pain or perforation of the cornea. The clinical indications include corneal opacification following cataract surgery, keratoconus (a condition affecting young adults when the cornea becomes misshapen), inherited disorders and scarring caused by infections such as herpes simplex virus (9). In addition, trauma or chemical injury to the cornea can also be important. Finally, a large proportion of corneal transplants are performed in order to replace grafts lost owing to allogeneic rejection or other causes of failure.

The majority of corneal transplants are performed by excising a 7–8-mm diameter circle from the recipient cornea, and replacing it by a specimen of similar diameter of donor cornea (Figure 1). While full-thickness replacement is the most common form of transplantation, partial thickness (lamellar) transplants can also be used in management of superficial corneal pathology. One major difference between processing of donor cornea and other tissues is that cornea does not need to be ‘fresh’ at transplantation. Typically corneas are harvested from cadaveric donors as late as 24 h after death, and then can be stored for up to 1 month before being used for transplantation. There are three main forms of storage, either as whole eyes at 4°C for 48 h, as a corneoscleral disc in a chondroitin sulphate-based medium at 4°C for up to 14 days, or in tissue culture as a corneoscleral disc at 37°C for more than 4 weeks. Differences in donor corneal storage method do not significantly influence graft outcome. However, the ability to store corneas provides an enormous advantage, as it allows both scheduling of transplant surgery and also quality testing of the donor corneas before grafting. The potential for ex vivo manipulation of the donor cornea before surgery is one further advantage which has not reached clinical practice.

Figure 1.

Corneal allograft. A functioning corneal transplant, with the arrow indicating the graft–recipient junction. The continuous 10/0 nylon suture is usually removed 12–24 months postsurgery, indicating the long wound healing time in an avascular tissue.

Corneal Endothelial Cell: Functional Unit and Rejection Target

The anatomy of the cornea is relatively simple, consisting of three major layers (Figure 2). The anterior surface of the cornea is a 6–8-cell deep epithelial layer, which lies superficial to Bowman's membrane. The main thickness of the cornea is formed by the stroma, consisting of precisely aligned collagen fibres supported by scattered keratocytes. The posterior surface is comprised of an endothelial monolayer on Descemet's membrane. As described later, the endothelial cell monolayer that forms the posterior surface of the cornea is critical to its functioning (Figure 3). In the human at birth there are >4000 cells/mm2 (10). During life, owing to the nonreplicative nature of the cells that are arrested in G1 phase of the cell cycle (11), there is a gradual loss of cells at a rate of 0.6% a year (12,13). Corneal transparency can be maintained with density as low as 500  cells/mm2, and so in most cases there is sufficient capacity such that loss of endothelial cells does not result in corneal blindness before death. However, in cases where there is trauma to the endothelium (for example following cataract surgery), there is accelerated loss of cell density. This can result in endothelial decompensation and requirement for corneal replacement by a transplant.

Figure 2.

Anatomy of the cornea and the anterior chamber. The cornea forms the transparent front surface of the eye, forming the anterior boundary of the anterior chamber and comprising several layers. The anterior surface is formed by epithelial cells, resting on Bowman's zone. The majority of the thickness of the cornea is formed by the stroma, consisting of collagen fibres, precisely arrayed so as to preserve transparency, together with scattered keratocytes. The posterior surface is formed by the endothelial monolayer, which is attached to Descemet's membrane.

Figure 3.

Corneal endothelial cells. The endothelial cells are readily visible using in vivo specular microscopy, and form a monolayer of roughly hexagonal cells. If endothelial cells die, the denuded space is covered by migration and enlargement of surrounding cells.

Donor corneas for transplantation demonstrate a marked fall in endothelial cell density during ex vivo storage. In one large corneal bank approximately 30% of all corneas were not used for transplantation, as the endothelial cell numbers had fallen below the acceptable density of 2200  cells/mm2 (14). Following transplantation there is a rapid loss in cell number, presumably owing to surgical trauma and early postoperative inflammation (15). If there is a rejection episode, there is a further acute fall in cell density. In some cases this will reduce cell density to a level where transparency is lost and the corneal graft fails.

The acute loss during rejection episodes reflects the immune damage. However, there is also an increased rate of endothelial loss following transplantation (4.2% a year) which is not associated with any overt rejection episodes. This can lead to loss of the graft owing to decompensation. It is not known whether this increase attrition of endothelial cells is the result of a low-grade chronic rejection, multiple undetected rejection episodes, or whether the corneal endothelial cells are damaged or stressed by the transplant procedure and previous rejection episodes and so die more frequently without any special stimulus (12). It is of interest that in autologous transplants a similar accelerated loss of cell density occurs, indicating that allogeneic inflammation is not the only factor in cell loss (C. Hartman, personal communication, 2001). One important goal of future research should be to understand the reasons for this accelerated loss of endothelial cells, as different strategies will be needed to prevent immunological damage or to ‘re-programme’ endothelial cells to prevent their death.

An alternative approach is to allow the cells to undergo a (limited) number of divisions so as to increase their cell number and so overcome this increased death. A doubling in the cell number would have a considerable effect on corneal graft survival. The checkpoints controlling the arrest of the endothelial cells have been described, the cells are arrested in the G1 phase of the cycle in part owing to the expression of p27kip1 (16), and strategies are being developed that allow the cells to undergo division (16,17). One of the difficulties with this area of research is that endothelial cells in rodent corneas are capable of division, limiting research either to human material, which is in short supply, or to more expensive animal models, for which there are less reagents available.

Corneal Transplantation Outcomes

As mentioned earlier, corneal transplants are often considered to not undergo a significant failure or rejection rate. However, this prejudice is based to a large extent on historic data when corneal transplants showed a low rejection rate when compared with other organs. With the advent of improved immunosuppression, the failure rate for renal, cardiac and hepatic transplantation has reduced to a similar figure for that of corneal transplantation, with approximately 25% of corneas being lost by 4–5 years (18). The commonest reason for failure of corneal grafts is acute allograft rejection (and other forms of failure, such as endothelial decompensation, may well be a form of chronic rejection). In general corneal transplants are performed with topical steroids as the sole rejection prophylaxis, without systemic immunosuppression. The majority of patients recognized to be at high risk of allograft rejection have a normal contralateral cornea and the disease for which surgery is undertaken is not sight-threatening. However there is a smaller group of patients with bilateral blinding corneal disease, many of whom have failed corneal transplants subsequent to rejection and in whom graft survival at even 2 years is very low. What are the outcomes of immunosuppression in these graft recipients? There is virtually no information from prospective randomized trials. Case series and case–control studies on single-agent calcineurin antagonist immunosuppression have shown unconvincing benefits on graft survival (19–21). By contrast to solid organ transplantation, this will seem an alarming deficit in clinical research which should be addressed. It is explained though hardly justified by, on the one hand historically greater interest in studies of HLA matching, and by lack of coordination of multicentre immunosuppression treatment trials. This patient group is important because not only do they need improved prophylaxis, but also because they represent patients in whom new forms of intervention might ethically be pioneered. HLA matching between donor and recipient is not routinely performed. Results of the two large multicentre matching trials in patients at high risk of rejection have together yielded no clear justification for, at least MHC, antigen matching. One large study has shown no benefit of HLA class I and class II-DR matching on graft survival (22), while a second study has shown a small, but significant, benefit of class I matching and an increased risk of rejection with class II matching (23).

Corneal Allograft Rejection and Its Pathogenesis

Rejection can occur in any of the three layers: epithelium, stroma or endothelium. Destruction of the epithelium is relatively unimportant in itself (though it may serve to immunize the host against the donor), as the donor epithelium can be replaced by recipient epithelium derived from the limbus. Stromal rejection is relatively common but in most recipients does not progress to endothelial rejection, as it is easily reversed with intensive application of a topical steroid. In terms of clinical corneal function the form of rejection of greatest significance is that of the endothelium. The endothelial monolayer functions to pump water out of the stroma of the cornea. In the absence of this pump function the cornea swells, the collagen fibres are disorientated and the cornea loses transparency. Human endothelial cells are nonreplicative (see earlier), and so donor cell loss is irreversible. During endothelial rejection it is possible to directly visualize linear or multifocal deposits of leucocytes adhering to the endothelium and the loss of clarity resulting from oedema (Figure 4). A variable but irreversible decrease in cell density therefore follows endothelial rejection, jeopardizing graft survival. Thus although intensive treatment with topical steroid reverses the acute inflammation in most patients (24) the goal is reversal of the rejection episode as early as possible to minimize endothelial cell loss. In those graft recipients in whom rejection cannot be reversed, endothelial cell density falls below levels consistent with control of corneal swelling and transparency: end-stage graft failure.

Figure 4.

Corneal allograft rejection. This picture shows an endothelial rejection episode. Leucocytes adherent to donor endothelium are visible as pale punctate or linear aggregates. A graft recipient at this time would notice disturbance of visual acuity and discomfort. [reproduced from reference (69), with permission of the BMJ Publishing Group].

Much of our information on the mechanisms of rejection derives from animal models of transplantation. Most descriptive human observations are obtained from replaced grafts, usually some months following rejection onset and treatment with topical steroid, and so are of limited value (25). There are problems with relating data from animal models to the human setting. As has already been mentioned, endothelial cells from rodent corneas are capable of division and thus of repairing immune-mediated damage. In addition, immunological rejection of corneas in mice and rats is more aggressive than in humans, with rejection occurring in a relatively short timeframe (weeks). Larger animals, such as the rabbit, show a slower rate of rejection. However, in most cases, in order to obtain a model that can be studied, the rate and incidence of rejection is increased, for example by inducing vascularization in the recipient bed with a stitch. This results in good models for ‘high-risk’ corneal transplants, but may not represent the situation seen in a low-risk human graft. Nevertheless, animal models, as for other grafts, provide invaluable data on the mechanisms of graft rejection and allow the development of improved therapies to prevent rejection.

During rejection a wide range of cytokines and chemokines are up-regulated (26–28). Interestingly, in one study using competitive RT-PCR to measure cytokine levels during rejection in a rat model, a range of cytokines were up-regulated in both allografts and autografts, with data in the two types of graft being indistinguishable at early time points following transplantation (28). It was only later, in the period before and following observed rejection onset, that higher cytokine levels were seen in allografts. This presumably reflects the release of chemokines and cytokines owing to the danger signals produced by surgical trauma. In a rabbit model high levels of bioactive TNF could be isolated from the anterior chamber. Sequential samples from the same animal showed ‘spikes’ of TNF activity with high levels of TNF in one sample followed by low levels in the next (27). Mathematical modelling of the cytokine production indicated that this could be related to positive and negative feedback pathways controlling TNF production, rather than any unique feature of corneal transplantation (29). These cytokines and chemokines serve to recruit and activate a range of leucocytes to the cornea. During rejection in animal models, T cells, macrophages, granulocytes and NK cells are found in the anterior chamber and adhering to the endothelial surface of the cornea (30). These migrate out of the iris blood vessels, and through the anterior chamber before attaching to the donor endothelium. During stromal rejection cells move out of blood vessels either in the limbus or in those that have formed as new vessels in the graft and surrounding recipient cornea. Depletion experiments show that CD4 T cells are vital for graft rejection, with no clear role for CD8 cells (31).

Data from animal studies indicate that graft rejection is predominantly via the indirect pathway of allorecognition, in which recipient antigen-presenting cells (APCs) take up donor peptide and present it to the immune system. For example, in rodent models minor antigen mismatches are more important than major antigen mismatches in determining graft survival (32). This is consistent with the low levels of expression of class II in the cornea (33,34), and the lack of dendritic cells (but see later) to stimulate a direct alloresponse. It may also explain the data described earlier, which suggest that class II matching may be detrimental, as sharing class II between the donor and recipient would allow indirect pathway-specific T cells to recognize donor tissue directly.

The role of allospecific antibody is unclear, with some evidence that the appearance of alloantibody correlates with graft rejection (35), and other studies showing no link (36,37). In high-risk grafts there is a small advantage to ABO compatible grafts (22). Experimental animal studies using B-cell-deficient recipient mice showed that the absence of antibody led to a slight but significant prolongation of graft survival, and that administration of antibody in T-cell-negative mice caused corneal opacity but no conventional rejection (38). This suggests a minor role for antibody in allograft rejection.

In addition to the adaptive arm, other components are clearly important. Thus local depletion of macrophages by subconjunctival injection of clodronate liposomes is highly effective at blocking graft rejection when given early enough following transplantation (39). This may reflect the role of these cells in initiating the immune response against the graft, by recruiting other leucocytes to the site. However, the presence of macrophages in the graft infiltrate suggests they may have an additional effector role.

One of the questions that arise is how the immune effector cells recognize and kill their target tissues. Leucocytes trafficking through the anterior chamber adhere to donor tissue, the endothelial rejection lines do not extend from donor into recipient cornea. However, if the recognition of alloantigen is via the indirect pathway, and in those circumstances where the donor and recipient are mismatched at class II, it is difficult to see how the leucocytes will specifically bind to donor tissue. It may be that recipient APCs present donor antigen only in the local site, and once they have taken up antigen they do not migrate. However, this is not yet known, and clearly this is an important area for future research.

Immune Privilege

While in large cohorts corneal graft rejection episodes have been found to occur in 18–20% recipients, in most this is in the absence of systemic immunosuppression. One reason for the low rate of rejection is that transplantation is undertaken in an immune privileged site, and that the cornea is itself an immune privileged tissue. The mechanisms underlying this immune privilege are reviewed in detail elsewhere (40,41). It is also important to note that the immune privilege is not an absolute bar on immune responses, and can be overstated (42). But it is clear that there are mechanisms present that prevent or attenuate the allogeneic response to donor cornea. These include mechanisms that contribute to privilege by making the immune system ignorant of the presence of a graft, mechanisms that deviate the immune response into a nondestructive pathway and those that blunt immune mechanisms that do manage to penetrate to the graft.

Immune ignorance of grafts was first recognized by Medawar in experiments in which he showed that corneal tissue or skin grafted into the anterior chamber of the eye showed reduced rejection compared with the same tissues transplanted into a skin site (43). The privilege of the corneal site was abolished if the normally avascular recipient corneal bed had been vascularized. Hence the lack of blood and lymphatic vessels is important in privilege. Allied to this is the paucity of conventional dendritic cells in normal cornea: if the number of dendritic cells in the graft is increased (for example by experimental induction of APC migration into donor cornea before transplantation (44), or by transplantation of peripheral donor cornea, which contains more APCs) then rejection is more rapid. However, the situation is more complex, as recent data indicate that there are atypical (class II negative) dendritic cells present in the cornea which, upon grafting, migrate to the lymph nodes and express MHC class II (45,46). Cells of the monocyte/macrophage line can also be identified in the cornea (47). The possible role of these donor APCs in allorecognition and graft rejection needs to be defined. However, in clinical practice immune ignorance is very important in determining graft survival, as recipient corneal vascularization is the most significant determinant of graft failure on multivariate analysis in outcome studies.

Immune deviation, frequently termed anterior chamber acquired immune deviation (ACAID), describes the phenomenon that a systemic immune response can be deviated from a cytotoxic, inflammatory response to a noncytotoxic response if the antigen had previously been introduced into the anterior chamber of the eye (40,48,49). Similar responses are also seen in other immune privileged sites. The mechanisms underlying ACAID have become better understood, with cytokines such as TGFβ2 and α-melanocyte stimulating hormone acting on APCs within the anterior chamber which subsequently migrate to the spleen and deviate the immune response. This involves the collaboration of the APCs, the T cells and the NK T  cells (50,51). Experimentally it can be shown that interventions that interfere with the induction of ACAID reduce graft survival.

Finally, the cornea expresses a number of molecules that block immune effectors. These include soluble molecules that prevent complement activation, as well as the expression of Fas ligand (FasL, CD95L) on corneal epithelium and endothelium (52). Fas ligand interacts with Fas on infiltrating leucocytes, inducing their apoptosis. Transplants using FasL−/− donors or Fas−/− recipients show a reduced graft survival, probably owing to loss of this mechanism of down-regulating the allogeneic response (53,54).

Methods to Prevent Rejection

The cornea affords a very useful model for the development of novel strategies aimed at preventing graft rejection. The transparent nature of the cornea makes the inflammatory processes visible at a very early stage to observers (from the other perspective, literally, in the clinical setting, patients are able to report the reduction in visual acuity at a very early stage in a rejection episode). In addition, the feasibility of maintenance in culture of the donor cornea for long periods allows the manipulation of the tissue before transplant. Finally, the presence of a group of patients with a high risk of rejection and without an alternative therapeutic intervention justifies clinical trials of novel treatment strategies.

There have been no reported trials of immunomodulatory proteins in patients with corneal rejection. However, in animal models the administration of soluble IL-1 receptor antagonist, soluble TNF receptor, CTLA4-Ig and/or anti-CD40 ligand antibody have shown some benefit in preventing rejection (55–57). In the case of costimulatory molecule blockade the increase in survival is less dramatic than seen for other organs in the same strain combinations. This might reflect either differences in the pathways of rejection or the susceptibility of the cornea to low levels of damage – in other words, rejection is scored at an earlier stage in corneal transplantation than for other organs when considerable loss of function has to be seen before rejection is noted. It is of note that in one study in which anti-CD40L and CTLA4-Ig produced a significant but modest prolongation of graft survival, the degree of rejection seen histologically in treated animals was considerably less than that of control animals (57). The failure to obtain clear cut, long-term, graft survival in animal models is one reason for the lack of clinical trials.

An attractive alternative is gene therapy, which can be used to modify the corneal endothelial cells. The process of gene delivery is facilitated by the relatively simple anatomy and the ability to maintain grafts in culture for long periods. This is currently an important area of research, and one in which data obtained on corneal transplantation may be useful in other settings. This includes transplantation of other organs, and also autoimmune diseases in the eye, such as uveitis. Ex vivo transduction of the cornea with gene transfer vectors has been shown to be feasible using both adenoviral, lentiviral, herpes and adeno-associated virus vectors in a range of different species (58–62). These vectors have been used to express various molecules that can prolong graft survival, including IL-10 (63), soluble TNF receptors (64), CTLA4-Ig (56) and IL-4 (65). Other strategies include the expression of genes that block cytotoxicity. While these are highly efficient, viral vectors can be immunogenic and pro-inflammatory, and in at least one study vector controls have shortened graft survival when compared with untreated corneal grafts (64,65). We have therefore developed a range of nonviral vectors, based on targeted liposomes or dendrimers, which have shown increasingly efficient gene transfer to the cornea (66–68). At present these are not as efficient as adenovirus, but significant levels of gene expression are obtained.

Conclusions

Corneal transplantation is relatively understudied when compared with other types of grafts. There has also been little attempt to translate findings in basic transplantation science into clinical corneal transplantation. However, it is of considerable clinical importance given the large numbers of patients receiving allogeneic grafts. In addition corneal transplantation offers a good system to test novel forms of therapy, such as gene therapy, because of the simple anatomy, the feasibility of maintaining the cornea ex vivo and the ability to directly visualize the transplanted tissue and any rejection response. There are differences in the nature of the rejection when compared with other organs, stemming from the immune privileged nature of the anterior chamber and the lack of direct allorecognition. However, the differences are probably less important than the similarities, and so more attention should be paid to this form of graft.

Acknowledgments

Andrew George is a BBSRC Research Leave Fellow. Current work on the cornea in our laboratory is funded by the MRC, Wellcome Trust and Action Research. We would like to thank all members of our laboratory, past and present, who have contributed to the studies on corneal transplantation.

Ancillary