Anthony J. Demetris MD, UPMC-Montefiore, Room E741, 3459 5th Avenue, Pittsburgh, PA 15213, USA. Tel.: 412-647-2072; fax: 412-647-2084; e-mail: firstname.lastname@example.org
Several factors acting together have recently enabled clinicians to seriously consider whether chronic immunosuppression is needed in all solid organ allograft recipients. This has prompted a dozen or so centers throughout the world to prospectively wean immunosuppression from conventionally treated liver allograft recipients. The goal is to lessen the impact of chronic immunosuppression and empirically identify occasional recipients who show operational tolerance, defined as gross phenotype of tolerance in the presence of an immune response and/or immune deficit that has little or no significant clinical impact. Rare operationally tolerant kidney allograft recipients have also been identified, usually by single case reports, but only a couple of prospective weaning trials in conventionally treated kidney allograft recipients have been attempted and reported. Pre- and postweaning allograft biopsy monitoring of recipients adds a critical dimension to these trials, not only for patient safety but also for determining whether events in the allografts can contribute to a mechanistic understanding of allograft acceptance. The following is based on a literature review and personal experience regarding the practical and scientific aspects of biopsy monitoring of potential or actual operationally tolerant human liver and kidney allograft recipients where the goal, intended or attained, was complete withdrawal of immunosuppression.
Allograft biopsy evaluation plays a critical role in the emerging field devoted to minimization or complete weaning of immunosuppression from human solid organ allograft recipients. The immediate practical goal of this field is to improve the quality of life and outcomes for allograft recipients by minimizing exposure to the high cost and serious side-effects of chronic immunosuppression, such as hypertension, diabetes, hyperlipidemia, kidney damage, and increased susceptibility to malignancies. Presumably, this can be achieved without sacrificing allograft structure and function consequent to uncontrollable acute or even indolent chronic rejection. A secondary, but equally important, goal is to use allografts as probes to understand cellular and molecular mechanisms associated with immunologic tolerance. The hope is that treatment algorithms might then be devised to routinely induce tolerance to allografts in a large percentage of recipients. These concepts might also be transferable to the related fields of autoimmunity and cancer immunosurveillance.
Since the advent of solid organ transplantation, two general approaches have been used to study clinical allograft acceptance/tolerance: (i) so-called ‘spontaneous operational tolerance (SOT)’ is a term borrowed from Ashton-Chess et al. . It refers to rare noncompliant recipients and others deliberately removed from immunosuppression who do not develop rejection even long after the event. SOT recipients are usually identified by trial and error. This approach was pioneered by Starzl who realized that acute rejection was reversible with temporarily increased immunosuppression, with the need for such immunosuppression significantly diminished afterward  and (ii) tolerance can also be induced intentionally via hematopoietic macrochimerism, using bone marrow or hematopoietic stem cell transplantation combined with simultaneous [3–5] or delayed kidney transplantation [6–10]. The hematopoietic chimerism approach was based on the original experimental animal observations of Billingham et al. . It was matured in further experimental animal studies and then successfully applied to humans by Sachs, Sykes, and Cosimi, using preconditioning with a nonmyeloablative regimen and major histocompatibility complex (MHC)-matched [3–5] or -mismatched  simultaneous bone marrow and living-donor kidney transplantation.
Distinguishing between these approaches has meaning beyond the purpose of understanding how the histopathology literature developed in this field. It also provides insights about the predominant immunologic mechanisms involved in allograft acceptance/tolerance. In experimental animals, tolerance achieved through hematopoietic chimerism is robust, mediated predominantly by deletion , and organ-independent. In the first author’s experience as a clinical and experimental transplant pathologist, this approach leads to the ‘cleanest’, or the most normal-appearing allografts. Stable macrochimerism, however, is very difficult to achieve in mismatched humans without graft-versus-host disease . Comparatively, SOT is meta-stable and probably mediated by a combination of deletion, ignorance, and regulation and is organ-dependent. Liver allografts exhibit SOT more frequently than other allografts (see below). SOT allografts are usually not as clean or free from inflammation as allografts in chimerically tolerant recipients. Rather than being totally different, however, the two approaches are qualitatively similar, but differ quantitatively in reference to underlying mechanisms, such as deletion and (micro-)chimerism, that contribute to long-term allograft survival [12,14].
These approaches also fit well with tolerance as recently defined by Girlanda and Kirk . ‘True tolerance’ refers to the absence of any detectable detrimental immune response as well as the absence of immunocompromise. ‘Operational tolerance’ refers to the gross phenotype of tolerance in the presence of an immune response and/or immune deficit that has no significant clinical impact. For the histopathologist, the difficult phrase in the operational tolerance definition is, ‘…that has no significant clinical impact’. This is not so easily determined and will be discussed in greater detail subsequently. For this review, we excluded an evaluation of pathology material from so-called prope tolerance studies [16,17] and reports of late rejection occurring in patients with low immunosuppression levels because it was difficult to determine what exactly constituted low-level or minimal immunosuppression.
Instead, this review is based on a literature survey and on personal observations from studies in which complete weaning from immunosuppression was the intended goal or provided some insight into the weaning process. It focuses primarily on studies of human SOT; detailed mechanistic studies in experimental animals are beyond the intended scope, except where they serve to illustrate a point relevant to human material. We apologize in advance because many of the histopathologic observations discussed are, because of trial design and material available, anecdotal and descriptive. But currently, that is the state of the field.
All allografts are not created equal
Spontaneous operational tolerance in conventionally treated recipients is, by far, most commonly observed in liver allograft recipients. Most clinical trials that attempt to prospectively wean human recipients from immunosuppression are conducted in liver allograft recipients and these also show the highest rate of success (see below). The traditional and probably the most accurate and authoritative reason given for this success is the so-called ‘hepatic tolerogenicity’ (reviewed in Benseler et al.  and Crispe et al. ). This refers to the liver’s unique role as an immunologic organ. Examples include: (i) oral tolerance, or the observation that systemic immune responses to any particular antigen are significantly less robust if the antigen is fed orally beforehand; (ii) spontaneous acceptance of fully MHC-mismatched liver allografts without immunosuppression in many animal species, excepting humans and primates; (iii) ability of liver allografts to protect other, extrahepatic, allografts from rejection if the latter are derived from the same donor; and (iv) ability of the liver to protect central immune organs from overstimulation by gut bacteria, bacterial products, and other antigens that normally leak through the intestinal barrier. A detailed discussion of the various mechanisms of hepatic tolerogenicity is beyond the scope of this review. Included are the release of soluble MHC antigens, migratory passenger leukocytes and activation of recipient lymphocytes in secondary lymphoid tissues, microchimerism, hepatic dendritic cell immaturity, activation of naïve T cells and purging of cytotoxic cells within the liver, and stimulation of regulatory T cells (reviewed in Benseler et al.  and Crispe et al. ).
There are, however, other reasons as to why the liver allograft recipients are more ideal candidates for weaning studies than other conventionally treated solid organ allograft recipients. First, the vast majority of acute cellular rejection episodes, regardless of severity, are not life- or allograft-threatening, do not produce significant morbidity, and are easily reversible with current immunosuppressive medications [20,21]. Second, reversal of rejection is usually complete: the allografts heal without significant fibrosis, architectural distortion, or loss of function because of robust hepatic regeneration [21–23]. Even the early phases of chronic rejection are reversible in the liver . Therefore, if a liver allograft recipient develops acute or the early chronic rejection during or after weaning, the process is likely to be completely reversible without significant sequelae [21,24]. But there are exceptions and weaning is not risk-free (Table 1). This is in contrast to cardiac allografts, where severe acute cellular rejection might be lethal. In pancreas, lung, and renal allografts significant acute rejection more frequently results in irreversible scarring, architectural distortion, and permanent loss of function. Intestinal allografts can also heal without significant fibrosis, but severe acute rejection is usually more difficult to reverse and accompanied by significant morbidity. Third, liver injury test parameters are more sensitive indicators of injury than are standard function tests for other organs, for example, serum creatinine in kidneys, pulmonary function tests in lung allografts, or symptoms of decreased cardiac output in heart allografts. Finally, liver allografts are more resistant to antibody-mediated rejection than are other solid organ allografts .
Table 1. Pre- and postweaning duration, and histopathologic diagnosis in follow-up liver tissue samples from liver allograft recipients withdrawn from immunosuppression.
No. Pts off IS/ attempted
Histopathologic diagnoses in postweaning biopsy specimens
*Most frequently consists of mild ‘nonspecific’ portal inflammation and steatosis.
†Vast majority of acute rejection episodes were Banff  mild to moderate. Two patients from the Tryphonopoulos et al.  study developed chronic rejection; one required re-transplantation; three patients in the Mazariegos et al.  study developed early chronic rejection stabilized by a return to immunosuppression; three patients in the Sandborn et al.’s study  developed CR and two died; one patient in the study of Devlin  and Girlanda et al.  required re-transplantation because of CR.
‡Pathology results were not available for all patients in this study because some samples were submitted for immunofluorescence and PCR analysis.
§Not all patients were routinely subjected to follow-up biopsies after withdrawal of immunosuppression.
¶Overlaps with the study of Ramos et al. , but with longer follow-up.
**Seven patients were treated for rejection without biopsy.
††Four patients developed ‘portal inflammation’ with elevated liver injury test parameters, not necessarily diagnostic of rejection, but were returned to immunosuppression.
‡‡Forty rejection episodes were clinically suspected and 30 were biopsy-proven.
§§Overlapping patient populations.
***Focal ductopenia involving <20% of portal tracts was observed in occasional recipients, but criteria for chronic rejection were felt not to be present.
†††Yoshitomi et al.  reported decrease in size and increase in number of bile duct and fibrosis in patients, which they attributed to possibly a variant of chronic rejection.
The reported experience of SOT in conventionally treated liver- and kidney allograft recipients is shown in Tables 1–3. One study from our center  that included 50 kidney-, 17 liver-, 14 pancreas-, and 11 intestinal allograft recipients treated with leukocyte-depleting antibodies was not included in these tables because long-term follow-up has not yet been tabulated and the patients were not entirely immunosuppression-free at the time of publication. Spaced weaning leading to a significant reduction in immunosuppression, however, was achievable in a majority of surviving recipients . The kidney cohort in that study overlaps with a series reported subsequently by Shapiro et al.  with longer follow-up.
Table 2. Retrospective studies of ‘spontaneous’ kidney allograft acceptance/tolerance.
Three patients subjected to Bx: one tolerant and two patients on minimal steroids showed mild patchy interstitial lymphocytic inflammation forming occasional small clusters, but without tubulitis or vascular injury. Mild arterial nephrosclerosis, focal glomerular lobular accentuation and global glomerulosclerosis involving up to 10% of glomeruli
In testable cases, lymphoid infiltrates were of recipient origin; endothelium and tubules were primarily of donor origin; one case showed recipient-type cells in the mesangium
Pre weaning biopsy showed focal infiltrate and fibrosis, but no tissue damage; F/U biopsy 7 years subsequently showed focal infiltrates but no tubulitis or evidence of acute rejection. Biopsy at ca. 8 years showed acute rejection; details not given
Biopsies obtained from two SOT patients after F/U of 10.3–18 years. Both showed lymphoid aggregates and scattered interstitial mononuclear cell infiltrates without tubulitis, allograft vasculopathy, interstitial fibrosis, or tubular atrophy
Numerous CD4+/TGF-β1+/CD25+/−/FoxP3− in the interstitium; TGF-β1-/FoxP3+/CD25+ cells mainly in lymphoid aggregates
Roussey-Kesler et al.  and Brouard et al. 
Biopsies from two recipients: (i) after 13 years of SOT renal dysfunction prompted a biopsy that showed grade I CAN with mild nephroangiosclerosis without significant lymphoid infiltration or specific changes suggestive of chronic rejection (C4d-) and (ii) after 7 years of SOT renal dysfunction prompted dialysis and a biopsy that was negative for acute rejection, but showed focal fibro-edema associated with mild mononuclear infiltration, and double contours in GBM and moderate arteriolar hyalinosis (not shown) diagnosed as grade Ib CAN with allograft glomerulopathy (C4d-)
Table 3. Prospective studies of ‘induced’ kidney allograft acceptance/tolerance.
Allograft findings/histology, if available
IPEX and other allograft tissue studies
BMTx, bone marrow transplant; GBM, glomerular basement membrane; IPEX, immunoperoxidase tissue staining; IS, immunosuppression; MHC, major histocompatibility complex; TLI, total lymphoid irradiation; Tx, transplant; m, MHC matched donors; mm, MHC mismatched donors.
Kidney transplants after MHC-identical* or parent to offspring (haploidentical)† BMT from the same donor.
MHC-mismatched cadaveric renal Tx (Strober et al. ) or living-related kidney Tx (Scandling et al. ) after TLI, lymphoid depletion and donor hematopoietic cell infusion
ca. 3 years
Obstructed ureter prompted a Bx at 10 m after IS withdrawal showed normal glomeruli and blood vessels and an occasional focus/cluster of interstitial mononuclear cells. A subsequent episode of obstruction was followed by increased creatinine and a second biopsy showed diffuse mononuclear-cell infiltrate consistent with either chronic obstruction or rejection. Another patient had a ‘normal’ Bx 20 m after withdrawal of IS. Most recent patient  had normal function at 28 months, but no biopsy was reported
Simultaneous MHC matched (Spitzer et al. , Buhler et al. , Fudaba et al. ) and MHC mis-matched (Kawai et al. ) living-related renal and BMTx using nonmyeloablative regimen
6 m + 4/5 mm
MHC matched donors: not reported in any detail MHC mismatched donors: one allograft lost to antibody-mediated rejection. One developed anti-donor HLA class II antibodies 2 months after complete IS withdrawal with C4d deposits in the allograft and segmental duplication of the GBM in some glomeruli. Biopsies from the three other grafts reported as normal and/or showing transient mononuclear cell infiltrates after IS withdrawal
MHC mismatched donors: intragraft levels of FoxP3 mRNA were about six times higher in the stable IS-free group than in the stable-with-IS group, whereas the granzyme B mRNA levels were similar. Therefore, the ratio of FoxP3:granzyme B might be important
Delayed living-related renal Tx after successful BMTx from the same donor: Sayegh et al.* , Butcher et al.*  Helg et al.* , Jacobsen et al.† , Sorof et al.† 
alemtuzumab depletion without (Kirk et al. )or with deoxyspergualin (Kirk et al. ) in living donor Tx
7 + 5
Atypical rejection developed in all 12 patients in both studies characterized by macrophages-rich infiltrates before T-cell extravasation, which initially distended interstitial capillaries, but subsequently became diffuse involving most of the interstitial capillaries, the interstitium, and tubules during clinical during rejection
IPEX staining for CD3, CD4 CD8, CD20, CD45RO CD68, CD56, perforin, granzymeB, and HLA-DR showed macrophage-rich infiltrates during rejection with few CD45RO+ cells, no increased NK cells, and upregulation of HLA-DR on tubules; increased transcripts associated with macrophage/monocyte function and chemokines. C4d staining was negative in the second study 
Thymoglobulin or Campath pretreatment followed by Post-Tx tacrolimus monotherapy and spaced weaning (Shapiro et al. ) in cadaveric Tx
45% of recipients developed rejection during weaning. Weaning was unsuccessful in about one-third of Thymoglobulin-treated recipients. Two-thirds were on reduced/spaced tacrolimus therapy after a follow-up of 24–39 months. Seventy-four per cent of the Campath-treated patients were on spaced weaning after 12–18 months of follow-up. Protocol biopsies were not performed, but patient and graft survival and the rate of CAN progression was similar to historic, conventionally treated controls
Much of the data used to construct these tables are difficult to verify because individually tolerant recipients are often reported more than once and the same patients are not easily traced among the studies. But, even so, the relative ease with which liver allograft recipients can be completely weaned from immunosuppression as compared with kidney allografts recipients is obvious, especially if one compares the ratio of SOT/total transplants. SOT has been reported in the global literature in at least 49 kidney allograft recipients versus 148 liver allograft recipients (Tables 1–3). These numbers, however, are probably significantly lower than the actual number of SOT recipients who are either unknown and/or unreported.
Weaning trial designs
Most prospective ‘weaning trials’ have been conducted in liver allograft recipients. The various trials were similar in design and comprised primarily of conventionally treated and immunologically stable liver allograft recipients more than 2 years after transplantation, without technical complications, evidence of rejection or significant allograft pathology (Table 4). The clinical perspective, including the details of initial immunosuppression, which differed somewhat among the studies, has been expertly reviewed elsewhere . Most trials weaned immunosuppression slowly over a period of months. Attempts at weaning earlier after transplantation were reported in recipients treated with lymphocyte-depleting antibodies at the time of transplantation .
Table 4. Design of immunosuppression withdrawal trials after liver transplantation and time until rejection, if it occurred.
Time until rejection
*Overlaps with the population of Ramos et al. .
†Overlaps with the population of Takatsuki et al. .
Adult, cadaveric donors, >5 years post-transplant; >2 years without rejection History of medical compliance Immunosuppression related complications Primary physician cooperation Absence of rejection or severe necro-inflammatory disease on liver biopsy
Adult, cadaveric donors Compared BM infusion group to controls to determine if infusion augmented chimerism and whether augmented chimerism increased tolerance >3 years post-Tx; stable liver function tests; rejection-free 12 months Recipients with autoimmune disorders excluded
Pediatric, living-related donor Pediatric patients; >2 years s/p Tx; >1 year rejection-free Normal liver function Parental permission
It is difficult to contest the premise of weaning trials that less immunosuppression without rejection is desirable. But only the study of Sandborn et al. , who attempted to wean Cyclosporine, included contemporaneous matched controls maintained on conventional immunosuppression to determine whether the withdrawal from immunosuppression was indeed beneficial overall. One study from our center included comparison to a historic control group . Other studies compared immunosuppressant-dependent (failed weaning) with immunosuppressant-free (successful weaning) recipients [31–33]. In general, no specific molecular mechanistic hypothesis was being tested in these weaning trials other than the one that microchimerism and long-term allograft acceptance under immunosuppression are conducive to immunosuppression-free allograft acceptance . Therefore, the data collected differed somewhat among the studies. It would be more ideal, in conventionally treated recipients, to compare a ‘weaning’ group with a maintenance immunosuppressive therapy group and include both potentially positive and negative endpoints, such as incidence of acute and chronic rejection and development of graft fibrosis over a period of time, incidence and severity of immunosuppression-related complications (renal failure, diabetes, cardiovascular disease, malignancies) and cost of medications.
The data on majority of the SOT kidney allograft recipients have been derived from anecdotal reports based on individual patients that were either noncompliant or who had anti-rejection medication withdrawn because of immunosuppression-related complications (Table 2). Again, no specific hypothesis was being tested in these reports other than the possibility that immunosuppression weaning might be possible. In contrast, studies attempting to induce tolerance through macrochimerism were all conducted prospectively and tested the hypothesis that hematopoietic chimerism would lead to allograft tolerance in outbred humans (Table 3). The approaches included: (i) using a nonmyeloablative preparatory regimen and simultaneous MHC-matched [3–5] or -mismatched  bone marrow- and living-related kidney transplants; (ii) delayed renal transplantation after successful bone marrow transplantation from the same living-related donor using myeloablative therapy [6–10]; and (iii) MHC-mismatched renal transplantation after total lymphoid irradiation, lymphoid depletion, and donor hematopoietic stem cell infusion [35,36].
Three other prospective kidney trials are included in Table 3. Two by Kirk et al. [37,38] used alemtuzumab leukocyte depletion both without  and with coexistent deoxyspergualin therapy , but without other immunosuppressants in related and unrelated living donors. Shapiro et al.  used thymoglobulin or alemtuzumab depletion plus tacrolimus monotherapy with fully mismatched cadaveric donors. These studies differed in the timing and dosage of alemtuzumab. Kirk et al. [37,38] did not use any baseline immunosuppression except deoxyspergualin in his second study , whereas Shapiro et al.  relied on tacrolimus monotherapy, which was weaned shortly after transplantation. They were also, however, testing the hypothesis that depletion of the recipient immune system would create favorable conditions for the development of tolerance [37,38] and donor leukocyte migration might positively contribute to this process  through the induction of chimerism.
Clinical and detailed histopathologic observations
This section will follow the observations during enrollment and follow-up of patients participating in immunosuppression minimization trials.
Pre weaning clinical profiles and biopsies
In most, but not all, prospective SOT liver and kidney allograft immunosuppression minimization trials, ‘pre weaning’ biopsies are obtained. The liver injury test parameters and serum creatinine are usually normal or near-normal, but minor abnormalities are not uncommon. The purposes of the biopsy are to: (i) exclude any histopathologic rejection-related activities or other findings, such as significant fibrosis, that might exclude the patient from the trial and (ii) document any other baseline inflammatory and/or structural changes present before withdrawal so that they can be compared with findings in subsequent biopsies. The rationale for these biopsy-based exclusions is as follows. Low-level subclinical rejection is likely to significantly worsen after weaning and any additional insult on an already structurally compromised allograft would likely lead to failure. In addition, any changes to allograft structure might represent a heretofore unrecognized manifestation of rejection, or a beneficial effect of immunosuppression withdrawal.
Most ‘pre weaning’ liver allograft biopsies are obtained several years after transplantation and show changes that are typical of protocol biopsies obtained at that time. These biopsies are often difficult to interpret and the subject of a recent Banff consensus document . Nearly 75% of biopsies obtained from adult recipients surviving more than 1 year ‘with abnormal liver tests’ will show histopathologically significant abnormalities [40–45], which are usually attributable to recurrent disease or biliary tract strictures [40–45]. The percentage is significantly less in pediatric recipients because recurrent disease is much less common. However, unexplained chronic hepatitis/inflammation is seen in a high percentage of pediatric recipients at some centers and this might represent a form of late rejection [46,47]. In addition, nearly 25% of biopsies from long-surviving ‘asymptomatic adult recipients with normal liver tests’ will show significant abnormalities if the original disease is one that commonly recurs, such as hepatitis C virus(HCV), steatohepatitis, primary biliary cirrhosis, or autoimmune hepatitis [40–45] and in up to 11% of recipients the pathology findings were judged to be of clinical significance .
Other minor histopathologic abnormalities occur in about two-thirds of long-term biopsies, even without recurrent disease, in asymptomatic recipients with normal or near-normal liver tests [40–45]. Common findings are portal venopathy and nodular regenerative hyperplasia; thickening and hyalinization of small hepatic artery branches [43,49], ‘nonspecific’ portal and lobular inflammation [43–45,50], and Ito cell hyperplasia . A higher percentage of split and liver donor allografts also show architectural changes compared to whole cadaveric organs (A. Demetris, unpublished observation). The pathogenesis, significance, long-term consequences, and impact of weaning on these otherwise unexplained, long-term histopathologic findings are in need of further study. It is important in drug minimization trials that changes associated with long-term engraftment are not confused with variants of rejection after weaning, which reinforces the need for pre weaning biopsies.
It is worth emphasizing that original disease recurrence is a significant problem in adult liver transplantation , accounting for about 50% of all episodes of allograft dysfunction occurring more than 1 year after transplantation. In contrast, biliary atresia is the indication for the vast majority of pediatric liver transplants. As this disease does not recur after transplantation, interpretation of the results of pediatric weaning studies, such as those from the Kyoto group [32,51,52], are less complicated from the perspective of recurrent disease.
In kidney allografts, interpretation of baseline pathology changes is generally much less complicated and the findings are usually attributable to age and hypertension, calcineurin toxicity, and/or diabetes-related changes, such as patchy interstitial fibrosis/tubular atrophy, mild mononuclear interstitial inflammation, and arterial and arteriolonephrosclerosis of varying severity. In contrast to liver allografts, recurrence of the original disease is much less common and rarely impacts the decision to continue with weaning. For weaning studies, however, theoretically the pre weaning biopsy should show no evidence of active tubulitis or other obvious rejection-related changes and negative C4d stains . Weaning has been attempted, however, in recipients showing borderline changes when the serum creatinine levels were near-normal and/or stable . It is very difficult, however, on the basis of routine light microscopy to absolutely and reliably distinguish between borderline changes and nonspecific inflammation associated with aging and arterial and arteriolonephrosclerosis. Some of these patients did not develop more severe rejection after weaning , but characterization of the infiltrates with formal long-term follow-up studies are needed to determine the impact of this decision.
Clinicopathologic observations during and shortly after weaning
Allograft biopsies were obtained in most weaning trials only when there was an elevation in liver injury test parameters for liver allografts or serum creatinine for kidney allografts. However, at the Liver Sessions 2007 Banff consensus meeting, which was devoted to late allograft dysfunction and weaning of immunosuppression, all participating hepatologists and surgeons agreed that protocol follow-up biopsies should be mandatory, or at least strongly encouraged, in such trials. Arguments raised by Roussey-Kesler et al.  not to conduct protocol biopsies in stable SOT kidney allograft recipients included the possibility that minimal histopathology findings might be misleading and result in an unjustified return to immunosuppression. In addition, the need to conduct serial biopsies to monitor possible progression of subtle findings carries a risk of morbidity. And weaned stable allograft recipients might not want to undergo biopsy evaluation. Counter-arguments to these reasonable points of concern are that the biopsy findings and interpretation should be viewed similar to any other laboratory test result and incorporated into the entire clinicopathologic profile. In addition, much can be learned from the biopsy material, particularly as there is evidence to suggest that in humans the allograft plays an important role in the maintenance of tolerance.
Elevation of the liver injury test parameters in liver allograft recipients or serum creatinine in kidney allograft recipients is not uncommon and usually occurs within the first several months during or after drug withdrawal [22,23,27,52,54–57]. In liver allograft recipients, however, elevated liver injury test parameters did not distinguish between those who developed acute rejection and those who did not [22,23,57]. This is because biopsies obtained for elevated liver injury test parameters showed a variety of changes, including recurrence and/or exacerbation of underlying chronic viral or autoimmune hepatitis, primary biliary cirrhosis, biliary tract complications, steatohepatitis, nodular regenerative hyperplasia, and nonspecific ‘lobular reactive changes’ (Table 1). This illustrates that immunosuppression also prevents immunologically mediated liver injury other than rejection. In addition, mildly elevated liver injury test parameters occasionally returned to normal without therapeutic intervention after a biopsy had largely excluded acute rejection as the cause of the dysfunction [22,23].
Persistent significant elevations of liver injury test parameters (3X baseline values), however, usually signal the development of a clinically relevant problem. Of the various enzyme measurements comprising the standard liver injury test profile, gamma-glutamyl transpeptidase (GGTP) elevations were felt to be the most specific and sensitive for rejection in two studies from two different studies/centers [23,52]. In contrast to its value in the early post-transplant period, total serum bilirubin was a relatively insensitive marker for acute rejection developing after weaning [22,23,52,57].
When acute rejection was identified as a cause of liver allograft dysfunction during or after weaning, in our experience, the histopathologic appearance of most, but not all cases, was typical of that reported for late onset acute rejection (reviewed in ). Most of these episodes were graded as mild, but occasional cases of moderate to severe symptoms were reported (Table 1). In several liver studies, however, even when biopsy analysis had excluded other causes of allograft dysfunction associated with a clinical diagnosis of acute rejection, the histopathologic findings were not always typical of those reported for acute rejection [22,23,57,58]. There are at least four possible reasons for this observation: (i) acute rejection occurring late after transplantation (>1 year) in liver allografts differs from that occurring earlier within the first several months of transplantation, even in conventionally immunosuppressed recipients (reviewed in ); (ii) the composition of both the allograft and the recipient immune system are different early versus late time points after transplantation; (iii) immunosuppressive regimens used before weaning, such as lymphocyte-depleting antibodies, can alter the histopathologic appearance of acute rejection after weaning; and (iv) understandable clinical anxiety about elevated liver injury test parameters occurring in conjunction with weaning might trigger therapeutic intervention before characteristic histopathologic changes have time to develop.
Portal and/or perivenular inflammation is almost invariably observed. The major histopathologic differences between acute liver allograft rejection occurring after weaning versus ‘typical early acute rejection’ include less inflammatory bile-duct damage and more interface and lobular necro-inflammatory activity in the former. These differences cause the biopsies obtained after weaning to resemble hepatitis, which in turn, results in some diagnostic difficulties for the pathologist [22,23,57,58]. Increased interface and disease activity is also seen after weaning in patients with an original disease of autoimmune hepatitis  and primary biliary cirrhosis , some of whom develop new onset autoimmune hepatitis. In addition, early and rapid weaning of immunosuppression in HCV+ recipients treated with lymphocyte-depleting antibodies can ‘re-arm’ the immune system . This manifests histopathologically as an aggressive hepatitis with rapid progression of fibrosis . The important message from the above observations is that rejection is not the only cause of allograft dysfunction that occurs after weaning (Table 1).
No standardized reliability studies have been conducted on biopsy samples obtained in the setting of immunosuppression weaning to determine if pathologists agree in their interpretation. This is because of the rarity of such samples and the anxiety associated with clinical decision-making process. Such studies, however, are important and will be needed, particularly to define changes that might signal a need to return to immunosuppression versus ones that are probably benign and nonprogressive. In the meanwhile, in the first author’s experience, reliance on standardized criteria is suggested .
Acute and chronic kidney allograft rejection occurring during or after weaning, in our experience, has not differed significantly enough from that usually seen in conventionally treated immunosuppression patients to cause diagnostic difficulties for the histopathologist. Development of anti-donor antibodies in the peripheral circulation and C4d deposits in the kidney, however, usually signals a need for returning to immunosuppression if the deposits are accompanied by histopathological evidence of significant tissue injury. Recurrence of the original disease can also be the primary cause of allograft kidney dysfunction after weaning, but the incidence is much less common than in liver allografts (unpublished observation).
The leukocyte-depleting alemtuzumab studies of Kirk et al. [37,38] nicely illustrate how the treatment strategy can influence the histopathologic findings. In the first study, no other baseline immunosuppression was used . The histopathologic findings during clinical rejection that subsequently developed in all recipients, several weeks after transplantation, were not typical of early acute cellular rejection in conventionally treated renal allograft recipients. Instead, chemokine and macrophage function-rich transcripts were detected in needle biopsies early after transplantation, accompanied by margination of macrophages in interstitial capillaries on day 14. This occurred before the onset of significant T-cell infiltration and clinically evident rejection. Macrophage infiltrates became more diffuse at the onset of clinical dysfunction involving most of the interstitium, capillaries, and tubules. Small numbers of CD45RO+ (memory) T cells were limited to the areas of macrophage infiltration, and tubulitis, when seen, was macrophage-predominant. Treatment with corticosteroids, OKT3, and sirolimus monotherapy reversed these episodes. A follow-up study added deoxyspergualin to the regimen in an attempt to inhibit macrophage function . However, neither the clinical results nor the histopathologic features of rejection were significantly different from the study using alemtuzumab alone  and C4d stains to monitor for antibody-mediated rejection were negative in the second study .
A general consensus in most kidney- and liver-weaning studies is that biopsy monitoring to determine the cause of dysfunction after weaning is an absolute necessity. Close clinicopathologic correlation, however, is even more important. In our experience, unbalanced emphasis on either the histopathologic findings or clinical profile can adversely impact either the scientific validity of the study and/or patient safety. The usual situation is that the pathologist is the worrier, whereas the clinicians are more reassured by stable liver injury test parameters or creatinine, although the reverse can also occur. As the truth is often somewhere in between the two viewpoints, one should override the other only when findings are obvious, or there is evidence of a clear trend over a period of time. Long-term follow-up often provides a clear indication of whether the chosen approach was correct or not.
Correlation between preweaning biopsy findings and outcome
Baseline biopsy findings in some liver studies proved to be associated with successful weaning when compared with biopsies from unsuccessfully weaned patients. Significant variables included: (i) less portal inflammation, overall; (ii) less CD3+ and CD8+ but more CD45RO+ lymphocytes within the lobules ; (iii) more advanced portal fibrosis in HCV+ recipients ; and (iv) an increase of potentially regulatory FoxP3+ T cells within the allografts of pediatric recipients [32,60]. Unsuccessful weaning, conversely, was associated with significantly more chronic portal inflammation on hematoxylin and eosin (H&E) stains and decreased CD45RO+ as well as increased CD8+ lymphocytes in the lobules . These observations suggested that chronic portal inflammation and lobular CD8+ cells might represent a latent form of rejection held in check by medications, which manifests itself clinically after the removal of immunosuppression.
A worry is that seemingly ‘tolerant’ patients might actually be experiencing low-grade chronic rejection. For example, in one liver series, five patients were categorized initially as showing SOT . During longer follow-up, however, one recipient developed acute rejection requiring reinstitution of immunosuppression, another required re-transplantation for chronic rejection, and a third resumed immunosuppression because of a kidney transplant . We appear to have observed similar occurrences in liver allograft recipients, but more characterization of tissue samples is needed (A. Demetris, unpublished observations). Similar findings have been reported in SOT kidney allograft recipients, so it is prudent to continue to closely follow seemingly tolerant patients.
Studies examining associations between pre weaning kidney biopsy findings and postweaning outcome have not been reported.
Clinicopathologic observations in stable SOT recipients
Many centers do not sample the allografts of SOT recipients if they are ‘clinically stable’. Thus the number of tissue samples from SOT liver and kidney allografts who remain off of immunosuppression after the biopsy, is considerably smaller than the total number of SOT recipients. Instead, biopsies are obtained only when indicated by elevated liver injury test parameters or serum creatinine. Consequently, the number of reported protocol tissue samplings in stable SOT recipients, whom remain immunosuppression-free after biopsy, is exceeding small. In total, ‘more or less’ protocol biopsies were obtained from eight SOT kidney recipients and from six chimeric- bone marrow plus kidney recipients and reported in the literature (Tables 2 and 3). The total number of liver allograft biopsies from SOT is more difficult to tabulate because different reports frequently contain overlapping patient populations (Table 1). The number appears to be between 100 and 200. This is somewhat disappointing because protocol biopsies from SOT patients can provide clinically and scientifically useful information.
There have been three studies, two liver- [14,58,62] and one kidney transplant [63,64] that have characterized the donor/recipient phenotype of cells infiltrating and comprising SOT allografts. In liver allografts, the vast majority of hepatocytes, bile ducts cells, and large vessel endothelia remain of donor origin, as do the tubular epithelial cells and endothelial cells of kidney allografts [63,64]. A majority of infiltrating leukocytes, however, were of recipient origin [14,58,62–64]. But donor hematopoietic cells can also be detected amidst the interstitial inflammation in some nearly SOT kidney allografts on low-dose immunosuppression . The significance of persistent donor hematopoietic cells within the allograft and whether it predicts subsequent acceptance has not been studied in any detail. In SOT liver allografts, some replacement of sinusoidal lining cells can be seen. But it is difficult to distinguish between Kupffer’s cells and endothelial cells and the level of sinusoidal cell replacement did not correlate with the ability to wean immunosuppression .
No long-term follow-up biopsies were conducted in the SOT kidney allografts, and because of the small numbers, it is difficult to draw any conclusions regarding any association with weaning. At this time, however, the evidence suggests that recipient replacement of donor epithelial or endothelial cells within the allograft is not a substantial mechanistic contributor to the development of SOT. Whether persistence of donor hematopoietic cells, including dendritic cells (DC) in the interstitium of allografts is associated with acceptance, as in experimental animals  is being actively investigated in our SOT tissue samples.
The Kyoto group conducted protocol biopsies in 14 pediatric living-donor liver allograft recipients who had been weaned from all immunosuppression. These biopsies were compared with biopsies from control liver allograft recipients maintained on chronic immunosuppression . There was more extensive portal fibrosis, ductular reactions, more CD8+ cells and decreased luminal diameter of bile ducts in SOT immunosuppression-free recipients [32,51]. The authors worried that the changes observed in the SOT recipients might represent a subtle, heretofore unrecognized, variant of chronic rejection [32,51]. Some of their concern is warranted because of the significant fibrosis and increased CD8+ lymphocytes is similar to that reported by Wong et al. , above. The mean follow-up in the Kyoto SOT group, however, was several years longer than that in their control group. This raises some questions about the etiology of these changes, which are not entirely typical of either early or late, acute or indolent chronic rejection. As the influence of longer term engraftment, regardless of immunosuppression, needs to be considered, more follow-up and detailed characterization of the changes are needed in this cohort.
Tisone et al.  studied the effect of immunosuppression weaning in HCV+ recipients and conducted protocol biopsies at 1 month after completion of weaning and yearly thereafter. Interestingly, successfully weaned patients initially showed more advanced fibrosis in baseline biopsies than immunosuppression-dependent HCV+ recipients. After weaning, fibrosis failed to progress significantly, or actually regressed, in patients removed from immunosuppression . In contrast, the immunosuppression-dependent HCV+ recipients showed fibrosis progression typical of conventionally treated HCV+ recipients. Thus, complete weaning of immunosuppression showed a beneficial effect on HCV-induced fibrosis progression in one patient subset . They also mentioned that focal ductopenia, a histopathologic finding of concern for early chronic rejection, was occasionally observed in protocol biopsies from the SOT patients. It was, however, always limited to less than 20% of the portal tracts, which is of uncertain significance. Once again, however, longer follow-up is needed in this cohort to make sure that early chronic rejection does not occur. But it is reassuring that this group did not show significantly elevated GGTP levels (a biochemical marker of ductopenia) as compared with the immunosuppression-dependent controls .
A common finding reported in SOT kidney allografts is that of patchy interstitial inflammation that is often arranged into small nodular aggregates [63,64,66,67] (Tables 2 and 3). Some biopsies have been reported as normal. Other findings include mild arterial nephrosclerosis, focal global glomerulosclerosis, grade 1 chronic allograft nephropathy with mild ‘nephroangiosclerosis’, moderate arteriolar hyalinosis and double contours of the glomerular basement membrane indicative of transplant glomerulopathy (Table 2). Most of these findings, however, are largely nonspecific from a light microscopic perspective and are commonly encountered in aged and/or hypertensive or diabetic kidneys and those with calcineurin toxicity. A possible exception is some of the transplant glomerulopathic changes, which might signal a form of antibody-mediated injury.
Xu et al.  characterized the patchy tubulointerstitial lymphocytic infiltrates in two SOT kidney allografts after 10.3 and 18 immunosuppression-free years. They found the interstitial infiltrates to be enriched with CD4+/transforming growth factor (TGF)-β1+/CD25±/FoxP3− adaptive regulatory T cell (Treg) and lymphoid aggregates enriched with TGF-β1-/FoxP3+/CD25+ natural Treg. Several years earlier, Burlingham et al.  reported a SOT kidney allograft recipient that showed similar findings (i.e. patchy interstitial infiltrates without damage) in a pre weaning biopsy. The patient remained stable during follow-up and a biopsy after 7 immunosuppression-free years was unchanged. The serum creatinine, however, gradually increased from 1.6 to 1.8 to 2.0 mg/dl and the patient eventually developed biopsy-confirmed acute rejection 9.7 years after transplantation [68,69].
Roussey-Kesler et al.  reported 10 SOT kidney allograft recipients after 9.4 ± 5.2 immunosuppression-free years. Most of these patients had interrupted weaning of immunosuppression over a long period of time and donor age was younger than donors used in the general transplant population. One patient, after 13 years of SOT, developed renal dysfunction. A biopsy showed grade I chronic allograft nephropathy with mild nephroangiosclerosis without significant lymphoid infiltration or specific changes suggestive of chronic rejection. C4d staining was negative and no anti-HLA antibodies were detected in the circulation. Renal function also deteriorated progressively in another patient, requiring dialysis. An allograft biopsy in this patient performed after 7 immunosuppression-free years, showed grade Ib chronic allograft nephropathy with allograft glomerulopathy, but without C4d staining.
The most impressive and carefully documented series of biopsies from tolerant kidney allograft recipients were reported by Kawai et al. . They induced tolerance using combined bone marrow and kidney transplants from MHC single-haplotype mismatched living-related donors with a nonmyeloablative preparative regimen. Of the five patients enrolled in that trial, one allograft was lost to antibody-mediated rejection. One other developed anti-donor HLA class II antibodies 2 months after complete immunosuppression withdrawal. Biopsies from this patient showed C4d deposits and segmental duplication of the glomerular basement membrane in some glomeruli. The patient was not returned to immunosuppression because of uncertainty about the significance of the relatively minor changes that did not worsen over a period of time. Protocol biopsies from the three other grafts obtained from between 666 and 1135 days after transplantation and from about 400 to 1000 days after withdrawal of all immunosuppression were reported as normal and/or showing transient mononuclear cell infiltrates; C4d stains were negative. Intra-graft levels of FoxP3 mRNA were about six times higher in the stable immunosuppression-free group than in the stable-with-immunosuppression group, whereas the granzyme B mRNA levels were similar. Therefore, the ratio of FoxP3:granzyme B might be an important marker of a favorable Treg–Teffector ratio and allograft acceptance.
Lessons learned and common characteristics of spontaneously/operationally tolerant allografts
Common clinical characteristics of successful weaning that emerge from the review of SOT liver- and kidney allografts include living-related allografts, immunologically stable/noninflamed allografts, and long survival, in situ, under conventional immunosuppression with gradual weaning of immunosuppression over months to years (Tables 1–3). Conversely, early and abrupt weaning of immunosuppression, nonrelated cadaveric allografts, or previously inflamed allografts are more likely to experience rejection after weaning. The ‘take home’ messages reported in the liver trials are shown in Table 5. These observations/lessons are beginning to point toward immunologic processes associated with graft acceptance, and eventually, these will translate into molecular pathways. But currently, the field is in its infancy.
Table 5. ‘Take home’ messages of the liver immunosuppression minimization trials.
Micro-chimerism is frequently observed in long-term liver allograft survivors Not all recipients require long-term maintenance immunosuppression ‘Tolerant’ recipients/accepted allografts and show inflammation/hepatitis, not attributable to rejection
Enzyme elevations typically occurred about 150 days into the weaning process, but not all associated with rejection Close monitoring needed; liver injury test parameters not adequate monitor, but weaning is safe: no allografts failures or permanent damage Weaning should not be attempted until 5–10 years after transplantation; micro-chimerism not necessarily associated with acceptance Acute/chronic rejection had typical presentation; sometimes preceded by ‘nonspecific lobular changes’
Close physician surveillance during weaning with frequent assessment of liver function; weaning should not be abrupt/quick LFTs not a good discriminator of rejectors versus tolerant; but patient should be biopsied and returned to immunosuppression, if needed Cyclosporine-treated recipients more resistant to weaning than those treated with tacrolimus or azathioprine
Devlin et al. , Wong et al. , and Girlanda et al. 
Close physician surveillance as required; transient rise in liver injury test parameters not always indicative of rejection – can spontaneously resolve Acute rejection that develops does not always show histopathologic features of ‘classic’ acute cellular rejection Microchimerism not statistically associated with graft acceptance Successful drug withdrawal correlated with nonimmune mediated liver diseases, HLA matching, low incidence of early rejection Ability to wean associated with less portal inflammation, less CD8+ lymphocytes and more lobular CD45RO+ lymphocytes
Weaning can be attempted in a majority of recipients; successful in up 38.1% of living-related donor liver recipients Liver injury test parameters were not significantly different in the rejection versus weaned groups Mechanisms of graft acceptance unclear
‘Tolerance’/graft acceptance observed in 33% of recipients Sinusoidal endothelial cell chimerism was frequent, but not necessary for graft acceptance Portal inflammation without endothelialitis or bile duct damage might represent either ‘latent’ rejection or ‘immunologic activation’ associated with graft acceptance
Bone marrow infusion increases the level of microchimerism, but does not significantly increase the percentage of patients that can be weaned from immunosuppression About 20% of stable liver allograft recipients can be weaned from all immunosuppression
Clinical ‘tolerance’/graft acceptance can be achieved in a minority of recipients Weaning from immunosuppression can be risky
Tisone et al.  and Martinez-Llordella et al. 
Univariant analysis: longer F/U after Tx, treatment with ribavirin, less steroids, more advanced architectural distortion/fibrosis on entry biopsy, and lower first week cyclosporin blood levels associated with ability to wean. Multivariate analysis: low cyclosporine trough levels during those first post-transplant week and initial steroid free immunosuppression independently associated with ability to wean ‘Tolerance’/graft acceptance associated with lower fibrosis progression/regression after weaning Differential expression of genes in circulating blood mononuclear cells associated with: (i) IL-2 signaling; (ii) pro-inflammatory, oxidative stress, apoptosis, etc. associated with HCV; (iii) upregulation of Vδ1γδ, NK receptors and TGF-β signaling; and (iv) increased percentage of FoxP3+, increased Vδ1γδ: Vδ2γδ ratio CD4+/CD25+/CD62Lhigh
Koshiba et al. , Yoshitomi et al. , and Li et al. 
Recipients of living-related donors can be successfully weaned more frequently than mismatched cadaveric allografts Baseline biopsies show increased infiltration by CD4+/CD25high and peripheral blood shows increased ratio of Vδ1/Vδ2 ratio as compared with normal individuals Graft acceptance resembles successful pregnancy in that Vδ1γδ T cells express very high IL-10 levels Tolerant grafts showed more portal fibrosis, ductular reactions, and decreased luminal diameter of bile ducts as compared with those maintained on immunosuppression; might be a variant of late onset rejection
Problems with early abrupt weaning and the advantage of relatively long allograft residence under immunosuppression and slow weaning are all probably related to the immunologic interface between the donor and recipient. Early after transplantation, in conventionally treated recipients, this interface is an activated and contentious one because: (i) the massive migration of donor hematolymphoid cells and cellular debris (danger-associated molecules) from the allograft floods the recipient lymphoid tissues [70,71] and (ii) tissue damage from preservation-related injury  fosters recipient leukocyte migration and retention within the allograft. The migration of donor leukocytes and debris, particularly from liver allografts, floods recipient lymphoid tissues with innate activation signals that can have both positive and negative effects, such as activation and partial deletion of donor-reactive lymphocytes and/or development of allospecific memory cells [14,34,73,74]. This probably explains why more immunosuppression is needed to prevent rejection early after transplantation and why it is more difficult to wean immunosuppression at this time [2,14].
The immunologic barrier is overcome or subverted, however, while using the combined bone marrow or hematopoietic stem cell and kidney transplant approach . Part of the early success in this pioneering trial is likely related to the relatively harsh conditioning regimen; but it is also nonmyeloablative, and weaning from immunosuppression has been rapid and deliberate. As compared with other trials using more conventional immunosuppression, this approach also shows a higher overall rate of success, but currently it can be applied to only a limited subset of patients. Nevertheless, the high rate of success, convincing demonstration of donor-specific nonreactivity, and ‘cleanliness’ of the allografts , in our opinion, suggest that deletion has occurred in these patients, at least early after transplantation. And deletion results in more robust tolerance. As macrochimerism was only observed transiently in these patients , it will be interesting to determine whether the deletion, donor-specific nonreactivity and allograft cleanliness persist long-term.
Preservation injury eventually heals, donor passenger leukocyte migration diminishes, and most, but not all, hematolymphoid cells within the allograft are eventually replaced by recipient ones. And the recipient immune system is no longer the same as is was before transplantation. In SOT, however, the allograft also contributes significantly to acceptance because the organ (liver versus kidney) and prolonged exposure under treatment enhances the ability to ultimately wean immunosuppression. However, the role of the allograft in SOT is not well understood and is evolving. Speculations include: (i) provision of a stromal niche for donor hematopoietic stem cells  and maintenance of microchimerism ; (ii) provision of donor antigen needed to stimulate adaptive Treg cells, which causes them to locate there [67,76]; (iii) a unique micro-environment in the case of the liver variably dampens a number of different immune responses [18,19,77]; (iv) a sink for alloreactive cells slowly mediating chronic rejection; or (v) some combination of the above.
Other nonrejection-related insults, such as recurrence of the original disease and technical complications associated with inflammation, can either sustain or re-activate the contentious allograft/recipient immunologic interface. This, in turn, can predispose to rejection, even in seemingly SOT allografts. Examples include diminished ability to wean immunosuppression in patients with autoimmune hepatitis or primary biliary cirrhosis in liver allografts [22,23] and triggering of apparent rejection after an episode of obstructive uropathy  (Table 3). Also, HCV-negative liver allografts that are inflamed at the time of weaning are more prone to rejection. This is probably related to the alterations of leukocyte trafficking through the organ, which diminishes immunologic ignorance.
It is not surprising that recipients of living-related allografts are more easily weaned from immunosuppression than nonrelated cadaveric organs. They are usually better MHC-matched than cadaveric organs and generally experience less severe ischemic/preservation-related injury. And if the donor is the mother, oral exposure to maternal antigens through breast feeding might positively contribute to tolerance induction. Clearly, more work is needed in studying the relationship between innate and adaptive immunity in triggering rejection in stable SOT allografts.
Several studies showed the presence of mononuclear infiltrates in SOT kidney and liver allografts (Tables 1–3). Many completely normal nonallograft kidney and livers show similar findings. But most transplant pathologists intuitively react with some level of concern because inflammation is so frequently associated with tissue damage and formation of aggregates and/or germinal centers in tissues is a time-tested marker of chronic inflammation. Yet Xu and Burlingham  have reported, in humans, how these infiltrates might represent a ‘protective’ response in the allograft. Their observation of a Treg-rich infiltrate supports the hypothesis that peripheral allograft tolerance involves Treg-dominance in the Treg–Teffector ratio homeostasis, as in experimental animals [76,78,79]. Their observation is also consistent with the finding that Treg localize in allograft tissue and at sites of inflammation . A higher Treg–Teffector ratio was also observed in tolerant kidney allografts studied by Kawai et al.  and increased Treg were noted in the liver allografts of tolerant pediatric recipients, but a Treg–Teffector ratio was not reported .
It should also be noted, however, that nodular lymphoid aggregates have also been used to distinguish chronically rejecting organs from seemingly tolerant ones in experimental animal studies [65,80]. But perhaps the quantity, composition or function of the lymphocytes/nodules differ between tolerance and chronic rejection, as in the peripheral circulation [81,82]. Or perhaps the two processes, tolerance and chronic rejection, are closely related and differ only in the severity and pace of the response in relationship to the lifespan of the recipient: A 65-year-old liver allograft recipient that is slowly developing chronic rejection over a period of 20 years might be better off considered tolerant rather than returned to maintenance immunosuppression. Regardless, better characterization and comparison to similar infiltrates in normal nontransplant tissues and stable allograft recipients on immunosuppression is needed. These seemingly benign infiltrates in tolerated organs appear to be related to the well-recognized affinity of adaptive Treg for allograft tissue and sites of inflammation [76,78,79]. But as TGF-β secretion plays an important role in their function  it will be important to determine whether regulation itself might produce pathology/fibrosis. T lymphocytes showing a regulatory phenotype, and producing significant TGF-β, were recently shown to be associated with IgG4-cholangiopathy, a fibrosing condition of bile ducts  that can affect other organs.
Another common characteristic of SOT in liver- and kidney allografts is that it appears to be meta-stable and to evolve over a period of time. Seemingly minor perturbations can trigger clinically significant acute rejection episodes, even in patients who have been off all suppression for many years. At least one study, however, suggests that the instability decreases with time . In addition, it is not entirely surprising that some apparently well-tolerated human allografts show features of chronic rejection after longer follow-up. This occurred in several renal allografts and at least one liver allograft recipient (Tables 1–3). And as we already know that liver injury test parameters and serum creatinine are not sensitive markers of tissue injury, some method of follow-up by protocol will benefit patient management and contribute to an understanding of mechanisms associated with allograft acceptance. The first author would certainly advocate for protocol biopsies, even in stable SOT patients, at least until we understand the process better.
Roles of the pathologist, features of interest within tolerated allografts, and sampling/testing recommendations
The pathologist will be asked to play two roles in this emerging field of immunosuppression minimization. The first, and most important, will be a clinical one in monitoring allograft acceptance and ‘helping in decision-making, but not unilaterally deciding,’ as to whether a particular recipient needs to be returned to immunosuppression or not. To successfully play this role, the pathologist has to be able to distinguish all of variants of antibody- and cell-mediated rejection that might require a return to immunosuppression from changes associated with long-term engraftment, recurrent disease, and technical complication where immunosuppression might not be indicated. Furthermore, there are likely to be findings of uncertain significance and these will require follow-up over a period of time. As with any new pathology endeavor, limiting biopsy analysis and interpretation to one or a small group of pathologists with a specific interest in immunosuppression minimization will decrease observer variation.
Thus, at a minimum, samples that should be obtained in any weaning study include: (i) indicated biopsies to determine the cause of any allograft dysfunction before weaning; (ii) protocol biopsies immediately before weaning in stable recipients; (iii) indicated biopsies in recipients who develop any significant evidence of graft dysfunction after weaning; and (iv) protocol biopsies in stable recipients after weaning. The schedule for, and even whether to obtain, protocol biopsies in stable recipients off all immunosuppression is controversial. But at least one sample after 1, 3, and 5 immunosuppression-free years is reasonable, in the first author’s opinion. But defensible arguments can be made for more or less frequent sampling. It is ideal to also have available donor and/or postreperfusion biopsies to determine whether early events, such as significant donor disease or preservation/reperfusion injury affect the ability to wean subsequently.
Routine light and histochemical microscopic examination, appropriate to the organ, is mandatory because it provides a plethora of information and because it is fast and inexpensive and based on abundant empirical data. We attempt to preserve as much tissue as possible for future experimental studies and routinely obtain H&E stains alone in liver allografts and H&E, Methenamine–silver–trichrome (MST) combination, PAS, and C4d stains in kidney allografts. Fibrosis can be reliably assessed by polarization microscopy. Beyond these tests, more sophisticated (experimental) analyses must balance the various limitations such as: sample size, potential yields of testing modalities; and research interests of the investigator and the field.
Beyond basic general diagnostic considerations, major histopathologic features of focus should include the overall tissue architecture, severity and composition of inflammatory infiltrates, development and/or progression of fibrosis and parenchymal cell atrophy and obliterative arteriopathy. The latter features are more easily followed in kidney than in liver allografts and are important, albeit not entirely specific, histopathologic markers of chronic allograft rejection. Routine tissue monitoring for C4d deposition in conjunction with circulating anti-donor antibodies is an absolute necessity in kidney allografts. In liver allografts, C4d deposits are infrequent and their clinical significance is much less certain unless there is sinusoidal deposition, which is rare. Most centers do not routinely obtain C4d stains for liver allograft recipients, but it is probably prudent to do so for any cause of unexplained allograft dysfunction or when anti-donor antibodies are detected in the circulation.
Any noticeable progression of routine histopathologic findings over a period of time, such as interstitial fibrosis and parenchymal cell atrophy, especially if it is accompanied by laboratory-validated deterioration of function, should prompt a thorough re-evaluation of the immunosuppression management policy. This recommendation includes caveats of intra- and inter-observer variation, sampling issues, and whether the rate of deterioration is relevant to the clinical setting. For example, a minimal or very slow progression of allograft fibrosis over a period of time without immunosuppression might be the result of sampling issues or be an acceptable alternative for a diabetic-prone elderly allograft recipient.
The second scientific role of the pathologist complements and extends the clinical one. Immunostaining, in situ hybridization, and various nucleic acid and protein expression arrays can be used to gain a functional understanding of the routine histopathology findings. Specific areas of interest would include evidence of injury and a response to injury in endothelial and parenchymal cells and the phenotype and activation/maturational status of various leukocyte populations, including organ-resident DC and various T-cell subsets. But assay selection should be balanced by considerations of tissue availability and potential significance and impact of any result(s). Recent development of multiplex staining in tissue sections has helped to conserve tissue by staining for multiple antigens in the same section (Fig. 1).
For example, livers and kidney (and allografts) usually show a relatively low rate of cellular stress and regeneration, as determined by immunohistochemical staining, and deviations from controls/normal might be a reason for concern. But any experimental result should not significantly influence the clinical decision-making process, unless scientifically validated. Evidence of injury and response to injury in endothelial and parenchymal cells might be monitored using markers of caspase activation, apoptosis, proliferation, and senescence-related changes, such as Ki67, PCNA, TUNEL, caspase 3, p16, p21, heme oxygenase-1, and increased expression of DNA repair enzymes. Beyond C4d deposits, one might look for immunohistochemical evidence of subtle endothelial injury. This might manifest as upregulation of anti-apoptotic molecules bcl-2, bcl-xl, or stress-induced hemoxygenase-1 HO-1 [85–89], CD46 [90,91], the complement regulatory proteins CD55  and CD59 [93,94]; or pAKT [85,95,96] and Phospho-S6 Ribosomal Protein (Ser235/236) [86,97] or reduced expression of ICAM-1 and VCAM-1 [85–89], complement component 3 receptor-alpha [91,98], and complement component 5a receptor [91,98].
Our group is particularly interested in the donor versus recipient origin and activation/maturational status of organ-resident DC as they occupy an important niche within the immune system as monitors of the environment and translators of innate-into-adaptive immunity (Figs 1 and 2). In the first author’s experience, well-tolerated allografts almost invariably contain residual donor DC . And DC are especially good at triggering rejection  and tolerogenic pathways . At a very basic level, however, we do not know whether the composition and maturational/activation status of interstitial leukocytes and DC in tolerated allografts resembles normal organs. And this will likely provide information about the mechanisms of allograft acceptance. Considering previous studies on the importance of naïve and memory T cells  and γδ-T cells [32,33] the composition of resident allograft leukocytes will certainly be of interest, as will expression of immunomodulatory cytokines such as TGF-β and interleukin (IL)-10.
The position of the liver within the body, immediately downstream of the intestines, also appears to be an important contributor to the tolerogenic properties  of the liver and might help explain why it is the liver allograft recipients who are able to be more easily withdrawn from immunosuppression and display SOT . Our group has been interested in hepatic STAT3 activity (pSTAT3) , which is higher in the liver than in other commonly transplanted organs. Bacteria and bacterial products normally leak through the intestines into the portal venous blood and this stimulates Kupffer’s cells to produce IL-6, which in turn, upregulates hepatic STAT3 activity. Activated or phosphorylated STAT3 inhibits hepatic myeloid and plasmacytoid dendritic cell maturation . The critical role of IL-6 is illustrated in normal IL-6-deficient mice livers, which harbor DC that are significantly more mature than DC in normal wild-type mice livers. Depletion of gut commensal bacteria in the wild-type decreased hepatic pSTAT3 levels and caused hepatic dendritic cell maturation . Activated STAT3 has also recently been recognized as a key modulator of tumor immunity  being involved in several aspects of tumor immunology including inhibition of DC maturation and expansion of Treg within neoplasms (reviewed in ). Thus, the normal physiologic state of the liver might resemble a tumor microenvironment [77,101] and in turn, this might enable recipients to be weaned from immunosuppression without triggering a rejection reaction. Clearly, molecular mechanisms beyond STAT3 are likely involved in the complex process of liver allograft acceptance, but pre-existing mechanisms to prevent an over-reaction to gut-derived antigens likely contribute significantly to the process.
Tolerance in humans induced via hematopoietic macrochimerism, even if transient, appears to be deletional and robust, at least early after transplantation , but might evolve towards relatively less stable regulatory pathways subsequently after transplantation . SOT in conventionally treated allograft recipients can be studied in more patients and appears to rely less on deletion and more on active regulation. Therefore, study of the regulatory characteristics of lymphocytes within SOT allografts has, and will continue, to gain popularity. As many of these studies will likely involve study of FoxP3 expression it is worthwhile to note that most human T cells express FoxP3 during early stages of T-cell activation . Therefore, studies using this marker alone to define Tregs should be interpreted with caution. Expression of the IL-7 receptor (CD127) might be helpful in this regard, as CD4+CD25+CD127+low cells include threefold more FoxP3+ T cells than the classic CD4+CD25hi subset, but show equivalent regulatory activity .
Perspective and future studies
One of the most remarkable observations made during compilation of this article was the realization that tissue samples from SOT liver- and kidney allograft recipients were scarce. This is not only attributable partially to the infrequency with which SOT patients are identified, but also to the fact that clinicians are hesitant to perform biopsies on otherwise seemingly stable SOT recipients. As mentioned before, clinicians might be misled by insignificant histopathologic curiosities. In addition, biopsies are invasive and not without risk of morbidity, and even mortality albeit rarely. In our opinion, the benefits of protocol biopsies in this situation outweigh their risks. It is crucial, however, that the tissue samples be used wisely. The issue of how to use them is not always an easy decision.
The choice of controls for SOT tissue studies can be problematic, especially for liver allografts because of the high incidence of recurrent disease. Normal age-matched control liver tissue, stable allograft recipients on immunosuppression, stable allograft recipients on immunosuppression with the same recurrent disease, and recipients that fail weaning attempts are possible comparison groups for SOT patients. Each one has advantages and drawbacks.
The advent of array technology and discovery science often pits those who practice ‘discovery’ science against those who practice ‘mechanistic/hypothesis testing’ science. Both have advantages and shortcomings. The essence of hypothesis testing is to associate a specific cell or pathway or system with a specific phenomenon. Key interventional experiments that change the potentially critical component are then conducted to determine whether the relationship holds up, as expected/hypothesized. The major problem, however, is how to identify the critical cell, pathway, or system that ultimately controls complex biologic phenomena like immunologic tolerance to human allografts. One could expend significant resources studying an unimportant cell, pathway, or system. In addition, interventional experiments in humans are usually delayed until the final stage of hypothesis testing. Moreover, they are expensive and often difficult to interpret.
‘Discovery science’, in contrast, has recognized that array technology and bioinformatics are reducing biologic phenomena to a ‘closed’, albeit very complex, systems. No assumptions are made about the particular importance of one cell, molecule, or signaling system over another. Instead, expression array analyses are conducted on populations that exhibit a phenomenon and prominent genes, proteins, pathways, or systems emerge . Single nucleotide polymorphism arrays also have the potential to contribute significantly to our understanding of tolerance. Genetic tendencies certainly contribute to the development of certain original diseases that lead to transplantation and are likely to also contribute to the ability to wean immunosuppression. This discovery approach also has drawbacks. Not all components of biologic systems are amenable to array analyses. For example, mRNA expression arrays are only quantitative and some protein arrays do not account for the activation/phosphorylation status of proteins (e.g. STAT3), which can significantly affect function. It is not a trivial task to identify nodal points in complex systems that ultimately control or significantly influence the phenomenon being observed. A particular gene or protein might be one of the most up- or downregulated quantitatively during the process, but might not be an important nodal regulator.
In the end, it is our opinion, that the best approach to the study of tolerance in tissue samples will be a combination of both the approaches. The ‘shotgun’ criticism currently applied to many array studies will eventually give way to ‘targeted’ or focused arrays through hypothesis testing that measure only key parameters associated with the biologic process of interest.