We undertook this study to assess the rate of poor early graft function (EGF) after laparoscopic live donor nephrectomy (lapNx) and to determine whether poor EGF is associated with diminished long-term graft survival. The study population consisted of 946 consecutive lapNx donors/recipient pairs at our center. Poor EGF was defined as receiving hemodialysis on postoperative day (POD) 1 through POD 7 (delayed graft function [DGF]) or serum creatinine ≥ 3.0 mg/dL at POD 5 without need for hemodialysis (slow graft function [SGF]). The incidence of poor EGF was 16.3% (DGF 5.8%, SGF 10.5%), and it was stable in chronologic tertiles. Poor EGF was independently associated with worse death-censored graft survival (adjusted hazard ratio (HR) 2.15, 95% confidence interval (CI) 1.34–3.47, p = 0.001), worse overall graft survival (HR 1.62, 95% CI 1.10–2.37, p = 0.014), worse acute rejection-free survival (HR 2.75, 95% CI 1.92–3.94, p < 0.001) and worse 1-year renal function (p = 0.002). Even SGF independently predicted worse renal allograft survival (HR 2.54, 95% CI 1.44–4.44, p = 0.001). Risk factors for poor DGF included advanced donor age, high recipient BMI, sirolimus use and prolonged warm ischemia time. In conclusion, poor EGF following lapNx has a deleterious effect on long-term graft function and survival.
Since the advent of laparoscopic live donor nephrectomy (lapNx) in 1995, the procedure has proliferated worldwide and supplanted the open technique as the procedure of choice at a majority of transplant centers worldwide (1,2). The procedure's history has been dominated by reports of the benefits of this less invasive procedure for the donor (3–7) and on organ supply (8–12) and of excellent recipient renal outcomes (13–36). However, we previously showed impaired early renal allograft function and higher rates of delayed graft function (DGF) in our center's early cohort of recipients of renal allografts procured by lapNx as compared to historic controls procured by the open approach (36), and some other groups have also found worse early allograft function in laparoscopically procured allografts (13,37–44). In fact, a United Network for Organ Sharing (UNOS) data base analysis of pediatric live donor transplant recipients found that DGF rates and acute rejection (AR) rates were higher for those whose organs were procured by the laparoscopic nephrectomy as compared to open donor nephrectomy (45). Additionally, animal and human studies have shown that the pneumoperitoneum used during the lapNx procedure impairs renal cortical blood flow (45–50); and a study of zero-hour biopsies demonstrated concerning histopathologic findings (subcapsular cortical damage with capsular lesions) in laparoscopically procured kidneys approached with hand-assistance or via the retroperitoneum, which were not present in those procured by open technique (51). Based on these observations, it has been postulated that the mechanical manipulation and warm ischemia sustained during the less invasive and technically more challenging laparoscopic procurement could cause early graft dysfunction.
Whether and to what degree such early insults in this setting may impact on long-term allograft function and survival are not entirely clear. Certainly, there is clear evidence in deceased donor renal transplant recipients that DGF (52–56) or slow graft function (SGF) (53) imparts higher immunologic risk and worse renal allograft survival. However, prior studies have been less clear on the effect of early graft dysfunction in the setting of living donor (LD) kidney transplantation (57–60), where the insults that could cause early graft dysfunction are qualitatively and quantitatively different than those in the setting of deceased donor renal transplantation and where the transplanted organ is expected to be of superior quality and therefore probably more capable of recovering from brief perioperative insults.
Given the rapid proliferation of the lapNx, we feel that is essential to accurately characterize the rate of poor EGF and to determine the effects on the outcome of the allograft. Understanding the ultimate impact of early dysfunction is crucially important in determining how much injury to the allograft can be sustained during procurement in exchange for minimizing the invasiveness and morbidity to the donor. Therefore, we undertook this study of recipient renal function outcomes in our singularly large cohort of lapNx to assess the incidence of poor EGF and to test the hypothesis that poor EGF impairs graft survival and other important later outcomes.
Nine hundred ninety-seven patients received grafts procured by LapNx between March 1996, when our center adopted this as the procurement procedure of choice, and November 2005. Because of the high immunologic risk of recipients who had a positive pretransplant crossmatch with their donor and who therefore underwent pretransplant desensitization, we excluded these recipients (n = 39) from the study. Among the remaining group, 946 had sufficient data to grade EGF and were included in the study. The lapNx operative technique was described previously (3). In 2003 we switched from University of Wisconsin (UW) solution to Custodial HTK as our preservation solution.
During the study period, our immunosuppression protocol of choice evolved. Lymphocyte-depleting agents (LDAs) including OKT3 or rabbit antithymocyte globulin (Thymoglobulin®) were utilized as induction in recipients who had a prior transplant or had a panel reactive antibody (PRA) >40%. In others, basiliximab induction was routinely used for induction since February 2002. The initial maintenance immunosuppression regimen (IMISR) initially included microemulsion cyclosporine-A (CSA), mycophenolate and prednisone. In October 1997, tacrolimus (FK) replaced CSA. In the absence of prior transplant or PRA >40%, corticosteroids were tapered off within 3 weeks in non-black recipients since February 2002 and in African American recipients since August 2005. Sirolimus was used sporadically since 2002. Percutaneous renal allograft biopsies were performed in recipients with poor graft function every 7–14 days in the early posttransplant period, and later biopsies were done as clinically indicated to evaluate graft dysfunction. AR was treated with high-dose corticosteroids or with a course of LDAs.
After receiving approval from our center's Institutional Review Board, donor and recipient data were retrieved for study subjects. Patient data, clinical and laboratory data as well as graft and patient survival status were compiled primarily from the transplantation data base, with review of transplant clinic records and hospital records where appropriate. Hemodialysis unit billing records for the first postoperative week were reviewed for all recipients.
Outcomes and analyses
DGF was defined as the need for hemodialysis on POD 1–7. Slow graft function (SGF) was defined as POD 5 serum creatinine ≥ 3.0 mg/dL without need for hemodialysis on POD 1–7. An allograft was considered to have poor early graft function (poor EGF) if it experienced DGF or SGF, and conversely it had immediate graft function (IGF) if it did not experience DGF or SGF. Average serum creatinine levels were calculated as the mean of all values lassoed in a progressively widening range around dates of interest: day 5 ± 1 days, 1 month ± 5 days, 1 year ± 20 days, 3 years ± 6 months, and 5 years ± 6 months. Estimated glomerular filtration rate (eGFR) was calculated with the abbreviated MDRD equation (61). AR was defined as biopsy-proven acute cellular or humoral rejection, as per prevailing Banff criteria. Warm ischemia time was defined as the time from clamping of renal arterial supply until submersion in iced saline (and therefore it does not include warm ischemia time accumulated during the recipient procedure). Failure of renal allograft was defined as return to another form of renal replacement therapy (dialysis or repeat kidney transplant) or patient death with a functioning allograft. Follow-up time and survival analyses were censored at the time of the most recent follow-up with our center. To assess temporal trends in poor EGF, the 946 cases were divided into three chronologic tertiles: tertile 1 included cases 1 through 315 (performed between March 27, 1996 and April 21, 1999), tertile 2 included cases 316 through 630 (April 23, 1999 to July 25, 2001), and tertile 3 included cases 631 through 946 (July 26, 2001 to August 31, 2005).
Our primary outcome was death-censored renal allograft survival, and our primary analysis of interest was the comparison of the group with poor EGF (which included both the SGF and the DGF groups) to the IGF group. Secondary outcomes and analyses are detailed in the Results section. In an attempt to eliminate cases in which poor EGF was likely due to immunologic rather than mechanical/ischemic insults, all analyses of rejection-free survival as well as selected secondary analyses of other outcomes were performed after excluding cases of very early AR, which was defined as AR on or before POD 10.
Continuous variables were reported as mean ± standard deviation and were compared using analysis of variance (ANOVA) and Student's t-tests. Categorical variables were reported as absolute number of patients and/or percentage of the particular group and were compared using chi-square tests. Adjustments for multiple covariates were made using logistic regression for binary outcomes and linear regression for continuous outcomes. The assumption of linearity of the relationship was confirmed. Survival analyses were performed using Kaplan–Meier techniques, compared with log-rank tests, and adjusted for potential confounders using Cox proportional hazard regression. The proportionality assumptions were confirmed. Values of p ≤ 0.05 were considered statistically significant. Potential confounding variables were chosen a priori for inclusion in the multivariate analysis if data were available in the data base on a sufficient number of subjects (>94%) and if an independent effect on the outcomes was felt to be reasonably expected, even if a statistically significant effect was not demonstrated in univariate analysis. The following covariates were used in regression analyses: very early AR; recipient age, sex, African American race, prior transplant, pretransplant diabetes mellitus (DM); donor age, sex, African American race, and MDRD eGFR; zero human leukocyte antigen (HLA) mismatches between donor and recipient; utilization of LDA induction; inclusion of mycophenolate in IMISR, steroid-sparing IMISR, tacrolimus in IMISR, and sirolimus in IMISR. SPSS Version 8.0 (SPSS, Inc., Chicago, IL) and Stata SE 9.1 (Stata Corporation, College Station, TX) software were used for statistical analyses. No funding sources were used in the completion of this study.
Among the 946 subjects, 792 enjoyed IGF while 154 (16.3%) experienced poor EGF — 99 (10.5%) with SGF and 55 (5.8%) with DGF. Baseline demographic and clinical parameters of the recipients and donors are detailed in Table 1. As can be seen, subjects who experienced poor EGF were slightly older and more likely to be African American, and their donors were slightly older, more likely to be African American and more likely to be genetically unrelated. Subjects in the poor EGF group were more likely to experience very early AR, were more likely to receive a right kidney, were less likely to receive a simultaneous pancreas transplant, and had longer warm ischemia times. The initial maintenance immunosuppression regimens were similar except that the poor EGF group was more likely to receive sirolimus, and a similar proportion of patients received LDA induction.
Table 1. Baseline demographic and clinical data based on early graft function group
As shown in Figure 1A, the death-censored renal allograft survival was worse in the poor EGF group, and the difference was clinically substantial and highly statistically significant. Furthermore, the difference in allograft survival persisted after adjustment for the potential confounders listed above (adjusted hazard ratio (HR) 2.15 with 95% confidence interval (CI) 1.34–3.47, p = 0.001).
Figure 1B shows the death-censored graft survival curves with the poor EGF group separated into SGF and DGF groups. When SGF (the less severe poor EGF group) was compared directly to IGF, there was still a significant difference between these groups, even after adjusting for the same potential confounders (adjusted HR 2.54 with 95% CI 1.44–4.44, p = 0.001). The DGF and SGF groups were not statistically different (adjusted HR 1.36 for DGF compared to SGF with 95% CI 0.66–2.82, p = 0.4), but the curves visually demonstrate that a majority of graft losses in the DGF group were early while the graft losses in the SGF group were spread out over the next several years. Also, overall graft survival (not death-censored) was worse in the poor EGF group as compared to the IGF group (curve not shown, p = 0.001); and this difference persisted after adjusting for the same potential confounders (adjusted HR 1.62 with 95% CI 1.10–2.37, p = 0.014). The causes of graft loss are detailed in Table 2, and the only statistically significant difference was a higher rate of thrombosis in the poor EGF group. Among the allografts that failed by POD 30 in the poor EGF group, the cause of graft loss was attributed to thrombosis in six of seven cases for which a specific cause was identified.
Table 2. Causes of graft losses in poor EGF and IGF groups
1The thromboses in the poor EGF group occurred at POD's 1, 1, 2, 2, 6 and 15; while the thromboses in the IGF group occurred on POD's 4, 15, 256 and 2740.
Chronic rejection or chronic allograft nephropathy
Polyoma virus nephropathy
Recurrent primary renal disease
When cases that were diagnosed with very early AR were excluded, these survival differences persisted, and the death-censored graft survival curve for poor EGF versus IGF is shown in Figure 2. Similarly, after eliminating cases of very early AR, the death-censored renal allograft survival was also worse for SGF compared to IGF (curves not shown, log-rank p = 0.0048).
Patient survival curves for poor EGF versus IGF are shown in Figure 3. Although the survival visually appears worse in the poor EGF group over the first several years, the curves eventually cross and the log-rank p-value is nonsignificant over the whole period of follow-up. Because of lower numbers of subjects and widening confidence intervals during the later years of follow-up, we compared survival during the first 5 years of follow-up and found that it was significantly worse during this period (p-value = 0.028, unadjusted HR 1.72 with 95% CI 1.05–2.81). The association of poor EGF with worse patient survival was no longer significant when adjusted for the above covariates (adjusted HR 1.50 with 95% CI 0.84–2.65, p = 0.18). At 5 years posttransplantation, 21 subjects had died and 36 subjects were still known to be living in the poor EGF group, while 67 subjects had died and 254 were known to be alive in the IGF group (63.2% vs. 79.1% survival, chi-square p = 0.010).
Figure 4 shows that the rejection-free survival during first posttransplant year was worse for poor EGF as compared to IGF. As discussed above, cases with very early AR were excluded from this analysis because these events were felt more likely to be the cause of poor EGF rather than an effect of it. Figure 4 demonstrates that the difference in the curves is produced by acute rejection episodes that occurred within the first 3–4 postoperative months, and thereafter the curves appeared to be parallel. This difference in rejection-free survival persisted after adjusting for the potential confounders (adjusted HR 2.75, 95% CI 1.92–3.94, p < 0.001).
Table 3 demonstrates that there was a modestly higher mean 1-month and 1-year serum creatinines and lower 1-year eGFR in the poor EGF group. These differences persisted after adjusting for the same potential confounders: the adjusted difference in the 1-year eGFR for the poor EGF group compared to the IGF group was −6.71 mL/min per 1.73m2, with 95% CI of −11.0 to −2.43 and p-value 0.002. There were no statistically significant differences in mean serum creatinines at 3 years and 5 years.
Table 3. Renal function for early graft function groups
The incidences of DGF, SGF and poor EGF through the three tertiles are detailed in Table 4. As can be seen, there were no statistically significant changes in the rate of these events through the tertiles, although the third tertile was numerically much higher. We noted also that the rate of early acute rejection was much higher in the third tertile (6.2% in tertile 1, 5.1% in tertile 2 and 13.9% in tertile 3); and when we excluded the patients with very early AR from analyses, the poor EGF rates were 13.6% in tertile 1, 14.0% in tertile 2 and 12.5% in tertile 3. The tertiles remained nonpredictive of these outcomes on regression analysis that included the covariates listed above: for poor EGF, relative risk (RR) 1.06 per tertile, 95% CI 0.78–1.42, p-value NS.
We compared the rates of poor EGF among the surgeons who were credited as the first attending surgeon of record for the cases, and we found no statistically significant differences among the group as a whole (chi-square p = 0.183). Additionally, we compared the poor EGF rate for the first 10 lapNx cases versus the later cases among the surgeons who had not accumulated prior unsupervised experience with the lapNx procedure, and we found that the incidence of poor EGF was similar (10.4% for first 10 cases and 19.1% for the later cases in this group of surgeons, p = 0.86).
We performed a univariate and multivariate analysis of parameters that we suspected could be risk factors for poor EGF, and the details of this analysis are shown in Table 5. Only those variables that achieved statistical significance in univariate analysis were included in the multivariate model. The only covariates that achieved statistical significance as predictors of poor EGF in this multivariate model were higher recipient BMI, older donor age, longer warm ischemia time and utilization of sirolimus in IMISR.
Table 5. Univariate and multivariate analysis of risk factors for poor early graft function (EGF)
OR (95% CI)
OR (95% CI)
Only those variables that achieved statistical significance in univariate analysis were included in the multivariate model.
OR = odds ratio; CI = confidence interval; BMI = body mass index; HLA = human leukocyte antigen; IMISR = initial maintenance immunosuppression regimen.
Recipient age (per decade)
Recipient sex male
Recipient race African American
Recipient BMI (per 5 kg/m2)
Donor age (per decade)
Donor sex male
Donor race African American
Donor BMI (per 5 kg/m2)
Donor eGFR (per 10 mL/min per 1.73 m2)
Genetically unrelated donor
HLA mismatch (per mismatch)
Simultaneous pancreas transplant
Donor's right kidney procured
Number of renal arteries (per artery)
Donor's time in operating room (per h)
Warm ischemia time (per min)
Lymphocyte-depleting agent induction
Tacrolimus in IMISR
Sirolimus in IMSR
This retrospective cohort study demonstrates that poor early graft function following lapNx complicates a significant proportion of transplants and has substantial effects on important survival metrics. We herein show that renal allograft survival is substantially worse in recipients who require dialysis in the first posttransplant week or whose serum creatinine is at or above 3.0 mg/dL at postop day 5, with more than double the risk for graft failure in these subjects compared to those with unimpaired initial graft function. We also demonstrate inferior renal function at 1 year and worse rejection-free survival in this group. The effect of poor EGF on these outcomes appears to be independent of other clinical predictors of graft dysfunction, as demonstrated by the persistence of a major detrimental effect after adjusting for a multitude of potential confounders. Additionally, we showed that a recipient who experiences poor EGF is less likely to survive 5 years after transplant, although this effect on patient survival may not be an independent effect of poor EGF. We also showed that simply having a serum creatinine at or above 3.0 mg/dL at POD 5 without a need for early hemodialysis (i.e. SGF) was enough to impair long-term graft survival and increase first-year rejection risk, suggesting that the less dramatic insults may be sufficient enough to mar outcomes. It clearly shows that early injuries to the allograft are ultimately not inconsequential.
Consistent with our conclusions of worse long-term outcomes after poor EGF is the clear evidence in deceased donor renal transplant recipients that DGF (53–56) or SGF (53) impart higher immunologic risk and worse renal allograft survival. With regard to LD kidney transplant recipients, our study findings should be compared with the findings of a very similar study by Brennan et al., which was a retrospective study of 469 recipients of LD kidney transplantation from the University of California San Francisco (57). They found that poor EGF (defined as in our study) was associated with a significantly lower 1-year graft survival but similar longer-term survival (mean follow-up of 31 months) and that poor EGF strongly predisposed to AR, which in turn significantly impaired graft survival. They did not note differences in 1-year renal allograft function based on early graft function. We found it counterintuitive that 1-year survival would be worse while longer-term survival was not worse, and we speculate that the study may not have had the power to show a difference because of insufficient numbers of subjects with longer follow-up durations. We also found three other studies that supported our conclusion that early dysfunction impairs long-term graft outcomes after live donor nephrectomy. First, Matas et al. found in their large cohort of recipients of living donors that DGF is associated with worse renal allograft function and worse renal allograft survival (60). Second, Hawley et al. found that DGF predicted lower Nankivell glomerular filtration rate at 6 months among 209 recipients of live donor kidney transplants (59). Third, the OPTN/SRTR 2005 Annual Report provides reason for profound concern about living donor renal allografts that experience DGF, with the 1-year graft survival being 65% if dialysis is needed within the first posttransplant week as compared to 97% if not (58).
Our study contributes important data to this body of literature in that it includes a large number of subjects with accurate internal data. Additionally, by eliminating patients with very early AR from many of our analyses, our study provides the first strong evidence that nonimmune-mediated early graft dysfunction after live kidney donation leads to higher risk of later rejection and to ultimate graft loss. Prior studies did not apply such techniques and therefore could not make this assessment.
Importantly, our study includes a robust analysis of risk factors for poor EGF and provides findings that may be useful in guiding efforts to reduce rates of poor EGF in this setting. The univariate analysis of covariates yielded many expected predictors of poor EGF, such as older donor age, higher number of HLA mismatches, lack of genetic relationship between donor and recipient, more prolonged warm ischemia times and utilization of sirolimus in IMISR. Others predictors were more intriguing and elude simple explanations, such as recipient male sex, recipient and donor African American race, lack of prior transplant and simultaneous transplant of deceased donor pancreas allograft. It should be noted that procurement of right donor kidney was predictive while greater number of arteries was not predictive of poor EGF in this univariate analysis, and this finding may be of practical use for surgeons who may face the decision of whether to procure a left kidney with multiple arteries or a right kidney with a single artery. Although more study is needed, our data suggest that the former may be a better choice.
Our multivariate analysis of potential risk factors yielded a few covariates that demonstrated an independent association with poor EGF. Our finding of higher risk with longer warm ischemia time is consistent with the findings of Brennan et al., who also found that this factor was independently predictive of poor EGF (57). Our multivariate analysis also found associations with older donor age and higher recipient BMI, which are also similar to associations noted in the Brennan study. It should also be noted that Brennan et al. noted a strong association of recipient diabetes and poor EGF after living donor renal transplantation, but our data did not show this association. Finally, our finding of higher risk for poor EGF in recipients receiving sirolimus is consistent with the well-known association of sirolimus use with DGF (62–64). On the other hand, it is possible that sirolimus could have been used preferentially in patients who had early graft dysfunction because of the impression that the drug is less nephrotoxic than calcineurin inhibitors (although it is not our standard protocol to do so), and thus one should be very cautious about concluding causation in the retrospective study.
Our data also provide some insights into the possible pathophysiologic connection of early dysfunction to later graft deterioration. The inferior subsequent rejection-free survival associated with poor EGF among those grafts that did not suffer very early AR suggests that a major component of this process may be that ischemic and/or mechanical injury initiates an immune-mediated inflammatory response that leads to later graft dysfunction, even after mechanical insults are no longer active. This finding is consistent with prior observations that renal ischemia induces increased MHC class II expression (65,66). Furthermore, ischemia and reperfusion of allografts have been observed to induce a strong inflammatory response that results in cellular apoptosis, release of reactive oxygen metabolites, induction of proinflammatory cytokines and potent leukocyte chemoattractants by ischemic vascular endothelium and possibly parenchymal cells (66,67). As hypothesized by Fairchild et al., such chemoattractant cytokines and adhesion molecules may serve as ‘signposts' that traffic T cells into allografts and may be involved in the ultimate development of graft vasculopathy and fibrosis (68). Additionally, nonimmune and noninflammatory processes may be at play. It is certainly plausible that loss of nephrons from perioperative insults may contribute to systemic hypertension and/or may induce adaptive single nephron hyperfiltration that leads to eventual atrophy and drop out of remaining nephrons and accompanying interstitial and glomerular fibrosis. Given that we had only modest numbers of graft losses, our study did not provide sufficient statistical power to make conclusions about why the allografts with poor EGF are more likely to fail, except that the poor EGF group demonstrated a higher rate of thromboses (which occurred very early after transplantation).
One could hypothesize that accumulated local and international experience with the procedure has resulted in improvements in technique that lessened or eliminated the problem of poor EGF, thus rendering the above findings less important. Arguing against this scenario, the incidence of early graft dysfunction has remained stable through the three chronologic tertiles of our center's first nearly 1000 lapNx cases, even after adjusting for potential confounders. (Although there appeared to be a trend toward higher incidence of poor EGF in the most recent tertile, we suspect that this was due to a markedly higher rate of very early AR during that tertile rather than procurement-related issues.) Also, we did not find evidence of a ‘learning curve’ for surgeons who were virgin to the procedure when we compared their first 10 cases to all later cases, and in fact we surprisingly found numerically lower rates during the first 10 cases (though not statistically significant). It should be noted that this analysis is very problematic because of the difficulties in determining what experience the surgeon had prior to doing their first laparoscopic nephrectomy as the attending surgeon of record at our center. Many of the surgeons had substantial experiences as trainees and/or had performed critical parts of the procedure under supervision prior to taking on the role of first surgeon. Furthermore, critical parts of the procedure may have been performed by surgical trainees or attendings who were not the first surgeon of record. Thus, although we did not find evidence of a ‘learning curve’ for individual surgeons, a more detailed analysis would be needed to adequately address this important issue.
Additionally, this study does not adequately address the culpability of the lapNx as the cause of the poor EGF. Several prior studies have indicated that poor EGF after LD nephrectomy is not unique to lapNx, as it complicates a nonnegligible proportion of LD kidney transplants done by open nephrectomy. For example, open cohorts in studies by Ratner (12) and by Derweesh (69) and national data from the era prior to the laparoscopic nephrectomy (70) reported DGF incidence rates of approximately 5–6%, which is very similar to the DGF rate in our lapNx cohort. Additionally, in a large UCSF cohort of LD renal transplant recipients whose organs were procured primarily with open nephrectomy (88%), Brennan et al. reported poor EGF in 15% (57), which is similar to our rate of poor EGF. Obviously, a multitude of factors in addition to the organ retrieval procedure can cause early graft dysfunction, including early rejection and recipient hemodynamic and vascular factors. In fact, we found that a couple recipient factors—higher recipient BMI and sirolimus use—were the strongest independent risk factors for poor EGF, arguing that recipient factors may be even more important than procurement insults in the development of poor EGF in this setting.
Limitations of this study are important to appreciate. First, it is possible that patients who had experienced early graft dysfunction may have been judged by the managing clinicians to be at higher risk for rejection, such that they may have been more likely to have a biopsy done. Thus, subclinical rejection or incidental inflammation would be more likely to be identified in a graft that had poor EGF than in one with IGF. Second, incomplete follow-up of subjects may have introduced bias. Patients who had planned to get their follow-up transplant care outside of our center would likely be more likely to return for care at our tertiary care transplant program if they were doing poorly, thus increasing the likelihood that we could identify patients with poor outcomes. Finally, it is possible that poor EGF may have been an early marker of recipients who were at higher risk for poor outcomes, rather than being a causative factor for the poor outcomes. By eliminating positive-crossmatch recipients who underwent desensitization from the study and by eliminating subjects diagnosed with very early AR from subanalyses, we feel that we removed most of the cases in which early graft dysfunction was simply a marker of high immunologic risk.
Our findings highlight the urgent need to find ways to lessen the rate of poor EGF after LD kidney transplantation. Based on our experience, our team employs the following measures: aggressive volume loading of donor, limiting pneumoperitoneum to 15 mmHg, and the routine use of papaverine as a topical renal artery antispasmotic. Additionally, the surgeons monitor intraoperative perfusion using kidney appearance and renal vein turgor, and they have ongoing communication with the anesthesiologists to maintain adequate introperative hemodynamics and vigorous urine output (with use of diuretics as necessary) throughout the case. Our findings in this study provide strong rationale for donor surgeons to attempt to limit warm ischemic time. Further studies to better define procedure-related and patient-related risk factors for poor EGF after lapNx are clearly needed. Studies that assess the impact of lapNx donor surgeon training and competency measures on outcomes of allografts may help centers prevent the loss of precious grafts to the purported ‘learning curve’. Also, our data suggest that maintaining vigilance and possibly higher levels of immunosuppression in recipients who experience poor EGF may lessen the incidence and/or clinical impact of future AR episodes in their grafts. Additionally, recent data have suggested the exciting possibility that early pharmacologic interventions may attenuate perioperative CD4+ T-cell infiltration and thereby interrupt the progression to transplant arterial vasculopathy and fibrosis (66), and such advances may be pertinent to LD renal allografts. Finally, hemodynamic interventions such renin–angiotensin system blockers could possibly diminish potential nonimmune injury.
In conclusion, this study of our large single-center cohort of recipients of lapNx transplants shows that (1) this group continues to be plagued by substantial rates of SGF and DGF (which may not be higher than in those whose graft was procured by the open technique) and that (2) those who experience poor early graft dysfunction demonstrate worse long-term graft survival, higher risk of AR during the first year and inferior 1-year renal function as compared to those recipients who experienced immediate graft function. Based on our findings, transplant teams should strive to limit the rate of poor EGF in their live donor renal transplant recipients. We are hopeful that refinements in the lapNx technique and training and improvements in patient selection and management will bring about a reduction in rates of poor EGF in LD renal transplant recipients and thereby help to optimize the quality and longevity of this ‘gift of life’ to our patients.