In citing this document, the following format should be used: Kidney Disease: Improving Global Outcomes (KDIGO) Transplant Work Group. KDIGO clinical practice guideline for the care of kidney transplant recipients. American Journal of Transplantation 2009; 9(Suppl 3): S1–S157.
Special Issue: KDIGO Clinical Practice Guideline for the Care of Kidney Transplant Recipients
Version of Record online: 14 OCT 2009
© 2009 The Authors Journal compilation © 2009 The American Society of Transplantation and the American Society of Transplant Surgeons
American Journal of Transplantation
Special Issue: KDIGO Clinical Practice Guideline for the Care of Kidney Transplant Recipients
Volume 9, Issue Supplement s3, pages S1–S155, November 2009
How to Cite
(2009), Special Issue: KDIGO Clinical Practice Guideline for the Care of Kidney Transplant Recipients. American Journal of Transplantation, 9: S1–S155. doi: 10.1111/j.1600-6143.2009.02834.x
- Issue online: 14 OCT 2009
- Version of Record online: 14 OCT 2009
- kidney transplant recipient care;
- graft monitoring;
- infectious diseases;
- cardiovascular disease;
- mineral and bone disorder;
- hematological complications;
- sexual function;
- mental health
The 2009 Kidney Disease: Improving Global Outcomes (KDIGO) clinical practice guideline on the monitoring, management, and treatment of kidney transplant recipients is intended to assist the practitioner caring for adults and children after kidney transplantation. The guideline development process followed an evidence-based approach, and management recommendations are based on systematic reviews of relevant treatment trials. Critical appraisal of the quality of the evidence and the strength of recommendations followed the Grades of Recommendation Assessment, Development, and Evaluation (GRADE) approach. The guideline makes recommendations for immunosuppression, graft monitoring, as well as prevention and treatment of infection, cardiovascular disease, malignancy, and other complications that are common in kidney transplant recipients, including hematological and bone disorders. Limitations of the evidence, especially on the lack of definitive clinical outcome trials, are discussed and suggestions are provided for future research.
Since the first successful kidney transplantation in 1954, there has been an exponential growth in publications dealing with the care of kidney transplant recipients (KTRs). In addition, the science of conducting and interpreting both clinical trials and observational studies has become increasingly controversial and complex. Caring for KTRs requires specialized knowledge in areas as varied as immunology, pharmacology, nephrology, endocrinology and infectious disease. The last two comprehensive clinical practice guidelines on the care of KTRs were published in 2000 by the American Society of Transplantation and the European Best Practices Guidelines Expert Group. Both of these guidelines were based primarily on expert opinion, not rigorous evidence review. For these reasons, the international consortium of kidney guideline developers, Kidney Disease: Improving Global Outcomes (KDIGO), concluded that a new comprehensive evidence-based clinical practice guideline for the care of KTRs was necessary.
It is our hope that this document will serve several useful purposes. Our primary goal is to improve patient care. We hope to accomplish this in the short term by helping clinicians know and better understand the evidence (or lack of evidence) that determines current practice. By making this guideline broadly applicable, our purpose is to also encourage and enable the establishment and development of transplant programs worldwide. Finally, by providing comprehensive evidence-based recommendations, this guideline will also help define areas where evidence is lacking and research is needed. Helping to define a research agenda is an often neglected, but very important function of clinical practice guideline development.
We used the GRADE system to rate the strength of evidence and the strength of recommendations. In all, there were only 4 (2%) recommendations in this guideline for which the overall quality of evidence was graded ‘A,’ whereas 27 (13.6%) were graded ‘B,’ 77 (38.9%) were graded ‘C,’ and 90 (45.5%) were graded ‘D.’ Although there are reasons other than quality of evidence to make a grade 1 or 2 recommendation, in general, there is a correlation between the quality of overall evidence and the strength of the recommendation. Thus, there were 50 (25.3%) recommendations graded ‘1’ and 148 (74.7%) graded ‘2.’ There were 3 (1.5%) recommendations graded ‘1A,’ 16 (8.1%) were ‘1B,’ 18 (9.1%) were ‘1C,’ and 13 (6.6%) were ‘1D.’ There was 1 (0.5%) graded ‘2A,’ 11 (5.6%) were ‘2B,’ 59 (29.8%) were ‘2C,’ and 77 (38.9%) were ‘2D.’ There were 45 (18.5%) statements that were not graded.
Some argue that recommendations should not be made when evidence is weak. However, clinicians still need to make clinical decisions in their daily practice, and they often ask ‘what do the experts do in this setting’? We opted to give guidance, rather than remain silent. These recommendations were often rated with a low strength of recommendation and a low strength of evidence, or were not graded. It is important for the users of this guideline to be cognizant of this (see Disclaimer). In every case these recommendations are meant to be a place for clinicians to start, not stop, their inquiries into specific management questions pertinent to the patients they see in daily practice.
We wish to thank Martin Zeier, Co-Chair, along with all of the Work Group members who volunteered countless hours of their time developing this guideline. We also thank the Evidence Review Team members and staff of the National Kidney Foundation who made this project possible. Finally, we owe a special debt of gratitude to the many KDIGO Board members and individuals who volunteered time reviewing the guideline, and making very helpful suggestions.
Kai-Uwe Eckardt, MD KDIGO Co-Chair
Bertram L. Kasiske, MD KDIGO Co-Chair
Guideline Scope and Intended Users
This guideline describes the prevention and treatment of complications that occur after kidney transplantation. We do not include pretransplant care. Specifically, we do not address issues pertinent to the evaluation and management of candidates for transplantation, or the evaluation and selection of kidney donors.
Although many of the issues that are pertinent to KTRs are also pertinent to recipients of other organ transplants, we intend this guideline to be for KTRs only. We cover only those aspects of care likely to be different for KTRs than for patients in the general population. For example, we deal with the diagnosis and treatment of acute rejection, but not with the diagnosis and treatment of community-acquired pneumonia. We also make recommendations pertinent to the management of immunosuppressive medications and their complications, including infections, malignancies, and cardiovascular disease (CVD).
This guideline ends before the kidney fails, either by death of the recipient with a functioning graft, return to dialysis, or retransplantation. We do not deal with the preparation of KTRs for return to dialysis or retransplantation.
This guideline was written for doctors, nurses, coordinators, pharmacists, and other medical professionals who directly or indirectly care for KTRs. It was not developed for administrative or regulatory personnel per se. For example, no attempts were made to develop clinical performance measures. Similarly, this guideline was not written for patients directly, although carefully crafted explanations of guideline recommendations could potentially provide useful information for patients.
This guideline was written for transplant-care providers throughout the world. As such, it addresses issues that are important to the care of KTRs in both developed and developing countries, but nowhere was the quality of care compromised for utilitarian purposes. Nevertheless, we recognize that, in many parts of the world, treatment of end-stage kidney disease (chronic kidney disease [CKD] stage 5) with dialysis is not feasible, and transplantation can only be offered as a life-saving therapy if it is practical and cost-effective. Therefore, in providing a comprehensive, evidence-based guideline for the care of the KTRs, we were cognizant of the fact that programs in some areas of the world may need to adopt cost-saving measures in order to make transplantation possible.
Section I: Immunosuppression
Kidney transplantation is the treatment of choice for CKD stage 5. The risk of death for KTRs is less than half of that for dialysis patients (1). Any differences in patient survival attributable to different immunosuppressive medication regimens are substantially smaller than the survival difference between dialysis and transplantation. Specifically, marginally inferior immunosuppressive medication regimens will result in substantially better patient outcomes than dialysis. Thus, it is better to perform kidney transplantation even with an inferior immunosuppressive regimen, than to avoid transplantation altogether.
Recommendations for immunosuppressive medications are necessarily complex, because combinations of multiple classes of drugs are used and because the choices among different regimens are determined by the tradeoffs between benefits and harm. Typically, a greater degree of immunosuppression may reduce the risk of rejection, but may also increase the risk of infection and cancer. Decision analysis with patient-based utilities may be needed to correctly assess the tradeoffs between benefits and harm, but this has not usually been done.
Rating Guideline Recommendations
Within each recommendation, the strength of recommendation is indicated as Level 1, Level 2, or Not Graded, and the quality of the supporting evidence is shown as A, B, C, or D.
|Level 1||‘We recommend’|
|Level 2||‘We suggest’|
|Grade for quality of evidence||Quality of evidence|
Chapter 1: Induction Therapy
- 1.1: We recommend starting a combination of immunosuppressive medications before, or at the time of, kidney transplantation. (1A)
- 1.2: We recommend including induction therapy with a biologic agent as part of the initial immunosuppressive regimen in KTRs. (1A)
- 1.2.1: We recommend that an IL2-RA be the first-line induction therapy. (1B)
- 1.2.2: We suggest using a lymphocyte-depleting agent, rather than an IL2-RA, for KTRs at high immunologic risk. (2B)
IL2-RA, interleukin 2 receptor antagonist; KTRs, kidney transplant recipients.
Except perhaps for transplantation between identical twins, all kidney transplant recipients (KTRs) need immunosuppressive medications to prevent rejection. Induction therapy is treatment with a biologic agent, either a lymphocyte-depleting agent or an interleukin 2 receptor antagonist (IL2-RA), begun before, at the time of, or immediately after transplantation. The purpose of induction therapy is to deplete or modulate T-cell responses at the time of antigen presentation. Induction therapy is intended to improve the efficacy of immunosuppression by reducing acute rejection, or by allowing a reduction of other components of the regimen, such as calcineurin inhibitors (CNIs) or corticosteroids. Available lymphocyte-depleting agents include antithymocyte globulin (ATG), antilymphocyte globulin (ALG) and monomurab-CD3. Basiliximab and daclizumab, the two IL2-RAs that are currently available in many parts of the world, bind the CD25 antigen (interleukin-2 [IL2] receptor α-chain) at the surface of activated T-lymphocytes and thereby competitively inhibit IL2-mediated lymphocyte activation, a crucial phase in cellular immune response of allograft rejection.
- • There is high-quality evidence that the benefits of IL2-RA vs. no IL2-RA (or placebo) outweigh harm in a broad range of KTRs with variable immunological risk and concomitant immunosuppressive medication regimens.
- • There is moderate-quality evidence that a lymphocyte-depleting agent vs. no lymphocyte-depleting agent (or placebo) reduces acute rejection and graft failure in high-immunological-risk patients.
- • There is moderate-quality evidence across a broad range of patients with different immunological risk and concomitant immunosuppressive medication regimens, which shows that (compared to IL2-RA) lymphocyte-depleting agents reduce acute rejection, but increase the risk of infections and malignancies.
- • Economic evaluations for IL2-RA demonstrate lower cost and improved graft survival compared with placebo.
- • Although there are sparse data in KTRs <18 years old, there is no biologically plausible reason why age is an effect modifier of treatment, and the treatment effect of IL2-RA appears to be homogenous across a broad range of patient groups.
- • Induction therapy with a lymphocyte-depleting antibody reduces the incidence of acute rejection compared with IL2-RA, but has not been shown to prolong graft survival.
- • Induction therapy with a lymphocyte-depleting antibody increases the incidence of serious adverse effects.
- • For KTRs ≥18 years old, who are at high risk for acute rejection, the benefits of induction therapy with a lymphocyte-depleting antibody outweigh the harm.
In a large number of long-term, randomized controlled trials (RCTs) in adults, it has been consistently shown that induction therapy with either lymphocyte-depleting agents or IL2-RA reduces acute rejection in patients treated with ‘double therapy’ (calcineurin inhibitor [CNI] and prednisone), or ‘triple therapy’ (CNI, an antiproliferative agent [e.g. mycophenolate or azathioprine], and prednisone). Lymphocyte-depleting antibody induction also reduces the risk of graft failure while, in more recent studies, IL2-RA reduced the risk of death-censored graft failure, but not overall graft loss. Oral maintenance therapy may not produce immediate effects on the immune response when it is most needed, that is at the time of transplantation and antigen presentation. Pharmacokinetic and pharmacodynamic properties of oral maintenance agents may delay their full effect on immune cells.
The efficacy and safety of IL2-RA (compared to placebo or no treatment) have been confirmed in the most recent Cochrane review of 30 RCTs and 4670 patients followed to 3 years (2). In this review, IL2-RA consistently reduced the risk of acute rejection (e.g. for biopsy-proven acute rejection: 14 RCTs, 3861 patients, relative risk [RR] 0.77, 064–0.92) and graft loss (censored for death: 16 RCTs, n = 2973 patients, RR = 0.74, 0.55–0.99). IL2-RA did not affect all-cause mortality (24 RCTs, n = 4468, RR 0.73, 0.50–1.07), malignancy (14 RCTs, n = 3363, RR 0.70, 0.38–1.29) or cytomegalovirus (CMV) infection (17 RCTs, n = 3767, RR 0.90, 0.76–1.06), although all point estimates favor IL2-RA (all outcomes are at 1 year). The use of IL2-RA has also been found to be cost-effective compared to placebo (3).
The evidence for safety and efficacy of lymphocyte-depleting antibodies is more limited than that for IL2-RA. A meta-analysis of seven RCTs (N = 794) comparing lymphocyte-depleting agents with placebo or no treatment reported a reduction in graft failure (RR 0.66, 0.45–0.96) (4). In an individual patient meta-analysis of five of these same trials (N = 628), the reduction in graft loss at 2 years was greater in patients with high panel-reactive antibody (PRA) levels (RR 0.12, 0.03–0.44), compared to the reduction in risk for patients without high PRA (RR 0.74, 0.50–1.09) (5).
Since publication of these meta-analyses, there have been other trials comparing lymphocyte-depleting agents with placebo or no depleting agent. In a single-center RCT, sensitized patients were randomized to induction with ATG or no induction. Patients treated with ATG had a reduction in acute rejection and improvement in graft survival (6). In a three-arm RCT, the incidence of biopsy-proven acute rejection at 6 months was highest in deceased-donor KTRs receiving tacrolimus, azathioprine and prednisone without induction (25.4%, N = 185) compared to a group receiving tacrolimus, azathioprine, prednisone and ATG (15.1%, N = 184) and a group receiving cyclosporine A (CsA), azathioprine, prednisone and ATG (21.2%, N = 186) (7). However, CMV infection occurred in 16%, 24% and 28% of the patients in these groups, respectively (p = 0.012). Similarly, leukopenia, thrombocytopenia, fever and serum sickness were all more common in the two groups receiving antithymocyte induction (7). There is high-quality evidence for a net benefit of IL2-RA compared to placebo for some patient outcomes (graft survival) but not all (all-cause mortality); and high-quality evidence of a net benefit to prevent acute rejection (see Evidence Profile and accompanying evidence in Supporting Tables 1–4 at http://www3.interscience.wiley.com/journal/118499698/toc).
There have been a number of RCTs comparing IL2-RA with lymphocyte-depleting agents. Most of these trials have been small and of low quality. A meta-analysis of nine RCTs (N = 778) found no difference in clinical acute rejection at 6 months (2). There were no differences in graft survival or patient survival (2). Since this meta-analysis, there have been other RCTs. The largest (N = 278), and arguably highest-quality, RCT compared ATG with daclizumab in deceased-donor KTRs selected to be high-risk for delayed graft function (DGF) and/or acute rejection (8). This RCT found no difference in the primary composite end-point, but the ATG induction group had fewer biopsy-proven acute rejections and more overall infections compared to the daclilzumab group (8). In an updated Cochrane review, the risk of acute rejection was higher with IL2-RA compared with lymphocyte-depleting agents (nine RCTs, n = 1166, RR 1.27, 1.00–1.61), but the risk of graft loss (12 RCTs, n = 1430, RR 1.10, 0.73–1.65), and mortality was not significantly different (13 RCTs, n = 1670, RR 1.28, 0.74–2.20). Compared with lymphocyte-depleting agents, the risk of CMV infection (13 RCTs, n = 1480, RR 0.69, 0.49–0.97), and malignancy (six RCTs, n = 840, RR 0.23, 0.06–0.93) is lower with IL2-RA. Thus, there is moderate-quality evidence for trade-offs between IL2-RA and depleting antibodies; depleting antibodies are superior to prevent acute rejection, but there is uncertainty whether this corresponds to improved graft outcomes. Depleting antibodies are associated with more infections (see Evidence Profile and accompanying evidence in Supporting Tables 5–7).
There have been few head-to-head comparisons of different lymphocyte-depleting agents. Thus, it is unclear whether any one of these agents is superior to any other. In meta-analyses, there do not appear to be obvious differences in the effects of different lymphocyte-depleting agents on acute rejection or graft survival.
Alemtuzumab (Campath 1H) is a humanized anti-CD52 monoclonal antibody that depletes lymphocytes. In the United States, it has been approved by the Food and Drug Administration (FDA) for use in patients with B-cell lymphomas. There have been a few small RCTs examining the use of alemtuzumab as an induction agent in KTRs. All of these RCTs lack statistical power to examine the effects of alemtuzumab on patient survival, graft survival or acute rejection. In many of the RCTs, there were differences between the comparator groups other than alemtuzumab, making it difficult to discern the effects of alemtuzumab alone. For example, in a single-center RCT, 65 deceased-donor KTRs received alemtuzumab induction with delayed tacrolimus monotherapy and were compared to 66 KTRs treated with no induction, mycophenolate mofetil (MMF) and corticosteroids. At 12 months, the rate of biopsy-proven acute rejection was 20% vs. 32% in the two groups, respectively (p = 0.09) (9). In 21 high-immunological-risk KTRs randomized to alemtuzumab plus tacrolimus vs. four doses of ATG (plus tacrolimus, MMF and steroids), there were two vs. three acute rejections, respectively (10). Among 20 patients randomized to alemtuzumab plus low-dose CsA vs. 10 patients on CsA plus azathioprine and prednisone, there were biopsy-proven acute rejections in 25% vs. 20%, respectively (11). Ninety deceased-donor KTRs were randomly allocated to ATG, alemtuzumab or daclizumab induction, with those receiving alemtuzumab also receiving a lower tacrolimus target, MMF 500 mg twice daily and no maintenance prednisone, while those in the other two groups received MMF 1000 mg twice daily and prednisone. After 2 years of follow-up, acute rejections occurred in 20%, 23% and 23% in the three groups, respectively, but there was borderline worse death-censored graft survival in the alemtuzumab group (p = 0.05), and more chronic allograft nephropathy (CAN) (p = 0.008) (12,13). Altogether, these small studies fail to clearly demonstrate that the benefits outweigh the harm of alemtuzumab induction in KTRs.
For KTRs treated with an IL2-RA, the reduction in the incidence of acute rejection and graft loss, without an increase in major adverse effects, makes the balance of benefits vs. harm favorable in most patients. However, it is possible that in some KTRs at low risk for acute rejection and graft loss, the benefits of induction with IL2-RA may be too small to outweigh even minor adverse effects (especially cost in developing countries) and so, in this setting, not administering IL2A is reasonable.
In contrast to IL2-RA, induction therapy with lymphocyte-depleting antibodies increases the incidence of serious adverse effects. For KTRs treated with lymphocyte-depleting antibodies, a reduction in the incidence of acute rejections must be balanced against an increase in major infections. This balance may favor the use of depleting agents in some, but not all, patients. Logic would suggest that the chances of a favorable balance between benefits and harm could be maximized by limiting the use of lymphocyte-depleting agents to patients at increased risk for acute rejection.
In an individual patient, meta-analysis of five RCTs comparing lymphocyte-depleting antibody induction with no induction (or placebo), the reduction in graft failure was greater in patients with a high PRA (5). Unfortunately, there are few, if any, studies comparing the relative effectiveness of lymphocyte-depleting agents vs. IL2-RA in subgroups of patients at increased immunological risk. Nevertheless, observational data can be used to quantify the risk for acute rejection and graft failure, and thereby define patients who are most likely to benefit from lymphocyte-depleting agents compared to an IL2-RA.
Risk factors for acute rejection include (Table 1):
|Patient characteristic||Study characteristics|
|Country of study||United States (14)||Spain (15)||North America (16)||Portugal (17)||Netherlands (18)||Norway (19)||UK (20)||Norway (21)|
|Number analyzed (N)||27 377||3365||2779 children||866||790||739||518||451|
|Percent that used living donors (%)||33%||0%||100%||1.4%||0%||100%||0%||33%|
|Transplant years included||97–99||90, 94, 98||87–97||85–99||83–96||94–04||91–99||94–97|
|Acute rejection riska|
|Deceased (vs. living donor)||NA||NA||NA|
|Younger recipient age|
|per 10 years||<60 y||<2 years||<45 years||<50 years|
|Older donor age|
|≥60 years||≥65 years||per 10 years|
|Recipient female (vs. male)|
|Deceased donor cause of death|
|Cerebral vascular death (vs. other cause)||NA|
|Trauma (vs. nontrauma)||NA|
|Recipient ethnicity US black (vs. white)|
|Recipient Hispanic (vs. non-Hispanic)||NA||NA||NA||NA||NA|
|Recipient diabetes (vs. no diabetes)||b|
|Any number of ABDR (vs. 0)|
|Any number of AB (vs. 0)|
|Any number of DR (vs. 0)|
|Per each ABDR mismatch 4–6 ABDR (vs. 3–1)|
|Panel reactive antibody status||NA|
|>0% (vs. 0%)|
|>15% (vs. ≤15%)|
|>50% (vs. ≤50%)|
|Cold ischemia time|
|>24 h (vs. <24 h)||NA|
|DGF (vs. none)|
|CMV disease (vs. none)||c|
|CMV infection (vs. none)||d||e|
|BMI ≥35 kg/m2|
- • The number of human leukocyte antigen (HLA) mismatches (A)
- • Younger recipient age (B)
- • Older donor age (B)
- • African-American ethnicity (in the United States) (B)
- • PRA >0% (B)
- • Presence of a donor-specific antibody (B)
- • Blood group incompatibility (B)
- • Delayed onset of graft function (B)
- • Cold ischemia time >24 hours (C)
where A is the universal agreement, B is the majority agreement and C is the single study.
Retrospective observational studies have identified a number of risk factors for acute rejection after kidney transplantation (Table 1). Younger recipients are at substantially higher risk than older recipients, although there is no clear age threshold for the risk of acute rejection. Younger recipients may also be better able to tolerate serious adverse effects of additional immunosuppressive medication, making it compelling to treat younger recipients with lymphocyte-depleting antibody than IL2-RA. Kidneys from older donors may impart increased risk for acute rejection to the recipient, but a distinct age threshold has not been clearly defined.
The number of HLA mismatches between the recipient and donor is associated with the risk of acute rejection, but few studies have agreed on the number or type of mismatches (Class 1 [AB] or Class 2 [DR]) that increase the risk for acute rejection. In the United States, African-American ethnicity has been linked to an increased risk of acute rejection. For deceased-donor recipients, the duration of cold ischemia, for example longer than 24 hours, has been associated with acute rejection. DGF has also been associated with acute rejection, although by the time it is apparent that graft function is delayed, it is likely too late to decide whether or not to use a lymphocyte-depleting agent or an IL2-RA. However, induction with a lymphocyte-depleting agent could be used when there is an increased risk for DGF, such as in cases with expanded criteria donation or prolonged cold ischemia time. Finally, the presence of antibodies to a broad panel of potential recipients has been associated with an increased risk of acute rejection.
Chapter 2: Initial Maintenance Immunosuppressive Medications
- 2.1: We recommend using a combination of immunosuppressive medications as maintenance therapy including a CNI and an antiproliferative agent, with or without corticosteroids. (1B)
- 2.2: We suggest that tacrolimus be the first-line CNI used. (2A)
- 2.2.1: We suggest that tacrolimus or CsA be started before or at the time of transplantation, rather than delayed until the onset of graft function. (2D tacrolimus; 2B CsA)
- 2.3: We suggest that mycophenolate be the first-line antiproliferative agent. (2B)
- 2.4: We suggest that, in patients who are at low immunological risk and who receive induction therapy, corticosteroids could be discontinued during the first week after transplantation. (2B)
- 2.5: We recommend that if mTORi are used, they should not be started until graft function is established and surgical wounds are healed. (1B)
CNI, calcineurin inhibitor; CsA, cyclosporine A; mTORi, mammalian target of rapamycin inhibitor(s).
Maintenance immunosuppressive medication is a long-term treatment to prevent acute rejection and deterioration of graft function. Treatment is started before or at the time of transplantation, and the initial medication may or may not be used with induction therapy. Agents are used in combination to achieve sufficient immunosuppression, while minimizing the toxicity associated with individual agents. Since the risk for acute rejection is highest in the first 3 months after transplantation, higher doses are used during this period, and then reduced thereafter in stable patients to minimize toxicity. In these guidelines, antiproliferative agents refer specifically to azathioprine or mycophenolate (either MMF or enteric-coated mycophenolate sodium [EC-MPS]).
Corticosteroids have traditionally been a mainstay of maintenance immunosuppression in KTRs. However, adverse effects of corticosteroids have led to attempts to find maintenance immunosuppression regimens that do not include corticosteroids. Terminology has often been confusing, but ‘steroid avoidance’ is used here to refer to protocols that call for the initial use of corticosteroids, which are then withdrawn sometime during the first week after transplantation. In contrast, ‘steroid-free’ protocols do not routinely use corticosteroids as initial or maintenance immunosuppression. ‘Steroid withdrawal’ refers to protocols that discontinue corticosteroids after the first week posttransplant. Similar definitions have been applied to the use of CNIs.
- • Used in combination and at reduced doses, drugs that have different mechanisms of action may achieve additive efficacy with limited toxicity.
- • The earlier that therapeutic blood levels of a CNI can be attained, the more effective the CNI will be in preventing acute rejection.
- • There is no reason to delay the initiation of a CNI, and no evidence that delaying the CNI prevents or ameliorates DGF.
- • Compared to CsA, tacrolimus reduces the risk of acute rejection and improves graft survival during the first year of transplantation.
- • Low-dose tacrolimus minimizes the risk of new-onset diabetes after transplantation (NODAT) compared to higher doses of tacrolimus.
- • Compared with placebo and azathioprine, mycophenolate reduces the risk of acute rejection; there is some evidence that mycophenolate improves long-term graft survival compared with azathioprine.
- • Avoiding the use of maintenance corticosteroids beyond the first week after kidney transplantation reduces adverse effects without affecting graft survival.
- • Mammalian target of rapamycin inhibitors (mTORi) have not been shown to improve patient outcomes when used either as replacement for antiproliferative agents or CNIs, or as add-on therapy, and they have important short- and long-term adverse effects.
Timing of initiation
In theory, the earlier that therapeutic blood levels of a CNI can be attained, the more effective the CNI is likely to be in preventing acute rejection. However, there are also theoretical reasons that the early use of CNIs might increase the incidence and severity of DGF. As a result, RCTs have compared early vs. delayed CNI initiation after transplantation. In three RCTs (N = 338), there was no difference in the incidence of DGF with early vs. delayed CsA initiation. In five RCTs (N = 620), there were no differences in acute rejection, graft failure or kidney function in early vs. delayed CsA initiation. Altogether, these RCTs suggest that there is no reason to delay the initiation of CsA. There are no similar studies using tacrolimus, but it is suggested that, with a regimen including induction and reduced-dose tacrolimus, the risk for early CNI nephrotoxicity is minimized and optimal prevention of acute rejection can be achieved. There is moderate-quality evidence that, in CsA-containing regimens, there is no net benefit or harm of early vs. delayed CsA; the evidence is of low quality for CNIs in general, because of a lack of data for tacrolimus-containing regimens (see Evidence Profile and accompanying evidence in Supporting Tables 11–13 at http://www3.interscience.wiley.com/journal/118499698/toc).
Tacrolimus vs. cyclosporine
A meta-analysis of RCTs reported reduced acute rejection and better graft survival with tacrolimus compared to CsA (22). For every 100 patients treated for the first year with tacrolimus rather than CsA, 12 would be prevented from having acute rejection, two would be prevented from having graft failure, but five would develop NODAT. The RCTs in the meta-analysis combined studies of patients receiving the original CsA preparation and cyclosporine A microemulsion (CsA-ME). This study also showed that lower tacrolimus were associated with higher relative risk of graft loss, while higher levels of tacrolimus were associated with an increased risk for NODAT.
Randomized controlled trials comparing tacrolimus with CsA-ME using concomitant azathioprine and corticosteroids, but no induction, have shown reduced acute rejection with tacrolimus; for example, 22% vs. 42% at 12 months, respectively (p < 0.001) (23). The difference in acute rejection between the two CNIs could no longer be observed with concomitant induction and MMF instead of azathioprine; for example 4% vs. 6%, for tacrolimus vs. CsA-ME, respectively (24) or 7% vs. 10% at 6 months, respectively (25) when C2 monitoring of CsA was also employed. Furthermore, there is evidence that subclinical rejection (acute rejection changes in protocol biopsy not indicated by a change in kidney function) is more effectively prevented by tacrolimus and MMF compared to CsA and MMF; 15% vs. 39% (p < 0.05) (26).
A very large multicenter RCT in de novo KTRs (n = 1645; the Symphony study) showed superior graft function, better prevention of acute rejection (12.3%) and superior graft survival (96.4%) at 12 months with daclizumab induction and low-dose tacrolimus (C0 3–7 ng/mL). The comparator groups included low-dose CsA and low-dose sirolimus, both with daclizumab induction and standard-dose CsA without induction. All patients received MMF (2 g/day) and corticosteroids (27).
There is no uniform definition of NODAT used in the literature. Therefore, the reported incidences of NODAT vary to a great extent. Studies reporting a difference between tacrolimus and CsA in the incidence of NODAT, impaired glucose tolerance, or the use of antidiabetic treatment, favor CsA; for example 17% vs. 9% (p < 0.01; tacrolimus vs. CsA) (25). Others have found lower incidences and no significant difference (24,28). One reason for the variation in findings may be differences in the use of corticosteroids as maintenance medication and treatment of acute rejection. Indeed, use of a steroid-free regimen has been associated with a lower incidence of NODAT (29).
Overall, there is moderate-quality evidence for a net benefit of tacrolimus vs. CsA (see Evidence Profile and accompanying evidence in Supporting Tables 8–10). There is no clear evidence of differences in terms of patient mortality, incidence of malignancy, infection, delayed onset of graft function or blood pressure. There is evidence that cholesterol, low-density lipoprotein cholesterol (LDL-C) (but not high-density lipoprotein cholesterol [HDL-C]), acute rejection and graft loss are higher with CsA vs. tacrolimus. However, there is also evidence that NODAT is more common with tacrolimus than CsA, so that there is clear trade-off in the different patient-relevant outcomes with these two CNIs.
Dosing of CNI
Dosing of CNI is important, but is a relatively under-researched area. There are few trials that compare the effects of different doses or target levels of the same drugs in which baseline immunosuppression is kept constant across both arms. Indirect comparisons and case series have shown that high doses might increase adverse events and low doses might increase acute rejection. Standard-dose tacrolimus may be defined as it is recommended by the producer (Astellas Pharma, Tokyo, Japan); the dose achieving 12-h trough levels (C0) of 10 (5–15) ng/mL. A low-dose tacrolimus has recently been introduced in the Symphony study and was defined as C0 of 5 (3–7) ng/mL (27). Standard-dose CsA may be defined as the dose achieving C0 of 200 (150–300) ng/mL (30) or C2 1400–1800 ng/mL early and 800–1200 ng/mL later after transplantation (25). Low-dose CsA has been used in some recent clinical studies (27,30) and was defined as achieving C0 of 75 (50–100) ng/mL.
Randomized controlled trials have shown that MMF (2 or 3 g, but not 1 g daily) is significantly better in preventing acute rejection than placebo. This was seen in studies using steroids as concomitant medication and either tacrolimus or CsA (31,32). For example, acute rejection at 6 months was reduced from 55% with placebo to 30% and 26% with MMF 2 and 3 g daily doses (31). There were 5–7% improvements of graft survival at 12 months with MMF, but the studies were not powered to evaluate this difference. There were no significant differences in patient survival, graft function, malignancy, NODAT, infection rates or gastrointestinal adverse events such as diarrhea, although there might be evidence that higher doses of MMF cause more diarrhea than lower doses of MMF. More bone marrow suppression was seen with MMF compared to placebo. Overall, there is moderate-quality evidence of a net benefit of MMF over placebo to prevent acute rejection, but low-quality evidence for all graft and patient outcomes overall (see Evidence Profile and accompanying evidence in Supporting Tables 14–15).
Randomized controlled trials comparing outcomes between MMF vs. azathioprine have shown some important inconsistencies. In a recent meta-analysis of 19 trials and 3143 patients, MMF was associated with less acute rejection (RR 0.62, 95% confidence interval [CI] 0.55–0.87) and improved graft survival (RR 0.76, 0.59–0.98) (33). However, there were no differences in patient survival or kidney function (33). There were also no differences in major adverse effects (e.g. infections, CMV, leucopenia, anemia and malignancies) between MMF and azathioprine, but diarrhea was more common with MMF (RR 1.57; 95% CI 1.33–28.6) (33). In several RCTs, MMF reduced the incidence of acute rejection at 6 months; for example from 36% with azathioprine (100–150 mg/day) to 20% with MMF (2 g/day) using CsA and steroids as concomitant medication (34) and from 38% to 20% with the addition of concomitant induction (35). Also, a reduction in acute rejection from 29% to 7% was seen with concomitant tacrolimus, steroids and induction in using MMF 2 g, but not 1 g (36). Conversely, another study showed a smaller reduction in acute rejection at 6 months from 23% with azathioprine (100–150 mg/day) to 18% with MMF (2 g/day), a difference that was not statistically significantly (37). These patients were also treated with CsA-ME and steroids. However, using the same concomitant medication, including CsA-ME, other investigators found a significant reduction of acute rejection at 12 months from 27% with azathioprine to 17% with MMF 2 g (38). In a third arm of this latter study, patients received MMF from day 0 to day 90 and thereafter azathioprine, and the acute rejection rate was the same, 17%, as for those receiving MMF for the entire study period of 12 months. Thus, high-quality evidence finds a net benefit of MMF over azathioprine to prevent acute rejection, but moderate-quality evidence exists for patient-level outcomes. Because of the substantially increased cost of MMF compared with azathioprine and increased side effects compared with azathioprine, there is no clear net benefit, but a decision based upon trade-offs is required (see Evidence Profile and accompanying evidence in Supporting Tables 16–18).
Analyses of observational registry data have shown either a small 4% improvement in graft survival with MMF vs. azathioprine (39) or, more recently, no improvement in graft survival (40). However, for a number of reasons, the results of retrospective analyses of observational registry data need to be interpreted cautiously (41).
MMF Compared to EC-MPS
One RCT compared MMF 2 g daily vs. EC-MPS 1.44 g daily with CsA-ME, steroids, with or without induction (42). There were no significant differences in acute rejection (24% vs. 23%), patient or graft survival or rates of malignancy or infection. There was no difference in rates of gastrointestinal disorders (80% vs. 81%) despite the fact that the potential reduction of gastrointestinal adverse events has been the incentive for the development of EC-MPS. Another study (43) tested the crossover between the two formulations and also found no differences in any of the outcome parameters. A summary of the RCTs on MMF vs. EC-MPS is available in Supporting Tables 25–26.
Steroid avoidance or withdrawal
The rationale for minimizing corticosteroid exposure is compelling and provided by well-established risks of osteoporosis, avascular necrosis, cataracts, weight gain, diabetes, hypertension and dyslipidemia. Such risk is not constant, and varies with comorbidities such as preexisting metabolic syndrome and age. On the other hand, corticosteroids have been the mainstay of immunosuppression for KTRs for decades, and trial data evaluating minimization of steroid exposure are sparse compared to the large number of trials that have included steroids in the regimens being evaluated. In addition, many of the adverse effects attributed to corticosteroids were observed with high doses. Whether or not low doses (e.g. 5 mg prednisone per day) that are commonly used for long-term maintenance immunosuppression are associated with major adverse effects is less clear.
Randomized controlled trials have shown that the withdrawal of corticosteroids from maintenance immunosuppressive medication regimens, when carried out weeks to months after transplantation, is associated with a high risk of acute rejection (44,45). More recent studies have examined whether steroid avoidance (discontinuing corticosteroids within the first week after transplantation) can be done safely. These studies have generally shown higher rates of acute rejection, but lower rates of long-term adverse effects (12,29,46–48). Unfortunately, these trials have had design limitations that make the interpretation of their results difficult.
Overall, there is moderate-quality evidence for trade-offs between steroid avoidance or withdrawal compared to steroid maintenance, with a higher rate of steroid-sensitive acute rejections but avoidance of steroid-related adverse effects (see Evidence Profile and accompanying evidence in Supporting Tables 19–21).
Mammalian target of rapamycin inhibitor(s)
Regimens using the mTORi sirolimus and everolimus have been compared to a number of different regimens in clinical trials in KTRs, for example as replacement for azathioprine, MMF or CNIs, and in combination with CNIs (both at high and low dose). The use of mTORi in the setting of chronic allograft injury (CAI) is described in Chapter 7. mTORi have a number of adverse effects that limit their use, including dyslipidemia and bone marrow suppression (49–56). Although they have been compared with many other regimens in RCTs, in none of these RCTs was there an improvement in graft or patient survival.
mTORi as replacement for antiproliferative agents
In a meta-analysis of 11 RCTs with 3966 KTRs evaluating mTORi as replacement for azathioprine or MMF, there were no differences in graft or patient survival (57). mTORi appear to reduce the risk of acute rejection (RR 0.84, 95% CI 0.71–0.99; p = 0.04), but graft function and LDL-C outcomes were generally better with azathioprine or MMF (57).
mTORi as replacement for CNIs
In a meta-analysis of eight RCTs with 750 patients evaluating mTORi as replacement for CNIs, there were no differences in acute rejection, CAN, graft survival or patient survival (57). mTORi were associated with higher glomerular filtration rate (GFR), but also with increased risk of bone marrow suppression and dyslipidemia (49,57).
mTORi in combination with CNIs
The combined use of mTORi and CNIs should be avoided, because these agents potentiate nephrotoxicity, particularly when used in the early post-transplant period (57). When used as long-term maintenance, mTORi have been used in two different regimens in combination with CNIs. Eight RCTs involving 1360 patients have evaluated low-dose mTORi and standard-dose CNI compared with standard-dose mTORi and low-dose CNI (57). Overall, the low-dose, CNI-standard dose mTORi regimen is associated with a 30% increased risk of rejection with no difference in graft survival. An additional 10 RCTs involving 3175 patients have evaluated the effects of high- vs. low-dose mTORi in combination with fixed-dose CNI, showing less rejection but lower GFR with higher-dose therapy, but no improvement in patient outcomes.
Moderate-quality evidence for sirolimus finds net harm without improved graft or patient survival; CNI toxicity is potentiated when used in combination with sirolimus (see Evidence Profile and accompanying evidence in Supporting Tables 22–24).
- • A long-term RCT that has adequate statistical power to detect differences in acute rejection and major adverse events is needed to determine whether the benefits of steroid avoidance outweigh the harm.
Chapter 3: Long-Term Maintenance Immunosuppressive Medications
- 3.1: We suggest using the lowest planned doses of maintenance immunosuppressive medications by 2–4 months after transplantation, if there has been no acute rejection. (2C)
- 3.2: We suggest that CNIs be continued rather than withdrawn. (2B)
- 3.2: If prednisone is being used beyond the first week after transplantation, we suggest prednisone be continued rather than withdrawn. (2C)
CNI, calcineurin inhibitor.
Using high doses of immunosuppressive medications early after transplantation when the risk of acute rejection is highest, but then reducing doses later when the risk of acute rejection is lower, has been used empirically as the mainstay of long-term immunosuppressive medication management since the advent of kidney transplantation. However, there are no randomized trials testing this therapeutic strategy.
- • If low-dose CNI was not implemented at the time of transplantation, CNI dose reduction >2–4 months after transplantation may reduce toxicity yet prevent acute rejection.
- • RCTs show that CNI withdrawal leads to increased acute rejection, without altering graft survival.
- • RCTs show that steroid withdrawal more than 3 months after transplantation increases the risk of acute rejection.
- • Different immunosuppressive medications have different toxicity profiles and patients vary in their susceptibility to adverse effects.
CNI dose reduction
Although there are no RCTs comparing dose reduction with maintaining initial high doses and target levels, this dose reduction strategy has been successfully adopted in most RCTs. The assumption is that the immune system gradually adapts to the foreign antigens in the graft, and that the need for immunosuppression is thereby reduced. There is great individual variation, and some patients with a high risk for immunological complications (acute and chronic rejection) may need to continue on higher doses of immunosuppression compared to the majority of patients.
A range of trial designs have directly and indirectly compared the effects of different CNI dose, usually as measured by different target levels. In RCTs in which CNI has been combined with mTORi (eight RCTs, 1178 patients), as either low-dose mTORi with standard CNI or higher mTORi and lower CNI, standard-dose CNI was associated with lower rates of acute rejection (RR 0.67) but lower GFR (9 mL/min/1.73 m2). Such trials are clearly confounded, but do suggest that variable CNI exposure leads to competing benefits and harm. Graft function may be improved by minimizing CNI, leading to reduced CAI, but may be worsened if acute rejection occurs.
The strongest evidence comes from RCTs that have directly compared low vs. high CNI doses (four RCTs, 1256 patients). In these trials, there were no differences in outcomes (including graft survival) except for GFR, which favored low CNI in two of the four studies. Low-quality evidence suggests no net benefit or harm of low- vs. standard-dose CNI (see Evidence Profile and accompanying evidence in Supporting Tables 27–29 at http://www3.interscience.wiley.com/journal/118499698/toc).
Using indirect comparisons of trials of different CNI doses, the risk of diabetes and graft loss was reduced with lower doses. However, there are sparse data on the relative effects of specific CNI target values from head-to-head trials, apart from the broad category of high vs. low.
Low-dose CNI maintenance
The notion of complete CNI withdrawal, after the peak period for immunologically mediated complications (3 months) is attractive, considering the long-term complications of CNI exposure. However, RCTs of complete CNI withdrawal show that, although some small benefit in graft function results, the risk of acute rejection is significantly increased without a clear benefit on improved graft survival (eight RCTs, 1891 patients). As described above, CNI toxicity can be minimized by administering low-dose CNI, while ensuring sufficient immunosuppression is provided. Moderate-quality evidence shows a net harm to CNI withdrawal (see Evidence Profile and accompanying evidence in Supporting Tables 30–32).
Long-term steroid administration may lead to hypertension, NODAT, osteoporosis, fractures and dyslipidemia, all of which may affect graft survival. However, long-term steroid administration prevents acute rejection and immunologically mediated graft loss. In six RCTs of 1519 KTRs, steroid withdrawal led to increased acute rejection, without a clear benefit for improved patient or graft outcomes, except for a reduction in total cholesterol levels in the steroid-withdrawal group. Low-quality evidence suggests net harm of steroid withdrawal (see Evidence Profile in Supporting Table 33).
Individual tailoring of immunosuppressive medication to the patient's risk profile
Although tailoring immunosuppressive therapies to the individual patient's risk profile (both risk for acute rejection and risk for adverse effects) is considered standard practice, there are few studies that suggest how this should be done. There are some data on the relative incidence and severity of adverse effects, collected in clinical trials and observational studies (Table 2). However, standard definitions have not been used to define adverse effects of immunosuppressive medications. Data collection has generally relied on spontaneous investigator reporting, which can lead to serious under-reporting. For these and other reasons, the quality of data on adverse drug effects is very low.
|New-onset diabetes mellitus|
|Anemia and leucopenia|
|Delayed wound healing|
Withdrawal of a specific drug in an individual patient with an adverse drug effect may or may not result in clinical improvement. Nevertheless, drug withdrawal or substitution is a logical course of action if the benefits (reducing symptoms) appear to outweigh the harm (acute rejection).
- • NODAT may be caused or exacerbated by corticosteroids, tacrolimus, mTORi and, to a lesser extent, by CsA. In patients with impaired glucose tolerance or NODAT, steroid reduction or withdrawal may be beneficial. If this is not sufficient, a switch from tacrolimus to CsA-ME may be considered.
- • Dyslipidemia may be caused or exacerbated by corticosteroids, CsA and especially by mTORi. Patients with significant dyslipidemia before or after transplantation should probably avoid mTORi.
- • Hypertension may be caused or exacerbated by corticosteroids, CsA and, to a lesser extent, tacrolimus. In patients, who are not normotensive after transplantation, despite adequate antihypertensive treatment, reduction or withdrawal of steroid or CNI may be beneficial.
- • Osteopenia may be caused or exacerbated by corticosteroids, and possibly CsA and tacrolimus. Steroid reduction or withdrawal may be helpful.
- • Bone marrow suppression may be caused or exacerbated by MMF, azathioprine and mTORi. Monitoring of the mycophenolic acid (MPA) area under the concentration–time curve (AUC), and probably reduction of the dose of MMF or azathioprine, are the first suggested actions in case of anemia or leucopenia.
- • Delayed wound healing may be caused or exacerbated by mTORi. Patients who have delayed wound healing on an mTORi may benefit from switching the mTORi to a CNI.
- • Diarrhea, nausea and vomiting may be caused or exacerbated by MMF and tacrolimus. Monitoring MPA, AUC and tacrolimus C0 levels may help to reduce these complications. However, it is important to rule out treatable, underlying causes other than the immunosuppressive medication. In a recent study, about half of the patients were cured by treatment of an infection (58). Only after ruling out other underlying causes should reducing the MMF, or changing MMF to azathioprine, be considered.
- • Proteinuria may be caused or exacerbated by mTORi. Consider avoiding an mTORi in a patient with persistent urinary protein excretion of more than 500–1000 mg/day.
- • Decreased kidney function may be caused or exacerbated by CsA and tacrolimus. See Chapter 7 regarding treatment of chronic CNI nephrotoxicity.
Chapter 4: Strategies to Reduce Drug Costs
- 4.1: If drug costs block access to transplantation, a strategy to minimize drug costs is appropriate, even if use of inferior drugs is necessary to obtain the improved survival and quality of life benefits of transplantation compared with dialysis. (Not Graded)
- 4.1.1: We suggest strategies that may reduce drug costs include:
- • limiting use of a biologic agent for induction to patients who are high-risk for acute rejection (2C);
- • using ketoconazole to minimize CNI dose (2D);
- • using a nondihydropyridine CCB to minimize CNI dose (2C);
- • using azathioprine rather than mycophenolate (2B);
- • using adequately tested bioequivalent generic drugs (2C);
- • using prednisone long-term. (2C)
- 4.2: Do not use generic compounds that have not been certified by an independent regulatory agency to meet each of the following criteria when compared to the reference compound (Not Graded):
- • contains the same active ingredient;
- • is identical in strength, dosage form, and route of administration;
- • has the same use indications;
- • is bioequivalent in appropriate bioavailability studies;
- • meets the same batch requirements for identity, strength, purity and quality;
- • is manufactured under strict standards.
- 4.3: It is important that the patient, and the clinician responsible for the patient's care, be made aware of any change in a prescribed immunosuppressive drug, including a change to a generic drug. (Not Graded)
- 4.4: After switching to a generic medication that is monitored using blood levels, obtain levels and adjust the dose as often as necessary until a stable therapeutic target is achieved. (Not Graded)
CCB, calcium-channel blocker; CNI, calcineurin inhibitor.
A number of cost-saving strategies may offer access to transplantation when the cost of immunosuppressive medication is otherwise prohibitive. The use of generic medications can substantially reduce cost. A generic immunosuppressive medication is a medication that is manufactured and distributed without patent protection, but is structurally identical to the brand-name medication. However, manufacturing, distribution and quality control may differ among pharmaceutical companies. Regulatory authorities generally do not require that the efficacy and safety of generic medications be tested in RCTs. Manufacturers of generic drugs must only prove that their preparation is bioequivalent to the existing drug in order to gain regulatory approval.
However, generic drugs approved by the US FDA have met rigid standards. To gain FDA approval (http://www.fda.gov/cder/ogd; last accessed March 30, 2009), a generic drug must:
- • contain the same active ingredients as the brand drug (inactive ingredients may vary);
- • be identical in strength, dosage form and route of administration;
- • have the same use indications;
- • be bioequivalent;
- • meet the same batch requirements for identity, strength, purity and quality;
- • be manufactured under the same strict standards of the FDA's good manufacturing practice regulations.
Similarly, the European Agency for the Evaluation of Medicinal Products, also known as the European Medicinal Agency (http://www.emea.europa.eu/htms/human/raguidelines/datagenerics/biosimilars.htm; last accessed March 30, 2009) defines a generic medicinal product as a medicinal product that has:
- • the same qualitative and quantitative composition in active substances as the reference product;
- • the same pharmaceutical form as the reference medicinal product;
- • bioequivalence with the reference medicinal product demonstrated by appropriate bioavailability studies.
Tacrolimus, CsA, mTORi, MMF, and azathioprine are all available as generics (loosely defined) in many countries around the world. However, the efficacy and the safety of these generics may not always be firmly established by local regulatory authorities charged with approving these agents.
- • Lack of dialysis facilities may make kidney transplantation the only life-saving therapy available for some patients with CKD stage 5.
- • Kidney transplantation is the therapy of choice to treat CKD stage 5, since overall costs are lower, and outcomes and quality of life are better compared to dialysis.
- • Cost savings that do not compromise patient safety are beneficial.
- • Use of cytochrome P-450 inhibitors, such as ketoconazole and diltiazem, allow therapeutic blood levels of CsA to be achieved at a lower dose, thereby reducing cost.
- • Azathioprine can be used to achieve most of the efficacy and safety of MMF, but at a much lower cost.
- • An adequately tested bioequivalent generic formulation can lower cost without compromising safety and efficacy of the originally patented formulation.
Chronic maintenance dialysis is not available for many patients in a number of developing countries in Asia, Africa, and South America (59). Patients living in remote areas may not have access to dialysis. Kidney transplantation, especially preemptive transplantation (before the need for chronic dialysis), may be the only viable option for long-term renal replacement therapy in many areas of the world. Transplantation is the most cost-effective form of renal replacement therapy, and offers a superior quality of life compared to dialysis (60). For all of these reasons, there is a growing demand for kidney transplantation in the developing world, and it is imperative that kidney transplantation be affordable. Even where immunosuppressive drugs are available, their high cost may preclude their use if adequate health insurance coverage is not available (61).
Calcineurin inhibitors currently form the backbone of immunosuppressive regimens, but their cost imposes a long-term financial burden on patients in developing countries. Forced discontinuation of CsA due to cost increases the risk of acute rejection and may result in poor long-term outcomes (62).
Calcineurin inhibitors and mTORi (sirolimus and everolimus) are metabolized through the hepatic cytochrome P-450 microsomal oxidase enzyme system. Commonly used drugs such as the antifungal ketoconazole and the nondihydropyridine calcium-channel blocker (CCB) diltiazem are known inhibitors of this enzyme system and increase blood levels of these immunosuppressive drugs. This, in turn, reduces the dose necessary to maintain therapeutic blood levels (63,64).
A number of studies (Table 3) have shown that ketoconazole, when used in a dose of 50–200 mg/day, allows substantial reduction in the daily dose of CsA, tacrolimus and sirolimus, while maintaining therapeutic blood levels (65–76). In a RCT (69), 51 patients received 100 mg/day of ketoconazole along with CsA and 49 served as controls. The dose reduction was highest at 1 month (76.5%) and was maintained at 10 years (64.6%). The cost of CsA decreased by 73% at 1 year, 69% at 5 years and 63% at 10 years in the intervention group, while the decrease in cost was 13% and 20% in the control group at 1 and 10 years, respectively.
|Study||CNI||Keto (N)||Control (N)||Mean follow-up (months)||Ketoconazole (mg/day)a||Estimated cost reduction (%)|
|Carbajal (71)||CsA||14||17||29||54 ± 17||60|
In another study (73), 70 patients on a tacrolimus-based immunosuppression regimen were randomly allocated to receive ketoconazole (n = 35) or no ketoconazole (controls, n = 35). The tacrolimus dose reduction was 58.7% at 6 months and 53.8% at 2 years, leading to cost reduction of 56.9% and 52.2%, respectively. None of the studies has reported any adverse effect of this approach on graft function.
Ketoconazole requires an acidic milieu in the stomach for its absorption; hence, concomitant use of agents that inhibit gastric acid secretion should be avoided.
In comparison to ketoconazole, the dose reduction achieved with diltiazem is modest (67,77). Hence, some would suggest that a nondihydropyridine CCB, such as diltiazem, be used only in situations where ketoconazole is contraindicated. On the other hand, if patients discontinue ketoconazole abruptly, the levels of immunosuppressive drugs may drop precipitously and result in acute rejection. A precipitous drop is less likely with nondihydropyridine CCBs, and the risk of acute rejection may therefore be less. In addition, most KTRs have hypertension that requires treatment, and nondihydropyridine CCBs may serve the dual purpose of treating hypertension and reducing cost. The choice between ketoconazole and a CCB should be adapted to the patient's situation and preference.
The use of 2-h CsA concentration (C2) monitoring for adjusting drug dose is not suitable for patients receiving ketoconazole or diltiazem. Metabolic inhibitors interfere with the disposal—but not the absorption—of CsA or tacrolimus, and therefore flatten the AUC. In this situation, the CsA AUC correlates better with C0 than C2. Dose adjustments based on C2 levels may lead to CsA toxicity (78). Trough concentration monitoring therefore should be used to adjust drug dosage.
Although MMF is considered the preferred antimetabolite for KTRs, the Mycophenolate Steroid Sparing follow-up study showed that azathioprine-treated patients experienced similar long-term outcomes compared to those receiving MMF after a median 5.4 years (37). CsA-ME was the CNI used in this study. The length of hospital stay, incidence of acute rejections, and the likelihood of return to dialysis were also similar in the two groups. In a cost-minimization analysis, MMF was found to be 15 times more expensive than azathioprine. This study (and the lack of large differences in outcomes in other studies comparing MMF with azathioprine) suggests that it may be acceptable to use azathioprine in place of MMF when cost is an important consideration.
A number of generic formulations of CsA, tacrolimus, mTORi and MMF are now available around the world. Generic formulations vary from country to country. Most countries require evidence of bioequivalence in only a small number of patients before marketing is permitted. In many countries, however, generic formulations have been available for over 10 years and their efficacy has been established in real-life situations. Head-to-head data comparing efficacy and toxicity are generally not available for most generics (79–81). Caution should therefore be exercised in choosing a generic formulation for use in KTRs. Ideally, a generic formulation should be used only after its safety and efficacy have been established in KTRs.
Chapter 5: Monitoring Immunosuppressive Medications
- 5.1: We recommend measuring CNI blood levels (1B), and suggest measuring at least:
- • every other day during the immediate post-operative period until target levels are reached (2C);
- • whenever there is a change in medication or patient status that may affect blood levels (2C);
- • whenever there is a decline in kidney function that may indicate nephrotoxicity or rejection. (2C)
- 5.1.1: We suggest monitoring CsA using 12-h trough (C0), 2-h post-dose (C2) or abbreviated AUC. (2D)
- 5.1.2: We suggest monitoring tacrolimus using 12-h trough (C0). (2C)
- 5.2: We suggest monitoring MMF levels. (2D)
- 5.3: We suggest monitoring mTORi levels. (2C)
AUC, area under concentration–time curve; CNI, calcineurin inhibitor; CsA, cyclosporine A; MMF, mycophenolate mofetil; mTORi, mammalian target of rapamycin inhibitor(s).
Cyclosporine A has a narrow therapeutic window and variable absorption characteristics, even with the microemulsion formulation (CsA-ME). Therefore, the CsA dosage must be individualized to find a balance between high levels that may be toxic and low levels that may be insufficient to prevent rejection. Variability in absorption is greatest during the first 4 h after dosing, and during the first few weeks after transplantation. There are no RCTs comparing monitoring with no monitoring; however, the fact that different target levels influence efficacy and toxicity is strongly suggestive that monitoring is beneficial (82).
The C0 is the measured concentration after the dosing interval (e.g. 12 h after dosing if the dosing interval is every 12 h), C2 the concentration 2 h after dosing and AUC0–4 is the AUC during the first 4 h after dosing. Fewer data are available to guide blood-level monitoring of tacrolimus compared to CsA. MPA is the active metabolite of MMF and the molecule generally used for monitoring of MMF. The half-lives of mTORi are greater than 48 h, making anything but monitoring of C0 unlikely to be useful. There are no clinical methods for monitoring corticosteroid blood levels.
There continues to be widespread interest in pharmcodynamic assays for monitoring immunosuppressive medication and adjusting dosing accordingly. However, there are insufficient data demonstrating the efficacy of pharmacodynamic monitoring.
Cyclosporine A absorption may increase substantially during the first 1–2 weeks after transplantation. In KTRs, absorption stabilizes by approximately the end of the first month. Common factors that might change CsA blood levels are the use of other drugs affecting cytochrome P450 3A4 (CYP3A4) and/or P-glycoprotein, diet and intestinal motility. There are no studies comparing one schedule of monitoring vs. another; however, tailoring the monitoring schedule to the expected absorption variability is a reasonable, empirical approach. There are no data to suggest whether monitoring blood levels in stable patients beyond the first few weeks after transplantation is beneficial.
There are few RCTs to define optimal target blood levels. Target levels should generally reflect the overall immunosuppressive medication regimen, and therefore target levels may vary accordingly. For example, it may be prudent to use lower early posttransplant target blood levels when an induction antibody is used. In any case, blood-level monitoring with predetermined targets can be effectively used to balance the risk for rejection with the risk for toxicity.
Cyclosporine A C0 has often been used for therapeutic drug monitoring, but C0 does not correlate closely with AUC0–4. Blood levels at 2 h after drug administration (C2), instead of at 12 h (C0 if the dosing interval is 12 h), have been used to monitor CsA therapy with the CsA-ME formulation. Although C2 levels appear to correlate more closely with AUC0 − 4, no differences have been observed in two RCTs between the incidence of acute rejection, graft loss or adverse events whether patients were monitored by AUC0–4 or C2 or C0 levels (83). Overall, a very low strength of evidence suggests uncertain trade-offs between using C0 or C2 (see Evidence Profile and accompanying evidence in Supporting Tables 34–36 at http://www3.interscience.wiley.com/journal/118499698/toc); therefore, either C0 or C2 blood levels are acceptable.
There have been fewer studies with blood-level monitoring for tacrolimus than for CsA. However, available evidence suggests that the benefits and harm of therapeutic drug monitoring for these two CNIs are similar. Tacrolimus C0 is correlated with the AUC of tacrolimus (generally r > 0.8) (84,85). This relationship appears to be better during the first few months after transplant than later; however, there is high inter- and intrapatient variability. As is the case for CsA, there are no studies comparing one schedule of monitoring tacrolimus vs. another; however, tailoring the monitoring schedule to the expected absorption variability is a reasonable, empirical approach. Target levels for tacrolimus should reflect the patient's overall immunosuppressive drug regimen and risk for rejection, with higher targets early after transplantation, and lower targets later.
The AUC is widely regarded as the best measure of overall drug exposure of MPA. Pharmacokinetic studies have demonstrated poor correlation of C0 with the full AUC (86). The inability of single-point sampling strategies, particularly those in the early postdose period, to effectively predict the AUC has resulted in a number of studies investigating the use of limited sampling strategies. These strategies use a number of sampling points, usually between 2 and 4 h, to predict the AUC (87).
Mycophenolate mofetil has conventionally been administered at a fixed dose without routinely monitoring MPA blood levels. Therapeutic drug monitoring during MMF therapy remains controversial. Available studies have serious limitations and report conflicting results. Early after transplantation, MPA AUC might be correlated with a lower risk of acute rejection than C0, but this is supported by only a single RCT (88). There are two RCTs showing that targeting different MPA AUC resulted in different rates of acute rejection (89,90). Several observational studies have also shown that MPA AUC early after transplantation correlates with acute rejection (91–93). Most studies showed little correlation between MPA pharmacokinetic parameters and adverse effects (89–93). In addition, there is an important intrapatient variability of MPA pharmacokinetics and an increasing number of different drug combinations, which may affect MPA bioavailability. The proposed therapeutic window of the MPA AUC0–12 (30–60 μg·h/mL) is restricted to the early posttransplant period and when MMF is used in combination with CsA. In general, MPA C0 1.0–3.5 mg/L correlates with MPA AUC0–12 (30–60 μg·h/mL) in patients treated with CsA. A summary of the RCTs about MPA monitoring is provided in Supporting Table 37.
The pharmacokinetics of mTORi sirolimus and everolimus differ substantially (94). Although the time to peak concentration is similar between the two mTORi, the half-life of sirolimus is about 60 h in adults (10–24 in children), while that of everolimus is 28–35 h (95,96). In general, C0 correlates well with AUC0–12 (95,97). Therefore, C0 is probably adequate for monitoring mTORi levels. There are limited observational data suggesting that mTORi C0 correlate with adverse effects (98). There are no RCTs demonstrating that monitoring mTORi C0 reduces acute rejection or adverse effects.
- • RCTs with adequate statistical power are needed to determine the cost-effectiveness of therapeutic drug monitoring for all immunosuppressive agents with measurable blood levels.
Chapter 6: Treatment of Acute Rejection
- 6.1: We recommend biopsy before treating acute rejection, unless the biopsy will substantially delay treatment. (1C)
- 6.2: We suggest treating subclinical and borderline acute rejection. (2D)
- 6.3: We recommend corticosteroids for the initial treatment of acute cellular rejection. (1D)
- 6.3.1: We suggest adding or restoring maintenance prednisone in patients not on steroids who have a rejection episode. (2D)
- 6.3.2: We suggest using lymphocyte-depleting antibodies or OKT3 for acute cellular rejections that do not respond to corticosteroids, and for recurrent acute cellular rejections. (2C)
- 6.4: We suggest treating antibody-mediated acute rejection with one or more of the following alternatives, with or without corticosteroids (2C):
- • plasma exchange;
- • intravenous immunoglobulin;
- • anti-CD20 antibody;
- • lymphocyte-depleting antibody.
- 6.5: For patients who have a rejection episode, we suggest adding mycophenolate if the patient is not receiving mycophenolate or azathioprine, or switching azathioprine to mycophenolate. (2D)
OKT3, muromonab (anti–T-cell antibody).
An acute rejection episode is the consequence of an immune response of the host to destroy the graft. It is of cellular (lymphocyte) and/or humoral (circulating antibody) origin. An acute rejection is clinically suspected in patients experiencing an increase in serum creatinine, after the exclusion of other causes of graft dysfunction (generally with biopsy). We know from the early days of transplantation, before there were effective antirejection treatments, that untreated acute rejection inevitably results in graft destruction. Therefore, it is strongly recommended that acute rejection episodes be treated, unless the treatment is expected to be life-threatening or to cause harm severe enough to preclude treatment.
Acute rejection is characterized by a decline in kidney function accompanied by well-established diagnostic features on kidney allograft biopsy. Subclinical acute rejection is defined by the presence of histological changes specific for acute rejection on screening or protocol biopsy, in the absence of clinical symptoms or signs. Acute cellular rejections are acute T-cell–mediated rejections and respond to treatment with corticosteroids. Borderline acute rejection is defined by histopathological changes that are only ‘suspicious for acute rejection’ according to the Banff classification schema (99). A rejection episode is said to be unresponsive to treatment when graft function does not return to baseline after the last dose of treatment.
An antibody-mediated rejection is defined by histological changes caused by a circulating, anti-HLA, donor-specific antibody. The following criteria are generally used to determine whether an acute rejection is caused by a donor-specific antibody:
- i) staining of peritubular capillaries with C4d (fourth complement fraction);
- ii) the presence of a circulating, anti-HLA, donor-specific antibody and
- iii) histological changes consistent with an antibody-mediated rejection including (but not limited to) the presence of polymorphonuclear cells in peritubular capillaries.
- • Several causes of decreased kidney function can only be distinguished from acute rejection by biopsy.
- • Treatment of decreased kidney allograft function that is not caused by acute rejection with additional immunosuppressive medication may be harmful.
- • Treating subclinical acute rejection discovered on protocol biopsy may improve graft survival.
- • Most acute cellular rejection responds to treatment with corticosteroids.
- • Treating acute cellular rejection that is unresponsive to corticosteroids or recurs with an anti–T-cell antibody may prolong graft survival.
- • Increasing the amount of immunosuppressive medication after an acute cellular rejection may help prevent further rejection.
- • Treating borderline rejection may prolong graft survival.
- • A number of measures may be effective in treating antibody-mediated rejections, including plasma exchange, intravenous immunuoglobulin, anti-CD20 antibody and anti–T-cell antibodies.
Although there are no RCTs to establish that obtaining a biopsy improves outcomes of suspected acute rejection, there are alternative diagnoses that might mimic an acute rejection episode. BK polyomavirus (BKV) nephropathy would generally be treated differently than acute rejection, for example with a reduction in immunosuppressive medication. Therefore, logic dictates that, whenever possible, biopsy confirmation should be obtained to avoid inappropriate treatment.
Some centers use protocol biopsies to detect and treat subclinical acute rejection. In a RCT, the detection and treatment of subclinical acute rejection in patients (N = 72) on CsA, MMF and corticosteroids resulted in better graft function (100,101). However, in a larger (N = 218) multicenter RCT in patients on tacrolimus, MMF and corticosteroids, protocol biopsies and treatment of subclinical acute rejection were not beneficial (102). Finally, in a single-center RCT of 102 recipients of living-donor kidneys (treated with CsA [N = 96] or tacrolimus [N = 6], MMF [N = 55] or azathioprine [N = 47] and corticosteroids) protocol biopsies and treatment of subclinical acute rejection resulted in improved graft function (103). Uncontrolled data suggest that, when the incidence of clinical acute rejection is low, the number of patients with subclinical acute rejection may be too small to warrant the inconvenience and cost of protocol biopsies (104).
Corticosteroid therapy is the most commonly used, first-line treatment for acute cellular rejection episodes. Although most patients respond to corticosteroids, the dose and duration of treatment has not been well defined by RCTs. Treatment starting with intravenous solumedrol 250–500 mg daily for 3 days is a common practice.
Treatment of acute cellular rejection with an anti–T-cell antibody (muromonab [OKT3], ATG or ALG) is more effective in restoring kidney function and preventing graft loss than treatment with corticosteroids (105). The systematic review concluded that treatment with an antibody is associated with more adverse effects, but whether the overall benefits of antibody treatment vs. corticosteroids outweigh harm is uncertain (105). There are no RCTs examining whether anti–T-cell antibodies vs. corticosteroids should be the initial treatment of Banff IIA or IIB (vascular) rejection. A low strength of evidence suggests no net benefits or harm between antibodies or steroids alone (see Evidence Profile in Supporting Table 39 at http://www3.interscience.wiley.com/journal/118499698/toc).
Studies suggest that steroid-resistant or recurrent T-cell–mediated rejection responds to treatment with polyclonal or monoclonal anti–T-cell antibodies (105). It is also possible that the addition of MMF to the postrejection maintenance immunosuppressive medication regimen, or replacement of azathioprine with MMF, will help to prevent subsequent acute rejection. A RCT (N = 221) compared MMF to azathioprine in the treatment of first acute rejection (106). Patients receiving MMF had fewer subsequent rejections, and among the 130 who completed the trial, at 3 years graft survival was better in the MMF group (106). A summary of the RCTs on replacement of azathioprine by MMF in the setting of rejection is provided in Supporting Tables 40–41.
Whether or not to treat borderline acute rejection is controversial. There are no RCTs addressing whether treatment of borderline acute rejection prolongs graft survival, and whether overall benefits outweigh harm.
If function does not return to baseline, or if there is a new decline in function after successful treatment of an acute rejection, a biopsy should be considered to rule out additional rejection, BKV nephropathy and other causes of graft dysfunction.
Anti–T-cell antibodies (OKT3, ATG, ALG) can be used when corticosteroids have failed to reverse rejection or for treatment of a recurrent rejection. In such circumstances, benefits generally outweigh harm. However, there is inadequate evidence from RCTs to conclusively establish the best treatment for steroid-resistant or recurrent acute cellular rejection (see Evidence Profile in Supporting Table 38). Most studies comparing OKT3 to ATG or ALG did not have adequate statistical power to show a difference in efficacy. However, in one RCT, ATG was better tolerated than OKT3 (107). When a steroid-resistant rejection or a recurrent rejection does not respond to a lymphocyte-depleting antibody or OKT3, a new biopsy should be considered to rule out alternative causes of graft dysfunction.
Therapeutic strategies that include combinations of plasma exchange to remove donor-specific antibody, and/or intravenous immunoglobulins and anti-CD20+ monoclonal antibody (rituximab) to suppress donor-specific antibody production have been used to successfully treat acute humoral rejection. However, the optimal protocol to treat acute humoral rejection remains to be determined. Indeed, there are no RCTs with adequate statistical power to compare the safety and efficacy of these different therapeutic strategies. In a RCT in 20 children, rituximab was associated with better function and improved postrejection biopsy scores compared to treatment with anti–T-cell antibody and/or corticosteroids (108). Clearly, additional studies to define the optimal treatment of acute humoral rejection are needed.
Additional RCTs are needed to determine:
- • whether treating borderline acute rejection improves outcomes;
- • when protocol biopsies and treatment of subclinical acute rejection are cost-effective;
- • the optimal treatment for antibody-mediated acute rejection.
Chapter 7: Treatment of Chronic Allograft Injury
- 7.1: We recommend kidney allograft biopsy for all patients with declining kidney function of unclear cause, to detect potentially reversible causes. (1C)
- 7.2: For patients with CAI and histological evidence of CNI toxicity, we suggest reducing, withdrawing, or replacing the CNI. (2C)
- 7.2.1: For patients with CAI, eGFR >40 mL/min/1.73 m2, and urine total protein excretion <500 mg/g creatinine (or equivalent proteinuria by other measures), we suggest replacing the CNI with a mTORi. (2D)
CAI, chronic allograft injury; CNI, calcineurin inhibitor; CsA, cyclosporine A; eGFR, estimated glomerular filtration rate; mTORi, mammalian target of rapamycin inhibitor(s).
Historically, KTRs with gradually declining kidney allograft function associated with interstitial fibrosis and tubular atrophy (IF/TA) have been said to have ‘chronic rejection,’ or ‘chronic allograft nephropathy.’ However, these diagnoses are nonspecific and the Banff 2005 workshop suggested using ‘chronic allograft injury’ to avoid the misconception that the pathophysiology and treatment of this entity are understood (109). Causes of CAI include hypertension, CNI toxicity, chronic antibody-mediated rejection and others. Overall, death causes up to 50% of graft failures. However, of those who return to dialysis or require retransplantation, the most common cause is CAI, followed by acute rejection and recurrent primary kidney disease (110,111). Moderate to severe CAI is present in about one quarter of KTRs at 1 year after transplant, and in about 90% by 10 years (112–114). CAI is a diagnosis of exclusion characterized by the progressive reduction in graft function not due to recurrence of disease or other recognized causes. Histologically, CAI is defined by IF/TA (109,114). Other features may include subclinical rejection, transplant glomerulopathy or transplant vasculopathy.
Graft function 6–12 months after kidney transplantation is an outcome reported in most RCTs of immunosuppressive medications. These are described in the relevant sections of these guidelines. Similarly, the use of other medications (antihypertensive agents, lipid-lowering agents, antiproteinuric agents) to prevent CAI or prevent the progression of CAI are also discussed in other sections of these guidelines.
Some causes of CAI may be reversible. Patients found to have acute rejection, BKV nephropathy or recurrent kidney disease, for example, may respond to appropriate treatments. Therefore, it is important that patients suspected of having CAI undergo biopsy, if possible. Most commonly, when there are no reversible causes of graft dysfunction, the biopsy will show IF/TA with or without other features consistent with CAI. In other words, the diagnosis of CAI is a diagnosis of exclusion. The roles of CNI toxicity, chronic antibody-mediated rejection and other immune and nonimmune mechanisms of injury are unclear. The treatment of CAI has been controversial (115).
CNI withdrawal and/or replacement
Although there are a large number of uncontrolled studies describing the effects of withdrawing CNIs in KTRs with CAI (116), there are only two RCTs. In both RCTs, the CNI was replaced with an alternative immunosuppressive agent. In the ‘Creeping Creatinine’ study of 143 KTRs, MMF was substituted for CsA, and outcomes were reported at 12 months (117). There were no differences in mortality, graft loss, acute rejection, infection or blood pressure between the two groups. Those randomized to MMF had a small improvement in their creatinine clearance (+5.0 mL/min [+0.8 mL/s] vs. –0.7 mL/min [–0.01 mL/s]) at 12 months, but creatinine clearance was not measured in 20%, and the long-term importance of this outcome is uncertain. The ‘Chronic Renal Allograft Failure’ study replaced CsA with tacrolimus in 186 KTRs (2:1 randomization) with moderate CKD. Baseline creatinine was 220 μmol/L and outcomes were reported at 5 years (118). There was no difference in death, graft loss, acute rejection, treatment discontinuations, NODAT, hypertension, infections or cancer between the two arms. However, incident cardiac events favored tacrolimus. Over 5 years, serum creatinine increased in the CsA group by about 60 μmol/L compared with the tacrolimus group. Overall, the quality of evidence evaluating the effects of replacing a CNI in patients with CAI is low, and there is uncertainty regarding benefit–harm trade-offs (see Evidence Profile and accompanying evidence in Supporting Tables 42–44 at http://www3.interscience.wiley.com/journal/118499698/toc).
CNI replacement with mTORi
No RCTs have examined whether switching KTRs with established CAI from a CNI to an mTORi is beneficial. However, a RCT randomly allocated 830 KTRs with estimated glomerular filtration rate (eGFR) ≥20 mL/min/1.73 m2 to continuation of CNI (N = 275) vs. converting to sirolimus (N = 555) (119). Patients were stratified into two groups based on eGFR 20–40 mL/min/1.73 m2 (N = 87) and eGFR >40 mL/min/1.73 m2 (N = 743). The Data Monitoring and Safety Board stopped the trial for patients with eGFR 20–40 mL/min/1.73 m2 when the primary safety end point (acute rejection, graft failure or death at 12 months) occurred in 8 of 48 of sirolimus vs. 0 of 25 CNI patients (p = 0.045). In the stratum eGFR >40 mL/min/1.73 m2, the primary end point (change in eGFR baseline to 12 months) was not different in the two groups, but there was more proteinuria in the sirolimus group (119). Thus, this post hoc subgroup analysis suggested that converting patients with eGFR 20–40 mL/min/1.73 m2 from CNI to sirolimus may be harmful, and that converting patients with eGFR >40 mL/min/1.73 m2 may not be beneficial. However, the patients in this trial were not selected to have CAI per se, and it is possible that patients with CAI, preserved kidney function and low levels of proteinuria may still benefit from conversion. Additional study is needed.
Section II: Graft Monitoring and Infections
Rating Guideline Recommendations
Within each recommendation, the strength of recommendation is indicated as Level 1, Level 2, or Not Graded, and the quality of the supporting evidence is shown as A, B, C, or D.
|Level 1||‘We recommend’|
|Level 2||‘We suggest’|
|Grade for quality of evidence||Quality of evidence|
Chapter 8: Monitoring Kidney Allograft Function
- 8.1: We suggest measuring urine volume (2C):
- • every 1–2 hours for at least 24 hours after transplantation (2D);
- • daily until graft function is stable. (2D)
- 8.2: We suggest measuring urine protein excretion, (2C) at least:
- • once in the first month to determine a baseline (2D);
- • every 3 months during the first year (2D);
- • annually, thereafter. (2D)
- 8.3: We recommend measuring serum creatinine, (1B) at least:
- • daily for 7 days or until hospital discharge, whichever occurs sooner (2C);
- • two to three times per week for weeks 2–4 (2C);
- • weekly for months 2 and 3 (2C);
- • every 2 weeks for months 4–6 (2C);
- • monthly for months 7–12 (2C);
- • every 2–3 months, thereafter. (2C)
- 8.3.1: We suggest estimating GFR whenever serum creatinine is measured, (2D) using:
- • one of several formulas validated for adults (2C); or
- • the Schwartz formula for children and adolescents. (2C)
- 8.4: We suggest including a kidney allograft ultrasound examination as part of the assessment of kidney allograft dysfunction. (2C)
GFR, glomerular filtration rate.
Some tests need to be performed routinely to detect abnormalities that may lead to treatment or prevention of complications that are common in KTRs (Table 4). The frequency of screening is based on the incidence of the complication being screened, because there are no other data to determine the best interval for screening. Serum creatinine is easily measured and readily available in most laboratories. Screening tests for urine protein excretion include dipstick tests for total protein or albumin, as well as randomly collected ‘spot’ urine to measure protein-to-creatinine or albumin-to-creatinine ratios.
|Screening test||Screening intervals by time after transplantation|
|1 week||1 month||2–3 months||4–6 months||7–12 months||>12 months|
|Creatininea||Daily||2–3 per week||Weekly||Every 2 weeks||Monthly||Every 2–3 months|
|Urine proteinb||…………..Once……………..||…………………Every 3 months………………………………..||Annually|
|Complete blood countc||Daily||2–3 per week||Weekly||……………….Monthly……………….||Annually|
|Diabetesd||…………..Weekly…………….||…………………..Every 3 months………………………………||Annually|
|Tobacco usef||Prior to discharge||Annually|
|BKV NATg||…………………………..Monthly…………………….||………….Every 3 months…………..||–|
|EBV NAT (seronegative)h||Once||………………..Monthly………………||………….Every 3 months…………..|
|Blood pressure, pulse, height, body weight||………………………………………………………………Each clinic visit………………………………………………………….|
- • Detecting kidney allograft dysfunction as soon as possible will allow timely diagnosis and treatment that may improve outcomes.
- • Urine output that is inappropriately low, or inappropriately high, is an indication of possible graft dysfunction.
- • Serum creatinine and urine protein measurements are readily available and are useful for detecting acute and chronic allograft dysfunction.
- • Ultrasound is relatively inexpensive and reasonably accurate for diagnosing treatable causes of kidney allograft dysfunction.
Urine volume is an easily-measured parameter of early kidney allograft function (120). The recovery of kidney function, measured as a decrease in serum creatinine and blood urea nitrogen, is generally preceded by an increase in urine volume (120). Rarely, excessive urine volume may indicate the presence of a saline diuresis or a water diuresis caused by tubular damage. In addition to its role in assessing early allograft dysfunction, measuring the urine volume is an important part of overall fluid and electrolyte management.
Urine protein excretion
Proteinuria is an early and sensitive marker of kidney damage in CKD (121). Many causes of proteinuria are potentially reversible with appropriate treatment (Table 5) (122), and detection of proteinuria can therefore improve graft outcomes (113,122–132). Patients with proteinuria generally have lower kidney function compared to patients without proteinuria (122,129). Proteinuria is also associated with mortality and CVD events in KTRs (130–132).
|Persistent disease in the native kidneys|
|Allograft rejection and drug toxicity|
|De novo and recurrent glomerular diseases|
|Minimal change disease|
|Thrombotic thrombocytopenic purpura|
|Systemic lupus erythematosus|
|Light- and heavy-chain deposition diseases|
Proteinuria includes albuminuria as well as other proteins. The urinary excretion rate for albumin and total protein can be estimated from the ratio of albumin or total protein to creatinine concentration in a casual urine specimen (133–136). Creatinine excretion is higher in men than in women. Therefore, the values in the general population and cut-off values for abnormalities in urine albumin-to-creatinine ratio are lower for men than women (137,138 (Table 6). For details, see Kidney Disease Outcomes Quality Initiative (KDOQI) Guidelines for Chronic Kidney Disease, Part 5, Assessment of Proteinuria (http://www.kidney.org/professionals/kdoqi/guidelines_ckd/p5_lab_g5.htm; last accessed March 30, 2009).
|Urine collection method||Normal||Microalbuminuria||Albuminiuria or clinical proteinuria|
|Total protein||24-h excretion||<300 mg/day (adults)||NA||≥300 mg/day (adults)|
|<4 mg/m2/h (children)||≥4 mg/m2/h (children)|
|Dipstick||<30 mg/dL (adults and children)||NA||≥30 mg/dL (adults and children)|
|Spot protein-to-creatinine ratio||<200 mg/g (adults)||NA||≥200 mg/g (adults)|
|<0.2 mg/mg (children 2 years or older)|
|<0.5 mg/mg (<6–24 months old)|
|Albumin||24-h excretion||<30 mg/day||30–300 mg/day||>300 mg/day|
|Albumin dipstick||<3 mg/dL||≥3 mg/dL||NA|
|Spot albumin-to-creatinine ratio||<17 mg/g (men)||17–250 mg/g (men)||>250 mg/g (men)|
|<25 mg/g (women)||25–355 mg/g (women)||>355 mg/g (women)|
|<30 mg/g (children)|
Causes of kidney allograft dysfunction that require rapid intervention for treatment to be effective include acute rejection, obstruction, urine leak, vascular compromise and some recurrent diseases, for example focal segmental glomerulosclerosis (FSGS). These causes are more common in the first few days to weeks after kidney transplantation than in subsequent months to years. Therefore, it is important to closely monitor kidney function early after transplantation.
Measurement of the serum creatinine concentration is a simple, inexpensive and universally available method for estimating GFR, and it is reliable for detecting acute changes of kidney function (142,143). The level of serum creatinine at year 1 after transplantation is a risk factor for subsequent outcomes, and may help guide care, for example the frequency of visits (144,145).
A gradual increase in serum creatinine after the first year may be due to acute rejection, but more often is caused by CAI, recurrence of the original kidney disease, or de novo kidney disease. Unfortunately, serum creatinine is less reliable for detecting chronic changes (over months to years) in kidney function.
As is true in the general population, measurement of GFR with inulin, iothalamate, iohexol or other suitable markers of GFR, either with urinary or plasma clearance techniques, provides the most accurate measure of allograft function in KTRs. Although these tests are appropriate for clinical use, the Work Group did not recommend their use in routine clinical practice due to cost, low patient acceptance, and lack of availability outside of academic medical centers. Measurement of cystatin C has also been used to monitor kidney function. The advantage of cystatin C is its independence from body weight. However, at present, there is a paucity of validation studies for cystatin C estimates of GFR in KTRs (146–148).
Formulas to estimate GFR have been tested in KTRs, but no formula has been consistently shown to be superior to any other formula (149–156). It is unlikely that these formulas will improve the ability of serum creatinine to estimate acute changes in kidney function since, in most formulas, the only component of the formula that changes significantly is serum creatinine. It is similarly unclear whether formulas improve the ability of serum creatinine to measure chronic changes in kidney transplant function, especially when serum creatinine may change due to changes in muscle mass due to an improved nutritional status after kidney transplantation (157–159).
Kidney allograft ultrasound examination
Many of the most common causes of allograft dysfunction, other than rejection, can be diagnosed by ultrasound. These include arterial occlusion, venous thrombosis, urinary obstruction, a urine leak (large fluid collection), compressing perinephric hematoma and arteriovenous fistula from a kidney biopsy (160–163). Ultrasound is also useful in guiding a kidney allograft biopsy, so it is often obtained at the time of biopsy. In the kidney allograft, mild to moderate calyceal distension can be normal, so a baseline ultrasound examination when kidney function is normal may be useful to compare to subsequent ultrasound examinations for allograft dysfunction.
Chapter 9: Kidney Allograft Biopsy
- 9.1: We recommend kidney allograft biopsy when there is a persistent, unexplained increase in serum creatinine. (1C)
- 9.2: We suggest kidney allograft biopsy when serum creatinine has not returned to baseline after treatment of acute rejection. (2D)
- 9.3: We suggest kidney allograft biopsy every 7–10 days during delayed function. (2C)
- 9.4: We suggest kidney allograft biopsy if expected kidney function is not achieved within the first 1–2 months after transplantation. (2D)
- 9.5: We suggest kidney allograft biopsy when there is:
- • new onset of proteinuria (2C);
- • unexplained proteinuria ≥3.0 g/g creatinine or ≥3.0 g per 24 hours. (2C)
Kidney allograft biopsies are performed for specific clinical indications, or as part of a surveillance program (or protocol). An ‘indicated biopsy’ is one that is prompted by a change in the patient's clinical condition and/or laboratory parameters. A ‘protocol biopsy’ is one obtained at predefined intervals after transplantation, regardless of kidney function. In both cases, the biopsy is obtained to find histological changes prompting treatment to improve outcomes. DGF is graft function low enough to require dialysis in the first week after kidney transplantation, or lack of improvement in pretransplant kidney function.
New-onset proteinuria (defined in Table 6) may indicate treatable causes of graft dysfunction, including acute rejection and thrombotic microangiopathy. In patients who already have proteinuria, an increase exceeding a threshold usually defined as ‘nephrotic range’ proteinuria, for example ≥3.0 g/g creatinine or ≥3.0 g/24 h, may indicate treatable causes of graft dysfunction.
- • Increased serum creatinine that is not explained by dehydration, urinary obstruction, high CNI levels or other apparent causes is most likely due to an intragraft parenchymal process, such as acute rejection, CAI, drug toxicity, recurrent or de novo kidney disease or BKV nephropathy.
- • The optimal diagnosis and treatment of intragraft parenchymal causes of allograft dysfunction require an adequate biopsy.
- • In patients with DGF, change in serum creatinine is not useful for ruling out acute rejection, and protocol biopsies are needed to rule out acute rejection.
- • Proteinuria, or a substantial increase in proteinuria, may indicate a potentially treatable cause of graft dysfunction.
Biopsies for an increase in serum creatinine
Although serum creatinine has many limitations for estimating GFR (see Chapter 8), an unexplained rise in serum creatinine is generally indicative of a decline in GFR. Some fluctuation in creatinine can result from normal laboratory or physiological variability. Hence, only a persistent increase that is outside this normal, but poorly defined, range is clinically relevant. A 25–50% increase over baseline is often arbitrarily used in studies. At least one study suggested that a persistent 30% rise in serum creatinine was an excellent predictor of subsequent graft failure (144,145). The Acute Kidney Injury Network (164) has proposed a definition and classification scheme for evaluating acute kidney injury (Table 7).
|Criteria||An abrupt (within 48 h) reduction in kidney function currently defined as an absolute increase in serum creatinine of ≥0.3 mg/dL (≥26.4 μmol/L), a percentage increase in serum creatinine of ≥50% (1.5-fold from baseline), or a reduction in urine output (documented oliguria of less than 0.5 mL/kg/h for more than 6 h).|
|Notes||The above criteria include both an absolute and a percentage change in creatinine to accommodate variations related to age, gender and BMI, and to reduce the need for a baseline creatinine but do require at least two creatinine values within 48 h.|
Causes of acute, reversible declines in GFR should be ruled out, including dehydration, urinary obstruction or acute CNI toxicity (by demonstrating high blood levels), before a biopsy is performed. If there are no apparent causes of a decline in GFR, then an allograft biopsy is generally warranted to detect the nature of potentially treatable causes of kidney injury, including rejection, infections like BKV nephropathy, recurrent or de novo kidney disease or infiltration with posttransplant lymphoproliferative disease (PTLD). Since any of these conditions can develop in the setting of preexisting graft pathology, additional biopsies may be required when an abrupt change in the rate of progression is observed.
Biopsies can determine both the type and severity of immunologic damage (109). Different types of acute rejection may require different treatment approaches. For example, acute cellular rejection is usually treated with steroid pulses, but acute antibody-mediated rejection may prompt the use of specific treatments in addition to steroids.
Biopsies for a lack of improvement in graft function
When acute rejection does not respond to first-line treatment with steroids, additional treatment (e.g. with a lymphocyte-depleting antibody) may be successful (105,165). Alternatively, a failure of function to return to baseline could be due to a new pathological process, such as coexistent acute tubular necrosis, drug toxicity or BKV nephropathy, that would require a different treatment approach. Therefore, a biopsy is indicated to determine the correct treatment.
Patients should always be assessed for their suitability for biopsy before undertaking the procedure. Biopsies may be hazardous in those with a bleeding diathesis, or in the presence of large fluid collections or infection.
Biopsies for DGF
Observational studies have shown that the incidence of acute rejection during DGF is higher than in patients without DGF (166–168). Kidney function cannot be used as an indication for biopsy to diagnose superimposed acute rejection while the patients are already being treated with dialysis due to DGF, or when the serum creatinine does not fall from pretransplant values. It is therefore prudent to obtain periodic biopsies of the kidney during DGF to diagnose acute rejection. There are few data to determine when and how often biopsies during DGF should be obtained. However, studies in which biopsies have been obtained every 7–10 days, while patients are receiving dialysis for DGF, have shown that acute rejection can be present for the first time on the second, third or even fourth biopsy (167).
In centers that have a very low overall incidence of acute rejection, the incidence of acute rejection during DGF could also be low enough to obviate the need for biopsies during DGF. A biopsy may no longer be needed when there are signs that DGF is resolving, for example when urine output is increasing rapidly or serum creatinine is declining.
Acute rejection, CAI and CNI toxicity can occur in the absence of a measurable decline in kidney function. Several studies have shown that protocol biopsies can detect clinically inapparent (subclinical) acute rejection, CAI and CNI nephrotoxicity. The reported prevalence of subclinical rejection (Banff grade 1A or higher) varies from 13% to 25% at 1–2 weeks, 11–43% at 1–2 months, 3–31% at 2–3 months and 4–50% at 1 year (169–175).
Data from observational studies indirectly suggest that detecting and treating subclinical acute rejection with protocol biopsies may be beneficial. Subclinical rejection is associated with CAI (170,173,176,177) and reduced graft survival (176–179).
In another study, subclinical acute rejection in 14-day protocol biopsies was associated with poorer 10-year graft survival (179). Graft survival rates with subclinical rejection, borderline subclinical rejection or no rejection were 88%, 99% and 98% at 1 year (p < 0.05), and 62%, 94% and 96% at 10 years (p < 0.05), respectively. In a pediatric study, subclinical rejection was associated with progressive CAI, reduced creatinine clearance and shorter graft survival (177).
Treatment of subclinical rejection may improve outcomes. In a RCT, 72 patients were randomly allocated to undergo protocol biopsies and treatment of subclinical rejection at 1, 2, 3, 6 and 12 months (biopsy group), or protocol biopsies without treatment at 6 and 12 months only (control group) (100). Patients in the biopsy arm of the study had a significant decrease in acute rejection episodes, a reduced 6-month chronic tubulointerstitial score and a lower 2-year serum creatinine. Interstitial fibrosis was less in those treated for subclinical rejection (100). In another trial, 52 living-donor KTRs were randomized to undergo protocol biopsies and 50 controls had only indicated biopsies (103). At 1 and 3 months, protocol biopsies revealed borderline changes in 11.5% and 14% patients, acute rejection in 17% and 12% and CAI in 4% and 10%, respectively. The incidence of clinically evident acute rejection episodes was similar in the two groups, but the biopsy group had lower serum creatinine at 6 months (p = 0.0003) and 1 year (p < 0.0001).
Baseline immunosuppression is likely important in determining the incidence of subclinical rejection and thereby the benefit of protocol biopsies. Tacrolimus- and MMF-treated patients generally have a lower rate of acute rejection than patients treated with CsA and azathioprine, and tacrolimus is associated with a reduced incidence of subclinical rejection (104,113,176,180,181), lower acute Banff scores (182,183) and 1-year serum creatinine (181). In a RCT, 121 patients were randomly allocated to biopsies at 0, 1, 2, 3 and 6 months, and 119 to biopsies at 0 and 6 months (102). At 6 months, 35% of the biopsy arm and 20.5% of the control arm patients had interstitial fibrosis and tubular atrophy (ci + ct) scores ≥2 (p = 0.07). Of note, the frequency of clinical acute rejection episodes was only 10% in the biopsy arm and 7% in the control arm (p > 0.05). The prevalence of subclinical rejection in the biopsy arm was 4.6%. Creatinine clearance at 6 months was not different (p > 0.05) in the two groups. Use of protocol biopsies, therefore, for diagnosis of subclinical rejection may not be appropriate in tacrolimus- and MMF-treated patients.
Other conditions that can be detected on protocol biopsies include CNI toxicity, recurrent disease, transplant glomerulopathy, CAI and BKV nephropathy. However, it is unclear whether the detection of these conditions by protocol biopsy improves outcomes.
The safety of biopsies has been documented in several series (180,184). The reported risk of major complications from protocol biopsy, including substantial bleeding, macroscopic hematuria with ureteric obstruction, peritonitis or graft loss, is approximately 1% (185–187). The reported incidence of graft loss from protocol biopsy is 0.03%. Protocol biopsies can be done safely as an outpatient procedure. Data collected on 1705 protocol kidney transplant biopsies at one center showed that all of the complications became evident in the first 4 h after the biopsy (188).
Protocol biopsies, however, may be expensive. The Mayo Clinic reported that protocol biopsies cost US$ 3000 per biopsy, and it cost US$ 114 000 to detect one case of acute subclinical rejection (104). Therefore, decisions on whether or not to perform protocol biopsies should take these and other factors, including patient preferences, into account. Altogether, based on very-low-quality evidence, the benefit of performing protocol biopsies in CsA/azathioprine-treated patients without induction therapy may outweigh the harm (see Evidence Profile and accompanying evidence in Supporting Tables 45–47 at http://www3.interscience.wiley.com/journal/118499698/toc).
- • RCTs are needed to determine when the benefits of protocol biopsies outweigh harm.
Chapter 10: Recurrent Kidney Disease
- 10.1: We suggest screening KTRs with primary kidney disease caused by FSGS for proteinuria (2C) at least:
- • daily for 1 week (2D);
- • weekly for 4 weeks (2D);
- • every 3 months, for the first year (2D);
- • every year, thereafter. (2D)
- 10.2: We suggest screening KTRs with potentially treatable recurrence of primary kidney disease from IgA nephropathy, MPGN, anti-GBM disease, or ANCA-associated vasculitis for microhematuria, (2C) at least:
- • once in the first month to determine a baseline (2D);
- • every 3 months during the first year (2D);
- • annually, thereafter. (2D)
- 10.3: During episodes of graft dysfunction in patients with primary HUS, we suggest screening for thrombotic microangiopathy (e.g. with platelet count, peripheral smear for blood cell morphology, plasma haptoglobin, and serum lactate dehydrogenase). (2D)
- 10.4: When screening suggests possible treatable recurrent disease, we suggest obtaining an allograft biopsy. (2C)
- 10.5: Treatment of recurrent kidney disease:
- 10.5.1: We suggest plasma exchange if a biopsy shows minimal change disease or FSGS in those with primary FSGS as their primary kidney disease. (2D)
- 10.5.2: We suggest high-dose corticosteroids and cyclophosphamide in patients with recurrent ANCA-associated vasculitis or anti-GBM disease. (2D)
- 10.5.3: We suggest using an ACE-I or an ARB for patients with recurrent glomerulonephritis and proteinuria. (2C)
- 10.5.4: For KTRs with primary hyperoxaluria, we suggest appropriate measures to prevent oxalate deposition until plasma and urine oxalate levels are normal (2C), including:
- • pyridoxine (2C);
- • high calcium and low oxalate diet (2C);
- • increased oral fluid intake to enhance urinary dilution of oxalate (2C);
- • potassium or sodium citrate to alkalinize the urine (2C);
- • orthophosphate (2C);
- • magnesium oxide (2C);
- • intensive hemodialysis to remove oxalate. (2C)
ACE-I, angiotensin-converting enzyme inhibitor; ANCA, antineutrophil cytoplasmic autoantibody; ARB, angiotensin II receptor blocker; FSGS, focal segmental glomerulosclerosis; GBM, glomerular basement membrane; HUS, hemolytic-uremic syndrome; IgA, immunoglobulin A; KTRs, kidney transplant recipients; MPGN, membranoproliferative glomerulonephritis.
The primary kidney disease is generally documented by pretransplant biopsy of the native kidney, or of a previous kidney transplant. Recurrence of the primary kidney disease is usually established when there is biopsy-documented involvement of the kidney allograft with the primary kidney disease.
- • Some recurrent kidney diseases cause allograft failure.
- • Treatment of some recurrent kidney diseases may prevent, or delay, the onset of graft failure.
- • Screening for treatable recurrent kidney disease may result in early diagnosis and treatment that may be beneficial.
Recurrence of primary kidney diseases is an important cause of morbidity and graft loss following kidney transplantation, in both adults and children. In a study of 1505 cases with both native kidney and kidney allograft biopsies documenting recurrent glomerular disease, graft loss due to recurrent glomerulonephritis was the third most frequent cause for graft failure 10 years after kidney transplantation (110). Recurrence may present as increased serum creatinine (reduced GFR), new-onset or increased proteinuria and/or hematuria. The impact of recurrence varies according to the primary kidney disease. Not all diseases recur with equal frequency. The risk of recurrence is particularly increased in FSGS, immunoglobulin A (IgA) nephropathy, membranoproliferative glomerulonephritis (MPGN), hemolytic-uremic syndrome (HUS), oxalosis and Fabry's disease and, to a lesser extent, with lupus nephritis, anti-glomerular basement membrane (GBM) disease and vasculitis (189). Also, the timing of recurrence and manner of presentation vary for different diseases. FSGS, HUS and oxalosis may recur in the first few days to weeks after transplantation, whereas the timing is variable in the others (127).
In a majority of instances, proteinuria and/or reduced GFR provide the initial basis for suspecting disease recurrence. Since these parameters are periodically assessed in KTRs as part of their routine monitoring, a separate strategy for detection of disease recurrence is not warranted.
The modality of screening for some of these diseases, however, may vary from the usual posttransplant monitoring if timely detection is not achieved by the routine posttransplant monitoring strategies (Table 8). For example, FSGS can recur early; hence, screening for FSGS recurrence requires early and frequent monitoring for proteinuria. HUS recurrence requires looking for evidence of microangiopathic hemolysis. Screening for recurrent IgA nephropathy, MPGN, anti-GBM disease and vasculitis require examination of urinary sediment to detect microhematuria and/or presence of casts in addition to screening for proteinuria. It is appropriate to perform dipstick testing for proteinuria followed by quantitation using spot protein creatinine ratio or timed urine collection. Depending on the primary disease, biopsy evaluation may require immunofluorescence and electron microscopy in addition to light microscopy to confirm recurrence and to rule out other causes of proteinuria, hematuria or graft dysfunction (190).
|Disease||Screening (in addition to serum creatinine)||Minimum screening frequency||Diagnostic tests (in addition to kidney biopsy)||Potential treatment|
|FSGS||Proteinuria||Daily for 1 week, weekly for 4 weeks, every 3 months for 1 year, then annually||Plasmapheresis|
|IgA nephropathy||Proteinuria, microhematuria||Once in the first month, every 3 months in the first year, then annually|
|MPGN||Proteinuria, microhematuria||Serum complement levels|
|Anti-GBM disease||Proteinuria, microhematuria||Anti-GBM antibodies||Plasmapheresis|
|Pauci-immune vasculitis||Proteinuria, microhematuria||ANCA||Cyclophosphamide and corticosteroids|
|HUS||Proteinuria, platelet count||During episodes of graft dysfunction||Platelet count, peripheral blood smear, LDH||Plasmapheresis|
There is also weak evidence (uncontrolled case studies and case reports) that disease-specific treatment may be beneficial for some recurrent diseases.
Idiopathic, or primary, FSGS is characterized by typical sclerosis in a segment of glomerular tuft, along with foot-process fusion on electron microscopy. Sclerosis may not be evident in early recurrence, and light microscopy may show normal glomerular architecture. Recurrence is suspected when a patient with a documented primary FSGS in the native kidneys or a prior kidney allograft develops proteinuria and/or increase in serum creatinine, typically soon after transplantation (127).
Idiopathic FSGS recurs in 20–50% of KTRs (up to 80% if it has recurred in a prior kidney transplant) (191). It is important to distinguish idiopathic from secondary causes of FSGS that generally do not recur. Recurrence of familial FSGS has also been documented, if the donor is an obligate carrier (191). Putative risk factors for recurrence include age of onset of FSGS in native kidneys between 6 and 15 years (192), rapid course of the original disease (e.g. less than 3 years from diagnosis to CKD stage 5), diffuse mesangial proliferation on histology and non-African American ethnicity. The strongest risk factor is recurrence in a previous transplant.
The demonstration of increase in the albumin permeability of isolated rat glomeruli by sera from patients with a recurrent FSGS offers the possibility of more accurate prediction of the risk of recurrent disease (193). However, this assay is still experimental.
Idiopathic FSGS can recur at any time after transplantation, but recurrence is more common early after transplantation. Recurrent disease presents with proteinuria, which is usually heavy. About 80% of cases recur in the first 4 weeks (193). Proteinuria screening therefore needs to be more frequent in the early posttransplant period in those with CKD stage 5 due to FSGS, especially those with risk factors for recurrence. The exact frequency has not been worked out. Interpretation of proteinuria, especially in the early posttransplant period, requires knowledge of pretransplant proteinuria. Although proteinuria from the native kidneys declines after transplantation (194), the time taken for its disappearance is variable. Posttransplant proteinuria therefore should be interpreted in light of the pretransplant values.
There have been no RCTs of therapy for recurrent idiopathic FSGS. However, there have been individual cases, and uncontrolled series, reporting that patients with recurrent idiopathic FSGS may have a substantial reduction in urine protein excretion after plasma exchange (195,196). This probably occurs by removing circulating factors that alter glomerular permeability to protein. Predictors of response to plasma exchange include early initiation of treatment after recurrence, and possibly an early recurrence of disease (196). Unfortunately, proteinuria may recur after treatment, and may require additional plasma exchange, or even periodic, ongoing treatments. The presumption is that reducing protein excretion with plasma exchange will help preserve allograft function, but no studies have examined this.
It is unclear how many plasma-exchange treatments are required to reduce protein excretion, but one review found a median of nine treatments before there was a remission in proteinuria (195). In small case series, prophylactic plasma exchange has been reported, but the data are not convincing that this is effective in preventing recurrent FSGS (197,198).
High-dose CsA may induce remission of proteinuria. In one series, 14 of 17 children entered lasting remission (199). The rationale behind maintaining a high CsA blood level is to overcome the effect of high serum cholesterol often seen in patients with recurrent FSGS (lipoproteins bind CsA and reduce free CsA levels). High-dose CsA may be combined with plasmapheresis. A study concluded that plasmapheresis alone was not sufficient to induce remission except when combined with high-dose CsA (200).
For patients who do not respond to plasma exchange, or for patients who have non-nephrotic proteinuria, a reduction in proteinuria with an angiotensin-converting enzyme inhibitor (ACE-I) and/or an angiotensin II receptor blocker (ARB) may be beneficial.
IgA nephropathy is the most common type of glomerulonephritis worldwide and is the primary cause of CKD stage 5 in 20% of KTRs in many parts of the world. Recurrent IgA nephropathy is common after transplantation. Reported incidence of recurrence varies from 13% to 53% according to differences in duration of follow-up and biopsy policy of different transplant centers, with the highest rates in centers that perform routine protocol biopsies (201). Latent IgA deposits in the donor kidney (identified on preimplantation biopsies) are responsible for ‘recurrence’ in some cases transplanted for kidney failure due to IgA nephropathy in areas with high disease prevalence (202). Single-nucleotide polymorphisms in the interleukin-10 and TNF-alpha genes have been shown to predict recurrence risk (203,204). The estimated 10-year incidence of graft loss due to recurrence was 9.7% (CI = 4.7–19.5%) (110). Recurrence risk in retransplants is increased if the first graft was lost due to recurrent IgA nephropathy in less than 10 years (205). There is no effective therapy for preventing recurrent IgA nephropathy. ACE-Is and ARBs have been shown to reduce proteinuria and possibly preserve kidney function in recurrent IgA nephropathy (206). In a study of 116 KTRs with IgA nephropathy, use of ATG as induction therapy was associated with a reduction in recurrence risk from 41% to 9% when compared to IL2 receptor antagonists (207).
Secondary causes of MPGN, such as hepatitis C, should be ruled out. The histological recurrence rate in idiopathic type I MPGN is 20–30% and exceeds 80% in type 2 disease (192). Manifestations include microhematuria, proteinuria and deterioration of kidney function. Risk factors for recurrence include severity of histological lesions in native kidneys, HLA-B8DR3, living related donors and previous graft loss from recurrence (208,209). There are reports of response to long-term cyclophosphamide (210), plasmapheresis (211–213) and CsA (214).
Hemolytic-uremic syndrome is defined histopathologically by intimal cell proliferation, and thickening and necrosis of the wall, thrombi and narrowed lumens of glomerular, arteriolar or interlobular artery. The severity can range from endothelial swelling to complete cortical necrosis. It manifests clinically with microangiopathic hemolytic anemia and rapid worsening of kidney function with or without involvement of other organs. HUS is often classified as diarrhea-associated (D)− HUS (atypical) and D+ HUS (typical).
Hemolytic-uremic syndrome recurs commonly in adults and in children in whom the original kidney disease was D− variant. The overall recurrence risk is less than 10% in the pediatric population; D+ HUS usually does not recur, while idiopathic D− or familial HUS may recur in 21–28% of children (215). Recurrence occurs in about 80–100% of patients with factor H or factor I mutation, while patients with a mutation in membrane cofactor protein do not have recurrence (216,217). The risk is higher in adults, with 33–56% (218–220) showing clinical manifestations and an additional 16–20% of patients demonstrating clinically silent recurrence. Recurrence is particularly frequent in adults with autosomal recessive or dominant HUS (215). Recurrence develops within 4 weeks in most cases. Most patients show microangiopathic anemia, thrombocytopenia and kidney dysfunction, whereas others present with rapidly progressive graft dysfunction without showing the classic hematologic manifestations. Platelet count should be performed during episodes of graft dysfunction in KTRs with HUS as the original cause of CKD stage 5. In those with falling counts, additional tests such as examination of peripheral blood smear to look for fragmented cells (schistocytes), haptoglobin and lactate dehydrogenase estimation to document hemolysis are warranted. Long-term graft survival is lower (about 30%) in those with recurrence.
Treatment strategies have included plasmapheresis, intravenous immunoglobulin and rituximab. Aggressive plasmapheresis using fresh frozen plasma (40–80 mL/kg per session) increases the levels of deficient factors and has provided encouraging results, even in those with factors H and I mutations (221–223). As factor H is synthesized in the liver, combined liver and kidney transplantation (together with preoperative and intraoperative plasmapheresis using fresh frozen plasma and low-molecular-weight heparin) could reduce the risk of recurrence (222,224–226). Intravenous immunoglobulin and rituximab have been reported to rescue recurrent HUS resistant to multiple courses of plasma exchanges (227,228). There is no evidence that avoidance of CNI, mTORi and OKT3 (that may themselves cause thrombotic microangiopathy) will reduce the recurrence risk.
ANCA-associated vasculitis and anti-GBM disease
Both antineutrophil cytoplasmic antibody (ANCA)-associated vasculitis and anti-GBM disease may present with rapidly progressive CKD and crescentic glomerulonephritis. Recurrence rates are low if the disease is quiescent at the time of transplant. In an analysis of pooled data from 127 patients with ANCA-associated vasculitis, 17% of patients had recurrence, with kidney manifestation in 57.1%. Kidney dysfunction occurred in 33% of those with recurrence (229). More recent studies (230) report lower (7%) recurrence rates, most beyond the first posttransplant year with no direct or indirect impact on allograft function. ANCA-associated vasculitis relapses in the kidney allograft usually manifest as pauci-immune necrotizing glomerulonephritis, but graft function can also be affected by acute arteritis, ureteral stenosis and obstructive uropathy due to granulomatous vasculitis.
Pretransplantation disease course, disease subtype, ANCA type or titer, time of transplantation or donor type does not predict recurrence. Kidney ANCA-associated vasculitis generally responds well to high-dose prednisolone and cyclophosphamide (231–233). Other treatment modalities that have been tried include MMF, plasmapheresis with or without intravenous immunoglobulin and rituximab (234–240).
Histological evidence of anti-GBM disease can be found in biopsies in 15–50% of cases. Clinical recurrence is rare and consisted of isolated case reports only (201,241). Graft failure due to recurrence is rare (110). The incidence of recurrence may be higher in those with circulating anti-GBM antibody at the time of transplantation. Treatment of clinically active anti-GBM disease may include pulse steroids, cyclophosphamide and plasma exchange, particularly if there is potentially life-threatening pulmonary involvement (241).
Primary hyperoxaluria is caused by deficiency of hepatic peroxisomal alanine:glyoxylate aminotransferase, leading to increased synthesis and urinary excretion of oxalate, recurrent calcium oxalate urolithiasis, irreversible nephrocalcinosis and eventually CKD. In CKD, insoluble oxalates accumulate throughout the body, especially in bone and arteries. Because the enzyme defect in primary hyperoxaluria is not corrected by isolated kidney transplantation, oxalate overproduction persists, leading to recurrence of calcium oxalate deposits in over 90% of transplanted kidneys, and eventually leading to graft loss (242), unless the enzyme is replaced through a simultaneous liver transplant (243). The total body oxalate burden is very high in CKD stage 5 patients, and the urinary oxalate excretion increases greatly as soon as graft function is established. Plasma and urine oxalate levels may remain high for some period of time even in patients undergoing simultaneous kidney and liver transplantation. High urinary oxalate concentration promotes precipitation of calcium oxalate crystals first in the distal tubules, leading to graft dysfunction. This secondarily results in deposition in the parenchyma of the graft, leading to allograft failure. This risk is obviously increased further in those with primary nonfunction of the graft. Transplant protocols designed to minimize complications of recurrent disease include early posttransplant urinary dilution through aggressive fluid administration, and early and frequent dialysis in those with DGF.
Although isolated kidney transplantation is not recommended in primary hyperoxaluria, it is sometimes carried out in developing countries where liver transplantation is not available. Primary hyperoxaluria recurs invariably in those who receive kidney transplant alone and leads to graft loss. Patients with the Gly170Arg mutation are pyridoxine-sensitive, and should be given high-dose pyridoxine if they receive kidney transplant alone (244).
The disease is sometimes diagnosed for the first time after kidney transplantation when oxalate deposits are detected on biopsy in patients with graft dysfunction. Whenever possible, these patients should be referred to specialized centers for liver transplantation. In the immediate postoperative phase, extra dialysis sessions may be necessary to control oxalate blood levels until the liver is completely working (245).
Specific measures designed to increase oxalate excretion and reduce production help in minimization of recurrence, and should be in place for all patients during the first months or years after kidney or combined liver–kidney transplantation (246). These include maintenance of urine output >3.0–3.5 L/day, and the use of alkaline citrate, neutral phosphate and magnesium oxide. Severe dietary oxalate restriction is of limited benefit (247), but intake of nutrients extremely rich in oxalate and ascorbic acid, a precursor of oxalate, should be discouraged. Pharmacological doses of pyridoxine may reduce hyperoxaluria in some patients, especially in those with a Gly170Arg mutation (244). Pyridoxine responsiveness can be assessed by observation of >30% reduction in urinary oxalate excretion to 10 mg/kg/day dose of pyridoxine (248) in patient's sibs with less severe kidney disease if it was not done at the predialysis stage. Urinary alkalinization with citrate reduces the risk of urinary calcium oxalate supersaturation by forming a soluble complex with calcium, which reduces the likelihood of binding and precipitation with other substances, such as oxalate (249). The dosage is 0.1–0.15 g/kg body weight of a sodium or sodium/potassium citrate preparation. The adequacy of therapy and patient compliance can be verified by measuring urinary pH and citrate excretion. Orthophosphate (20–60 mg/day), along with pyridoxine, has also been shown to reduce urinary calcium oxalate crystallization (250).
Fabry disease is a rare, X-linked inherited disease characterized by a deficiency of alpha-galactosidase A (alpha-Gal-A), resulting in progressive systemic accumulation of glycosphingolipids. Transplantation is the treatment of choice for most patients with CKD stage 5 due to Fabry disease (251). Although patients with Fabry disease may have histological recurrence of the disease in the allograft, how often recurrence causes graft failure is not clear. In a recent US Organ Procurement and Transplantation Network registry study, 197 KTRs with Fabry disease had 74% 5-year graft survival, compared to 64% in KTRs with other kidney diseases (252). Two formulations of recombinant human alpha-Gal A are currently available: agalsidase alpha (Replagal, Transkaryotic Therapies, Cambridge, MA) and agalsidase (Fabrazyme, Genzyme, Cambridge, MA). In non-KTRs, treatment with recombinant human alpha-Gal A has been shown to reduce the rate of decline in kidney function. However, it is unclear whether treatment improves graft survival, or reduces other complications of Fabry disease in KTRs. Treatment appears to be safe in KTRs (253,254); however it is very expensive, and whether it is cost-effective for improving KTR outcomes is not known.
Chapter 11: Preventing, Detecting, and Treating Nonadherence
- 11.1: Consider providing all KTRs and family members with education, prevention, and treatment measures to minimize nonadherence to immunosuppressive medications. (Not Graded)
- 11.2: Consider providing KTRs at increased risk for nonadherence with increased levels of screening for nonadherence. (Not Graded)
KTRs, kidney transplant recipients.
Adherence is ‘the extent to which the patient's behavior matches the agreed-upon prescriber's recommendations’ (255). At a recent consensus conference, this definition was modified to take into account the threshold of the effect of nonadherence on the therapeutic outcome. We have adopted this definition of nonadherence as ‘deviation from the prescribed medication regimen sufficient to adversely influence the regimen's intended effect’ (255). Nonadherence encompasses primary (at initiation) and secondary (subsequent) nonadherence, partial and/or total nonadherence, as well as the timing of medication use (256–260).
- • Nonadherence is associated with a high risk of acute rejection and allograft loss.
- • Nonadherence may occur early and/or late after transplantation.
- • The transition from pediatric to adult nephrology care may be a time when nonadherence is particularly common.
- • Measures can be taken to reduce nonadherence and thereby improve outcomes.
Nonadherence is common in the first months after kidney transplantation and increases by duration of follow-up. The level of adherence affects clinical outcomes, and is associated with early and late allograft rejection, which reduces graft function and graft survival (261–263). Graft loss is sevenfold more likely in nonadherent compared to adherent individuals (264). In another study, nonadherence (missed appointments, fluctuating drug concentration) accounted for over a half of kidney allograft loss (265).
Nonadherence is multidimensional (255), although we have focused primarily on adherence with immunosuppressive medication use. Additional areas of nonadherence include prescribed diet; exercise; tobacco, alcohol and drug use; self-monitoring of vital signs, for example blood pressure, body weight and clinical appointments.
Satisfactory adherence to medication use is achieved when the gaps between dosing history and the prescribed regimen have no effect on therapeutic outcome. This pharmacoadherence definition emphasizes therapeutic outcome in contrast to specific medication intake or drug level. Measurement of outcome and drug levels is commonly used in the transplant population. Measurable parameters of pharmacoadherence are acceptance (whether the patient accepts the recommended treatment), execution (how well the patient executes the recommended regimen), and discontinuation (when the patient stops taking the medication) (264,266). Measurement of adherence can be by direct observation that medication was consumed, indirect measures that medication had been consumed or self-reporting (Table 9). Indirect measures include serum drug levels, biological markers, electronic monitoring, pill count and refill/prescription records. Since there is no perfect measure of adherence, consideration should be given to use more than one approach to measure adherence (267–270).
|Self-reporting medication use by patient|
|Collateral reporting of medication use by relatives, friends or caretakers|
|Laboratory tests (drug and metabolite levels)|
|Medical record review, outcomes|
|Monitored pill counts|
|Electronic monitoring devices|
In organ transplant recipients, the average nonadherence rate was highest for diet (25 cases per 100 people per year), immunosuppressive medication (22.6 cases per 100 people per year), monitoring vital signs (20.9 cases per 100 people per year) and exercise (19.1 cases per 100 people per year) (264). Among KTRs, nonadherence with immunosuppressive medications was highest (35.6 cases per 100 people per year). Nonadherence to long-term medication is as high as 50% in developed countries and even higher rates have been reported in developing countries (264). Meta-analysis showed that the odds of having a good outcome is 2.9 times higher if the patient is adherent (271).
Risk factors for nonadherence include long duration of treatment (with decline in rates of adherence over time), poor communication and lack of social support (Table 10). Risk factors for nonadherence can be categorized into four interrelated areas: patient/environment, caregiver, disease and medication. The patient/environment is central and interrelates with the other three categories. The primary patient–medication factors are side effects, regimen complexity, costs and poor access. Negative beliefs in medication and lack of medication knowledge have a moderate impact. Patient–caregiver factors include poor communication and poor aftercare/discharge planning (272–274). Patient–disease factors are primarily poor disease knowledge and insights, disease duration and comorbid psychiatric disease. A meta-analysis of 164 studies in the nonpsychiatric literature reported risk factors for adherence, including: age (adolescents less adherent), sex (girls more adherent than boys among pediatric patients), education level (positively associated with adherence in chronic disease) and socioeconomic status (positively correlated with adherence in adults) (275–277).
|Nonadherence behavior prior to transplantation|
|Poor social support|
|Substance abuse and other high-risk behavior|
|High education level|
|Time since transplantation (higher earlier)|
|Lack of adequate follow-up with transplant specialists|
|Inadequate pretransplant education|
|Multiple adverse effects from medications|
|Complex medication regimens|
A team approach consisting of education, monitoring, recognition and intervention is essential to secure the benefit of transplantation. A combination of educational, behavioral and social support interventions provides the best results (Table 11) (271,278). Simplified drug regimens, pillboxes to organize medications, individualized instructions (particularly for travelers and night-shift workers), combining medication administration with daily routine activities and electronic devices can contribute to improve adherence.
|Education and medical intervention|
|Ensure that patients know their medications by name, dosage and reason for prescription; reinforce these points during every clinic visit.|
|Inform patients about the adverse effects of drugs.|
|Provide written instructions for each change in medication dose or frequency.|
|Reduce the number and frequency of medications. Where possible, medications should be given either once or, at most, twice daily.|
|Ensure the patients understand that they need to continue taking immunosuppressive agents even if the transplanted organ is functioning well.|
|Teach patients that chronic rejection is insidious in onset, hard to diagnose in its early stages and often not reversible once established.|
|Attempt to treat adverse effects by means other than dose reduction.|
|Inquire about problems during every clinic visit, and address specific patient concerns.|
|Monitor compliance with laboratory work, clinic visit and prescription refills.|
|Behavioral and psychosocial approaches|
|Provide positive support to encourage adherent behaviors during preparation for transplant.|
|Encourage patient to demonstrate a track record of medication adherence and knowledge.|
|Encourage individual team members to develop rapport with patient.|
|Identify and involve a backup support system (family or friends).|
|Treat depression, anxiety or other psychological issues.|
|Elicit a personal promise of adherence (e.g. a written contract).|
|Use a nonjudgmental approach to the discussion of adherence.|
|Address social problems such as insurance changes or difficulties at school or work.|
|Tailor interventions for nonadherence to its root cause.|
|Integrate taking medication into the daily routine.|
|Consider reminders such as digital alarms or alerts.|
|Provide ongoing education, discussion and easily accessible counseling.|
Simply forgetting to take their pills is one of the most common reasons that patients give for missing doses of their medication (268). Patients should be counseled about various possibilities to integrate their medication administration into their daily routine. Pillboxes may be helpful for complex regimens consisting of multiple drugs with multiple daily dose-administration schedules. Electronic compliance devices, including alarms, are also available for improving medication adherence. The disease-management assistance system is a device that delivers a programmed voice message reminder at set times and has been applied in patients on antiretroviral therapy (279). Finally, an online pager or mobile phone system may improve adherence to medication regimens (280). However, except for the observation method, which can be onerous, all measures have significant disadvantages, primarily related to their lack of accuracy. Because there is no perfect measure of nonadherence, consideration should be given to use more than one approach to measure adherence. The overall approach to measure adherence requires individualization.
The number of prescribed medications and the dosing frequency has an effect on adherence rates (280,281). When a regimen is extremely complex, forgetfulness becomes a contributing factor to nonadherence (282). The complexity of a medication regimen is inversely proportional to the rate of adherence, with an increasing number of prescribed medications favoring nonadherence (283). Medications requiring twice-daily administration have resulted in greater adherence than those administered more than twice daily (284). The simplification of therapy strategies includes immunosuppressive as well as nonimmunosuppressive medications (e.g. antihypertensives). In addition, steroid- or CNI-sparing protocols should be considered for the benefit of reduction of number of drugs, and reduction of adverse events. Involving a clinical pharmacist may be helpful to provide comprehensive patient education regarding benefits and adherence effects of their medications. A significantly greater proportion of patients were adherent with their immunosuppressive medications at 1 year after transplant when a pharmacist was involved (284,285).
Behavioral change strategies have been applied in the clinical setting. Behavior modifications have been incorporated in six adherence-improvement RCTs in KTRs (286,287). The methods included behavioral contacting, education, skills training, feedback and reinforcement. These data indicated that such behavioral intervention is a very individualized process and adherence motivation needs to be patient-specific and updated continuously. Using the medication event-monitoring system to monitor monthly azathioprine adherence during a 6-month period in KTRs demonstrated a significant correlation with adherence and rejection-free survival in the first 6 months after transplantation (288).
- • Additional prospective cohort studies are needed to establish the best measures of adherence and the association between adherence and outcomes.
- • RCTs are needed to test interventions to improve adherence in KTRs.
Chapter 12: Vaccination
- 12.1: We recommend giving all KTRs approved, inactivated vaccines, according to recommended schedules for the general population, except for HBV vaccination. (1D)
- 12.1.1: We suggest HBV vaccination (ideally prior to transplantation) and HBsAb titers 6–12 weeks after completing the vaccination series. (2D)
- 126.96.36.199: We suggest annual HBsAb titers. (2D)
- 188.8.131.52: We suggest revaccination if the antibody titer falls below 10 mIU/mL. (2D)
- 12.2: We suggest avoiding live vaccines in KTRs. (2C)
- 12.3: We suggest avoiding vaccinations, except influenza vaccination, in the first 6 months following kidney transplantation. (2C)
- 12.3.1: We suggest resuming immunizations once patients are receiving minimal maintenance doses of immunosuppressive medications. (2C)
- 12.3.2: We recommend giving all KTRs, who are at least 1-month post-transplant, influenza vaccination prior to the onset of the annual influenza season, regardless of status of immunosuppression. (1C)
- 12.4: We suggest giving the following vaccines to KTRs who, due to age, direct exposure, residence or travel to endemic areas, or other epidemiological risk factors are at increased risk for the specific diseases:
- • rabies, (2D)
- • tick-borne meningoencephalitis, (2D)
- • Japanese B encephalitis—inactivated, (2D)
- • Meningococcus, (2D)
- • Pneumococcus, (2D)
- • Salmonella typhi—inactivated. (2D)
- 12.4.1: Consult an infectious disease specialist, a travel clinic or public health official for guidance on whether specific cases warrant these vaccinations. (Not Graded)
KTRs, kidney transplant recipients; HBsAb, antibody to hepatitis B surface antigen; HBV, hepatitis B virus.
Recommended vaccinations are those approved and suggested by local and national health authorities for their constituent populations. These may vary by country of origin and geographic location. The efficacy of hepatitis B vaccination is determined by the prevention of hepatitis B infection, which is indirectly measured by the development of antibody to hepatitis B surface antigen (HBsAb) titers >10 mIU/mL. Individuals who are at increased risk include those with direct exposure, or residence in or travel to an endemic geographic area. In the case of meningococcal infection, patients who have undergone splenectomy are at increased risk.
- • The harm of different infections, and thereby the potential benefits of vaccinations, vary by geographic region.
- • Little or no harm has been described with the use of licensed, inactivated vaccines in KTRs.
- • Most vaccines produce an antibody response, albeit diminished, in immunocompromised individuals, including KTRs.
- • The potential benefits outweigh the harm of immunization with inactivated vaccines in KTRs.
- • Serious infection can result from live vaccines in immunocompromised patients, including KTRs.
- • In the absence of adequate safety data to the contrary, it should be assumed that the harm of live vaccines outweigh their benefits in KTRs.
- • Vaccinations are most likely to be effective when immunosuppression is lowest, when KTRs are receiving the lowest possible doses of immunosuppressive medication.
- • Influenza vaccination needs to be provided on an annual basis in advance of the onset of the annual influenza season. Even while KTRs are receiving high levels of immunosuppression, the benefits of timely vaccination outweigh the risks of delaying vaccination.
- • Some KTRs are at increased risk to develop disease attributable to one or more (rare) pathogens based upon direct exposure from residence in, or travel to, endemic areas. Although limited efficacy data are available for these inactivated vaccines to rare pathogens, potential benefits likely outweigh harm.
The American Society for Transplantation's Guidelines for the Prevention and Management of Infectious Complications of Solid Organ Transplantation provides guidance on immunizations relevant to their patient populations (289). While these recommendations may be appropriate for North America, they may not apply to KTRs worldwide.
Although only a limited number of studies evaluating the safety and efficacy of inactivated vaccines have been performed in solid-organ transplant recipients in general, and in KTRs in particular, available evidence suggests that inactivated vaccines are safe. There is no evidence that vaccinations lead to an increased risk of rejection.
Unfortunately, data on the efficacy of individual inactivated vaccines are limited. In general, existing data suggest that the response to vaccination in KTRs is diminished compared to immunization prior to transplantation. Accordingly, the optimal timing for immunizing KTRs is prior to transplantation. However, this is not always possible and, in some cases, repeated vaccinations after transplantation are necessary. A number of studies have been performed in organ transplant recipients that demonstrate immunogenicity of several inactivated vaccines after solid-organ transplantation. Influenza vaccination is among the most thoroughly evaluated in organ transplant recipients. Although response to influenza vaccination may vary among KTRs and from year to year, 30–100% of immunized KTRs will achieve protective hemagglutination-inhibiting serum antibody titers. Of note, the efficacy of influenza vaccination appears to be superior in pediatric compared to adult KTRs (290). Data are also available supporting the use of the 23-valent polysaccharide pneumococcal vaccine for KTRs >2 years of age. In contrast, hepatitis B vaccine has significantly diminished immunogenicity in organ transplant recipients compared to organ transplant candidates (291). Specific data regarding the immunogenicity of most of the remaining inactivated vaccinations are not available for solid-organ transplant recipients. Although data are lacking, most experts agree that the benefits outweigh the risks of immunization with inactivated vaccines (289).
There are sufficient data in KTRs indicating that the risk of vaccination with inactivated vaccines is minimal. The risk of infection, on the other hand, is higher in KTRs than in the general population. Therefore, vaccination with inactivated vaccines is warranted (Table 12).
|Haemophilus influenza B|
|Influenza types A and B (administer annually)|
|Meningococcus: administer if recipient is in high risk|
The currently licensed live vaccines use either attenuated viral strains that have been manipulated to reduce their virulence while attempting to maintain their immunogenicity, or, as in the case of Bacillus Calmette-Guérin (BCG), substitute a related bacterium that is thought to be less pathogenic, but still able to provide cross-reacting immunity to the target pathogen. While data are limited, significant concern exists for the use of live vaccines in immunocompromised patients. To date, only a limited number of studies have evaluated the use of live viral vaccines in organ transplant recipients (292). The high incidence of infections in KTRs is ample cause for concern that live vaccinations may cause infection in KTRs. While limited published experience is available describing the use of some live viral vaccines in organ transplant recipients (292), the limited number and small sample sizes included in these studies raise concerns about both the safety and efficacy of these vaccines in KTRs. Accordingly, most experts agree that, in general, the risks outweigh the potential benefits of using live vaccines in KTRs (293).
A number of live vaccinations licensed for use in the general population are contraindicated in KTRs (Table 13).
|Live oral typhoid Ty21a and other newer vaccines|
|Measles (except during an outbreak)|
|Live Japanese B encephalitis vaccine|
The reduced antibody response to different vaccines in KTRs is most likely due to immunosuppressive medication. Although there are no RCTs, it is reasonable to assume that giving vaccines when the amount of immunosuppressive medications patients are receiving is lowest is most likely to maximize the response to the vaccine (289)
Immunosuppressive medication amounts are usually highest in the first few months after transplantation, when the risk of acute rejection is also the greatest. Some time during the first 6–12 months, the amount of immunosuppressive medication is generally reduced to the lowest maintenance levels, if there is no acute rejection, and this is likely to be the best time for vaccination. This time of minimal maintenance immunosuppressive medication, and optimal time for vaccination, may be different in patients treated for acute rejection.
Influenza infection is a potentially important cause of morbidity and mortality in KTRs. The use of influenza vaccination has been demonstrated to be safe and generally effective in organ transplant recipients, including KTRs (294,295). In particular, it is worth noting that there is no proven association between the use of influenza vaccination in organ transplant recipients and the development of rejection. Accordingly, annual use of influenza vaccination is recommended for both KTRs and their household contacts. Because acquisition of influenza will occur during annual seasonal epidemics, it may not be possible to delay giving this vaccine until the patient is out far enough from transplant or on low levels of immunosuppression. Given that this is an inactivated viral vaccine, the major consequence of using this too early is that the immunization will not work. Given the potential benefit of providing the vaccine, it is recommended to give this vaccine prior to the onset of the annual influenza season, as long as the recipient is at least 1-month posttransplant. This timing is chosen as the vaccine is least likely to work during the first month after transplant, especially if the KTR has received induction therapy.
Hepatitis B revaccination
The need for hepatitis B vaccination booster is controversial and the practice varies from country to country. Patients with impaired immune function tend to have lower peak HBsAb levels compared to immunocompetent individuals. There are few data on durability of immunologic memory in immunocompromised hosts. However, there have been reports of clinically significant infection due to hepatitis B virus (HBV) in previously immunized dialysis patients in whom production of HBsAb was no longer measurable (296).
Serial measurements of HBsAb levels to inform the use of a booster dose of hepatitis B vaccine has been recommended for dialysis patients by the US Advisory Committee on Immunization Practices (296). In addition, the European Consensus Group on Hepatitis B immunity has expanded this recommendation to include patients with impaired immune function (297). Immunological memory wanes faster in immunocompromised renal transplant recipients. A level above 10 mIU/mL is generally taken to be protective, but transplant recipients with titers less than 100 mIU/mL tend to lose them rapidly. The potential for low anti-HBs levels to mask significant infection (indicated by hepatitis B surface antigen (HBsAg)) and the rapid decline led a European Consensus Group to suggest booster vaccination at titers below 100 mIU/mL. Although there is no clear evidence to support this recommendation, given the relative risk–benefit ratio of hepatitis B vaccine, it seems prudent to assess annually the need for a booster dose of this immunization.
Kidney transplant recipients may be at increased risk for vaccine-preventable pathogens through residence or travel to endemic areas, or due to inadvertent exposure. Recommendations for individuals traveling to certain geographic locations frequently include receipt of one or more immunizations against these pathogens. These recommendations would logically apply to KTRs, as long as the recommended vaccinations are inactivated, for example salmonella typhi Vi polysaccharide vaccine, or meningococcal vaccine. Consultation with an infectious disease specialist, travel clinic or public health official is recommended to clarify appropriate use of vaccinations for scenarios where travel or exposure may warrant use of these additional vaccinations.
Although efficacy data may not be available in KTRs, inactivated vaccines are generally safe. In contrast, some immunizations typically recommended for travelers are available only as live-attenuated vaccines. The use of these vaccines cannot be recommended, as neither safety nor efficacy data are available in this patient population.
Studies are needed to determine:
- • the optimal timing of immunization in KTRs;
- • the durability of immunologic response in KTRs vaccinated before and after transplantation.
Chapter 13: Viral Diseases
- 13.1: BK POLYOMA VIRUS
- 13.1.1: We suggest screening all KTRs for BKV with quantitative plasma NAT (2C) at least:
- • monthly for the first 3–6 months after transplantation (2D);
- • then every 3 months until the end of the first post-transplant year (2D);
- • whenever there is an unexplained rise in serum creatinine (2D); and
- • after treatment for acute rejection. (2D)
- 13.1.2: We suggest reducing immunosuppressive medications when BKV plasma NAT is persistently greater than 10 000 copies/mL (107 copies/L). (2D)
BKV, BK polyoma virus; KTRs, kidney transplant recipients; NAT, nucleic acid testing.
BK polyoma virus (BKV) is a member of the polyoma family of viruses. BKV can cause nephropathy, which is diagnosed by kidney biopsy. Reduction of immunosuppression is defined as a decrease in the amount and intensity of immunosuppressive medication. Nucleic acid testing (NAT) is defined as one or more molecular methods used to identify the presence of DNA or RNA (e.g. polymerase chain reaction).
- • The use of NAT to detect BKV in plasma provides a sensitive method for identifying BKV infection and determining KTRs who are at increased risk for BKV nephropathy.
- • Early identification of BKV infection may allow measures to be taken that may prevent BKV nephropathy.
- • When NAT is not available, microscopic evaluation of urine for the presence of decoy cells is an acceptable, albeit nonspecific, alternative screening method for BKV disease and the risk for BKV nephropathy.
- • Fifty percent of patients who develop BK viremia do so by 3 months after kidney transplantation.
- • Ninety-five percent of BKV nephropathy occurs in the first 2 years after kidney transplantation.
- • BKV plasma NAT >10 000 copies/mL (107 copies/L) has a high positive predictive value for BKV nephropathy.
- • Reduction of immunosuppressive medication may result in reduced BKV load and decreased risk of BKV nephropathy.
- • Histologic evidence of BKV nephropathy may be present in the absence of elevated serum creatinine.
- • Reduction in maintenance immunosuppressive medication is the best treatment for BKV nephropathy.
Whether to screen KTRs with NAT of plasma or urine has been controversial. A negative urine NAT for BKV has almost a 100% negative predictive value (298). By testing urine, one can avoid performing BKV testing of blood on those patients with negative urine studies. Based on this, some experts recommend screening of urine as the definitive site for BKV surveillance (298). However, the presence of a positive NAT for BKV in urine, in the absence of an elevated BKV load in the plasma, is not associated with an increased risk for BKV disease (298). Hence, the use of urine screening requires performance of NAT on the blood of those patients whose level of BK viruria exceeds established thresholds. This requires patients to return to the clinic for the additional test. Accordingly, it is suggested that NAT be performed on plasma, and not the urine of KTRs.
When NAT is not available, microscopic evaluation of the urine for the presence of decoy cells is an acceptable, albeit nonspecific, alternative screening method for BKV disease and the risk for BKV nephropathy. A negative screening test rules out BKV nephropathy in most cases (high negative predictive value). However, a positive screening test has a very low positive predictive value for BKV nephropathy (298,299). Thus, many patients with urine decoy cells will not develop BKV nephropathy. It may be inappropriate to change therapy in such patients based on the presence of urine decoy cells alone.
Emerging data suggest that BKV nephropathy can be prevented if immunosuppressive medications are reduced in patients with BKV detected by a high viral load in plasma (determined by NAT) (300).
Timing of BKV NAT
The presence of BKV can be identified prior to the onset of clinical symptoms at a time when only subclinical infection is present, or in association with clinically apparent BKV nephropathy. Evidence to date suggests that the presence of BK viremia precedes BKV nephropathy by a median of 8 weeks. Approximately, 50% of patients who will develop BK viremia will do so by 3 months after transplant (298).
Most BKV nephropathy occurs in the first 2 years after transplant with only 5% of cases occurring between 2 and 5 years after transplant (298). Accordingly, the timing and frequency of testing in recommended screening algorithms should reflect these data and balance the cost of screening with the potential to prevent BKV nephropathy. The proposed screening algorithm is most intense early after kidney transplantation, with decreasing frequency as patients are out longer from the transplant. Although we have not recommended screening beyond the first year after transplant, an international consensus conference suggested continued annual screening for patients between 2 and 5 years after kidney transplantation (298). Centers with higher frequency of BKV might follow this approach. Screening for the presence of BKV should also be performed for patients with unexplained rises in serum creatinine, as this may be attributable to BKV nephropathy. Finally, screening should be considered for those patients who have undergone a major increase in immunosuppressive medication, as they may be at risk of developing BKV nephropathy.
Rising BKV load
There is increased risk of BKV nephropathy associated with a rising BKV load in plasma (298,299). Although plasma NAT assays for BKV lack standardization, a threshold plasma BKV level of >10 000 copies/mL (107 copies/L) is associated with a 93% specificity for the presence of BKV nephropathy. In the absence of evidence of clinical disease, KTRs with BKV levels in excess of this threshold are considered to be at risk of progression to BKV nephropathy (298,299). Histologic evidence of early BKV nephropathy may be present prior to detection of elevated serum creatinine (298).
The risk of BKV nephropathy appears to be correlated with the intensity of immunosuppression, and reduction of immunosuppression can result in a decrease in BKV load and a concomitant reduction of risk of development of BKV nephropathy (301). A RCT reported that withdrawal of the antimetabolite resulted in clearance of viremia without progression to BKV nephropathy (300). Although some would use antiviral therapy (including cidofovir, leflunomide and/or ciprofloxacin) as treatment, to date there are no definitive data confirming the effectiveness of these agents for either treatment or prevention of BKV nephropathy (298,299).
Some centers may choose different treatment strategies for patients with elevated BKV loads in the absence of any histologic changes, compared to patients with findings of BKV nephropathy in the absence of serum creatinine elevation. The international consensus group recommended performance of kidney biopsy for these patients (298). When a kidney biopsy is obtained, it should be evaluated for the presence of BKV using the cross-reacting antibody for simian virus 40. However, other experts have not recommended the performance of a kidney biopsy for asymptomatic patients with an elevated BKV load (300).
Treating biopsy-proven BKV nephropathy
The treatment of BKV nephropathy is unsatisfactory. Although there are some centers that would use antiviral therapy (including cidofovir, leflunomide and/or ciprofloxacin) as treatment, to date there are no definitive data confirming their effectiveness. However, reduction of immunosuppression does appear to have some impact on BKV nephropathy, though variable rates of graft loss attributable to BKV nephropathy have been reported even when reduction of immunosuppression has been employed (Table 14). A common practice of immunosuppressive dose reduction is withdrawal of antimetabolite (azathioprine or MMF) and reduction in CNI dosage by 50%. An algorithm for the treatment of BKV nephropathy through modification of baseline immunosuppression has been proposed (298). Switching from the antimetabolite MMF or EC-MPS to leflunomide (an immunosuppressive agent with antiviral activity) has been associated with declining BKV load in blood and improving histology (302), although convincing evidence of the efficacy of this, or other antiviral agents, is lacking.
|TacrolimusCsA (trough levels||Tacrolimus (trough levels||Tacrolimus or MMF (maintain|
|100–150 ng/mL) (B-III)||< 6 ng/mL) (B-III)||or switch to dual-drug therapy):|
|MMFazathioprine (dosing ≤100 mg/day) (B-III)||MMF dosing ≤1 g/day (B-III)||CsA/prednisone (B-III)|
|Tacrolimussirolimus (trough levels <6 ng/mL)||CsA (trough levels 100–150 ng/mL)||Tacrolimus/prednisone (B-III)|
|MMFsirolimus (trough levels <6 ng/mL) (C-III)||Sirolimus/prednisone (C-III)|
|MMFleflunomide (C-III)||MMF/prednisone (C-III)|
Studies are needed to determine:
- • the most cost-effective strategies for screening for BKV in different populations;
- • the efficacy of altering immunosuppressive medication regimens and of antiviral agents in the prevention and treatment of BKV nephropathy.
- 13.2: CYTOMEGALOVIRUS
- 13.2.1: CMV prophylaxis: We recommend that KTRs (except when donor and recipient both have negative CMV serologies) receive chemoprophylaxis for CMV infection with oral ganciclovir or valganciclovir for at least 3 months after transplantation, (1B) and for 6 weeks after treatment with a T-cell–depleting antibody. (1C)
- 13.2.2: In patients with CMV disease, we suggest weekly monitoring of CMV by NAT or pp65 antigenemia. (2D)
- 13.2.3: CMV treatment:
- 184.108.40.206: We recommend that all patients with serious (including most patients with tissue invasive) CMV disease be treated with intravenous ganciclovir. (1D)
- 220.127.116.11: We recommend that CMV disease in adult KTRs that is not serious (e.g. episodes that are associated with mild clinical symptoms) be treated with either intravenous ganciclovir or oral valganciclovir. (1D)
- 18.104.22.168: We recommend that all CMV disease in pediatric KTRs be treated with intravenous ganciclovir. (1D)
- 22.214.171.124: We suggest continuing therapy until CMV is no longer detectable by plasma NAT or pp65 antigenemia. (2D)
- 13.2.4: We suggest reducing immunosuppressive medication in life-threatening CMV disease, and CMV disease that persists in the face of treatment, until CMV disease has resolved. (2D)
- 126.96.36.199: We suggest monitoring graft function closely during CMV disease. (2D)
CMV, cytomegalovirus; KTRs, kidney transplant recipients; NAT, nucleic acid testing.
Cytomegalovirus disease is defined by the presence of clinical signs and symptoms attributable to CMV infection, and the presence of CMV in plasma by NAT or pp65 antigenemia. CMV disease may manifest as a nonspecific febrile syndrome (e.g. fever, leukopenia and atypical lymphocytosis) or tissue-invasive infections (e.g. hepatitis, pneumonitis and enteritis). Tissue-invasive CMV disease is defined as CMV disease and CMV detected in tissue with histology, NAT or culture. Serologically, negative CMV is defined by the absence of CMV immunoglobulin G (IgG) and immunoglobulin M. Serologically positive for CMV is defined as being CMV IgG-positive. Interpretation of CMV serologies may be confounded by the presence of passive antibody that may have been acquired from a blood or body-fluid contamination. Chemoprophylaxis is defined as the use of an antimicrobial agent in the absence of evidence of active infection, to prevent the acquisition of infection and the development of disease.
- • CMV disease is an important cause of morbidity and mortality.
- • There are strategies for preventing CMV infection and disease that result in marked improvements in outcomes.
- • Risk for CMV after transplantation is strongly dependent on donor (D) and recipient (R) serology, with patients who are D+/R−, D+/R+ or D−/R+ at risk for developing CMV infection and disease, and D+/R− at highest risk for severe CMV disease.
- • The incidence of CMV disease in D−/R− is <5%.
- • Chemoprophylaxis with ganciclovir or valganciclovir for at least 3 months after transplantation reduces CMV infection and disease in high-risk patients.
- • Chemoprophylaxis is associated with improved graft survival compared to preemptive antiviral therapy initiated in response to increased CMV load.
- • The use of a T-cell–depleting antibody is a risk factor for CMV disease.
- • Chemoprophylaxis with ganciclovir for patients receiving a T-cell–depleting antibody protects against the development of CMV disease.
- • A detectable CMV load at the end of antiviral therapy is associated with an increased risk of disease recurrence.
- • CMV infection is associated with acute rejection.
Cytomegalovirus is a frequent and important cause of clinical disease in KTRs. In the absence of antiviral prophylaxis, symptomatic CMV disease can be seen in approximately 8% of KTRs (303), although older estimates placed it at 10–60% of KTRs (304). In addition to directly attributable morbidity, CMV may also have an immunomodulatory effect, and active CMV disease has been associated with infectious complications as well as acute rejection and CAI (305). Accordingly, strategies that can prevent CMV infection and disease should lead to improved outcomes following kidney transplantation.
Randomized controlled trials have demonstrated that the incidence of CMV disease can be reduced by prophylaxis and preemptive therapies in solid-organ transplant recipients (306–308). In trials of KTRs alone, there is low-quality evidence, largely due to sparse data, that prophylaxis results in less acute rejection and CMV infection, with no clear evidence of increased adverse events (see Evidence Profile and accompanying evidence in Supporting Tables 48–49 at http://www3.interscience.wiley.com/journal/118499698/toc). However, there is high-quality evidence from a large systematic review that CMV prophylaxis in solid-organ transplant recipients (307) significantly reduces all-cause mortality, CMV disease mortality, CMV disease, but not acute rejection or graft loss. In most of these trials, the majority of organ recipients received kidneys. Thus, the Work Group concluded that overall there is moderate-quality evidence to support this recommendation. Observational data suggest that D+/R− KTRs are at the highest risk of developing severe CMV disease compared to all other KTRs (306). Studies in this high-risk population have shown that antiviral chemoprophylaxis reduces the incidence of CMV disease by about 60% (306). The use of antiviral chemoprophylaxis has also been shown to reduce the incidence of CMV-associated mortality, all-cause mortality, as well as clinically important disease due to opportunistic infections (306). Chemoprophylaxis has also been shown to be effective in KTRs at moderate risk for CMV disease (e.g. CMV D+/R+, or D−/R+).
In contrast to the situation for antiviral chemoprophylaxis, the number of studies evaluating the efficacy of viral load monitoring to inform preemptive therapy in high-risk patients is limited (308). While results of these studies are encouraging, they have only demonstrated a reduction in CMV disease, and this strategy has not yet been shown to reduce CMV-related mortality (306). At the present time, the use of viral load monitoring to prompt preemptive therapy is not recommended for these high-risk KTRs (307). The basis for this concern is both a lack of data in CMV D+/R− KTRs, the implications of a failure to comply with the preemptive monitoring approach (an important potential limitation of this strategy) and the relative safety and efficacy of universal chemoprophylaxis in high-risk organ transplant recipients.
The use of CMV viral load monitoring to inform preemptive antiviral treatment with ganciclovir in patients at moderate risk for developing CMV disease has been shown to be effective (308) and has several potential advantages compared to the use of universal chemoprophylaxis. Primary among these is limiting exposure to antiviral agents only to those KTRs who have demonstrated evidence of subclinical CMV infection. Based upon this, a consensus has existed to limit this approach to patients at moderate (but not high) risk for CMV disease (305,307). However, a recently published RCT comparing oral ganciclovir prophylaxis to CMV surveillance monitoring to inform preemptive ganciclovir therapy demonstrated an advantage in long-term graft survival in those KTRs randomized to received ganciclovir chemoprophylaxis (309). Accordingly, while many experts have previously felt that both strategies (universal chemoprophylaxis or viral load monitoring to inform pre-emptive antiviral therapy) were acceptable for the prevention of CMV disease in this population (305,308), if confirmed, the newer data may provide evidence that all KTRs at risk for the development of CMV should receive chemoprophylaxis and not a preemptive therapy approach. Some experts recommend the use of viral load monitoring to inform preemptive antiviral treatment in this cohort of KTRs at moderate risk for developing CMV disease.
A number of observational studies have shown that the incidence of CMV disease is very low (<5%) in CMV seronegative recipients of CMV seronegative donors (D−/R−) (307). Although there are no cost–benefit studies in this low-risk population, the very low incidence of CMV disease makes it very unlikely that the benefits of preventive strategies outweigh their harm. The latter include adverse effects of medication and costs.
There is strong evidence linking the use of antibody treatment of rejection with increased risk of CMV infection and disease. The use of these agents results in activation of CMV from latency to active infection.
A variety of potential antiviral agents have been evaluated. RCTs demonstrated that ganciclovir, valganciclovir, acyclovir and valacyclovir were each effective in the preventing CMV infection and disease (307). However, head-to-head comparisons demonstrated that ganciclovir was more effective than acyclovir in preventing both CMV infection and CMV disease. Oral valganciclovir was as effective as intravenous ganciclovir in the prevention of both CMV infection and disease. Oral and intravenous ganciclovir yielded similar results. The use of acyclovir and valacyclovir should be restricted to situations where ganciclovir/valganciclovir cannot be used.
Most recent RCTs evaluating oral antiviral agents for the prevention of CMV disease have treated patients for 3 months after transplantation (307). A recent meta-analysis did not find a difference in treatment efficacy for patients receiving less or more than 6 weeks of therapy. The impetus behind prolonged treatment is an increasing recognition of late CMV disease. A RCT evaluating 3 vs. 6 months is currently being conducted.
Three studies have evaluated prophylaxis or CMV disease in KTRs treated for acute rejection. Two studies evaluating ganciclovir in patients receiving antilymphocyte antibody therapy demonstrated a reduction in CMV disease (310). A third study evaluated the use of intravenous immunoglobulin followed by acyclovir prophylaxis in patients receiving OKT3 (311). This latter study failed to demonstrate a protective effect against CMV compared with no therapy. Accordingly, the use of intravenous ganciclovir or oral valganciclovir has been recommended for CMV prophylaxis during antilymphocyte antibody therapy (305). The use of oral ganciclovir should be avoided for patients with high-level CMV viremia (305). The use of acyclovir or famciclovir is not recommended, given the absence of data supporting the efficacy of these agents. It is also suggested that CMV serologies be repeated for patients CMV-seronegative prior to transplant, who require antibody therapy as treatment for rejection to decide their current risk status.
The presence of CMV in plasma, detected by NAT or pp65 antigenemia, at the end of treatment is a major predictor of recurrent CMV disease (305). Recent evidence suggests that the use of oral valganciclovir was effective in the treatment of CMV disease (312). Although the results of this study are encouraging, the determination of what level of disease is appropriate for oral therapy in the ambulatory setting vs. treatment with intravenous ganciclovir (at least initially) remains unclear. At this point, most experts would be willing to use oral therapy to treat adult KTRs with mild CMV disease. A consensus does not exist as to which patients with tissue-invasive disease might be candidates for oral therapy. Clearly, patients with more severe disease, including those with life-threatening disease should be hospitalized and treated with intravenous ganciclovir.
It is worth noting that similar data are not available for pediatric KTRs or other children undergoing solid-organ transplantation. Accordingly, while the use of oral valganciclovir may be appropriate for some adult KTRs experiencing mild to moderate CMV disease, all pediatric KTRs should receive intravenous ganciclovir for the treatment of CMV disease. Further, concern also exists with regards to the use of oral valganciclovir in patients in whom there are questions regarding adequate absorption of this medication.
CMV viral load testing
While resolution of clinical signs and symptoms are critical in the management of CMV disease, measurement of the CMV viral load provides additional useful information. The use of viral load monitoring identifies both virologic response (guiding duration of therapy) as well as the possible presence of antiviral resistance. The presence of detectable CMV load at the end of therapy is associated with an increased rate of recurrent disease (313). The time to clearance of CMV in plasma as measured by NAT may be prolonged compared to pp65, and may be associated with an increase risk of recurrent CMV disease (314).
Immunosuppression and graft function monitoring during CMV disease
The reduction of immunosuppression used as part of the treatment of CMV disease places patients at some risk for the development of rejection. The presence of CMV infection and disease has been associated with the development of rejection independent of reduction of immunosuppression. Accordingly, careful monitoring of kidney allograft function is warranted during treatment of CMV disease to guide the use of immunosuppression.
Randomized controlled trials are needed to determine:
- • the benefits and harm of CMV chemoprophylaxis vs. preemptive antiviral therapy informed by CMV viral load monitoring;
- • the optimal duration of antiviral chemoprophylaxis.
- 13.3: EPSTEIN-BARR VIRUS AND POST-TRANSPLANT LYMPHOPROLIFERATIVE DISEASE
- 13.3.1: We suggest monitoring high-risk (donor EBV seropositive/recipient seronegative) KTRs for EBV by NAT (2C):
- • once in the first week after transplantation (2D);
- • then at least monthly for the first 3–6 months after transplantation (2D);
- • then every 3 months until the end of the first post-transplant year (2D); and
- • additionally after treatment for acute rejection. (2D)
- 13.3.2: We suggest that EBV-seronegative patients with an increasing EBV load have immunosuppressive medication reduced. (2D)
- 13.3.3: We recommend that patients with EBV disease, including PTLD, have a reduction or cessation of immunosuppressive medication. (1C)
EBV, Epstein-Barr virus; KTRs, kidney transplant recipients; NAT, nucleic acid testing; PTLD, post-transplant lymphoproliferative disease.
Epstein-Barr virus (EBV) disease is defined by signs and symptoms of active viral infection and increased EBV load. The EBV viral load is defined as the amount of viral genome that is detectable in the peripheral blood by NAT. PTLD are clinical syndromes associated with EBV and lymphoproliferation, which range from self-limited, polyclonal proliferation to malignancies containing clonal chromosomal abnormalities (315). The World Health Organization (WHO) has developed a histological classification for PTLD (323).
- • There is a 10- to 50-fold increased risk for EBV disease (including PTLD) in EBV-seronegative compared to EBV-seropositive KTRs.
- • The EBV viral load measurement is sensitive, but not specific, for EBV disease and PTLD, particularly in previously seronegative KTRs.
- • The EBV viral load becomes positive before the development of EBV disease.
- • Early identification of primary infection and viral load monitoring allows therapeutic interventions to prevent progression to EBV disease.
- • Reducing immunosuppressive medication may prevent EBV disease and PTLD.
- • Reducing immunosuppressive medication is an effective treatment for many patients with EBV disease and PTLD.
- • EBV viral load is detectable and elevated in many patients experiencing EBV disease, including PTLD, but can also be elevated in asymptomatic patients.
- • The presence of EBV-negative PTLD has been reported, and these lesions may behave differently than EBV-positive PTLD lesions.
Primary EBV (human herpes virus 4) infection is associated with an increased incidence of PTLD in KTRs. An EBV-negative KTR from an EBV-positive donor is at increased risk for developing PTLD (316,317). A newly detectable or rising EBV load often precedes EBV disease and PTLD (318). Identification of seronegative patients with a rising EBV load offers the opportunity to preemptively intervene and potentially prevent progression to EBV disease including PTLD (319). While this has been observed most frequently in pediatric KTRs, there is no reason to assume that EBV-seronegative adult KTRs who receive a kidney from a EBV seropositive recipient are not also at increased risk of developing EBV disease, and likely to benefit from EBV load monitoring.
Primary EBV infection in EBV-seronegative organ transplant recipients occurs most frequently in the first 3–6 months following organ transplantation (320). This is most likely due to the fact that the source of the EBV infection is attributable to either the donor organ or blood products received by the patient at or near the time of transplant. Serial measurement of EBV loads in previously seronegative patients allows the identification of onset of infection (318). Continued observation of EBV loads in newly infected patients identifies those patients with rapidly rising viral loads who are likely to be at greatest risk of progressing to EBV disease. Because the most likely sources of EBV infection in KTRs are either passenger leukocytes from the donor allograft or blood products exposure (which are more likely at or near the time of transplantation), the likelihood that they will develop primary EBV infection is reduced with time after transplantation. Accordingly, EBV load monitoring should be performed most frequently during the first 3–6 months after transplant. Because the risk of developing EBV infection after this time period is diminished, but not eliminated, continued surveillance of EBV load is recommended, albeit at less frequent intervals.
EBV-seronegative patients with an increasing EBV viral load
The development of primary EBV infection after kidney transplantation is associated with a marked increased risk for the development of EBV disease and PTLD (316,317). High EBV loads have been found at the time of diagnosis of PTLD. Because the EBV load becomes positive 4–16 weeks prior to development of PTLD (318), the presence of a rising EBV load identifies patients in whom intervention may prevent PTLD.
The potential role of antiviral therapy as a preemptive response to a rising viral load is controversial. Children undergoing liver transplantation had a reduction in the risk for EBV PTLD with reduced immunosuppressive medication (tacrolimus) without concomitant use of antiviral therapy (321). In contrast, evidence is lacking for the efficacy of preemptive antiviral therapy (e.g. acyclovir, ganciclovir) in response to an elevated or rising EBV load in the absence of reduction of immunosuppression.
EBV disease diagnosis
Epstein-Barr virus disease can present with varied manifestations, including nonspecific febrile illness, gastroenteritis, hepatitis and other manifestations that may be attributable to CMV or other pathogens. Although biopsy to detect the presence of EBV infection within affected tissue is the most definitive way to confirm the diagnosis of EBV disease, histological confirmation may not be feasible for patients with some nonspecific clinical syndromes that may not localize to specific tissue (e.g. febrile syndromes). Because the EBV viral load is detectable and elevated in the vast majority of KTRs with EBV disease, including PTLD, the combination of the presence of a compatible clinical syndrome in association with a high EBV load provides a sensitive and specific approach to the diagnosis of EBV disease 322). However, it is still necessary to be cautious in considering this diagnosis, as many patients may have asymptomatic elevations of EBV load. Accordingly, such patients may be misdiagnosed as having EBV disease, if they develop intercurrent infections due to an alternative pathogen at a time that they are having an asymptomatic elevation in their EBV load. In such patients, a tissue diagnosis may be the only method of confirming the presence or absence of EBV disease.
The term PTLD describes a broad category of EBV-related diseases that have distinct histological appearances (Table 15) (323). The approach to the management of PTLD can vary according to the PTLD disease classification. Furthermore, EBV-negative PTLD lesions have been reported. These lesions may behave differently then EBV-positive lesions and may warrant alternative therapeutic options. In addition, lesions with a characteristic clinical appearance on physical examination or imaging studies may be due to alternative pathogens (e.g. pulmonary nodules attributable to fungal pathogens). Because of all these concerns, it is imperative that suspected PTLD lesions be biopsied and undergo histolopathologic evaluation by a pathologist experienced with the diagnosis of PTLD (315).
|1: Early lesions||Reactive plasmacytic hyperplasia|
|2: PTLD—polymorphic||Polyclonal (rare) Monoclonal|
|3: PTLD—monomorphic (classify according to lymphoma classification)||B-cell lymphomas|
|Diffuse large B-cell lymphoma (immunoblastic, centroblastic, anaplastic)|
|Plasma cell myeloma|
|Peripheral T-cell lymphoma, not otherwise categorized|
|Other types (hepatosplenic, gamma-delta, T/NK)|
|4: Other types (rare)||Hodgkin's disease-like lesions (associated with methotrexate therapy)|
Observational studies have suggested KTRs with EBV disease are at high risk of developing PTLD (324). Observational studies have also shown that mortality from EBV-associated PTLD is over 50% (325,326). The presence of immunosuppression is major risk factor for the development of EBV disease, including PTLD, in KTRs (317,327). In most cases, the progression of clinical symptoms is a consequence of the inability to mount an adequate EBV-specific cytotoxic T-cell response, because of the immunosuppressive medications. It is therefore logical to assume that reduction of immunosuppression may result in resolution of EBV disease. As many as two thirds of patients presenting with EBV-associated PTLD will respond to reduction or withdrawal of immunosuppressive medication (315,328). This is less likely to be the case for patients presenting more than 1 year after transplantation, or with EBV-associated lymphoma. In these cases, there is an increased tendency for the lesions to behave in a truly malignant fashion. However, because some patients presenting late after transplant with biopsy evidence of lymphoma have responded to reduction of immunosuppression, this strategy may still be considered even in these patients, though expectations of efficacy will be reduced.
Epstein-Barr virus disease and PTLD are important causes of morbidity and mortality following kidney transplantation. Rates of PTLD are higher in pediatric KTRs and those patients who are EBV-seronegative prior to transplant who experience primary infection after transplant. While EBV disease and PTLD may be more common among pediatric KTRs, adult EBV-seronegative recipients of kidneys from an EBV-seropositive donor are also felt to be at increased risk for the development of these complications. Because of the complexity of this disease and its management, involvement of infectious diseases specialists, oncologists and transplant physicians in a team approach will likely maximize therapeutic outcomes.
- 13.4: HERPES SIMPLEX VIRUS 1, 2 AND VARICELLA ZOSTER VIRUS
- 13.4.1: We recommend that KTRs who develop a superficial HSV 1, 2 infection be treated (1B) with an appropriate oral antiviral agent (e.g. acyclovir, valacyclovir, or famciclovir) until all lesions have resolved. (1D)
- 13.4.2: We recommend that KTRs with systemic HSV 1, 2 infection be treated (1B) with intravenous acyclovir and a reduction in immunosuppressive medication. (1D)
- 188.8.131.52: We recommend that intravenous acyclovir continue until the patient has a clinical response, (1B) then switch to an appropriate oral antiviral agent (e.g. acyclovir, valacyclovir, or famciclovir) to complete a total treatment duration of 14–21 days. (2D)
- 13.4.3: We suggest using a prophylactic antiviral agent for KTRs experiencing frequent recurrences of HSV 1,2 infection. (2D)
- 13.4.4: We recommend that primary VZV infection (chicken pox) in KTRs be treated (1C) with either intravenous or oral acyclovir or valacyclovir; and a temporary reduction in amount of immunosuppressive medication. (2D)
- 184.108.40.206: We recommend that treatment be continued at least until all lesions have scabbed. (1D)
- 13.4.5: We recommend that uncomplicated herpes zoster (shingles) be treated (1B) with oral acyclovir or valacyclovir (1B), at least until all lesions have scabbed. (1D)
- 13.4.6: We recommend that disseminated or invasive herpes zoster be treated (1B) with intravenous acyclovir and a temporary reduction in the amount of immunosuppressive medication (1C), at least until all lesions have scabbed. (1D)
- 13.4.7: We recommend that prevention of primary varicella zoster be instituted in varicella-susceptible patients after exposure to individuals with active varicella zoster infection (1D):
- • varicella zoster immunoglobulin (or intravenous immunoglobulin) within 96 hours of exposure (1D);
- • if immunoglobulin is not available or more than 96 h have passed, a 7-day course of oral acyclovir begun 7–10 days after varicella exposure. (2D)
HSV, herpes simplex virus; KTRs, kidney transplant recipients; VZV, varicella zoster virus.
Superficial herpes simplex virus (HSV) infection is defined as disease limited to the skin or mucosal surfaces without evidence of dissemination to visceral organs.
Systemic HSV infection is defined by disease involving visceral organs.
Primary varicella zoster virus (VZV) infection is infection in a patient who is immunologically naive to VZV. In general, primary VZV presents as ‘chickenpox,’ which most frequently manifests as multiple crops of cutaneous lesions that evolve from macular, papular, vesicular and pustular stages. The lesions tend to erupt over the entire body and will be in different stages. Disseminated VZV can develop in immunocompromised individuals with involvement of the lungs, liver, central nervous system and other visceral organs.
Uncomplicated herpes zoster (shingles) is defined as the presence of cutaneous zoster limited to no more than three dermatomes.
Disseminated or invasive herpes zoster is defined as the presence of cutaneous zoster in more than three dermatomes, and/or evidence of organ system involvement.
The definition of a clinically significant exposure to an individual with active VZV infection varies by whether the infected individual presents with varicella (chickenpox) or zoster (shingles). Varicella may be spread to a susceptible individual by either airborne exposure or direct contact with a lesion. In contrast, an infectious exposure to someone with zoster requires direct contact with a lesion. Accordingly, a significant exposure to varicella is defined by face-to-face contact with someone with chickenpox, while a significant exposure to someone with zoster requires direct contact with a lesion. The minimum duration of airborne exposure necessary to allow transmission is not known. In general, most experts consider the minimum to be somewhere in the range of 5–60 min.
- • Superficial HSV infections are typically self-limited in immunocompetent patients, but immunosuppressive medication in KTRs increases the risk for invasive and disseminated HSV infection; treatment of superficial HSV infections with oral acyclovir or valacyclovir is safe and effective.
- • Systemic HSV infections represent a potentially life-threatening complication to immunosuppressed KTRs. Intensive treatment of systemic HSV infection with intravenous acyclovir and a reduction in the amount of immunosuppressive medication is warranted to prevent progression and further dissemination of HSV.
- • Primary VZV infection is potentially life-threatening to KTRs. Treatment with intravenous acyclovir is safe and effective.
- • Herpes zoster infection is potentially life-threatening to KTRs. Treatment with oral acyclovir or valacyclovir is safe and effective.
- • Disseminated or invasive herpes zoster is life-threatening to KTRs. Treatment with intravenous acyclovir and a temporary reduction in the amount of immunosuppressive medication is safe and effective.
- • The use of varicella zoster immunoglobulin or commercial intravenous immunoglobulin products within 96 h of exposure to VZV prevents or modifies varicella in susceptible individuals.
- • Oral acyclovir begun within 7–10 days after varicella exposure and continued for 7 days appears to be a reasonable alternative to immunoglobulin to prevent or modify primary varicella in susceptible individuals (329,330).
Superficial HSV infection
Serologic evidence of HSV1 and HSV2 is common in the general population. Although periodic reactivation of HSV1 and HSV2 infection occurs, these episodes tend to be self-limited in immunocompetent individuals. However, episodes of invasive or disseminated HSV may occur in KTRs receiving immunosuppressive medications, and indeed the incidence of invasive HSV is higher in KTRs than in the general population (331,332).
The highest incidence of HSV reactivation occurs early after transplantation, with the greatest risk occurring during the first month following transplantation (333). While presentation later after transplant is associated with a lower risk of dissemination, treatment of superficial infection with oral acyclovir, valacyclovir or famciclovir is still recommended, given the safety and efficacy of these medications (333). To prevent dissemination, it seems prudent to continue treatment until there are no new, active lesions.
Systemic HSV infection
In contrast to superficial HSV infection, systemic HSV infection involving the lungs, liver, central nervous system or other visceral organs represents a potentially life-threatening complication. Because systemic HSV is life-threatening, hospitalization and treatment with intravenous acyclovir is warranted (333). If possible, immunosuppressive medications should be reduced or withdrawn until the infection has resolved.
Intravenous acyclovir should be continued until there is demonstrative evidence of clinical improvement as measured by resolution of fever, hypoxia and signs or symptoms of hepatitis. Once the patient has reached this level of improvement, completion of therapy may be carried out using oral acyclovir or valacyclovir.
Primary varicella zoster infection
Varicella zoster infection can be life-threatening in KTRs (334,335). Although some centers have begun to institute the use of oral acyclovir in the outpatient setting for KTRs, there is little evidence to confirm the safety and efficacy of this approach. Careful selection of patients with assurance of close clinical follow-up is necessary if oral acyclovir is to be used in these patients.
Uncomplicated herpes zoster
Although herpes zoster can be seen in immunocompetent patients, the presence of immunosuppression is associated with an increased risk for the development of both uncomplicated and complicated herpes zoster infection. Patients with only skin disease, but who have lesions involving more than three dermatomes, are considered to have disseminated cutaneous zoster. Similarly, patients with visceral involvement in addition to skin disease are considered to have disseminated zoster.
Uncomplicated zoster is a clinical syndrome characterized by cutaneous clustering of vesicular lesions in a dermatomal distribution of one or more adjacent sensory nerves. An important complication of herpes zoster in immunocompetent adults is the potential development of postherpetic neuralgia. RCTs in healthy adults have demonstrated that the use of acyclovir, valacyclovir or famciclovir have been associated with more rapid healing of the skin, as well as a decreased incidence of both acute neuritis and postherpetic neuralgia (336,337). In immunocompromised hosts, patients are at risk not only of postherpetic neuralgia but also of severe local dermatomal infection (334). Similarly, immunosuppressed patients are at increased risk for the development of disseminated cutaneous zoster and visceral dissemination. The more severe the level of immunosuppression, the greater the risk of dissemination. Accordingly, prompt initiation of antiviral therapy with close follow-up is warranted for these patients, even if they have only superficial skin infection (333).
Disseminated or invasive herpes zoster
Treatment with intravenous acyclovir and temporary reduction in the amount of immunosuppressive medication is efficacious (333,338). Although specific evidence is not available to guide which immunosuppressive agent should be reduced, it would seem logical, whenever possible, to reduce the dosage of CNIs as well as steroids. In the absence of any evidence of intercurrent rejection, an effort should be made to maintain the reduced level of immunosuppression for a minimum of 3–5 days and until there is evidence of clinical improvement.
Prevention of primary varicella zoster infection
The use of varicella zoster immunoglobulin has been demonstrated to prevent or modify varicella in immunosuppressed individuals exposed to varicella (330,333,339). If varicella zoster immunoglobulin is not available, or if >96 h have passed since the exposure, some experts recommend prophylaxis with a 7-day course of oral acyclovir (80 mg/kg/day administered in four divided doses with a maximum of 800 mg per dose) beginning on day 7–10 after varicella exposure (330,339). The use of varicella vaccine is not recommended as a postexposure prophylactic strategy in KTRs.
- 13.5: HEPATITIS C VIRUS
- 13.5.1: We suggest that HCV-infected KTRs be treated only when the benefits of treatment clearly outweigh the risk of allograft rejection due to interferon-based therapy (e.g. fibrosing cholestatic hepatitis, life-threatening vasculitis). (2D)[Based on KDIGO Hepatitis C Recommendation 2.1.5.]
- 13.5.2: We suggest monotherapy with standard interferon for HCV-infected KTRs in whom the benefits of antiviral treatment clearly outweigh the risks. (2D)[Based on KDIGO Hepatitis C Recommendations 2.2.4 and 4.4.2.]
- 13.5.3: We suggest that all conventional current induction and maintenance immunosuppressive regimens can be used in HCV-infected patients. (2D)[Based on KDIGO Hepatitis C Recommendation 4.3.]
- 13.5.4: Measure ALT in HCV-infected patients monthly for the first 6 months and every 3–6 months, thereafter. Perform imaging annually to look for cirrhosis and hepatocellular carcinoma. (Not Graded)[Based on KDIGO Hepatitis C Recommendation 4.4.1.] (See Recommendation 19.3.)
- 13.5.5: Test HCV-infected patients at least every 3–6 months for proteinuria. (Not Graded)[Based on KDIGO Hepatitis C Recommendation 4.4.4.]
- 220.127.116.11: For patients who develop new-onset proteinuria (either urine protein/creatinine ratio >1 or 24-hour urine protein >1 g on two or more occasions), perform an allograft biopsy with immunofluorescence and electron microscopy. (Not Graded)[Based on KDIGO Hepatitis C Recommendation 4.4.4.]
- 13.5.6: We suggest that patients with HCV-associated glomerulopathy not receive interferon. (2D)[Based on KDIGO Hepatitis C Recommendation 4.4.5.]
ALT, alanine aminotransferase; HCV, hepatitis C virus; KDIGO, Kidney Disease: Improving Global Outcomes; KTRs, kidney transplant recipients.
The Work Group reviewed the KDIGO Hepatitis C Guidelines (340) that were applicable to KTRs, and ultimately agreed with the pertinent guideline statements. Only minor modifications (to guideline statement 4.4.1 in the KDIGO Hepatitis C Guidelines) were made, resulting in recommendation statement 13.5.4. The Transplant Work Group did not conduct a systematic review, but relied on the evidence reviewed by the Hepatitis C Work Group. A brief synopsis of the rationale for the KDIGO Hepatitis C Guidelines that are pertinent to KTRs is presented, with further discussion to the modification of recommendation 13.5.4. Details may be found in the Hepatitis C guidelines.
Kidney transplant recipients infected with hepatitis C virus (HCV) have worse patient- and allograft-survival rates than KTRs without HCV infection. In addition, HCV-infected KTRs are at increased risk for several complications, including worsening liver disease, NODAT and glomerulonephritis. Thus, close follow-up of the HCV-infected KTR is prudent.
There are few data to suggest when and how to screen HCV-infected KTRs for posttransplant complications. However, given the higher level of immunosuppression early after transplantation, the Transplant Guideline Work Group determined that liver enzymes should be checked every month for the first 6 months of the posttransplant period, and every 3 months thereafter. The detection of clinically worsening liver enzymes should prompt referral for hepatologic evaluation. Annual liver ultrasound and alpha-fetoprotein level to screen for hepatocellular carcinoma should be considered in patients with cirrhosis on liver biopsy.
Available evidence indicates that all currently available induction and maintenance immunosuppressive agents can be used in KTRs infected with HCV. Although immunosuppression may cause or contribute to complications of HCV in KTRs, there is scant evidence that one type of immunosuppressive agent is more or less likely to be harmful. The exception is tacrolimus, which increases the risk for NODAT, and might be expected to impart at least an additive risk for NODAT to HCV-infected KTRs.
Interferon is effective for viral eradication in HCV-infected patients, especially when combined with ribavirin. However, the administration of interferon after kidney transplantation can be deleterious to the allograft and should generally be avoided in KTRs, unless there is indication of worsening hepatic injury.
Hepatitis C virus infection has also been implicated in the pathogenesis of glomerular disease in both native and transplanted kidneys. Therefore, the Hepatitis C and Transplant Guideline Work Groups concluded that HCV-infected KTRs should be tested for proteinuria every 3–6 months. As recommended for all KTRs, patients who develop new-onset proteinuria (either urine protein/creatinine ratio >1 or 24-hour urine protein greater than 1 g on two or more occasions) should have an allograft biopsy with immunofluorescence and electron microscopy.
Interferon-based therapies may be effective in treating HCV-related glomerulopathy in native kidney disease. However, interferon use in KTRs is associated with an increased risk of rejection. The risk of kidney allograft loss from progressive HCV-associated glomerulopathy compared to that from interferon-induced rejection is unknown. Ribavirin can reduce proteinuria in HCV-associated glomerulopathy, although its impact on kidney function is unknown and it does not lead to viral clearance.
- 13.6: HEPATITIS B VIRUS
- 13.6.1: We suggest that any currently available induction and maintenance immunosuppressive medication can be used in HBV-infected KTRs. (2D)
- 13.6.2: We suggest that interferon treatment should generally be avoided in HBV-infected KTRs. (2C)
- 13.6.3: We suggest that all HBsAg-positive KTRs receive prophylaxis with tenofovir, entecavir, or lamivudine. (2B)
- 18.104.22.168: Tenofovir or entecavir are preferable to lamivudine, to minimize development of potential drug resistance, unless medication cost requires that lamivudine be used. (Not Graded)
- 22.214.171.124: During therapy with antivirals, measure HBV DNA and ALT levels every 3 months to monitor efficacy and to detect drug resistance. (Not Graded)
- 13.6.4: We suggest treatment with adefovir or tenofovir for KTRs with lamivudine resistance (>5 log10 copies/mL rebound of HBV-DNA). (2D)
- 13.6.5: Screen HBsAg-positive patients with cirrhosis for hepatocellular carcinoma every 12 months with liver ultrasound and alpha feto-protein. (Not Graded) (See Recommendation 19.3.)
- 13.6.6: We suggest that patients who are negative for HBsAg and have HBsAb titer <10 mIU/mL receive booster vaccination to raise the titer to ≥100 mIU/mL. (2D)
ALT, alanine aminotransferase; HBsAb, antibody to hepatitis B surface antigen; HBsAg, hepatitis B surface antigen; HBV, hepatitis B virus; KTRs, kidney transplant recipients.
Patients with CKD stage 5 are at increased risk of acquiring HBV infection. Infection can be acquired through infected blood products, or transmission from another infected patient in a dialysis unit. The risk has come down considerably in Western countries following the introduction of universal immunization and strict isolation practices, but remains substantial in developing countries. Screening for HBV infection is done by serologic testing for hepatitis B surface antigen (HBsAg). NAT for the presence of HBV DNA gives a more accurate idea of the infection load. Viral replication is accelerated following introduction of immunosuppression in KTRs. A number of studies have shown that HBV infection increases the risk of mortality, most often due to liver disease and graft failure. Effective antiviral therapy permits inhibition of viral replication and retards development of progressive liver disease, and may lower the risk of liver cancer.
- • HBV-infected patients exhibit increased viral replication and are at risk for progressive liver disease after kidney transplantation.
- • HBsAg positivity is an independent risk factor for mortality and graft failure.
- • HBsAg-negative patients are at low risk of increased viral replication and progressive liver disease.
- • Prospective studies have shown that antiviral agents normalize alanine aminotransferase (ALT), and induce clearance of HBV-DNA and hepatitis B E antigen (HBeAg). Antiviral agents are best used as prophylaxis, since KTRs not initiated on antiviral agents at the time of transplantation often develop enhanced viral replication and hepatic dysfunction.
- • ALT activity is lower in KTRs than in the general population, and is unreliable as a marker of liver disease activity by itself. Serial monitoring of HBV DNA is required to assess treatment efficacy. A rise in DNA copy number suggests development of resistance.
- • The newer nucleoside analogues, adefovir and tenofovir are effective for treatment of lamivudine-resistant HBV infection.
Hepatitis B virus infected patients are at risk of exacerbation of the infection, progressive liver disease and development of hepatocellular carcinoma after kidney transplantation. The rate of HBV infection in CKD stage 5 patients as determined by seropositivity for HBsAg varies between 0% and 8% in developed countries (341). The US Centers for Disease Control and Prevention (CDC) estimates that the prevalence of HBsAg-positive patients in the US dialysis population has declined from 7.8% to 0.9%, with an estimated incidence of disease in 2000 of 0.05% (342). This has largely been due to widespread use of universal precautions, screening of the blood supply, the use of erythropoiesis-stimulating agents (ESAs), HBV vaccination and strict implementation of segregation of HBsAg-positive from HBsAg-negative patients during hemodialysis with dedicated machines and staff for each group. The prevalence, however, is much higher (10–20%) in developing countries.
Hepatitis B virus infection in CKD stage 5 patients is usually asymptomatic even in the acute phase, with about 80% of patients progressing to a chronic carrier state (343). Immunosuppression following kidney transplantation leads to increased replication of HBV and results in progressive liver disease. Assessing the natural history of hepatitis B among KTRs is difficult for several reasons (344). Aminotransferase activity is lower in this population, which hampers recognition of HBV-related liver disease (345).
In a meta-analysis (346) of six observational studies (6050 patients), HBsAg positivity was found to be an independent and significant risk factor for mortality (RR 2.49, 95% CI 1.64–3.78) and graft failure (RR 1.44, 95% CI 1.02–2.04). This finding was confirmed in later observational studies. In a study of 286 kidney transplant patients, liver-related death was the most common cause of death in HBV-positive patients (347). A survey from the South Eastern Organ Procurement Foundation demonstrated a detrimental effect of HBV infection on patient survival (p = 0.02) and graft survival (p = 0.05) in 13 287 patients who underwent kidney transplantation between 1977 and 1987 in the United States (348). Patient survival was 62% and 66% at 10 years for HBsAg-positive and -negative KTRs (p = 0.02). The 10-year survival rate of HBsAg-positive KTRs (45%) compares poorly with HCV-infected patients (65%). In patients with biopsy diagnosis of cirrhosis, 10-year survival was 26% (349).
Many studies provided only limited details of virology and did not incorporate liver histology before kidney transplantation, leading to underestimation of the severity of liver disease at the time of transplantation. The only study that carried out serial biopsies found histological deterioration in 85% of HBsAg-positive patients at a mean interval of 66 months. Approximately, 28% showed cirrhosis, whereas no patients had been cirrhotic on baseline biopsy (350). Among those with cirrhosis, hepatocellular carcinoma was found in 23%, suggesting an annual incidence of between 2.5% and 5%. Based on these data, an expert group recommended hepatic imaging every 3 months to detect hepatocellular carcinoma in patients with cirrhosis (351).
The standard practice of screening for HBV infection is testing for HBsAg. The place of routine NAT in these patients is unclear. Some recent studies have shown that a proportion of dialysis patients may exhibit occult HBV infection as detected by NAT in the face of a negative HBsAg (352–360) but not all (361–363). These patients have generally low viral loads and may have mutations that prevent appearance of HBsAg. A large proportion of those with occult infection have antibody to hepatitis B core antigen (HBcAb) and it has been suggested that testing these patients by NAT may be a cost-effective strategy for confirming occult infection. The risk of reactivation of HBV among patients who are HBsAg-negative and HBcAb-positive is low, however (364). Berger et al. (365) found recurrence in 2 of 229 (0.9%) such patients. Savas et al. (366) reported two cases of reactivation and provided a review of 25 previously reported cases. They noted a wide age range of patients experiencing recurrence (22–75 years), a male preponderance, and a posttransplant time of onset between 8 weeks and 15 years. All but one patient had HBsAb titers of less than 100 mIU/mL, leading the authors to suggest that vaccination of such patients may be an effective preventative measure. An expert group recommended routine use of vaccination in such patients to boost the titers above 100 mIU/mL and lamivudine prophylaxis (see section ‘Pharmacotherapy,’ below) during periods of intensified immunosuppression (351).
The primary goals of management are maximal suppression of viral replication, while minimizing development of resistance and prevention of hepatic fibrosis. In view of the poor likelihood of seroconversion to HBsAb, low rates of conversion from HBeAg to anti-HBeAg antibody positivity, and poor reliability of following ALT as a measure of activity, HBV DNA levels need to be followed to assess response to therapy. Serological markers of fibrosis, such as the commercially available Fibrotest panel, have not been evaluated in KTRs with HBV infection. Since the replication is dependent on the overall extent of immunosuppression rather than an individual drug, efforts should be made to minimize the doses of all immunosuppressive drugs without compromising graft outcomes. These include use of the lowest possible dose of steroids. Currently, there is no evidence for the differential effect of any specific immunosuppressive agent on HBV replication.
There are currently seven medications available for the treatment of hepatitis B: interferon alfa-2b, pegylated interferon alfa 2a, lamivudine, adefovir, tenofovir, telbivudine and entecavir. Interferon therapy for HBV infection in KTRs is associated with high rates of graft loss due to rejection. In a series (367) of 31 HBsAg-positive KTRs treated with recombinant interferon-alpha (three million international units) three times a week for 6 months, long-term ALT normalization was noted in 47% of patients and 13% cleared HBeAg. However, graft loss occurred in five out of 17 patients during therapy and an additional four patients after the completion of therapy. The use of interferon in this setting, therefore, is not recommended (351).
Lamivudine, a cytosine analog that inhibits HBV reverse transcriptase, has been used extensively in KTRs with HBV infection (Table 16). The utility of lamivudine in stabilization of liver function was shown in several observational studies. A meta-analysis (368) that included 14 prospective cohort studies (184 patients) determined the mean overall estimate for ALT normalization, and HBV-DNA and HBeAg clearance at 81% (95% CI 70–92%), 91% (95% CI 86–96%) and 27% (95% CI 16–39%), respectively. The duration of lamivudine therapy was 6–12 months in the majority (11 of 14) of the studies. Later clinical trials (369–375) have shown similar results with lamivudine monotherapy given for 24–69 months. HBeAg and HBV-DNA clearance occurred in 0–25% and 43–78%, respectively. Changes in ALT paralleled those in viremia, and 33–77% of patients maintained normal ALT levels.
|Author (year) (ref no)||ALT normalization (%)||HBsAg clearance (%)||HBeAg clearance (%)||HBeAg seroconversion (%)||HBV DNA clearance (%)|
|Rostaing (1997) (376)||4/5 (80)||0||0||NA||6/6 (100)|
|Goffin (1998) (377)||4/4 (100)||0||0||0/1 (0)||4/4 (100)|
|Jung (1998) (378)||6/6 (100)||0||1/3 (33)||NA||6/6 (100)|
|Kletzmayr (2000) (379)||3/3 (100)||0||2/12 (17)||2/12 (17)||15/16 (93)|
|Tsai (2000) (380)||NA||0||0||NA||7/8 (87.5)|
|Lewandowska (2000) (381)||17/28 (61)||0||2/26 (8)||NA||10/10 (100)|
|Antoine (2000) (382)||NA||0||8/12 (67)||NA||9/12 (75)|
|Mouquet (2000) (383)||8/15 (53)||0||NA||NA||13/15 (87)|
|Fontaine (2000) (384)||NA||0||6/13 (46)||6/13 (46)||26/26 (100)|
|Lee (2001) (385)||NA||1/13 (8)||3/8 (37.5)||3/8 (37.5)||10/13 (77)|
|Han (2001) (386)||6/6 (100)||0||2/3 (67)||NA||6/6 (100)|
|Chan (2002) (369)||14/14 (100)||0||3/14 (21)||NA||26/26 (100)|
|Park (2001) (387)||8/10 (80)||0||1/5 (20)||NA||7/10 (70)|
|Mosconi (2001) (388)||NA||0||NA||NA||4/4 (100)|
Timing of initiation
Data on optimal timing of initiation of antiviral therapy are scarce. However, the available data support starting treatment at the time of transplantation in HBsAg-positive patients, irrespective of HBV DNA levels. In a study of 15 patients with normal preoperative ALT (389), seven were started on lamivudine at the time of kidney transplantation. Half of those not treated showed transaminase elevations and HBV viremia in the first year of follow-up, requiring initiation of lamivudine therapy. In contrast, all seven individuals who received lamivudine at the time of transplantation continued to have normal ALT and were negative for HBV DNA throughout the follow-up. In another study of HBsAg-positive KTRs (386), where lamivudine was given prophylactically (HBV DNA negative) or preemptively (HBV DNA positive) to 10 patients or reserved for hepatic dysfunction in 10 patients, 42% in the latter group developed viremia during follow-up, compared to 10% in the former. Six in the reactive group developed hepatic dysfunction compared to none in the prophylactic/preemptive group. In another study (369) where the decision to start lamivudine was based on HBV DNA levels or liver function status, all patients had to be started on lamivudine at a mean time period of 8 months after transplant. More than half the patients were started on treatment because of abnormal ALT.
Duration of therapy
The optimal duration of therapy that ensures long-term remission of viremia and maintenance of normal liver function and minimizes the development of resistance is not known. In a meta-analysis, increased duration of lamivudine therapy was positively associated with frequency of HBeAg loss (r = 0.51, p = 0.04) (Figure 1) (368). Lamivudine discontinuation was attempted by Chan et al. (369) in 12 low-risk patients after stabilization, and was successful in only five (42%).
At least 24 months of prophylactic treatment has been recommended (390). The optimal treatment and the choice of drugs require further study. Withdrawal of antiviral therapy may be associated with a relapse and increased viral replication, even resulting in liver failure.
Development of resistance is a major clinical problem with long-term lamivudine use. This is usually reflected by a secondary increase in the HBV DNA titers. A commonly used definition is demonstration of >5 log10 copies/mL rebound of HBV DNA. In most, but not all, instances, it is caused by a mutation in the tyrosine–methionine–aspartate–aspartate (YMDD) locus of the HBV DNA polymerase (384). The clinical presentation varies. While some patients show no significant biochemical changes or clinical symptoms, others develop deterioration in liver function (391).
In a study of 29 KTRs (392), resistance was noted in 48% of patients during a mean follow-up period of 69 months; all due to YMDD mutations. Resistance was not related to patient demographics, HBeAg status, seroconversion rates or genotype. About 80% with the YMDD mutation had a hepatitis flare. In the meta-analysis (368), the mean overall estimate for lamivudine resistance was 18% (95% CI 10–37%). An increased duration of lamivudine therapy was positively associated with lamivudine resistance (r = 0.62, p = 0.02). The cumulative probability of developing resistance was approximately 60% in the later studies.
Patients with lamivudine resistance should be treated with adefovir or tenofovir. Limited data are available regarding use of these agents in KTRs. Fontaine et al. (393) gave adefovir to 11 KTRs with lamivudine-resistant HBV infection and found it to be effective in bringing about a reduction in serum HBV DNA, without any significant adverse effects. Entecavir, a guanosine analog, is 30 times more potent than lamivudine in suppressing viral replication. In a multicenter, double-blind RCT comparing entecavir to lamivudine in the general population, entecavir was shown to result in larger reductions in HBV DNA than lamivudine. At a dose of 0.5 mg daily, 83% of patients treated with entecavir had undetectable HBV DNA compared to 58% of those treated with lamivudine (394). In a study (395) that treated eight adefovir- and lamivudine-resistant KTRs with entecavir for 16.5 months, there was a significant decrease in HBV DNA viral load without any significant adverse effects. Data in the non-CKD population shows that, while the risk of resistance to entecavir is low in treatment-naïve patients, it may be as high as 51% at 5 years (396) in lamivudine-resistant cases. In a recent study, tenofovir was shown to be superior to adefovir in achieving remission of HBV viremia and hepatic histologic scores in non-CKD patients. Tenofovir was effective in lamivudine-resistant cases, and did not produce resistance up to 48 months of treatment (397). Of the two agents, tenofovir has a much lower renal toxicity than adefovir, and hence would be the preferred agent in KTRs. It is not known whether substitution of lamivudine with entecavir or tenofovir for prophylaxis will prevent development of resistance.
- • The frequency of occult HBV infection in patients with CKD stage 5 should be evaluated in different parts of the world, and its impact on posttransplant outcomes determined.
- • Studies are required to determine whether substitution of lamivudine with entecavir or tenofovir for prophylaxis will prevent development of resistance in KTRs.
- 13.7: HUMAN IMMUNODEFICIENCY VIRUS
- 13.7.1: If not already done, screen for HIV infection. (Not Graded)
- 13.7.2: To determine antiretroviral therapy, refer HIV-infected KTRs to an HIV specialist, who should pay special attention to drug–drug interactions and appropriate dosing of medications. (Not Graded)
HIV, human immunodeficiency virus; KTRs, kidney transplant recipients.
Screening for human immunodeficiency virus (HIV) infection is defined as the performance of serologic testing for HIV. A two-step screening is usually performed. In the first step, patients are screened for the presence of antibodies against HIV, usually with an enzyme-linked immunosorbent assay (ELISA). This is an extremely sensitive test. However, it is not specific. Accordingly, those patients who are positive on ELISA are then screened using a Western Blot assay. The presence of a positive Western Blot assay for HIV confirms the diagnosis of HIV infection except in children <18 months of age, where a positive serologic test may be attributable to the presence of passive antibody acquired from the child's mother during the pregnancy. NAT for the presence of HIV DNA or HIV RNA viral load should be performed on children <18 months of age with a positive HIV antibody. Antiretroviral medications are used specifically for the treatment of HIV infection. Drug–drug interactions are pharmacokinetic interactions between separate medications that may result in accumulation or more rapid metabolism of one or both compounds.
- • Patients with HIV require specialized care in centers with appropriate expertise.
- • Screening for HIV infection should be carried out on all KTRs (ideally before transplantation) in order to identify those KTRs that will require specialized care.
- • Antiretroviral therapy is necessary to maintain virologic suppression and normal immunologic function in HIV patients undergoing kidney transplantation.
- • The concomitant use of antiretroviral agents and immunosuppressive medications creates the potential for drug–drug interactions that may substantially alter blood levels of drugs and require appropriate monitoring and adjustments in dosing.
Case series have documented successful outcomes of KTRs with HIV (398–400). However, these HIV patients had been carefully selected and adequately treated for HIV at the time of transplantation (400). Although HIV is not an absolute contraindication to kidney transplantation, the presence of HIV has major implications in the management of patients following transplantation. A major issue of concern in the management of HIV patients is the need to be aware of potential drug–drug interactions among antiretroviral therapy and other medications, including immunosuppressants. Care must be taken to identify and select those HIV-infected patients who are most likely to benefit from kidney transplantation without an unacceptably high risk of opportunistic infections.
Evidence from a National Institutes of Health (NIH)—sponsored study of organ transplantation in HIV patients has demonstrated both the effectiveness of transplantation as well as the complexity of management of KTRs with HIV (400). Data accrued from the NIH-sponsored study of organ transplantation in HIV-infected patients has identified specific drug combinations that are associated with drug–drug interactions in these patients (401). Accordingly, attention must be paid and caution must be used in these patients to account for the potential impact of these interactions. Although the data from the NIH study demonstrate the feasibility of transplantation for HIV-infected KTRs, the limited number of HIV patients with CKD stage 5 undergoing kidney transplantation to date suggests the need to continue performing this procedure under research protocols and in selected centers with appropriate expertise. Finally, it is worth noting that review of experience to date suggests that there may be an increased risk for the development of acute cellular rejection in patients with HIV undergoing organ transplantation.
- • There is a need to determine the optimal immunosuppression medication regimen, as well as the best antiretroviral regimens, for HIV-infected KTRs.
Chapter 14: Other Infections
- 14.1: URINARY TRACT INFECTION
- 14.1.1: We suggest that all KTRs receive UTI prophylaxis with daily trimethoprim–sulfamethoxazole for at least 6 months after transplantation. (2B)
- 14.1.2: For allograft pyelonephritis, we suggest initial hospitalization and treatment with intravenous antibiotics. (2C)
KTRs, kidney transplant recipients; UTI, urinary tract infection.
A urinary tract infection (UTI) is an infection causing signs and symptoms of cystitis or pyelonephritis (including the presence of signs of systemic inflammation), which is documented to be caused by an infectious agent. Kidney allograft pyelonephritis is an infection of the kidney allograft that is usually accompanied by characteristic signs and symptoms of systemic inflammation and a positive urine and/or blood culture. Occasionally, pyelonephritis is diagnosed by allograft biopsy. Antibiotic prophylaxis is the use of an antimicrobial agent (or agents) to prevent the development of a UTI.
- • UTI is a frequent and potentially important complication of kidney transplantation.
- • The use of antibiotic prophylaxis can reduce the risk of UTI.
- • Kidney allograft pyelonephritis may be associated with bacteremia, metastatic spread, impaired graft function and even death.
- • KTRs with clinical and laboratory evidence suggestive of kidney allograft pyelonephritis should be hospitalized and treated with intravenous antibiotics.
Observational studies have documented a high incidence of UTI in KTRs (402). Pyelonephritis of the kidney allograft is a common complication in KTRs (402). It may cause graft failure, sepsis and death. The use of antibiotic prophylaxis with trimethoprim–sulfamethoxazole has been demonstrated to decrease the frequency of bacterial infections, including UTI in KTRs (403). The use of trimethoprim–sulfamethoxazole for the first 9 months following kidney transplant was associated with statistically significant decreases in number of any bacterial infection, overall number of UTI and number of noncatheter UTI. There is moderate-quality evidence that the benefit of UTI prophylaxis (primarily preventing infection, but unclear evidence for reducing mortality or preventing graft loss) outweighs the risks (see Evidence Profile and accompanying evidence in Supporting Tables 50–51 at http://www3.interscience.wiley.com/journal/118499698/toc). Based upon this, and several other small studies, prophylactic trimethoprim–sulfamethoxazole for 6–12 months following kidney transplantation is warranted.
Although the use of ciproflaxicin also appeared effective in the prevention of UTI after KTRs, patients treated with this regimen were at risk for, and developed Pneumocystis jirovecii pneumonia (PCP) (see Recommendation 14.2) (404). Accordingly, the use of trimethoprim–sulfamethoxazole is preferred over ciprofloxacin at least during the first 6 months after transplantation.
Although some investigators have recommended indefinite use of trimethoprim–sulfamethoxazole, data are not available demonstrating clinical benefit beyond the first 9 months following kidney transplantation. Evidence suggests that late UTIs tend to be benign, without associated bacteremia, metastatic foci or effect on long-term graft function (405). For this reason, we recommend providing prophylaxis for a minimum of 6 months. For patients who are allergic to trimethoprim–sulfamethoxazole, the recommended alternative agent would be nitrofurantoin. This agent, which is widely recommended as an alternative to trimethoprim/sulfamethoxazole, is chosen over ciprofloxacin (despite demonstrated effectiveness in KTRs) in an effort to limit the likelihood of emergence of antibacterial resistance.
Kidney allograft pyelonephritis may be associated with bacteremia, metastatic spread, impaired graft function and even death. Accordingly, KTRs with clinical and laboratory evidence suggestive of kidney allograft pyelonephritis should be hospitalized and be treated with intravenous antibiotics for at least the initial course of therapy. This is particularly true in early infections (first 4–6 months following kidney transplantation). Recognition of the morbidity and mortality associated with allograft pyelonephritis led to recommendations in the 1980s to treat UTIs with as long as a 6-week course of antimicrobials for early UTI following transplantation. More recently, UTI after kidney transplantation has been associated with considerably lower morbidity and mortality (405). Accordingly, a less-prolonged course may be required, although patients experiencing relapsing infection should be considered for a more prolonged therapeutic course.
Because of the potential for serious complications, KTRs with kidney allograft pyelonephritis should be hospitalized and treated with intravenous antibiotics, at least initially. Although evidence derived from RCTs on the optimal duration of therapy for kidney allograft pyelonephritis are not available, it is anticipated, in the absence of a kidney abscess, that 14 days should be adequate.
- 14.2: PNEUMOCYSTIS JIROVECII PNEUMONIA
- 14.2.1: We recommend that all KTRs receive PCP prophylaxis with daily trimethoprim–sulfamethoxazole for 3–6 months after transplantation. (1B)
- 14.2.2: We suggest that all KTRs receive PCP prophylaxis with daily trimethoprim–sulfamethoxazole for at least 6 weeks during and after treatment for acute rejection. (2C)
- 14.2.3: We recommend that KTRs with PCP diagnosed by bronchial alveolar lavage and/or lung biopsy be treated with high-dose intravenous trimethoprim–sulfamethoxazole, corticosteroids, and a reduction in immunosuppressive medication. (1C)
- 14.2.4: We recommend treatment with corticosteroids for KTRs with moderate to severe PCP (as defined by PaO2 <70 mm Hg in room air or an alveolar gradient of >35 mm Hg). (1C)
KTRs, kidney transplant recipients; PaO2, partial pressure of oxygen in arterial blood; PCP, Pneumocystis jirovecii pneumonia.
Pneumocystis jirovecii (formally known as Pneumocystis carinii) is an opportunistic fungal pathogen known to cause life-threatening pneumonia in immunocompromised patients, including KTRs. P. jirovecii pneumonia (PCP) is defined as the presence of lower respiratory-tract infection due to P. jirovecii. A definitive diagnosis of PCP is made by demonstration of organisms in lung tissue or lower respiratory tract secretions. Because no specific diagnostic pattern exists on any given imaging test, it is imperative that the diagnosis of PCP be confirmed by lung biopsy or bronchoalveolar lavage.
- • Infection with P. jirovecii is life-threatening in KTRs.
- • Prophylaxis with trimethoprim–sulfamethoxazole is safe and effective.
- • Although thrice-weekly dosing of trimethoprim–sulfamethoxazole is adequate prophylaxis for PCP, daily dosing also provides prophylaxis for UTI and may be easier for patient adherence.
- • Treatment of PCP with high-dose, intravenous trimethoprim–sulfamethoxazole and reduction of immunosuppressive medications are the treatments of choice for PCP.
- • Based upon data from HIV-infected adults, the use of corticosteroids has been uniformly recommended for all patients experiencing moderate to severe PCP.
Pneumocystis jirovecii is an important opportunistic pathogen known to cause life threatening PCP in KTRs (406). The most typical time of onset of symptoms of PCP is 6–8 weeks following initiation of immunosuppressive therapy. Although PCP is potentially a life-threatening complication of KTRs, the use of chemoprophylaxis has been shown to be extremely effective in preventing the development of clinical disease attributable to this pathogen. The use of trimethoprim–sulfamethoxazole prophylaxis resulted in a RR of 0.08 (95% CI 0.023–0.036) of developing PCP compared to either a placebo, control or no intervention (403). Treatment also decreased mortality.
There was no difference in efficacy for PCP when trimethoprim–sulfamethoxazole was given daily or three times per week (407). However, in KTRs, the use of daily trimethoprim–sulfamethoxazole may be associated with a decreased risk of bacterial infection (403). Although definitive evidence for the duration of PCP prophylaxis is not available, most experts agree that it should be continued for at least 6 months (and perhaps as long as 1 year) following transplantation (406). Because most KTRs will remain on immunosuppression for the rest of their lives, some experts recommend a more prolonged and perhaps even indefinite use of PCP prophylaxis. Indications for the use of alternative preventive agents include the development of allergic reactions and/or drug-induced neutropenia from trimethoprim–sulfamethoxazole. Potential alternative agents include dapsone, aerosolized pentamidine, atovaquone or the combination of clindamycin and pyrimethamine (Table 17).
|Agent||Adult dose||Pediatric dose|
|Trimethoprim/sulfamethoxazoleb||Single-strength pill (80 mg as trimethoprim) or double-strength pill (160 mg as trimethoprim) daily or three times per week||150 mg/m2/day as trimethoprim daily or three times per week|
|Aerosolized pentamidine||300 mg inhaled every 3–4 weeks via Respirgard II™ nebulizer||For children ≥5 years old, 300 mg inhaled monthly via Respirgard II™ nebulizer|
|Dapsonec||100 mg/day as a single dose or 50 mg twice a day||Can be administered on a daily or weekly schedule as 2.0 mg/kg/day (maximum total dosage of 100 mg/day) or 4.0 mg/kg/week (maximum total dosage of 200 mg/week) orally. Approximately two thirds of patients intolerant to Trimethoprim/sulfamethoxazole can take dapsone successfully. Studies in adults show dapsone is as effective as atovaquone or aerosolized pentamidine but slightly less effective than Trimethoprim/sulfamethoxazole|
|Atovaquone||1500 mg/day||Administered with a meal as an oral yellow suspension in single dosage of 30 mg/kg/day for patients 1–3 months and >24 months of age, and 45 mg/kg/day for infants aged 4–24 months|
Prior to the use of trimethoprim–sulfamethoxazole, mortality from PCP in KTRs was very high (409,410). The treatment of PCP includes both the use of intravenous trimethoprim–sulfamethoxazole as well as corticosteroids for KTRs with significant hypoxemia (406). RCTs have demonstrated that the use of corticosteroids in the first 72 hours of PCP in HIV patients resulted in improved outcome, including morbidity, mortality and avoidance of intubation (406). The usual duration of treatment is 2–3 weeks. The use of intravenous pentamidine isethionate should be considered in patients with proven trimethoprim–sulfamethoxazole allergy. Other treatment strategies should be restricted to patients with mild PCP only.
- 14.3: TUBERCULOSIS
- 14.3.1: We suggest that TB prophylaxis and treatment regimens be the same in KTRs as would be used in the local, general population who require therapy. (2D)
- 14.3.2: We recommend monitoring CNI and mTORi blood levels in patients receiving rifampin. (1C)
- 126.96.36.199: Consider substituting rifabutin for rifampin to minimize interactions with CNIs and mTORi. (Not Graded)
CNI, calcineurin inhibitor; KTRs, kidney transplant recipients; mTORi, mammalian target of rapamycin inhibitor(s); TB, tuberculosis.
- • KTRs are at increased risk of developing disease due to tuberculosis (TB).
- • KTRs with latent TB, identified by a positive purified protein derivative (PPD) skin test or a history of TB disease without adequate treatment, are at highest risk of developing clinical TB after transplantation and are therefore good candidates for chemoprophylaxis with isoniazid.
- • Treatment of TB in KTRs has been shown to respond to standard antimycobacterial therapy.
- • The use of rifampin is associated with numerous drug–drug interactions through its activation of the CYP3A4 pathway.
- • This interaction can affect drug levels for CNIs as well as mTORi.
- • Rifabutin achieves similar therapeutic efficacy while minimizing the potential for drug–drug interactions.
The incidence of TB among KTRs varies according to geographic locations, with rates of 0.5–1.0% reported in North America, 0.7–5% in Europe and 5–15% in India and Pakistan (411,412). This represents a marked (50- to 100-fold) increase in the frequency of TB compared to the general population. In addition, there is also a marked increase in severity of disease in KTRs with mortality rates 10-fold higher than in immunocompetent individuals with TB.
The most frequent source of TB infections in KTRs is reactivation of quiescent foci of Mycobacterium tuberculosis that persist after initial asymptomatic infection (413). Accordingly, screening and identification of individuals with evidence of prior latent infection with TB should allow treatment prior to development of clinical disease, resulting in improved outcome.
Data from a variety of immunosuppressed populations demonstrate that treatment of latent TB markedly reduces the risk of subsequent progression to clinically active TB (414). A limited number of RCTs have evaluated the benefit of prophylactic treatment with isoniazid for KTRs (415) or organ transplant patients, including KTRs (416,417). Results of these studies suggest a benefit to KTRs, although study size and design limit the strength of these observations. The use of prophylactic isoniazid in patients with a past or current positive PPD skin test, and/or a history of TB without adequate documented treatment, has been previously recommended by the European Best Practice Guidelines for Renal Transplantation (411) and the American Society of Transplantation Guidelines for the Prevention and Management of Infectious Complications of Solid Organ Transplantation (418).
If, according to these guidelines, vaccination with BCG can give a ‘false-positive’ PPD skin test, then some patients may be treated unnecessarily. Most believe that the effect of BCG should not persist for more than 10 years (419). The use of BCG vaccine is especially common in regions where the prevalence of TB is high. In these regions, it is therefore difficult to distinguish PPD skin tests that are positive due to BCG from those that are positive due to prior infection with M. tuberculosis. Accordingly, it is recommended that the history of BCG vaccination should be ignored and that a 9-month course of prophylactic isoniazid should be used (411). It is also possible that dialysis and transplant patients frequently have false-negative PPD skin tests. Accordingly, some experts have recommended use of isoniazid prophylaxis in selected KTRs with a negative PPD skin test. These would include those with history of active TB that was not adequately treated, those with radiographic evidence of previous TB without a history of treatment and those who have received an organ from a donor with a history of a positive PPD skin test (418).
Interferon-gamma release assays such as T-SPOT.TB and QuantiFERON are an alternative to the tuberculin skin test for detecting latent TB infection. Their sensitivity and specificity, however, have not been systematically evaluated in KTRs. Data from CKD stage 5 patients suggest important limitations for detecting latent TB infection which preclude their routine use at present (420–423).
Extensive experience in the treatment of immunosuppressed patients (including transplant recipients) suggests that the response to treatment is the same as in immunocompetent patients. Unfortunately, rifampin is a strong inducer of the microsomal enzymes that metabolize CNIs and mTORi, and it may be difficult to maintain adequate levels of these immunosuppressive drugs to prevent rejection. The use of rifampin has required doses of CNIs to be increased two- to threefold (418). One potential alternative is to substitute rifabutin for rifampin. Rifabutin has activity against M. tuberculosis that is similar to rifampin, but rifabutin is not as strong an inducer of CYP3A4 as rifampin. However, there is little published experience with rifabutin in KTRs.
There are reports of successful treatment of posttransplant TB with rifampin-sparing regimens (415). In this report, rifampin is substituted with a fluoroquinolone along with isoniazid, ethambutol and pyrazinamide for the first 2 months. At this point, the latter two are stopped and fluoroquinolone and isoniazid continued for another 10–12 months. According to the authors, the success rate is 100% (424–426).
Finally, the rate of recovery of drug-resistant TB is increasing. Since both KTRs and their donors may come from diverse geographic locations where the prevalence of drug resistance may vary, all isolates of TB recovered from KTRs should be submitted for susceptibility testing. Modifications in treatment should be made once the results of susceptibility testing become available.
- 14.4: CANDIDA PROPHYLAXIS
- 14.4.1: We suggest oral and esophageal Candida prophylaxis with oral clotrimazole lozenges, nystatin, or fluconazole for 1–3 months after transplantation, and for 1 month after treatment with an antilymphocyte antibody. (2C)
- • KTRs are at increased risk for oral and esophageal infections due to Candida species.
- • The use of oral clotrimazole troches or nystatin provides effective prophylaxis without systemic absorption and hence without concerns for side effects.
- • Although data regarding the duration of prophylaxis are not available for KTRs, prophylaxis should logically be continued until patients are on stable, maintenance immunosuppression, particularly corticosteroids.
Observational studies have reported a high incidence of oral and esophageal Candida infections in KTRs. There are limited data supporting the use of antifungal therapy in KTRs, although it is beneficial in liver transplant recipients (427). The standard immunosuppressive agents typically used in KTRs are associated with an increased risk of developing Candida infections. The most common source for these infections is colonization of the oral mucosa. Accordingly, use of topical antifungal therapies such as clotrimazole troches and nystatin offer the opportunity to eradicate fungal colonization without associated risks that may be present for systemically absorbed antifungal agents. However, a recent report suggested a potential drug–drug interaction between clotrimazole and tacrolimus (428). It is important to note that there are drug–drug interactions between fluconazole and CNIs.
Although data regarding the appropriate duration of prophylaxis for these agents are not available for KTRs, the risk is greatest early after transplantation when patients are receiving their highest levels of immunosuppression, and are more likely to be exposed to antibacterial agents that increase the risk for Candida infections. Accordingly, these agents can likely be discontinued once the patient is on maintenance immunosuppression, particularly when steroid doses are stable and low.
- • RCTs are needed to determine the optimal duration and type of prophylaxis for Candida infections in KTRs.
Section III: Cardiovascular Disease
The incidence of CVD is high after kidney transplantation (429–434). The annual rate of fatal or nonfatal CVD events is 3.5–5.0% in KTRs, 50-fold higher than in the general population (435). By 36 months after transplantation, nearly 40% of patients have experienced a CVD event (436). Although acute myocardial infarction is common after transplantation, especially in elderly patients and those with diabetes (437) congestive heart failure (CHF) is also a common CVD complication (436). Most of the ‘traditional risk factors’ in the general population, including cigarette smoking, diabetes, hypertension and dyslipidemias, are also risk factors for CVD in KTRs (Table 18). In addition, many KTRs have had CKD for an extended period of time prior to transplantation, and have thereby acquired additional CVD risk by the time they undergo transplantation. For all of these reasons, KTRs should be considered to be at the highest risk for CVD and managed accordingly.
|Predictor||Number of studies (number of analyses)||Total number of subjects (range)||Outcomes||Number statistically significant (p < 0.05)|
|Tobacco use (438–443)||6 (10)||57 027||CeVD||1/2|
|Diabetes (430,442,444–453)||12 (17)||115 510||All CVD||1/1|
|Obese/elevated BMI (14,443,454–456)||5 (6)||103 295 (2067–51 927)||CHF||1/1|
|Hypertensiona (439–441,443,450)||5 (5)||29 259||All CVD||1/1|
|Dyslipidemiab (457–465)||9 (9)||3657||All CVD (combined in||5/9|
Rating Guideline Recommendations
Within each recommendation, the strength of recommendation is indicated as Level 1, Level 2, or Not Graded, and the quality of the supporting evidence is shown as A, B, C, or D.
|Level 1||‘We recommend’|
|Level 2||‘We suggest’|
|Grade for quality of evidence||Quality of evidence|
Chapter 15: Diabetes Mellitus
- 15.1: SCREENING FOR NEW-ONSET DIABETES AFTER TRANSPLANTATION
- 15.1.1: We recommend screening all nondiabetic KTRs with fasting plasma glucose, oral glucose tolerance testing, and/or HbA1c(1C) at least:
- • weekly for 4 weeks (2D);
- • every 3 months for 1 year (2D); and
- • annually, thereafter. (2D)
- 15.1.2: We suggest screening for NODAT with fasting glucose, oral glucose tolerance testing, and/or HbA1c after starting, or substantially increasing the dose, of CNIs, mTORi, or corticosteroids. (2D)
CNI, calcineurin inhibitor; HbA1c, hemoglobin A1c; KTRs, kidney transplant recipients; mTORi, mammalian target of rapamycin inhibitor(s); NODAT, new-onset diabetes after transplantation.
Diabetes is defined according to the WHO and American Diabetes Association (ADA) (Table 19).
|1.||Fasting plasma glucose ≥126 mg/dL (7.0 mmol/L). Fasting is defined as no caloric intake for at least 8 hours.*|
|2.||Symptoms of hyperglycemia and a casual plasma glucose ≥200 mg/dL (11.1 mmol/L). Casual is defined as any time of day without regard to time since last meal. The classic symptoms of hyperglycemia include polyuria, polydipsia and unexplained weight loss.|
|3.||Two-hour plasma glucose ≥200 mg/dL (11.1 mmol/L) during an oral glucose tolerance test. The test should be performed as described by the WHO, using a glucose load containing the equivalent of 75 g anhydrous glucose dissolved in water.*|
New-onset diabetes after transplantation is diabetes defined by the WHO and ADA that develops for the first time after kidney transplantation.
- • The chances of reversing or ameliorating NODAT may be improved by early detection and intervention.
- • Early treatment of NODAT may prevent complications of diabetes.
- • The incidence of NODAT is sufficiently high to warrant screening.
Fasting plasma glucose, 2-h glucose tolerance testing (after a 75-g glucose load) and hemoglobin A1c (HbA1c) are probably suitable screening tests to detect NODAT in KTRs. The frequency of screening for NODAT is based on the incidence of NODAT at different times after kidney transplantation. The reported incidence varies by the definition of diabetes and the type of immunosuppressive medications used. However, the incidence of NODAT is highest in the first 3 months after transplantation. The cumulative incidence of NODAT by the end of the first year has generally been found to be 10–30% in adults receiving CsA or tacrolimus plus corticosteroids (468–479), and 3–13% in children (480,481). The high incidence of NODAT justifies frequent screening during the first year after transplantation. A number of risk factors increase the incidence of NODAT (Table 20), and patients with one or more of these additional risk factors may benefit from more frequent screening.
|Predictor||No. of subjects (range)||Association (No. of studies p < 0.05)||No association (No. of studies)|
|Tacrolimus (474–477,479,482–485)||100 418 (386–28 941)||7||2|
|CsA (479,484)||1066 (528–538)||2|
|Corticosteroids (477,478,484,486)||2035 (386–589)||2||2|
|Sirolimus (479,484,487,488)||22 525 (528–21 459)||2||2|
|Acute rejection (477–479)||1436 (386–528)||3|
|Obesity/higher BMI (471,472,474,476–479,482,484,485,488)||97 702 (386–28 942)||9||2|
|African American ethnicity (471,472,474–476,479,482,485,488)||103 383 (528–28 942)||8||1|
|Hispanic ethnicity (US) (474)||15 787||1|
|Older age (471,472,474–479,484,485,488)||94 487 (386–28 942)||9||2|
|Male (471,474,476–479,484,485)||64 090 (386–28 942)||8|
|HLA mismatch (474,476,478,485)||60 560 (522–28 942)||2||2|
|Deceased-donor kidney (471,474,476–478,485)||63 024 (386–28 942)||1||5|
|Hepatitis C (474,477,478,482,485,488)||63 805 (386–21 459)||5||1|
|HCV risk (D+/R−) (476)||28 942||1|
|CMV risk (D+/R−) (477)||386||1|
|Type 2 diabetes in family (478,484)||1060 (522–538)||1||1|
|Impaired fasting glucose||nd|
|Impaired glucose tolerance||nd|
|HDL-C <40 mg/dL||nd|
|Triglycerides >150 mg/dL (472)||1811||1|
Since tacrolimus, CsA, mTORi and corticosteroids can cause NODAT, it is reasonable to screen for NODAT after starting, or substantially increasing the dose of one of these medications. Treating acute rejection with high-dose corticosteroids, for example, should prompt screening for NODAT.
Tacrolimus and CsA may cause NODAT by directly decreasing insulin secretion of pancreatic beta cells (489–493). Logically, reducing the dose or discontinuing these agents as soon as possible could potentially limit the damage to beta cells, although the clinical evidence is anecdotal (494,495). There is anecdotal evidence from case reports/series that NODAT may be reversed by reducing, replacing or discontinuing CsA, tacrolimus or corticosteroids (494,495). There are few data on the effects of corticosteroid reduction on reversing NODAT once it has occurred. Similarly, few, if any, data are available on whether discontinuing mTORi will reverse NODAT.
The relative effects of different immunosuppressive agents on NODAT are difficult to quantify, because RCTs use different regimens and doses, as well as different definitions of NODAT, all of which make comparisons difficult. Nevertheless, it appears that the risk of NODAT with tacrolimus is greater than with CsA. It is also clear that high doses of corticosteroids used immediately after transplantation, and in the treatment of acute rejection, are risk factors for NODAT. Sirolimus has not been as well studied. Some observational studies have found that sirolimus use was associated with an increased incidence of NODAT (487,496,497). Randomized trials have produced conflicting results (498–502). There is no evidence that azathioprine or MMF causes NODAT.
The risk of NODAT from immunosuppressive medications is no doubt higher in individuals with other risk factors, for example African American or American Hispanic ethnicity, obesity and age. Thus, the choice of immunosuppressive medications could be individualized to the risk for NODAT attributable to other risk factors in each individual patient. In addition, the risk of NODAT should be considered in light of the risk of acute rejection. Indeed, the occurrence of acute rejection and its treatment with corticosteroids is a risk factor for NODAT. Unfortunately, it is difficult to weigh the relative risks of rejection and NODAT in individual patients to determine the best immunosuppressive medication regimen.
By almost any definition, the risk of NODAT is increased by obesity. African American and Hispanic ethnicity are generally defined as self-reported. Since data on African American and Hispanic ethnicity are largely from the United States, it is unclear if ethnicities defined otherwise and in other countries have similar risk for NODAT. Older age is a risk factor that shows a linear relationship with risk, but there is no clear threshold. HCV infection is defined by the presence of antibody to the HCV at the time of transplantation.
A number of other risk factors for diabetes have not been rigorously studied in KTRs, but there is little reason to believe that they would not also be risk factors after transplantation. These risk factors include: family history (type 2 diabetes), gestational diabetes, impaired fasting glucose, impaired glucose tolerance and dyslipidemia (high fasting triglycerides and/or low HDL-C) (503–507).
Data from observational studies have shown that NODAT is associated with worse outcomes, including increased graft failure, mortality and CVD (474). It is possible that some of these associations result from unmeasured risk factors that are common to both NODAT and poor outcomes. However, it is certainly plausible that NODAT directly and indirectly contributes to worse outcomes. Untreated diabetes may increase the risk of metabolic complications, including hyperkalemia, and even ketoacidosis. However, there is no evidence from observational studies to suggest how frequently these complications occur after NODAT.
- • Future RCTs of immunosuppressive medication regimens should measure fasting glucose, HbA1c and/or glucose tolerance tests, and any treatments of diabetes, to determine the effect of the medication regimens on the incidence of NODAT.
- 15.2: MANAGING NODAT OR DIABETES PRESENT AT TRANSPLANTATION
- 15.2.1: If NODAT develops, consider modifying the immunosuppressive drug regimen to reverse or ameliorate diabetes, after weighing the risk of rejection and other potential adverse effects. (Not Graded)
- 15.2.2: Consider targeting HbA1c 7.0–7.5%, and avoid targeting HbA1c≤6.0%, especially if hypoglycemic reactions are common. (Not Graded)
- 15.2.3: We suggest that, in patients with diabetes, aspirin (65–100 mg/day) use for the primary prevention of CVD be based on patient preferences and values, balancing the risk for ischemic events to that of bleeding. (2D)
CVD, cardiovascular disease; HbA1c, hemoglobin A1c; NODAT, new-onset diabetes after transplantation.
The management of diabetes that is present at the time of transplantation may be complicated by severe autonomic neuropathy and other complications of long-standing diabetes that may make ‘tight’ control of blood glucose difficult to achieve. Therefore, we recommend avoiding intensive therapies targeting HbA1c levels <6.0%. However, complications of long-standing diabetes that make the management of diabetes difficult are less likely to be present in patients with NODAT, and it is not clear whether NODAT can be safely and effectively managed within a narrow range of low blood glucose and HbA1c targets.
- • The benefits and harm of altering the immunosuppressive medication regimen in response to the development of NODAT are unclear.
- • In the general diabetic population, there is insufficient evidence for or against targeting a specific HbA1c level to reduce CVD; however, recent data suggest that mortality may be increased in patients with type 2 diabetes by targeting HbA1c levels that are <6.0%.
- • In KTRs, attempting to reduce HbA1c levels in order to reduce CVD may result in more complications than in the general diabetic population.
- • Randomized trials in the general population suggest that aspirin prophylaxis may prevent CVD in patients with diabetes.
There are no RCTs testing whether changing to different immunosuppressive medication regimens reverses or ameliorates NODAT. There are uncontrolled (largely anecdotal) reports on the effects of changing immunosuppressive agents once NODAT has developed (494,495). Given the associations of NODAT with CsA, tacrolimus, mTORi and corticosteroids, it is plausible that reducing or eliminating these immunosuppressive medications may reverse or ameliorate NODAT. Changes in immunosuppressive medications that may reverse or ameliorate NODAT include:
- i) reducing the dose of tacrolimus, CsA or corticosteroids;
- ii) discontinuing tacrolimus, CsA or corticosteroids;
- iii) replacing tacrolimus with CsA, MMF or azathioprine;
- iv) replacing CsA with MMF or azathioprine.
We could find no published reports of reducing the dose or discontinuing a mTORi to reverse or ameliorate NODAT.
Optimal glycemic control to prevent microvascular disease complications has been defined in a number of guidelines for the general population. A recent systematic review of these guidelines concluded that the goal for glycemic control should be as low as feasible without incurring undue risk for adverse events (508). These authors concluded that a HbA1c level <7% is a reasonable goal for many, but not all, patients in the general diabetic population.
While there is evidence in the general diabetic population that strict glycemic control reduces microvascular disease complications, there is less evidence that glycemic control reduces CVD. The United Kingdom Prospective Diabetes Study (UKPDS) and the Diabetes Control and Complications Trial reported nonsignificant trends toward lower CVD with lower HbA1c levels (509,510). A long-term follow-up of this trial reported that intensive insulin therapy reduced CVD (511). Similarly, in a 10-year follow-up of the UKPDS, there were reduced myocardial infarctions in the sulfonylurea–insulin and metformin intensive-therapy groups (compared to usual care) (512).
Recently, the blood glucose control arm of the Action to Control Cardiovascular Risk in Diabetes (ACCORD) was stopped early, because participants in the intensive-treatment group had experienced increased mortality (513). In ACCORD, 10 251 adults with long-standing (average 10 years) type 2 diabetes, and either heart disease or two or more other risk factors for heart disease, were randomly allocated to target HbA1c <6.0% vs. standard treatment targeting HbA1c 7.0–7.9%. Half of the participants in the intensive-treatment group achieved a HbA1c of <6.4%, and half of the participants in the standard treatment group achieved a HbA1c of <7.5%. The Data Safety Monitoring Board halted these diabetes control arms of the trial 18 months early, because of a higher mortality rate in the group targeting lower HbA1c levels. In the intensive-treatment group 257 died, compared with 203 in the standard-treatment group. This was a difference of 54 deaths, or 3 per 1000 participants per year, over an average of almost 4 years of treatment. For both the intensive- and standard-treatment groups in ACCORD, clinicians could use all major classes of diabetes medications available. Extensive analyses did not determine a specific cause for the increased deaths, and there was no evidence that any medication or combination of medications was responsible.
Similarly, the Action in Diabetes and Vascular Disease (ADVANCE) study (514) failed to demonstrate that more intensive glycemic control compared to standard practice reduced CVD events. The ADVANCE study achieved a median HbA1c of 6.3% in the intensive-management group compared with 7.0% in the standard-intervention group. The results from ACCORD and ADVANCE studies may not apply to patients with type 1 diabetes, patients with recently diagnosed type 2 diabetes or those whose cardiovascular risk is different than the participants studied in ACCORD and ADVANCE. In particular, the results may not apply to patients with CKD or to KTRs. Nevertheless, the results of the ACCORD and ADVANCE trials cast serious doubt on the advisability of targeting low HbA1c levels to reduce CVD. Additional trials in the general diabetic population may help to determine the optimal strategy for managing diabetes (515).
Kidney transplant recipients with diabetes, especially if the diabetes was the cause of CKD stage 5, often have difficult-to-control diabetes, with advanced autonomic neuropathy causing diabetic gastroparesis and hypoglycemic unawareness. In a RCT comparing intensive glucose control with usual care in 99 KTRs, the incidence of severe hypoglycemia was significantly higher in the intensive glucose-control arm (516). Therefore, it may be more difficult to achieve a HbA1c level <7.0% without undue risk and burden in many KTRs. In addition, some medications used to treat diabetes may need dose reduction, or should be avoided in patients with reduced kidney function (Table 21).
|Class||Drug||Dose adjustment||Drug–drug interactions|
|First-generation sulfonylureas||Acetohexamide||Avoid (517)||CsA levels|
|Chlorpropamide||50% if GFR 50–70 mL/min/1.73 m2||CsA levels|
|Avoid if GFR <50 mL/min/1.73 m2 (517,518)|
|Tolbutamide||Use with caution (519,520)||CsA levels|
|Second-generation sulfonylureas||Glipizide||No dose adjustment||CsA levels|
|Gliclazide||No dose adjustment||CsA levels|
|Glyburide (Glibenclamide)a||Avoid if GFR <50 mL/min/1.73 m2 (521)||CsA levels|
|Glimepiride||Start at 1 mg/day||CsA levels|
|Gliquidoneb||No dose adjustment|
|Glisentideb||Avoid if advanced CKD|
|Alpha-glucosidase inhibitors||Acarbose||Avoid if Scr >177 μmol/L (2 mg/dL) (522–524)|
|Miglitol||Avoid if GFR <25 mL/min/1.73 m2 (522–524)|
|Metformin||Contraindicated if Scr ≥133 μmol/L (1.5 mg/dL) men, ≥124 μmol/L (1.4 mg/dL) women (522)|
|Meglitinides||Repaglinide||Start 0.5 mg with meals if GFR <40 mL/min/1.73 m2 and titrate carefully (522)||Repaglinide levels with CsA (525)|
|Nateglinide||Use with caution if advanced CKD (522)|
|Thiazolidinedionesc||Pioglitazone||No dose adjustment (522)|
|Rosiglitazone||No dose adjustment (522)|
|Incretin mimetic||Exenatide||Avoid if GFR <30 mL/min/1.73 m2 (522)|
|Amylin analog||Pramlintide||No dose adjustment if GFR >20 mL/min/1.73 m2|
|DDP-4 inhibitor||Sitagliptin||50% if GFR 30–50 mL/min/1.73 m2|
|75% if GFR <30 mL/min/1.73 m2|
|Vildagliptine||Avoid if advanced CKD on hemodialysis|
Patients with difficult-to-control type 1 diabetes may be candidates for pancreas transplantation. There has never been a randomized trial of pancreas transplantation vs. kidney transplantation alone, but there is little question that a successful pancreas transplantation can improve the quality of life of patients with difficult-to-control diabetes (527–529). Whether pancreas transplantation reduces the risk for CVD is unknown. Pancreas transplantation is best performed either simultaneously with, or subsequent to, a living-donor kidney transplantation in patients who are already taking immunosuppressive agents (530). Islet transplantation is still experimental, and long-term survival of islets has yet to be achieved (531). In addition, the multiple infusion of islet cells required may sensitize the recipient to a number of major histocompatibility antigens that can make it difficult to find a compatible solid organ for transplantation when one is needed (532).
Evidence that the benefits of aspirin (e.g. preventing of CVD events) outweigh the harm (e.g. bleeding complications) for patients with diabetes, but without known CVD, is not strong. Therefore, while some guidelines in the general population suggest that aspirin be used for primary prevention in all patients with diabetes, others do not. For example, the ADA currently recommends:
- • Use aspirin therapy (75–162 mg/day) as a primary prevention strategy in those with type 1 or type 2 diabetes at increased cardiovascular risk, including those who are >40 years of age or who have additional risk factors (family history of CVD, hypertension, smoking, dyslipidemia or albuminuria). (C)
- • Use aspirin therapy (75–162 mg/day) as a secondary prevention strategy in those with diabetes with a history of CVD. (A)
- • Aspirin therapy should not be recommended in people under 30 years of age due to lack of evidence of benefit, and is contraindicated in patients under the age of 21 years because of the associated risk of Reye's syndrome. (E)
where A indicates ‘Clear evidence from well-conducted, generalizable, randomized clinical trials that are adequately powered …,’C indicates ‘Supportive evidence from poorly controlled or uncontrolled studies …’ and E indicates ‘Expert consensus or clinical experience …’ (533).
A recent RCT in patients with type II diabetes and peripheral vascular disease (PVD) reported that aspirin prophylaxis had no effect on CVD events (534). Another small trial of low-dose aspirin for primary prevention of atherosclerotic events in Japanese patients with type II diabetes failed to show clear benefit from aspirin (535). The results of these trials have cast doubt on the use of aspirin in patients with diabetes to prevent first CVD events. Thus, it is unclear whether the benefits outweigh the harm for aspirin use in KTRs with diabetes. The results of other pending trials with aspirin prophylaxis in the general population may help to clarify the benefits and harm of aspirin for primary prevention in patients with diabetes.
- • A RCT is needed to examine aspirin prophylaxis in KTRs with and without diabetes.
Chapter 16: Hypertension, Dyslipidemias, Tobacco Use, and Obesity
- 16.1: HYPERTENSION
- 16.1.1: We recommend measuring blood pressure at each clinic visit. (1C)
- 16.1.2: We suggest maintaining blood pressure at <130 mm Hg systolic and <80 mm Hg diastolic if ≥18 years of age, and <90th percentile for sex, age, and height if <18 years old. (2C)
- 16.1.3: To treat hypertension (Not Graded):
- • use any class of antihypertensive agent;
- • monitor closely for adverse effects and drug–drug interactions; and
- • when urine protein excretion ≥1 g/day for ≥18 years old and ≥600 mg/m2/24 h for <18 years old, consider an ACE-I or an ARB as first-line therapy.
ACE-I, angiotensin-converting enzyme inhibitor; ARB, angiotensin II receptor blocker.
Most guidelines for the general population define hypertension as persistent systolic blood pressure on at least 2 days ≥140 mm Hg and/or diastolic blood pressure ≥90 mm Hg if age ≥18 years, and ≥95th percentile for gender, age and height if age <18 years (Table 22). However, these same guidelines establish treatment goals for high-risk subpopulations, for example diabetes and CKD, that are generally systolic <130 mm Hg and/or diastolic <80 mm Hg for adults, and <90th percentile for gender, age and height for adolescents and children.
|Guideline||Hypertension definition||Treatment goals (mm Hg)|
|JNC 7 2003 (536)||≥140/90||<140/90||<130/80 in diabetes and CKD|
|WHO ISH 2003 (537)||≥140/90||<140/90||<130/80 in diabetes|
|KDOQI 2004 (538)||–||<130/80 in KTRs|
|NHBPEWG Children 2004 (539)||≥95th percentilea||<95th percentilea||<90th percentilea in concurrent conditionsb|
|ESH ESC 2007 (540)||≥140/90||<140/90||<130/80 in diabetes and high riskc|
|USPSTF 2007 (541)||≥140/90||See JNC 7d||See JNC 7d|
- • In the general population, there is strong evidence that treatment of hypertension is effective in preventing CVD and in retarding the progression of CKD.
- • In KTRs, the prevalence of hypertension is high enough to warrant screening.
- • In KTRs, blood pressure is a risk factor for CVD and CAI.
- • In KTRs, there is little reason to believe that the prevention and treatment of hypertension would not also prevent CVD and kidney allograft injury.
Observational studies and RCTs have conclusively shown that hypertension is an independent risk factor for CVD and CKD in the general population.
In addition, evidence from RCTs in the general population has conclusively shown that reducing blood pressure reduces the risk of CVD. These trials have shown benefit to reducing blood pressure to <140/90 mm Hg even in low-risk adult populations. Additional benefit may extend to high-risk populations, such as those with diabetes. RCTs in CKD have generally shown that blood pressure reduction reduces proteinuria and slows the rate of decline in kidney function.
Life expectancy is lower in KTRs than in the general population, and it is possible that the benefits and harm of hypertension treatment in KTRs are different than in the general population. However, the leading cause of death in KTRs is CVD, making it likely that treatments that reduce the risk of CVD in the general population will also be cost-effective in KTRs. Although adverse effects of pharmacological treatment of hypertension in KTRs are different and likely more common than in the general population, small RCTs and observational studies suggest that these adverse effects are generally not severe enough to reduce quality of life or increase mortality.
The incidence of hypertension in KTRs is 50–90% (435,542,543). Thus, even conservative estimates on the incidence of hypertension in KTRs suggest that hypertension is common enough to warrant close scrutiny in KTRs. Observational studies have shown that hypertension is an independent risk factor for CVD after kidney transplantation (Table 18) (430,544). There are also studies linking hypertension to poor graft function, although it is difficult to separate cause and effect relationships in these studies (545–547).
There are few data to suggest how often patients should be screened for hypertension after kidney transplantation. However, the high incidence of hypertension, the changing risk for hypertension and CVD in KTRs and the ease of obtaining blood pressure measurements are compelling arguments for measuring blood pressure at every clinic visit. Patients should be seated quietly for at least 5 min with feet on the floor and arm supported at heart level. An appropriately sized cuff with bladder encircling at least 80% of the arm should be used. At least two measurements should be made. Systolic blood pressure is the point at which the first of two or more sounds is heard (phase 1), and diastolic blood pressure is the point before the disappearance of sounds (phase 5). Patients should be provided with their specific blood pressure readings and goals (536).
Ambulatory blood pressure monitoring is warranted for the evaluation of possible ‘white coat hypertension,’ episodic hypertension, assessing apparent drug resistance, hypotensive symptoms with blood pressure treatment and autonomic dysfunction (536). Ambulatory blood pressure readings are lower than office blood pressure readings, with daytime values being higher than values during sleep (Table 23) (536).
|Method of measurement||Threshold (mm Hg)|
|Office or clinic||140/90|
Self-measured blood pressure is also useful in assessing treatment of hypertension and improving adherence to treatment (536). Home measurement devices should be checked regularly for accuracy.
It is unlikely that there will be RCTs in KTRs to determine whether blood pressure lowering reduces CVD events, or prolongs patient or graft survival. However, observational studies have reported that hypertension is associated with both CVD events and graft survival (Table 18). Guidelines from the general population recommend targeting <140/90 mm Hg for all patients, even low-risk patients. However, these same guidelines recommend targeting <130/80 mm Hg for high-risk patients, such as patients with diabetes and CKD (536,538). There are indeed RCT data justifying this lower target in these populations. Although many transplant patients have diabetes and many have reduced GFR, whether benefits outweigh risks of targeting <130/80 mm Hg is unclear.
Causes of posttransplant hypertension include CNI use, corticosteroids, kidney allograft dysfunction, allograft vascular compromise (from within the allograft itself, from within the allograft artery and its anastomosis and from within arteries immediately proximal to the allograft artery anastomosis) (548–553), as well as factors related to the presence of the native kidneys (554–556). Treatment should include adjusting CNI dose, administering antihypertensive medications and managing other CVD risk factors. A number of small randomized trials have demonstrated the efficacy and safety of lowering blood pressure with most classes of antihypertensive medications. However, there is insufficient evidence to recommend any class of antihypertensive agents as preferred for long-term therapy for reducing CVD or improving long-term graft survival.
The choice of initial antihypertensive agent may be determined by the presence of one or more common posttransplant complications that may be made better or worse by specific antihypertensive agents (Table 24). Urine protein excretion ≥1 g per 24 h if age ≥18 years (and ≥600 mg/m2 per 24 h if age <18 years) is a threshold at which blood pressure lowering trials have shown efficacy in reducing the progression of kidney disease in nontransplant patients (538). To date, there are no RCTs showing that reducing urinary protein in KTRs preserves kidney allograft function.
|Agent class||Advantages (additional indications that are common in KTRs)||Disadvantages (adverse effects that are common in KTRs)|
|Thiazide diuretics||CHF with systolic dysfunction||Hypomagnesemia|
|High CAD risk||Hyperuricemia|
|Recurrent stroke prevention||Hyponatremia|
|Aldosterone antagonists||CHF with systolic dysfunction||Hyperkalemia|
|Beta-blockers||CHF with systolic dysfunctiona||Hyperkalemia|
|Chronic stable angina||Dyslipidemias|
|Post MI||Glucose intolerance|
|High CAD risk|
|Angiotensin-converting enzyme inhibitorc||CHF with systolic dysfunction||Hyperkalemia|
|High CAD risk|
|Recurrent stroke prevention|
|Calcium-channel blockers||Chronic stable angina||Edema|
|High CAD risk||Increased CNI levelsb|
|Supraventricular tachycardia||Reduced kidney function|
|Increased CNI levels (allowing a reduction in dose and cost)b|
In general, no antihypertensive agent is contraindicated in KTRs. Data from nontransplant patients with CKD suggest that ACE-Is and ARBs may be have beneficial effects on the progression of diabetic and nondiabetic CKD, particularly in patients with proteinuria (538). However, RCTs in KTRs have not had sufficient statistical power to determine whether ACE-I or ARB therapy improves patient or graft survival (557). On the other hand, ACE-Is and ARBs may be associated with an increased risk of hyperkalemia and anemia in KTRs (557–560). Hypertensive KTRs with ischemic heart disease and/or CHF may benefit from ACE-Is, ARBs and/or beta-blockers (561). Diuretics may be effective in treating hypertension in KTRs, since hypertension in CNI-treated KTRs may be sodium dependent (562).
Many patients will require combination therapy to control blood pressure. Most combinations should include a thiazide diuretic, unless it is contraindicated. Recent studies suggest that thiazides may be more effective than previously thought in patients with reduced kidney function (563–565). When hypertension is difficult to control, especially when it is associated with otherwise unexplained kidney allograft dysfunction, screening for allograft vascular compromise, within or proximal to the allograft artery, should be considered. This usually requires imaging of the allograft vasculature using either an angiogram, computerized tomographic angiography or magnetic resonance imaging. When hypertension is difficult to control, and there are no reversible causes, bilateral native kidney nephrectomies may be considered, especially in a KTR<40 years old.
Randomized controlled trials are needed to determine:
- • the optimal blood pressure treatment target in KTRs;
- • the effect of reducing proteinuria on progression of CKD in KTRs;
- • the effects of ACE-Is/ARBs on patient survival and graft survival.
- 16.2: DYSLIPIDEMIAS(These recommendations are based on KDOQI Dyslipidemia Guidelines and are thus Not Graded)
- 16.2.1: Measure a complete lipid profile in all adult (≥18 years old) and adolescent (puberty to 18 years old) KTRs (based on KDOQI Dyslipidemia Recommendation 1):
- • 2–3 months after transplantation;
- • 2–3 months after a change in treatment or other conditions known to cause dyslipidemias;
- • at least annually, thereafter.
- 16.2.2: Evaluate KTRs with dyslipidemias for secondary causes (based on KDOQI Dyslipidemia Recommendation 3)
- 188.8.131.52: For KTRs with fasting triglycerides ≥500 mg/dL (≥5.65 mmol/L) that cannot be corrected by removing an underlying cause, treat with:
- • Adults: therapeutic lifestyle changes and a triglyceride-lowering agent (based on KDOQI Recommendation 4.1);
- • Adolescents: therapeutic lifestyle changes (based on KDOQI Recommendation 5.1).
- 184.108.40.206: For KTRs with elevated LDL-C:
- • Adults: If LDL-C ≥100 mg/dL (≥2.59 mmol/L), treat to reduce LDL-C to <100 mg/dL (<2.59 mmol/L) (based on KDOQI Guideline 4.2);
- • Adolescents: If LDL-C ≥130 mg/dL (≥3.36 mmol/L), treat to reduce LDL-C to <130 mg/dL (<3.36 mmol/L) (based on KDOQI Guideline 5.2).
- 220.127.116.11: For KTRs with normal LDL-C, elevated triglycerides and elevated non-HDL-C:
- • Adults: If LDL-C <100 mg/dL (<2.59 mmol/L), fasting triglycerides ≥200 mg/dL (≥2.26 mmol/L), and non-HDL-C ≥130 mg/dL (≥3.36 mmol/L), treat to reduce non-HDL-C to <130 mg/dL (<3.36 mmol/L) (based on KDOQI Guideline 4.3);
- • Adolescents: If LDL-C <130 mg/dL (<3.36 mmol/L), fasting triglycerides ≥200 mg/dL (≥2.26 mmol/L), and non-HDL-C ≥160 mg/dL (≥4.14 mmol/L), treat to reduce non-HDL-C to <160 mg/dL (<4.14 mmol/L) (based on KDOQI Guideline 5.3).
HDL-C, high-density lipoprotein cholesterol; KDOQI, Kidney Disease Outcomes Quality Initiative; KTRs, kidney transplant recipients; LDL-C, low-density lipoprotein cholesterol.
Dyslipidemias are abnormalities in circulating lipoproteins that are associated with an increased risk of CVD. The Work Group did not perform systematic reviews of the evidence for management of dyslipidemias in KTRs since this was performed recently for the KDOQI Dyslipidemia Guidelines. Rather, the recommendations of the Work Group are based on those of the KDOQI Dyslipidemia Guidelines for the management of dyslipidemia in CKD (566). The Work Group searched for, but did not find, large RCTs for dyslipidemia management in KTRs published since the publication of the KDOQI Dyslipidemia Guidelines. In addition, the Work Group searched for, but did not find, new guidelines for the management of dyslipidemia in the general population. Therefore, the Work Group concluded that there was little new evidence to require modification of the KDOQI Dyslipidemia Guidelines at this time. However, the Work Group amended the original guideline statements to apply to the KTRs.
- • In the general population, there is strong evidence that reducing LDL-C decreases the risk for CVD events.
- • In KTRs, there is little reason to believe that reducing LDL-C would not be safe and effective in reducing CVD events.
- • In KTRs, the prevalence of dyslipidemia is high enough to warrant screening and intervention.
- • In KTRs, there is moderate evidence that dyslipidemias contribute to CVD and that treatment of increased LDL-C with a statin may reduce CVD events.
A large number of RCTs in the general population have demonstrated that lowering LDL-C reduces CVD events and mortality. There is less evidence that treating other lipoprotein abnormalities, such as increased triglycerides or reduced HDL-C is effective. Guidelines generally recommend treating patients based on the level of LDL-C and the level of risk for CVD events.
Although there are drug–drug interactions that must be monitored in KTRs, the use of 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitors (‘statins’) is generally safe and effective in lowering LDL-C, if appropriate dose modification is made for patients treated with CNIs. The use of other lipid-lowering therapies are less certain, but potentially beneficial in KTRs.
The incidence and prevalence of dyslipidemia is high in KTRs, in large part due to the fact that immunosuppressive agents cause or contribute to dyslipidemias. Agents implicated in causing dyslipidemias include corticosteroids, CsA and mTORi. The overall prevalence of dyslipidemia during the first year after transplantation is >50%, although the prevalence is greatly influenced by the type of immunosuppression used and the presence of other factors, such as proteinuria, acute rejection and graft dysfunction. In any case, this high prevalence of dyslipidemia justifies screening and monitoring.
Observational studies suggest that hypercholesterolemia and increased LDL-C are independently associated with CVD events in KTRs. A RCT found that treatment of LDL-C with fluvastatin did not significantly reduce the primary end point (major adverse cardiac events) (567). However, important secondary end points, including mortality, were reduced by fluvastatin, and long-term follow-up suggested that major adverse cardiac events were also reduced (568). Thus, this study generally confirmed evidence from observational studies in KTRs, and RCTs in the general population, which indicate that increased LDL-C causes CVD, and treatment of LDL-C with a statin reduces the risk of CVD.
Although many measurements of lipoproteins can be linked to CVD events (e.g. apolipoprotein B, lipoprotein (a), etc.), the preponderance of evidence suggests that elevations in LDL-C are most closely associated with CVD. As a result, most guidelines target the screening and treatment of LDL-C. The measurement of LDL-C, or its estimation with the Friedewald formula, is reliable and generally available in most major laboratories around the world. The calculation of LDL-C requires a fasting lipid panel with total cholesterol, HDL-C and triglycerides. Directly measured LDL-C changes little with fasting or nonfasting, but direct measurement is less readily available.
Treating an underlying cause of dyslipidemia may improve the lipid profile. Although there are few data in KTRs, it is reasonable to expect that reducing or eliminating nephrotic-range proteinuria may improve the lipid profile. Similarly, treating poorly controlled diabetes may improve abnormal plasma lipids. Rarely, severe hypothyroidism may alter plasma lipoproteins. RCTs have shown that corticosteroids, CsA and especially mTORi can cause dyslipidemias in KTRs. In some cases, severe dyslipidemia may require modification of immunosuppressive medications.
The National Cholesterol Education Program Guidelines (569) and the KDOQI Guidelines on Dyslipidemia in KTRs (566) recommend first treating severe hypertriglyceridemia to avert the risk for pancreatitis. Very high levels of triglycerides (usually in the thousands) generally indicate elevations in chylomicrons. There is an association between severe hypertriglyceridemia and pancreatitis, prompting the recommendation to treat severe hypertriglyceridemia as the first priority. How often severe hypertriglyceridemia causes pancreatitis in KTRs is unknown.
If severe hypertriglyceridemia is not present, then LDL-C becomes the therapeutic target. In the KDOQI Dyslipidemia Guidelines, all adult KTRs are at high risk for ischemic heart disease, and therefore should be treated to maintain LDL-C <100 mg/dL (2.59 mmol/L) (566). The drug of first choice for reducing LDL-C is a statin. Doses of statins usually need to be reduced by approximately 50% in patients treated with CsA, and probably also in patients treated with tacrolimus (although fewer data are available).
The relatively small number of patients who have normal or low LDL-C, increased triglycerides and high non-HDL-C likely have high levels of atherogenic lipoprotein remnants. Treatment for these patients should be similar to treatment for patients with high LDL-C (566).
For adolescents, the KDOQI Dyslipidemia Guidelines increased the LDL-C target goal to reflect both the uncertainty of dyslipidemia treatment in adolescents, and possible the increased risk. The US Preventive Services Task Force (USPSTF) was unable to determine the balance between potential benefits and harm of screening children and adolescents for dyslipidemia (570). The National Cholesterol Education Program Report of the Expert Panel on Blood Cholesterol Levels in Children and Adolescents recommended selective screening for children and adolescents with a family history of premature coronary heart disease or at least one parent with a high total cholesterol level (571).
- 16.3: TOBACCO USE
- 16.3.1: Screen and counsel all KTRs, including adolescents and children, for tobacco use, and record the results in the medical record. (Not Graded)
- • Screen during initial transplant hospitalization.
- • Screen at least annually, thereafter.
- 16.3.2: Offer treatment to all patients who use tobacco. (Not Graded)
KTRs, kidney transplant recipients.
Tobacco use includes the inhalation or ingestion of any tobacco product, including: the inhalation of tobacco smoke from cigarettes, cigars, water pipes or other devices; the nasal absorption of tobacco from snuff and the oral absorption and ingestion of tobacco from chewing.
- • In the general population, there is strong evidence that tobacco use causes CVD, cancer, chronic lung disease and premature death.
- • In the general population, there is strong evidence that screening, prevention and treatment measures are effective in adults. The effectiveness of clinician counseling of children and adolescents is uncertain.
- • In KTRs, there is no reason to believe that the approach to prevention and treatment of tobacco use should be different than in the general population.
- • In KTRs, cigarette smoking is associated with CVD and cancer.
- • In KTRs, the prevalence of tobacco use is high enough to warrant intervention.
Evidence-based guidelines for the general population have concluded that there is strong evidence that tobacco use causes CVD, cancer and chronic lung disease (572–578). Although most studies have focused on cigarette smoking, there is evidence that any tobacco use is harmful (579). Evidence-based guidelines for the general population have also concluded that screening patients for tobacco use and implementing prevention and treatment measures are effective, at least in the short term, in improving the likelihood of abstinence in adults. However, there are few studies from the general population showing that interventions are effective for more than 1 year. There is also insufficient evidence that interventions are effective in children and adolescents.
A large number of observational studies have reported higher rates of CVD and mortality for cigarette smokers in the general population. In addition, there have been a large number of RCTs showing that different smoking cessation interventions are effective in increasing the number of patients who quit smoking (580–582). Recently, RCTs have also shown that smoking cessation interventions reduce mortality in the general population (583,584).
In KTRs, there is no reason to believe that the prevention and treatment of tobacco use would be different from that in the general population. In particular, there are no interactions between pharmacotherapies for aiding in tobacco abstinence and immunosuppressive agents that would prevent the use of either in KTRs (Table 25).
|Nicotine replacement||Nicotine gum, inhaler, nasal spray, lozenge and patch||May use in combinations with other nicotine and non-nicotine replacement agents|
|Antidepressant||Bupropion SR||Monitor CsA blood levels and increase CsA dose as needed (585)|
|α4β2 nicotinic receptor partial agonist||Varenicline||Warn patients and monitor for serious neuropsychiatric symptoms including depression and suicidal ideationa|
Cigarette smoking at the time of kidney transplantation has been found to be an independent risk factor for patient survival, graft survival, ischemic heart disease, cerebral vascular disease, PVD and CHF (Table 18) (438,439,442,443,586,587). Smoking has also been found to be associated with posttransplant malignancies (588).
The prevalence of cigarette smoking at the time of transplantation varies between 25% and 50% (438,439,586,588). The prevalence of smoking varies from country to country, likely due to differences in the prevalence of smoking in the general populations of those countries. However, even in countries where the prevalence is relatively low, it is high enough to warrant interventions.
Screening (and counseling) adults for tobacco use is recommended for the general population (572–576). Guidelines in the general population have cited a lack of evidence that screening adolescents and children is effective, although there is likely little harm in including children and adolescents (573). Screening patients includes asking them about their tobacco use history (including start and stop dates), amounts and types of tobacco used and prior interventions. Patients may not admit that they use tobacco, and nicotine levels have been used to identify smokers among KTRs (589). However, there is insufficient evidence for or against the use of laboratory testing to detect tobacco use in KTRs or in the general population.
There is no evidence to suggest when and how often to screen for tobacco use in KTRs. However, there are studies in the general population that indicate screening and intervention during hospitalization is more effective than usual care (575). Therefore, we recommend screening and intervention for patients during the initial hospitalization for kidney transplantation. There is no evidence to suggest the optimal interval after hospitalization for screening and intervention. However, given that initial screening may not be effective, follow-up screening would seem to be prudent. In addition, given the fact that at least some patients who do not use tobacco may begin to use tobacco at some time after transplantation, periodic screening is indicated. The Work Group determined that annual screening is a reasonable minimum frequency.
Self-help is not adequate for smoking cessation. Both counseling and pharmacotherapy are effective, either alone or in combination. In general, the effectiveness of counseling is proportional to the amount of time spent counseling; however, even counseling for 3 min or less is effective (573). The ‘5 As’ of counseling include: (i) ask about tobacco use, (ii) advise to quit through clear and personalized messages, (iii) assess willingness to quit, (iv) assist quitting and (v) arrange follow-up and support (573).
A number of different pharmacological therapies are effective in increasing the rate of smoking abstinence. There are five nicotine replacement aids and two other medications that have been shown to be effective in RCTs in the general population (Table 25) (580–582). These agents can and should be used in combination.
- • Randomized controlled trials are needed to determine the optimal approach(es) for reducing tobacco use in KTRs.
- 16.4: OBESITY
- 16.4.1: Assess obesity at each visit. (Not Graded)
- • Measure height and weight at each visit, in adults and children.
- • Calculate BMI at each visit.
- • Measure waist circumference when weight and physical appearance suggest obesity, but BMI is <35 kg/m2.
- 16.4.2: Offer a weight-reduction program to all obese KTRs. (Not Graded)
BMI, body mass index; KTRs, kidney transplant recipients.
Obesity in adults is defined, as it is in major guidelines for the general population, as body mass index (BMI) ≥30 kg/m2 (Table 26). Because some individuals may have BMI ≥30 kg/m2 that is not due to excess body fat, it is recommended that the definition of obesity in adults include waist circumference ≥102 cm (≥40 in.) in men and ≥88 cm (≥35 in.) in women.
|Obesity class||BMI (kg/m2)||Disease riska|
|Obesity, class 1||30.0–34.9||High|
|Obesity, class 2||35.0–39.9||Very high|
|Extreme obesity, class 3||≥40||Extremely high|
Body mass index can be calculated either as weight in kilograms divided by height in meters squared, or as weight in pounds divided by height in inches squared multiplied by 703 (both methods yielding units kg/m2).
In children, obesity is generally defined as BMI above the 95th percentile for age and sex. However, this definition is largely based on data from the US Caucasian population, and may be less applicable to other populations. The CDC and the American Academy of Pediatrics recommend the use of BMI to screen for overweight in children beginning at 2 years old (http://www.cdc.gov/nccdphp/dnpa/bmi/childrens_BMI/about_childrens_BMI.htm; last accessed March 30, 2009). For children, BMI is used to screen for overweight, at risk of overweight or underweight. However, BMI is not a diagnostic tool in children. For example, a child may have a high BMI for age and sex, but to determine if excess fat is a problem, a health-care provider would need to perform further assessments. These assessments might include skinfold thickness measurements, evaluations of diet, physical activity, family history and other appropriate health screenings.
The USPSTF found ‘fair evidence’ that BMI is a reasonable measure for identifying children and adolescents who are overweight, or at risk for becoming overweight, and that overweight children and adolescents are at increased risk for becoming obese adults. Therefore, BMI thresholds should be used to define overweight based on percentiles of the general population for age and sex (Table 27) (591).
|Obesity risk||BMI (kg/m2)a||Risk|
|At risk for being overweight||85–94 percentile||Becoming overweight|
|Overweight||≥95 percentile||Being overweight as an adult|
- • In the general population, there is strong evidence that obesity is a risk factor for CVD events and mortality in adults.
- • In the general population, there are few studies examining the effects of obesity treatment on CVD events or mortality, but there is evidence that the benefits of treating obesity on intermediate outcomes for CVD outweigh harm in adults.
- • In KTRs, obesity is associated with CVD events and mortality.
- • In KTRs, there is little reason to believe that weight reduction measures are not equally effective as in the general population; however, there is some reason to believe that pharmacological and surgical management of obesity may be more likely to cause harm than in the general population.
Observational studies in the general population have shown that obesity is an independent risk factor for CVD (592). Obesity is also associated with a number of risk factors for CVD, including hypertension, dyslipidemias and diabetes (590).
A number of RCTs in the general population have shown that diet may cause modest weight reduction, at least over a period of 12 months. Pharmacological interventions are more effective in weight loss than diet alone, but are associated with more adverse effects. Bariatric surgery is effective, and may improve health outcomes. Guidelines in the general population generally recommend screening and treatment of obesity (http://www.cdc.gov/healthyweight/assessing/bmi/childrens_BMI/about_childrens_BMI.html; last accessed July 27, 2009) (591,593–597).
Observational studies in adult KTRs have reported an association between obesity and mortality, CVD mortality and CHF (Table 18).
Counseling standard weight reduction diets, as recommended in guidelines in the general population, is unlikely to cause harm in KTRs. The effects of pharmacological management of obesity in KTRs are largely unexplored. Anecdotal evidence suggests that bariatric surgery can be performed safely in KTRs and results in weight loss, at least over a relatively short duration of follow-up (598–600).
Small, uncontrolled trials in KTRs suggest that diet and other behavior modifications are safe and help reduce weight over the short term (601,602). There is no evidence that any one diet is more effective than any other. A reasonable goal is to create a caloric deficit of 500–1000 kcal/day. Diets of 1000–1200 kcal/day for women and 1200–1500 kcal/day for men can be effective. Increased physical activity may help to sustain weight reduction and reduce CVD risk independent of weight reduction. Exercise may also be beneficial, although a small RCT in KTRs failed to show that counseling to encourage exercise reduced weight or CVD risk factors at 1 year (603). Nevertheless, exercise capacity increased in this study, and there was no harm associated with exercise.
A large number of RCTs have examined pharmacologic interventions for weight loss in the general population. These trials have shown modest weight reduction from medications vs. placebo at 12 months (604). There are few long-term studies, and even fewer studies that have examined health outcomes. In a 4-year RCT, 52% completed treatment with orlistat while 34% completed treatment with placebo. Mean weight loss was greater with orlistat (–5.8 kg) vs. placebo (–3.0 kg, p < 0.001). The cumulative incidence of diabetes was 6.2% with orlistat vs. 9.0% with placebo (p = 0.0032). In a RCT, comparing the cannabinoid receptor antagonist rimonabant with placebo in 839 patients, rimonabant failed to reduce the primary end point, change in atheroma volume on coronary intravascular ultrasound (605). Of concern are reports of psychiatric adverse effects from rimonabant (606). Altogether, it remains unclear whether the benefits outweigh harm of pharmacological management of obesity in the general population.
Pharmacological treatment of obesity has not been adequately studied in KTRs. Adverse effects of available agents limit their usefulness in the general population, and are likely to have an even greater potential for adverse effects in KTRs. Orlistat may interfere with the absorption of fat-soluble vitamins, and there have been case reports of an interaction between orlistat and CsA, resulting in lower CsA levels (607–609). Studies in the general population have shown that sibutamine can cause weight loss, but adverse effects are common and include increased blood pressure and heart rate (604). There have been no studies of sibutamine in KTRs.
There have been no RCTs examining the long-term effects of bariatric surgery on health outcomes in the general population. Nevertheless, bariatric surgery appears to be more effective than diet in causing weight reduction (610,611). In the largest case-control study to date, gastric bypass, vertical banded gastroplasty or gastric banding caused, respectively, −25%, −16% and −14% weight losses from baseline to 10 years (612). Importantly, there were 129 deaths in the control group and 101 deaths in the surgery group (p = 0.04). The most common cause of death in this study was myocardial infarction (612). In another large observational study, all-cause mortality (p < 0.0001), deaths from diabetes (p = 0.0005) and deaths from coronary artery disease (CAD) (p = 0.006) were lower among 7925 patients who had undergone bariatric surgery compared to 7925 matched controls (613). Thus, it appears that bariatric surgery can produce sustained weight reduction and improve health outcomes.
Guidelines in the general population recommend weight loss surgery in patients with severe obesity, that is BMI ≥40 kg/m2 or ≥35 kg/m2 with comorbid conditions. Bariatric surgery may include gastric banding or gastric bypass (Roux-en-Y). Uncontrolled studies suggest that bariatric surgery may be performed safely in selected KTRs (598–600). However, the incidence of complications may also be greater in KTRs (614).
Guidelines in the general population recommend tailoring treatment to the severity of obesity and its comorbidities (Table 28).
|Pharmacotherapy||If there are comorbiditiesb||Yes||Yes||Yes|
|Bariatric surgery||If there are comorbiditiesc||If there are comorbiditiesc||If there are comorbiditiesc|
Childhood obesity in the general population is associated with a higher prevalence of CVD risk factors, such as dyslipidemias, hypertension and diabetes. However, CVD events may take decades to develop. Few studies have examined the safety and efficacy of weight reduction in children or adolescents. The USPSTF concluded that evidence was insufficient to recommend for or against routine screening for obesity in children and adolescents as a means to prevent adverse health outcomes. There are likewise few studies on the treatment of obesity in children and adolescent KTRs; therefore, there is no basis for a different recommendation than for the general population.
- • Additional research is needed to determine the effect of bariatric surgery on outcomes in KTRs.
Chapter 17: Cardiovascular Disease Management
- 17.1: Consider managing CVD at least as intensively in KTRs as in the general population, with appropriate diagnostic tests and treatments. (Not Graded)
- 17.2: We suggest using aspirin (65–100 mg/day) in all patients with atherosclerotic CVD, unless there are contraindications. (2B)
CVD, cardiovascular disease; KTRs, kidney transplant recipients.
The Work Group chose to deal with the prevention of CVD after kidney transplantation, and considered the management of CVD complications to be beyond the scope of this guideline. However, in patients with known CVD, prophylaxis includes aspirin.
- • There is good evidence that atherosclerotic CVD is prevalent in KTRs.
- • There is no reason to believe that the management of complications of atherosclerotic CVD is different in KTRs than in the general population.
- • In the general population, there is strong evidence that aspirin reduces atherosclerotic CVD events in patients with known CVD.
- • There is little reason to believe that the benefits of aspirin would not exceed the harm in KTRs with CVD, as in patients with CVD in the general population.
Randomized controlled trials, and meta-analyses of these trials, have demonstrated that low-dose aspirin is safe and effective in reducing CVD events in patients at high risk for CVD. This has led to several guidelines suggesting that low-dose aspirin should be used in patients with known CVD (secondary prevention) (615–617). The American Heart Association, for example recommends using aspirin for patients with established coronary and other atherosclerotic vascular disease, including peripheral arterial disease, atherosclerotic aortic disease and carotid disease (616).
In KTRs, there is little reason to believe that low-dose aspirin would not be as effective as it is in the general population. There is some evidence that platelet function is abnormal in KTRs, increasing the risk for thrombosis (618). Some observational data suggest that aspirin is safe in KTRs. In at least one retrospective observational study, the use of aspirin was associated with better graft survival (619). Given the high incidence of CVD in KTRs, the benefits of aspirin prophylaxis may be expected to outweigh risks, principally of bleeding.
Evidence from the general population suggests that aspirin prophylaxis is effective in preventing CVD events in patients at high risk for CVD events, such as patients with known CVD. Most guidelines recommend that patients in the general population with known CVD should receive aspirin prophylaxis unless aspirin is contraindicated. Data for other antiplatelet agents are sparse; however, many guidelines recommend that clopidogrel may be used in patients who cannot take aspirin.
- • A RCT is needed to determine the efficacy and safety of aspirin in KTRs.
Section IV: Malignancy
Introduction: Malignancy Risks After Kidney Transplantation
Kidney transplant recipients from around the world are at greater risk of developing cancer compared to the general population (Table 29). This is especially true for cancers associated with viral infections (e.g. EBV-associated lymphomas). Some cancers are common in the general population and also occur at a higher incidence in KTRs (e.g. colon cancer). Some are common in KTRs because they are common in the general population and have a similar incidence in KTRs (e.g. breast cancer). Others are rare, but occur at a substantially higher rate in KTRs (e.g. Kaposi sarcoma) (620,621). There are also cancers that may cause stage 5 CKD, and are therefore seen more commonly in KTRs (e.g. myeloma and renal cell carcinoma).
|Common cancersb||Common cancers in transplant population (estimated)c||Rare cancersd|
|High SIRe (>5)||Kaposi's sarcoma (with HIV)e||Kaposi's sarcomaf||Eye|
|Moderate SIRe (>1–5, p < 0.05)||Lung||Oro-nasopharynx||Melanoma|
|No increased risk showne||Breast||Ovary|
Cohort studies have demonstrated the variability of risk for cancer with both age and sex, with young KTRs having a risk 15–30 times greater than the general population of the same age, while the risk is only two times greater for 65-year-old KTRs (625). After the development of cancer, the survival of transplant recipients is poor, and treatment options are limited by the transplant or comorbidities. It is thus important to consider options for preventative measures and screening KTRs, which can theoretically deliver benefits of lower morbidity and mortality through reduced incidence or early interventions.
Rating Guideline Recommendations
|Level 1||‘We recommend’|
|Level 2||‘We suggest’|
|Grade for quality of evidence||Quality of evidence|
Chapter 18: Cancer of the Skin and Lip
- 18.1: We recommend that KTRs, especially those who have fair skin, live in high sun-exposure climates, have occupations requiring sun exposure, have had significant sun exposure as a child, or have a history of skin cancer, be told that their risk of skin and lip cancer is very high. (1C)
- 18.2: We recommend that KTRs minimize life-long sun exposure and use appropriate ultraviolet light blocking agents. (1D)
- 18.3: We suggest that adult KTRs perform skin and lip self-examinations and report new lesions to a health-care provider. (2D)
- 18.4: For adult KTRs, we suggest that a qualified health professional, with experience in diagnosing skin cancer, perform annual skin and lip examination on KTRs, except possibly for KTRs with dark skin pigmentation. (2D)
- 18.5: We suggest that patients with a history of skin or lip cancer, or premalignant lesions, be referred to and followed by a qualified health professional with experience in diagnosing and treating skin cancer. (2D)
- 18.6: We suggest that patients with a history of skin cancer be offered treatment with oral acitretin, if there are no contraindications. (2B)
KTRs, kidney transplant recipients.
Skin cancers include basal cell carcinoma, squamous cell carcinoma and malignant melanomas.
Fair-skin individuals are Caucasians and especially those with blond hair and light complexion (626).
High sun-exposure climates are in areas of the world that are near the Equator, and those that have poor ozone-layer protection.
Appropriate ultraviolet light/sun avoidance includes the use of shade and avoidance of sunlight during peak hours of radiation, wearing protective clothing and the use of ultraviolet light blocking sunscreens.
Skin and lip self-examination is accomplished by close inspection of all skin areas, using a mirror and/or the assistance of a family member, such as a spouse.
Qualified health professionals with experience in diagnosing skin cancer include physicians, physician's assistants or nurse practitioners with experience in diagnosing skin cancer.
Qualified health professionals with experience in diagnosing and treating skin cancer include dermatologists, physicians or surgeons with experience in diagnosis (including skin biopsies and their interpretation) and treatment of skin cancer.
Acitretin has been used at doses between 0.2 and 0.4 mg/kg/day in RCTs to prevent skin cancers.
- • Patients who are at high risk can be identified.
- • Patient behaviors can reduce the risk.
- • Educating patients who are at high risk will encourage them to undertake behaviors that will reduce that risk.
- • Sun exposure is a risk factor for skin cancer.
- • Avoiding sun exposure may reduce the incidence of skin cancer.
- • Self-examination will detect skin cancer at an earlier stage than other measures.
- • Early detection and treatment will reduce the morbidity and mortality of skin cancer.
- • Skin and lip examination by a qualified health professional can detect skin cancer early.
- • Advice to undertake regular skin self-examination is poorly recalled and implemented.
- • Acitretin may reduce the risk for recurrent squamous cell skin cancer in KTRs.
- • Although adverse effects associated with the use of acitretin are common, and often necessitate discontinuing therapy, the benefits may outweigh harm in selected KTRs.
Skin cancers occur with a much higher incidence in KTRs compared to the general population. In addition, risk factors for skin cancers in the general population are also likely to be risk factors for skin cancer in KTRs. These include: fair skin, living in high sun-exposure climates, having occupations with sun exposure, having had significant sun exposure as a child, or having a history of skin cancer (627).
Most measures for reducing the risk of skin cancer (described in guideline statements above) require patient cooperation. Although there are only limited RCT data demonstrating that informing KTRs of their increased risk for skin cancer helps to reduce that risk, the benefits of patient education are very likely to outweigh harm (628).
There is evidence that geographical locations associated with increased sun exposure are associated with increased risk of skin cancer in both KTRs and the general population (629). There is also evidence in the general population that the use of sunscreen reduces the incidence of squamous cell cancer (630). Although there is no evidence in KTRs that avoidance of sun exposure or the use of sun blockers reduces skin cancer, potential benefits likely outweigh harm. Sun can be blocked by staying in shaded environments, wearing protective clothing, a wide-brim hat and sunglasses that block ultraviolet light. There is a concern that use of sunscreens may lead to behaviors which increase total sunlight exposure (631).
It is plausible that self-examination will lead to earlier detection of skin cancer than less frequent skin examinations by health-care providers (632). It is also plausible that early detection will lead to early treatment, and thereby reduce morbidity and mortality. However, skin self-examination has not been shown to be effective in reducing overall cancer-specific mortality and morbidity in either the general population or in KTRs. Nevertheless, since the costs and adverse effects of self-screening are low, the use of education programs to encourage self-examination, especially in areas of high prevalence of skin cancer, is justified.
American (627) and European (633) transplantation professional guidelines recommend skin cancer screening in KTRs, monthly skin self-examination and at least annual total body skin examination by a dermatologist or expert physician (634). The USPSTF concluded that there is insufficient evidence to recommend for or against population skin cancer screening using total skin examination (635), while The American College of Preventive Medicine recommended screening for high-risk individuals (636).
Advice to undertake regular skin examination is poorly recalled and implementation is thus not reliable (637). Nonetheless, in a community-based RCT of regular skin screening, the intervention group reported considerably higher rates of performance (638). Visual inspection by KTRs is also likely not to be as reliable for detecting skin cancer as regular skin examinations by qualified health professionals. Studies in the general population have shown that individuals with adequate training and experience, for example dermatologists, detect skin cancer earlier than general practitioners (639). General practitioners with experience may perform as well as dermatologists in some areas (640). In the absence of experienced general practitioners, resources may be insufficient to allow KTRs to be seen annually by a dermatologist. Therefore, a strategy that combines primary screening with referral of suspicious lesions to a dermatologist may be most cost-effective. Patients who have had a skin cancer are much more likely to develop a second lesion than patients with no history of skin cancer (641). Therefore, patients who have had a skin cancer are more likely to benefit from regular screening by a dermatologist, or health-care professional with comparable training. Early diagnosis and removal of skin cancers is essential to reduce disfiguring surgery and to prevent mortality from advanced or metastatic lesions.
There is a paucity of RCT data assessing whether the benefits of altering the immunosuppressive medication regimen to reduce the incidence of skin cancer outweigh harm. For example, in a recent RCT, KTRs 10–15 years after transplant were randomly allocated to convert CNI to sirolimus (N = 555) vs. remaining on CNI (N = 275) (119). At 2 years of follow-up, 12 (2.2%) in the conversion group vs. 21 (7.7%) in the CNI group had investigator-reported skin cancer (p < 0.001). However, the number of adverse effects in the sirolimus conversion arm was higher than those in the CNI control arm. Indeed, the Drug Safety Monitoring Board halted enrollment for patients with eGFR 20–40 mL/min/1.73 m2 early, because in this stratum (N = 77) the composite safety end point (first occurrence of biopsy-proven acute rejection, graft failure or death) was significantly higher in the conversion vs. the control group (119). The Work Group concluded that it remains unclear whether there is a high-risk population of KTRs in which benefits from converting one immunosuppressive regimen to another to reduce skin cancer outweigh harm.
In three RCTs, which together included a total of 93 KTRs (10–15 years after transplant), those treated with acitretin for 6–12 months demonstrated a reduction in the rate of formation of new skin cancers compared to untreated controls, with no differences between doses of 0.2 and 0.4 mg/kg/day (642). In these trials, several individuals had adverse effects attributed to therapy (642); however, these adverse effects generally resolved upon discontinuation of treatment. Adverse effects that resulted in treatment withdrawal included: headache (N = 3), dyslipidemia (N = 2) musculoskeletal complaints (N = 2) and skin rash (N = 2). In addition, the duration of treatment and follow-up were relatively short in these trials. Altogether, the Work Group concluded that there is moderate-quality evidence that there are tradeoffs to prophylaxis with acitretin (see Evidence Profile in Supporting Table 52 at http://www3.interscience.wiley.com/journal/118499698/toc); some KTRs may consider that the benefits of treatment outweigh the harm.
- • A RCT is needed to better define the optimal dose and the benefits and harm of acitretin to prevent recurrent skin cancer in KTRs.
Chapter 19: Non-Skin Malignancies
- 19.1: Develop an individualized screening plan for each KTR that takes into account the patient's past medical and family history, tobacco use, competing risks for death, and the performance of the screening methodology. (Not Graded)
- 19.2: Screen for the following cancers as per local guidelines for the general population (Not Graded):
- • Women: cervical, breast and colon cancer;
- • Men: prostate and colon cancer.
- 19.3: Obtain hepatic ultrasound and alpha feto-protein every 12 months in patients with compensated cirrhosis. (Not Graded)[See Recommendations 13.5.4 (HCV) and 13.6.5 (HBV).]
HBV, hepatitis B virus; HCV, hepatitis C virus; KTR, kidney transplant recipient.
Screening for cancer has both benefits and harm. In KTRs with multiple comorbidities, it is essential to consider the extent and magnitude of potential harm, so it can be weighed against the risks of disease and benefits of early detection. There is good reason to believe that screening test performance, harm from interventions and the life years to be gained by early intervention may be substantially different in KTRs compared to that in the general population. Hence, careful individual appraisal needs to be exercised when making recommendations for screening of KTRs (643).
In general, the better the individual's prognosis, the higher the risk of disease, and the lower the risk of harm from screening, the greater is the chance of benefit (644). If, on the other hand, the individual has a poor prognosis from cardiac or other comorbidity, the risk of the disease to be screened is not high and the harm from screening is significant, the less it can be justified. For example cervical cancer screening of an unvaccinated 45-year-old patient with a well-functioning kidney allograft and no comorbidities is easier to recommend than fecal occult blood testing (FOBT) and subsequent colonoscopy in a 69-year-old patient with type II diabetes and severe CAD. The likely incidence of disease needs to be taken into account as well as the standardized incidence ratio (SIR) as performed in Table 29, since the two factors taken together define the likely risk of any given disease in an individual KTR. Unfortunately, there are no RCTs on screening for cancer in KTRs.
- • Comorbidities and competing risks in KTRs may influence the potential benefits and harm from screening for some cancers.
- • The decision to screen for cancer should be individualized.
Screening for cervical cancer
- • In the general population, there is good evidence that the benefits of screening outweigh harm.
- • In KTRs, cervical cancer is more common than in the general population, and screening may therefore be more beneficial.
- • In KTRs with quality of life and life expectancy not greatly reduced from that of general population, the benefits of screening may outweigh harm.
- • In the general population, there is evidence that the benefits of human papillomavirus (HPV) vaccination outweigh harm.
- • In KTRs, although vaccination may be less effective, there is little reason to believe that benefits would not outweigh harm.
Initiation of screening for cervical cancer is recommended for women within 3 years of onset of sexual activity or age 21 (whichever comes first) in order to detect malignant lesions resulting from persistent human papillomavirus (HPV) infection (http://www.ahrq.gov/clinic/uspstf/uspscerv.htm; last accessed July 17, 2009) (645). Cervical cancer is more common, may develop more rapidly and may be more aggressive in immunosuppressed patients (646,647), suggesting that KTRs should be screened more frequently (648). American and European transplant guidelines recommend annual screening for cervical cancer with pelvic examination and Pap smear (627,633). Use of HPV DNA testing has not achieved widespread acceptance (http://www.ahrq.gov/clinic/uspstf/uspscerv.htm; last accessed July 17, 2009). Screening for cervical cancer also provides an opportunity to inspect the anal, vaginal and vulvar regions for cancers that are also increased in female KTRs. The cost of cervical cancer screening in KTRs is modeled at US$ 12 000 per life-year saved comparable to the general population (US$ 25 000 to 50 000 per life-year saved) (649).
In the general population, there is strong evidence that the benefits of vaccination outweigh harm, but the longest duration of follow-up is 52 months at present. HPV vaccination of girls prior to exposure to HPV infection (for the oncogenic strains 16 and 18, which account for approximately 70% of cervical cancers, and for the wart-causing strains 6 and 11) has been adopted in a number of countries (650,651). The vaccine is inactivated and could thus be used both prior to transplantation and in KTRs, but there is no evidence for effectiveness or safety in immunosuppressed patients.
Screening for breast cancer
- • In the general population, there is weak evidence that the benefits of screening outweigh harm.
- • In KTRs, the incidence of breast cancer is similar to that in the general population.
- • In KTRs with quality of life and life expectancy similar to that of general population, the benefits of screening may outweigh harm.
Mammography for women in the general population ages 50–74 decreases breast cancer mortality by 23% (95% CI 13–31%) (652,653). The incidence of breast cancer is very similar in both the general population and in KTRs. There are no RCTs or studies on which to base advice for or against breast cancer screening in KTRs. The two factors that might influence the decision to screen are screening test performance and potential life-years saved from intervention. American and European transplant guidelines recommend screening in KTRs between 50 and 69 years with an option to screen above the age of 40 years (627,633). Test accuracy for mammography varies with the best results in older women, and the worst results in younger women. Consideration should also be given to the potential physical and emotional harm from false-positive and false-negative screening tests. Models of screening for breast cancer in KTRs suggest that it is cost-effective in nondiabetic Caucasians (654).
Screening for prostate cancer
- • In the general population, there is little evidence that the benefits of screening outweigh harm.
- • In KTRs, the incidence of prostate cancer is similar to that in the general population.
- • In KTRs, it is unclear whether the benefits of screening outweigh harm.
Screening for prostate cancer, using prostate-specific antigen (PSA) and/or digital rectal examination, is controversial in the general population. The most recent recommendation from the USPSTF is to avoid screening men 75 years or older (http://www.ahrq.gov/clinic/uspstf/uspsprca.htm; last accessed July 17, 2009). They also concluded that there was insufficient evidence to assess the balance of benefits and harm for screening men younger than 75 years old (http://www.ahrq.gov/clinic/uspstf/uspsprca.htm; last accessed July 17, 2009). The incidence of prostate cancer in KTRs is similar to that in the general population, and being one of the commonest cancers in males, there is a high absolute risk (Table 29). However, there are no data on screening test performance, or benefits in KTRs, and there is good reason to believe that the performance of PSA testing may be different in KTRs compared to the general population. No advice is thus given for or against screening for prostate cancer in KTRs, beyond following local recommendations/standards for prostate cancer screening in the general population.
Screening for colorectal cancer
- • In the general population, there is good evidence that the benefits of screening outweigh harm for individuals age 50 years and older.
- • In KTRs, the incidence of colon cancer is increased compared to the general population, especially among KTRs less than 50 years of age.
- • In KTRs, there are reasons to believe that FOBT may be less specific for colon cancer than in the general population, but there is no evidence to believe that colonoscopy is less sensitive or specific.
Studies in the general population have demonstrated that the benefits of screening generally outweigh the harm (655–658). Guidelines for the general population in Australia/New Zealand, the US and in Europe, recommend screening individuals 50 years and older, using annual FOBT and/or colonoscopy (655). The standardized incidence of colorectal cancer is increased in KTRs compared to the general population, and there is good evidence that colon cancer occurs at a younger age in KTRs compared to the general population (Table 29). American and European transplantation guidelines recommend screening either at age 50 years, or at the age at which it is recommended in the general population in each country (627,633).
Screening with FOBT may be less specific in KTRs, given that the incidence of positive tests from CMV infection and drug toxicities may be high. The harms of colonoscopy must be carefully considered in each individual based upon their comorbidities, since the consequences of the potential complications of colonoscopy are influenced negatively by immunosuppression. In the absence of data on the benefits and harm of screening of KTRs for colon cancer, it is suggested that screening should be performed as currently recommended for the general population with careful individual risk–benefit analysis based upon overall prognosis and comorbidities. A recent analysis suggests that the benefits may outweigh the harm from screening of KTRs aged 35–50 years (659).
Screening for hepatocellular cancer
- • In KTRs, the risk of hepatocellular carcinoma is higher than in the general population.
- • In the general population, there is no evidence that the benefits of screening outweigh harm.
There are screening recommendations in high-risk groups (patients with cirrhosis and those who are hepatitis B carriers) that include abdominal ultrasound and alpha-feto protein testing every 6–12 months (660–662). Testing every 6 months is based on the estimated doubling time of this tumor (660). The Work Group chose a 12-month testing interval, given uncertainties of the benefits and harm of testing. Both tests have limited specificity and sensitivity (663). Nonetheless screening by gastroenterologists in high-risk patients is reported to be about 50% by questionnaire survey in the United States (664,665), the interventions have significant risks and no RCTs have demonstrated survival benefits. There have been several cost-effectiveness studies but the conclusions have varied widely from very cost-effective to values exceeding US$ 250 000 per quality-adjusted life-year (666). The US National Cancer Institute does not recommend screening (http://www.cancer.gov/cancertopics/pdq/screening/hepatocellular/healthprofessional/page2; last accessed July 17, 2009) largely because of a concern of uncommon but significant harm due to invasive testing after false-positive screening. There have been two large population-based RCTs in Asia in HBV-infected subjects. The larger study showed some benefit, but was of poor quality, and the second showed no benefit (667,668).
The highest-risk group of KTRs with otherwise good prognosis are those with compensated cirrhosis and chronic viral hepatitis, especially HBV (669). Given that the benefits are inconclusive in high-risk nontransplant patients, the recommendation of the US National Cancer Institute is not likely to differ in KTRs.
Screening for renal cell cancer
- • In KTRs, the incidence of renal cell carcinoma is much higher than in the general population; however, there is no evidence that the benefits of screening outweigh harm.
Screening is not generally recommended in the general population. Both relative and absolute risks of renal cell cancer are substantially increased in KTRs compared to the general population. Although there is no good evidence that mortality is reduced, several United States, European and Asian centers are screening for renal cell carcinoma after transplant (670–672). The rate of renal cell carcinoma (number per years of follow-up) is difficult to determine from these reports, but appears to vary considerably. Two important risk factors for renal cell carcinoma in these reports were prior renal cell carcinoma and the presence of acquired cystic disease. A medical decision analysis conducted several years ago, predominantly in dialysis patients with low expected survival rates, determined that the benefits of routine screening would be low (673). Screening will likely detect many unimportant lesions that will require further investigation, treatment and thus possible harm. Nonetheless, significant benefits could accrue to higher-risk transplant recipients with better-than-average life expectancy. Patients with prior renal cell cancer are at risk of both recurrence and new primaries, irrespective of whether they have been transplanted. Some diseases, such as analgesic nephropathy, tuberous sclerosis and acquired cystic disease are associated with an increased risk of renal cell carcinoma. The American Society of Transplantation guidelines found no evidence to advise screening with either imaging or urine cytology (627).
- • Observational studies are needed to better define age-specific SIR for most cancers, with preliminary analyses suggesting that younger KTRs have a greatly increased SIR compared to older KTRs.
- • Studies on the performance of FOBT in KTRs would help determine its potential role for screening KTRs.
- • A RCT should be performed to assess the benefits and harm of screening vs. no screening for renal cell carcinoma. Preliminary data are needed to define mortality rates from renal cell carcinoma after transplantation, and determine age-specific SIR, since analyses suggest that younger KTRs have a greatly increased SIR in comparison to older KTRs.
Chapter 20: Managing Cancer with Reduction of Immunosuppressive Medication
- 20.1: We suggest consideration be given to reducing immunosuppressive medications for KTRs with cancer. (2C)
- 20.1.1: Important factors for consideration include (Not Graded):
- • the stage of cancer at diagnosis;
- • whether the cancer is likely to be exacerbated by immunosuppression;
- • the therapies available for the cancer;
- • whether immunosuppressive medications interfere with ability to administer the standard chemotherapy.
- 20.2: For patients with Kaposi sarcoma, we suggest using mTORi along with a reduction in overall immunosuppression. (2C)
KTRs, kidney transplant recipients; mTORi, mammalian target of rapamycin inhibitor(s).
- • In KTRs, cancers that have a high or moderately increased SIR (e.g. ≥3.0) are likely caused or exacerbated by immunosuppressive medication.
- • In KTRs that develop cancers likely to be caused or exacerbated by immunosuppressive medication, reducing immunosuppressive medication may prolong survival.
- • In KTRs, cancers that have a low SIR (e.g. ≤1.5) are unlikely to have been caused or to be exacerbated by immunosuppressive medication.
- • In KTRs that develop cancers that are unlikely to be caused or exacerbated by immunosuppressive medication, reducing immunosuppressive medication is less likely to have a significant effect on survival, and may increase the risk for acute rejection.
- • Reduced quality of life from graft loss must be balanced against the potential for prolonging survival by reducing immunosuppression.
- • Reducing immunosuppressive medications may reduce complications of cancer chemotherapy.
- • In KTRs with Kaposi sarcoma, dramatic reductions in lesion size have been associated with a change in immunosuppressive medication to mTORi.
In KTRs, non–renal cell cancers that have a high SIR (e.g. ≥3.0) are likely caused or exacerbated by immunosuppressive medication. There is strong evidence that immunosuppressive medication increases the risk of some specific types of cancer, notably cancer that may be caused by viruses (Table 30). There is little evidence that specific immunosuppressive agents are more likely than others to increase the risk of cancer. It is more likely that the total amount of immunosuppressive medication increases the risk for cancer, rather than the type of immunosuppressive medication per se. Observational data have suggested that there is an association between PTLD and the use of biological anti–T-cell agents (674). There is evidence from post hoc analysis of RCTs that there was a reduction in cancer incidence in sirolimus treatment arms (119,675). However, the numbers of patients developing cancer were small, and the post hoc nature of the analysis increases the possibility that the results were due to chance.
|Virus||Malignancy site/Type of cancer|
|Sufficient evidence||Limited evidence|
|HBV and HCV||Liver|
|Human T-cell lymphotropic virus type 1||Non-Hodgkin lymphoma|
|Human Herpes virus 8||Kaposi sarcoma|
|EBV||Nasopharynx, Non-Hodgkin lymphoma, Hodgkin lymphoma|
|HPV||Tongue, mouth, tonsil, anus, vagina, cervix, penis||Nonmelanoma skin, larynx|
To reduce immunosuppressive medications in KTRs diagnosed with cancer is a difficult decision. There is evidence that the risk of de novo cancer returns to pretransplant levels after graft failure (676–679), suggesting that reducing immunosuppressive medication may be warranted. Experimental studies have demonstrated the specific capacity of CNIs to increase metastasis (680). Clinical studies have implicated antiproliferative agents in increased, and mTORi in relative reduction in cancer risk. However, there have been no RCTs testing the effects of reducing or withdrawing immunosuppressive medications in posttransplant cancer, and it is possible that established cancer and de novo cancer behave differently under the influence of immunosuppression. The standard established treatment for PTLD and Kaposi's sarcoma includes reducing immunosuppression, and this has proven to be sufficient to control or eliminate tumors in some KTRs (681).
The decision to reduce or withdraw immunosuppressive medication must also balance quality of life with and without a functioning transplant, if cessation of medication results in graft rejection. Altogether, evidence suggests that consideration should be given to reducing immunosuppressive medications in each individual, but since this evidence is weak, the type of cancer, stage of disease, and patient preferences should be taken into account.
In KTRs, cancers that have a low SIR (e.g. <3.0) are unlikely to be caused or exacerbated by immunosuppressive medication. In distinction to those cancers in which the SIR is elevated in immunosuppressed KTRs, cancers in which there is no evidence for an increased risk from immunosuppression have no rationale for reducing or ceasing therapy.
In KTRs who develop cancers that are unlikely caused or exacerbated by immunosuppressive medication, reducing immunosuppressive medication will likely have little effect on survival, and may increase the risk for acute rejection. There are no data to support or refute altering immunosuppression after development of cancer of the prostate, breast, ovary, uterus, pancreas, brain glioma or testis. However, many of the complications of cancer chemotherapy are also complications of immunosuppressive agents used in KTRs, and reducing immunosuppressive medications to prevent or treat complications of chemotherapy is warranted.
Several case series in patients with established Kaposi sarcoma have demonstrated benefits from conversion from standard immunosuppression to either sirolimus or everolimus. Cases with disease limited to the skin have had resolution of the skin lesions, while the responses of disseminated solid-organ invasive disease have been less convincing (682,683). The strong benefit seen in these case series, together with experimental data and a clear scientific rationale for efficacy through inhibition of vascular endothelial growth factor-F receptors, have lead to the conclusion that patients with Kaposi sarcoma should be immunosuppressed with these agents in preference. On the other hand, there are also case series that have shown regression of Kaposi's sarcoma with a reduction in immunosuppressive medication alone (684).
Section V: Other Complications
Rating Guideline Recommendations
|Level 1||‘We recommend’|
|Level 2||‘We suggest’|
|Grade for quality of evidence||Quality of evidence|
Chapter 21: Transplant Bone Disease
(See KDIGO Clinical Practice Guideline for the Diagnosis, Evaluation, Prevention and Treatment of Chronic Kidney Disease–Mineral and Bone Disorder [CKD–MBD].)
- 21.1: In patients in the immediate post-kidney transplant period, we recommend measuring serum calcium and phosphorus at least weekly, until stable. (1B)
- 21.2: In patients after the immediate post-kidney transplant period, it is reasonable to base the frequency of monitoring serum calcium, phosphorus and PTH on the presence and magnitude of abnormalities, and the rate of progression of CKD. (Not Graded)
- 21.2.1: Reasonable monitoring intervals would be (Not Graded):
- • In CKD stages 1–3T, for serum calcium and phosphorus, every 6–12 months; and for PTH, once, with subsequent intervals depending on baseline level and CKD progression.
- • In CKD stage 4T, for serum calcium and phosphorus, every 3–6 months; and for PTH, every 6–12 months.
- • In CKD stage 5T, for serum calcium and phosphorus, every 1–3 months; and for PTH, every 3–6 months.
- • In CKD stages 3–5T, measurement of alkaline phosphatases annually, or more frequently in the presence of elevated PTH.
- 21.2.2: In CKD patients receiving treatments for CKD–MBD, or in whom biochemical abnormalities are identified, it is reasonable to increase the frequency of measurements to monitor for efficacy and side effects. (Not Graded)
- 21.2.3: It is reasonable to manage these abnormalities as for patients with CKD stages 3–5. (Not Graded)
- 21.3: In patients with CKD stages 1–5T, we suggest that 25(OH)D (calcidiol) levels might be measured, and repeated testing determined by baseline values and interventions. (2C)
- 21.4: In patients with CKD stages 1–5T, we suggest that vitamin D deficiency and insufficiency be corrected using treatment strategies recommended for the general population. (2C)
- 21.5: In patients with an eGFR greater than approximately 30 mL/min/1.73 m2, we suggest measuring BMD in the first 3 months after kidney transplant if they receive corticosteroids or have risk factors for osteoporosis as in the general population. (2D)
- 21.6: In patients in the first 12 months after kidney transplant with eGFR greater than approximately 30 mL/min/1.73 m2 and low BMD, we suggest that treatment with vitamin D, calcitriol/alfacalcidiol, or bisphosphonates be considered. (2D)
- 21.6.1: We suggest that treatment choices be influenced by the presence of CKD–MBD, as indicated by abnormal levels of calcium, phosphorus, PTH, alkaline phosphatases, and 25(OH)D. (2C)
- 21.6.2: It is reasonable to consider a bone biopsy to guide treatment, specifically before the use of bisphosphonates due to the high incidence of adynamic bone disease. (Not Graded)
- 21.6.3: There are insufficient data to guide treatment after the first 12 months. (Not Graded)
- 21.7: In patients with CKD stages 4–5T, we suggest that BMD testing not be performed routinely, because BMD does not predict fracture risk as it does in the general population and BMD does not predict the type of kidney transplant bone disease. (2B)
- 21.8: In patients with CKD stages 4–5T with a known low BMD, we suggest management as for patients with CKD stages 4–5 not on dialysis. (2C)
25(OH)D, 25-hydroxyvitamin D; BMD, bone mineral density; CKD, chronic kidney disease; CKD–MBD, chronic kidney disease–mineral and bone disorder; eGFR, estimated glomerular filtration rate; KDIGO, Kidney Disease: Improving Global Outcomes; PTH, parathyroid hormone.
We largely deferred to the KDIGO CKD–MBD Guideline that is pertinent to KTRs (684a). We reviewed these recommendations, but did not conduct independent evidence reviews.
- • The risk of fractures following kidney transplantation is high.
- • It is not clear how to identify KTRs who might benefit from treatment.
- • Bone disease is multifactorial, and most KTRs have preexisting CKD–MBD.
- • In non-KTRs, low bone mineral density (BMD) or a loss of BMD predicts fractures, but data are scant for KTRs.
- • No RCTs in KTRs have examined bone-specific therapies on patient-level outcomes, including mortality or fractures.
- • Treatment with calcium, calcitriol or vitamin D analogs, and/or bisphosphonates has been suggested to improve BMD in KTRs.
- • A small study of calcitriol demonstrated worsened bone turnover, but improved mineralization.
- • A small study of treatment with bisphosphonates demonstrated worsening bone turnover and mineralization.
- • There are insufficient data to suggest any bone-specific therapies after the first year of kidney transplantation.
CKD–MBD is common in KTRs. Most KTRs have some degree of CKD, and thus CKD–MBD may be present. Transplant-specific therapies, especially corticosteroids, may further affect CKD–MBD management. Biochemical abnormalities are common after transplantation. The scope and magnitude of the biochemical abnormalities of CKD–MBD fluctuate early, compared to late after transplantation. Posttransplant bone disease represents an important complication observed in a substantial proportion of patients, but the etiology and pathology vary. Early studies have demonstrated a rapid decrease in BMD in the first 6–12 months after successful kidney transplantation, and continued loss, albeit at a lower rate, for many years (685). Fractures are common and are associated with substantial morbidity.
The etiology of transplant bone disease is multifactorial. Patients come to transplantation with preexisting CKD–MBD. In addition, there are potentially deleterious effects of immunosuppressive agents (see Supporting Table 53 at http://www3.interscience.wiley.com/journal/118499698/toc), impaired kidney function, and other factors, such as postmenopausal status, presence of diabetes, smoking, physical inactivity and duration of CKD stage 5 (686). Previous studies in KTRs have shown a correlation between the cumulative dose of glucocorticoids and BMD. Based on a few bone biopsy studies in KTRs, glucocorticoids appear to be the primary determinant of subsequent bone volume and turnover. Thus, the cumulative and mean prednisone dose correlated negatively with bone turnover, whereas there was no correlation with CsA cumulative dose or serum parathroid hormone (PTH) (687). The possible role of CNIs remains incompletely studied, with contradictory reports on their effects on bone turnover (687).
Arterial calcification is also common after a kidney transplant, but it may be due to the effects of the uremic state and dialysis rather than the transplant itself. In KTRs (CKD stages 1–5T), only one prevalence study was identified, demonstrating a prevalence of calcification of 24.4% (444). Although this cross-sectional study was large (n = 1117), calcification was assessed by posterio-anterior plain abdominal X-ray examination of the aorto-iliac region, which is likely to be less sensitive than computerized tomography based imaging. In addition, one of the major difficulties in interpreting calcification in the transplant population is the carryover effect from CKD stage 5 or stage 5D. Currently, only one preliminary study is available suggesting that the progression of cardiovascular calcification may be halted after renal transplantation (688).
KTRs who develop persistently low levels of serum phosphorus (<1.0 mmol/L) should be considered for treatment with phosphate supplementation. However, phosphate administration is not without risk, and caution should be exerted, as it may exacerbate an already existing secondary hyperparathyroidism. Therefore, every attempt should be made in order to prescribe the strict minimum doses.
Although no clinical trials have specifically addressed the frequency of monitoring in KTRs, KTRs usually have CKD, and therefore are likely to have CKD–MBD. Thus, the management of the biochemical abnormalities of CKD–MBD after transplant should be similar to that proposed for nontransplant CKD and based on the prevalence of abnormalities, and the risks associated with those abnormalities.
A recent study of 303 KTRs in the United States found that 11–25% had abnormal calcium or calcium X phosphorus product in the first year following transplant, and 24% with eGFR 40–60 mL/min/1.73 m2 had intact PTH levels >130 pg/mL (130 ng/L) at 1 year after kidney transplantation (689). Another series from the UK (690) evaluated 244 KTRs; 104 in the first year, and the remainder more than 1 year after transplant. Hypercalemia was present in 40% of recently transplanted recipients and 25% of long-term patients. Vitamin D insufficiency (40–75 nmol/L) was present in 29% and 43%, deficiency (12–39 nmol/L) in 56% and 46%, and severe deficiency (<12 nmol/L) in 12% and 5%, respectively. A larger cohort from Switzerland (691) evaluated 823 KTRs, on average 7 years after transplantation. They found only 27% had a PTH within normal range (i.e., 15–65 pg/mL [15–65 ng/L]), whereas 70% had hyperparathyroidism (PTH >65 pg/mL [65 ng/L]), and 2.8% were hypoparathyroid (PTH <15 pg/mL [15 ng/L]). Serum phosphorus was normal in 74% (0.85–1.45 mmol/L), and increased in only 3.6%. Finally, serum calcium was normal in most patients (85.9%), with only 2.8% and 11.3% being hypo- and hypercalcemic, respectively. Thus, disorders of mineral metabolism may persist many years after transplantation.
There are few data describing the risk relationship of biochemical abnormalities of CKD–MBD and mortality in KTRs. A study of 773 KTRs found no relationship between serum calcium, phosphorus or PTH and mortality (692). However, patients with the highest quintile of phosphorus had increased risk of kidney allograft loss. Similarly, those with the highest quintile of calcium had an increased risk of kidney allograft loss.
Hypercalcemia following kidney transplantation is common and is usually due to hyperparathyroidism that persists from the preceding period of CKD. In 30–50% of KTRs, abnormal PTH secretion persists, causing hypercalcemia that may require parathyroidectomy (693–696). The same principles for managing patients with CKD stages 3–5 with CKD–MBD will apply for patients with CKD stages 3–5T.
Studies demonstrating that low BMD, or loss of BMD, predict fractures are lacking in KTRs. In one study (697), reductions in BMD have been associated with an increased fracture rate in studies of osteoporosis in postmenopausal women, in men, in patients treated with glucocorticoids, and in heart or liver transplant recipients. However, the etiology of posttransplant bone disease is likely influenced by pretransplant CKD–MBD, and ongoing CKD–MBD following transplantation, given that most patients have some impairment of CKD. Thus, studies in the general population and other solid-organ transplant recipients may not be applicable to KTRs.
Trials evaluating vitamin D as preventive therapy assessed changes in BMD as the primary outcome. In two studies, an increase in BMD was observed with calcitriol and alfacalcidol, vs. ‘no treatment’ or placebo (698,699). Except for mild hypercalcemia in the study by Josephson et al. (700). there were few adverse effects. Unfortunately, there are no RCTs examining beneficial or harmful effects of bone-protective agents on patient-level outcomes, for example fractures, hospitalizations or mortality.
Two studies have evaluated bisphosphonates in KTRs. Coco et al. (701) studied KTRs who received intravenous pamidronate at baseline, 1, 2, 3 and 6 months after transplantation. A rapid decrease of lumbar spine BMD was prevented in the pamidronate group. No changes in hip BMD were observed. There were no differences in the number of fractures between the groups after 1 year. Bone biopsies were done at the time of transplantation in 21 patients and in 14 patients after 6 months, six in the pamidronate group and eight in the control group (701). The mean activation frequency after 6 months was significantly lower in the pamidronate-treated patients than in the controls. All of the pamidronate-treated patients had adynamic bone disease on the 6-month biopsy; four patients with initial hyperparathyroidism and one with mixed uremic osteodystrophy developed adynamic disease. In the control group, three of eight had adynamic bone disease. Bone turnover improved in five of eight (62%) of controls and in none of the pamidronate biopsies. It worsened in one control biopsy (12%) and in five of six (83%) of pamidronate biopsies. Overall, the histology shows development of adynamic bone disease in the pamidronate-treated patients, but the results are limited by small numbers and short follow-up time. It is also not clear if the potential benefit from preserving bone volume outweighs the potential harm of decreased bone formation and/or prolonged mineralization.
Grotz et al. (702) evaluated intravenous ibandronate at baseline and 3, 6, and 9 months after transplantation. Loss of trabecular and cortical bone assessed by BMD was prevented by ibandronate. Fewer vertebral deformities by X-ray were observed in the ibandronate group compared to the controls. No significant side effects or decreased GFR were reported.
Overall, the quality of the preventive studies with bisphosphonates was ranked as moderate. Some of the studies showed limited fracture data and/or bone biopsy information. The observation in the study by Coco et al. that patients showed early evidence of and progression to adynamic bone disease should raise caution about the use of bisphosphonates in KTRs.
Only one RCT in KTRs late after transplantation evaluated the effect of calcitriol plus calcium carbonate vs. no treatment (703). This study enrolled 45 patients, with only 30 of them completing the trial. Bone biopsies were an evaluated end point. Although significant improvement in BMD was observed after 1 year in the treatment group, no differences were observed between the treatment and nontreatment groups. No fracture data were reported. Thus, the overall quality of the evidence is low. Bone biopsy results showed that bone turnover was better in 43% of the control biopsies and 12% of the calcitriol biopsies, but worse in 28% of the control biopsies and 50% of the calcitriol biopsies. No adverse effects were recorded.
Only one randomized comparison trial examined the effect of bisphosphonates in long-term KTRs with established osteopenia or osteoporosis. Jeffery et al. evaluated 117 patients with reduced BMD (T score ≤−1). Patients were randomized to daily oral alendronate and calcium vs. calcitriol and calcium (704). One year of therapy was completed by 90 patients. Both treatments showed significant increases in lumbar spine and femur BMD. No differences between groups were demonstrated.
Special considerations in children
In a four-arm study of 60 pediatric KTRs, alfacalcidol ± calcitonin was compared to alendronate with respect to BMD and selected biochemical markers (705). No differences were found. No fracture data were reported. Another 30 patients from the same investigators were given either alfacalcidol or placebo therapy, and BMD and selected biochemistries were assessed (706). There were no differences in outcomes. Given the paucity of data about CKD stages 1–5T, and the inherent inaccuracy in the use of dual energy X-ray absorptiometry to assess BMD in pediatric patients, there is currently insufficient evidence to recommend specific treatments for posttransplant renal bone disease in children.
- • Observational studies are needed to determine the level of BMD that is predictive of fractures in KTRs.
- • RCTs are needed in KTRs with low BMD at the time of transplantation to evaluate the effects of bisphosphonates or calcitriol and vitamin D analogs on patient-level outcomes, such as all-cause mortality, hospitalization, fracture, cardiovascular morbidity and mortality and quality of life.
- • For KTRs with low serum calcidiol levels at the time of transplantation, RCTs are needed to determine the effect of vitamin D supplementation on change in BMD and patient-level outcomes, such as all-cause mortality, hospitalization, fracture, cardiovascular morbidity and mortality and quality of life.
Chapter 22: Hematological Complications
- 22.1: Perform a complete blood count at least (Not Graded):
- • daily for 7 days, or until hospital discharge, whichever is earlier;
- • two to three times per week for weeks 2–4;
- • weekly for months 2–3;
- • monthly for months 4–12;
- • then at least annually, and after any change in medication that may cause neutropenia, anemia or thrombocytopenia.
- 22.2: Assess and treat anemia by removing underlying causes whenever possible and using standard measures applicable to CKD. (Not Graded)
- 22.3: For treatment of neutropenia and thrombocytopenia, include treatment of underlying causes whenever possible. (Not Graded)
- 22.4: We recommend using ACE-Is or ARBs for initial treatment of erythrocytosis. (1C)
ACE-I, angiotensin-converting enzyme inhibitor; ARB, angiotensin II receptor blocker; CKD, chronic kidney disease.
Hematologic abnormalities are common adverse effects of immunosuppressive medications and of transplant- or immunosuppression-related comorbidities. In addition, hematologic abnormalities can cause potentially life-threatening complications. Therefore, screening is warranted. In most laboratories, a complete blood count includes hemoglobin, white blood count (with differential) and platelet count. Anemia is defined as a hemoglobin <13.5 g/dL (135 g/L) in adult males, <12.0 g/dL (120 g/L) in adult females and <5th percentile for children (707). Neutropenia is defined as a neutrophil count <1500/μL (1.5 × 109/L). Thrombocytopenia is defined as platelet count <150 000/μL (1.5 × 1011/L).
Erythrocytosis or polycythemia is variably defined in the literature as hemoglobin >16–18 g/dL, or hematocrit >50–52%. Some report gender-specific hematocrit thresholds (men 53–55%; women 48–51%) and others require evidence of persistence over a specified time period or on multiple determinations (627,708–710). The Work Group has chosen to define erythrocytosis as hemoglobin >17 g/dL or a hematoctrit >51%.
- • In KTRs, anemia, neutropenia and thrombocytopenia are common.
- • In KTRs, anemia is associated with morbidity and mortality, neutropenia with infection and thrombocytopenia with bleeding. In addition, these hematologic abnormalities may be an indication of treatable, but potentially life-threatening, underlying disorders.
- • In KTRs, monitoring and identifying the underlying cause and treatment will reduce the morbidity and mortality of anemia, neutropenia and thrombocytopenia.
The Work Group reviewed the KDOQI Guidelines on Anemia in CKD, and concluded that these evidence-based guidelines can and should guide anemia management in KTRs (707). Readers can find a detailed discussion of anemia in CKD in these guidelines. Anemia in the immediate posttransplant period is likely to be caused by pretransplant anemia and operative blood loss. The correction of anemia after transplantation is dependent on achieving hemostasis, immunosuppressive medications, iron deficiency, other causes of bone marrow suppression and factors affecting kidney function (e.g. DGF).
After the immediate posttransplant period, infections, rejection, immunosuppressive medications, other medications such as ACE-Is and ARBs (Table 31), hemolysis, and—less often—cancer, may cause or contribute to anemia. There is some evidence that KTRs may have a level of anemia greater than can be expected based on the level of kidney function, even without specific causes (711,712). When and how to evaluate anemia is well defined in the KDOQI guidelines for KTRs who are not actively bleeding, and have stable kidney function (707). Treatment should be directed at the underlying cause. Iron deficiency is common. There is evidence from a single small RCT that iron supplementation results in a higher hematocrit (44%) compared to no iron (36%) in KTRs (713).
|Medications that cause hematologic abnormalities|
|Anemia||Azathioprine (714–717)||CNIs (722,723)|
|MPA (718,719)||OKT3 (722,723)|
|Neutropenia||Azathioprine (714,715)||Rituximab (726)|
|MPA (718)||ACE-I (727)|
|Sirolimus (50)||Ticlopidine/clopidogrel (728)|
|Leflunomide (720)||Other antimicrobials (728)|
|Lymphocyte-depleting antibodies (8)|
|Thrombocytopenia||Sirolimus (42)||OKT3 (730)|
|MPA (729)||Valganciclovir (722,723)|
|Azathioprine (729)||Ticlopidine/clopidogrel (731)|
|Lymphocyte-depleting antibodies (8)||Heparin (732)|
Altering immunosuppressive agents to treat anemia should be considered, but may be difficult, especially in the early posttransplant period when acute rejection rates are highest and maintaining adequate immunosuppression is critical. Some, but not all, studies have identified anemia as an independent predictor of mortality in the intermediate posttransplant period (733–735). However, there are no RCTs showing that benefits of therapy with an ESA outweigh harm, or the optimal hemoglobin target, in KTRs. There are two small RCTs using ESAs in the early posttransplant period, but the overall effects on anemia were small (711,736). Another small trial showed that patients receiving ESAs before transplant, who attained normal hemoglobin levels, had outcomes that were no different than those with low hemoglobin levels (737). There is no evidence to support routine ESA administration in anticipation of anemia (see Supporting Tables 54–55 at http://www3.interscience.wiley.com/journal/118499698/toc).
The European Best Practices Guidelines for kidney transplantation recommend regular screening and careful evaluation of anemia (721). They also identify immunosuppressive agents, ACE-Is and ARBs as causative agents. They recommend following the European Best Practices Guidelines for anemia management, which recommend that an ESA not normally be discontinued in patients undergoing surgery or who develop an intercurrent illness (738). No recommendation was made on whether to continue or stop ESAs in the immediate posttransplant period. Patients with a failing kidney transplant should be followed as any other patient with failing kidney function.
Many of the same factors responsible for anemia also cause neutropenia (Table 31). Although there are no RCTs on screening for these abnormalities, the potential consequences of not screening are severe. Infection is the second most common cause of death, after CVD, in KTRs (739). In the nontransplant population with iatrogenic neutropenia (absolute neutrophil count <500/μL [5 × 108/L]), patients are at increased risk for serious infection (740). A possible major contributor to neutropenia in KTRs is that kidney dysfunction may delay clearance of medications that can suppress leukocyte production by the bone marrow.
Medications are a common cause of leukocyte abnormalities. There are a number of RCTs that document leucopenia in the first 1–3 years after transplantation. Unfortunately, the definition of leucopenia differs among studies; therefore, direct comparison across trials is problematic. Different classes of immunosuppressive agents have differing effects on leucocytes. CNIs are not generally associated with leucopenia. In contrast, antiproliferative agents are an important cause of leucopenia. In early trials, azathioprine was associated with leucopenia (714,715). In the European Trial of MMF vs. placebo with CsA and prednisone, there was more leucopenia in the group treated with 2 g/day MMF (14%, n = 165) vs. placebo (4%, n = 166) (718). In the tricontinental MMF trial, there was slightly less (significance not stated) leucopenia in the arm treated with 2 g/day MMF (19%, n = 171) vs. the arm using 100–150 mg/day azathioprine (30%, n = 162) (729). In two trials evaluating the safety of EC-MPS vs. MMF, there were no significant differences in leucopenia (42,43). These study protocols included rules to reduce the dose or discontinue these agents in the presence of leucopenia, which likely limited the severity and overall incidence of very low counts.
In a Cochrane systematic review, mTORi were associated with more leukopenia (RR 2.02, 95% CI 1.12–3.66, by meta-analysis) than CNIs (50). No mention was made of differences in leucopenia in patients treated with sirolimus vs. placebo with CsA and prednisone, or in the meta-analysis comparing sirolimus to other antiproliferative agents (50,741). The Symphony trial compared four interventions: standard-dose CsA and MMF (n = 384), low-dose CsA with MMF (n = 408), low-dose tacrolimus and MMF (n = 403) and sirolimus and MMF (n = 380) (30). At the end of 12 months, leucopenia occurred in 10.2%, 10.1%, 13.4% and 10.3% of patients, respectively (p > 0.05).
There is no evidence that IL2-RAs cause significant hematologic abnormalities. In contrast, lymphocyte-depleting antibodies are associated with more (p < 0.001) leucopenia (33%, n = 141) compared to the IL2-RA, basiliximab (14.6%, n = 137) (8). More leucopenia was demonstrated in a RCT comparing groups treated with lymphocyte-depleting antibodies with tacrolimus or CsA to one with tacrolimus and no lymphocyte depleting antibodies (7). Addition of steroids also has an impact on leucopenia. In one trial, leucopenia was seen more often (significance not stated) in the steroid-free (17.9%) and the steroid-withdrawal (16.5%) arms compared to the standard steroid arm (13.8%) (48).
Other medications commonly used in KTRs to treat comorbidities are associated with leucopenia. Valganciclovir was associated with more leucopenia compared to ganciclovir (8.2% vs. 3.2%) in a RCT of high-risk solid-organ transplant recipients (724). However, the alternative antiviral valacyclovir was not associated with more leucopenia compared to placebo in a RCT of CMV prophylaxis in KTRs, but drug-induced leucopenia in the treatment arm may have offset the CMV-induced leucopenia in the control arm (742). Combined therapy with antiviral and antiproliferative agents may increase the incidence of leucopenia (743).
The risk of neutropenia from trimethoprim–sulfamethoxazole in KTRs is unclear. There have been several small RCTs, and they did not report differences in the incidence of leucopenia (744,745). In a bone marrow transplantation study, prophylaxis with trimethoprim–sulfamethoxazole (vs. ciprofloxacin) was associated with a 6-day delayed recovery of neutropenia (746). Case reports of agranulocyctosis have been reported with trimethoprim–sulfamethoxazole (725).
Many of the factors that cause anemia and leucopenia also cause thrombocytopenia (Table 31). There are also relatively uncommon conditions, such as recurrent or de novo thrombotic microangiopathy, that can cause kidney dysfunction, hemolytic anemia and thrombocytopenia (722,723). Thrombocytopenia is also associated with several medications used in KTRs. mTORi are associated with much higher RRs of thrombocytopenia compared to CNIs (RR 7.0, 95% CI 3.0–16.4) (42). Sirolimus also demonstrated more thrombocytopenia in comparison to azathioprine and MMF (RR 1.95, 95% CI 1.29–2.97) (50). Thrombocytopenia was also frequently observed in the tricontinental MMF trial (5% MMF 3 g/day; 9% MMF 2 g/day; 12% azathioprine, significance not stated) (729). In a (potentially underpowered) study comparing thymoglobulin to basiliximab induction, thrombocytopenia (platelet count <80 000/μL) was not significantly different (10.6% vs. 5.8%, p = 0.19) in the thymoglobulin group vs. the basiliximab group (8). Thrombocytopenia is also observed in patients with thrombotic microangiopathy associated with CNIs and, rarely, other medications such as clopidogrel and valacyclovir (722,723,731).
Other causes of leucopenia and thrombocytopenia include severe sepsis, viral infection (CMV, parvovirus B19) and other medications (716,717,719,720,726–728,730,732,747–753). Idiopathic thrombocytopenia has rarely been described after transplantation, and can be related to autoimmunity transferred from the donor (754). Transient thrombocytopenia has also been described in recipients of allografts whose donors had suffered disseminated intravascular coagulation (755).
Patients with low platelet counts are at increased risk of bleeding. Treatment of thrombocytopenia includes removing the offending drugs or treating other underlying causes. For example, case series have shown that parvovirus B19 associated hematologic abnormalities can be treated with intravenous immunoglobulin (751). Plasmapheresis has also been used to treat HUS/thrombotic microangiopathy that may be associated with thrombocytopenia (723). There are several case reports documenting the use of colony-stimulating factors (CSFs) to treat neutropenia in kidney transplant patients (756–758). However, there is potential for harm with treatment. One case report suggested that CSFs may have been associated with worsening graft function (758). There are clinical practice guidelines in the cancer literature that can be referred to for the use of CSFs (616). The review performed by the American Society of Clinical Oncology (616) found there is ample evidence that CSFs shorten the duration of neutropenia. There are, however, inadequate data to know whether or not there is benefit in afebrile neutropenic (absolute neutrophil count <1000/μL [1 × 109/L]) patients. There is evidence, though, that patients with febrile neutropenia (absolute neutrophil count <500/μL [5 × 108/L]) benefit from CSFs along with antibiotics if there is pneumonia, fungal infection, hypotension, sepsis syndrome or multisystem organ failure.
The European Best Practice Guidelines on kidney transplantation recommend regular screening and careful evaluation of neutropenia in KTRs (759). The combination of allopurinol and azathioprine should be avoided to prevent neutropenia (616). There are not likely to be any RCTs to determine when to give CSFs in KTRs. Guidance for their use will be derived mostly from local clinical practice and oncology guidelines (708). There are similar guidelines for the treatment of thrombocytopenia with platelet transfusion (760).
- • Erythrocytosis is a well-known complication of kidney transplantation.
- • In the general population, erythrocytosis is associated with morbidity (fatigue, dyspnea, thrombotic events, etc.) and mortality.
- • In the general population, there is some evidence that correction is associated with a reduction in thrombotic events.
- • In KTRs, adverse consequences of erythrocytosis may be less common than in the general population.
- • In KTRs, treatment of erythrocytosis is effective and safe with angiotensin blockade.
The incidence of erythrocytosis varies from 8% to 22% among reports identified from earlier clinical practice guideline publications (627,708–710). More recent studies document that erythrocytosis still occurs in KTRs (761–765). Many studies do not differentiate between increased red cell mass or reduced plasma volume. Erythrocyctosis tends to occur within the first 2 years, but can occur much later. It may revert spontaneously in 20% or more of cases (709,710).
The mechanisms of erythrocytosis are unclear and are likely multifactorial. Sustained increases in erythropoietin have not been consistently found, but seem to be increased to a greater extent than expected for the level of hematocrit (766). Other proposed mediators of erythrocytosis include endogenous androgens, renin–angiotensin system activation and other growth factors (710). Identified clinical risk factors that have been reported include male gender, polycystic kidney disease, smoking, immunosuppression, reduced kidney function, absence of rejection, renal artery stenosis, hydronephrosis, hypercalcemia, longer duration of dialysis, higher pretransplant hemoglobin, angiotensin-converting enzyme genotype, hypertension and diabetes mellitus (709,710,761–765,767–778).
The consequences of erythrocytosis can be severe. Evidence for the adverse outcomes related to erythrocytosis arise mostly from observations in patients with polycythemia vera. Historical observations document 20% of polycythemia vera patients present with a thrombotic event, and subsequent thrombosis occurs in as many as 50%; however, the associated risk of thrombosis has been difficult to quantify (779,780). Patients with polycythemia vera have a reduced life expectancy, but this is, in part, related to malignant progression (781). In addition, a large study of elderly patients without polycythemia vera undergoing noncardiac surgery showed that an elevated hematocrit was associated with short-term mortality and cardiac morbidity (782).
In the general population, treatment of erythrocytosis is effective. In a large observational study of patients in the general population with polycythemia vera and a prior history of thrombosis, pharmacological therapy to reduce red cell volume was associated with a 53% reduction in recurrent thrombotic events (783). Many of the recurrences occurred in patients with inadequate treatment (hematocrit >45%).
In KTRs, erythrocytosis can be asymptomatic, or patients may complain of fatigue, headaches, plethora, dyspnea or blurred vision (709,767,776). The more serious consequences include increased risk of venous and arterial thrombosis (767,768,784). One small case control study found more thromboembolic events in patients with polycythemia (11 events in 53 patients) compared to those without erythrocytosis (0 in 49 matched controls) (767). Most other studies in KTRs either did not report adverse events, described no concurrent controls, or found no increase in adverse events (770,771,774). In a large registry analysis of KTRs, erythrocytosis was not found to be a risk factor for stroke (450). Since erythrocytosis is now readily treatable, and the potential consequences of not treating are severe (venous and arterial thrombosis), there are not likely to be any long-term RCTs to compare the effect of treatment vs. no treatment on outcomes.
There are a number of small RCTs of fair quality and case series demonstrating the use of ACE-Is or ARBs to reduce hematocrit by an absolute value of between 4% and 15% (785–797). Given the small sample sizes and the lack of data on critical clinical outcomes, there is only a low level of evidence (see Evidence Profile and accompanying evidence in Supporting Tables 56–58). In a RCT comparing enalapril (2.5 mg/day, n = 15) to placebo (n = 10), the hematocrit dropped by 6.6% in the treatment arm compared to only 1.3% in the control arm (p = 0.004) (788). In another small trial, 15 patients were randomized to an ACE-I (enalapril) and 12 patients an ARB (losartan) (796). Hemoglobin levels decreased significantly in both groups (174–149 g/L for enalapril and 171–159 g/L for losartan); however, the drop was greater (p = 0.05) with enalapril (32.6 g/L decrease) than losartan (17.0 g/L decrease). Theophylline has been found to be useful in the transplant population with dramatic absolute reductions in hematocrit of 8–12% (798,799). However, several trials have found that ACE-Is were superior when compared directly to theophylline (800–802). In the study by Trivedi et al., the hematocrit fell by 7.6% in the ACE-I arm (fosinipril, n = 9) and did not change significantly (rose by 2.3%) in the theophylline arm (n = 5) (802). Other strategies include phlebotomy and bilateral nephrectomy, but these are invasive and the latter can be associated with significant morbidity (803). Clinicians should also be aware that both ACE-Is and ARBs are associated with small, reversible reductions in kidney function (557).
The European Best Practice Guidelines on kidney transplantation recommend that first-line treatment of erythrocytosis (>52% hematocrit in men and >49% in women) be ACE-Is or ARBs (708). The American Society of Transplantation states that erythrocytosis (>17–18 g/dL or hematocrit >51–52%) causes potentially life-threatening complications and is readily treatable.
- • RCTs on the use of ESAs and the optimal hemoglobin in KTRs are needed.
- • RCTs on the use of CSFs and target cell counts are needed.
- • Studies are needed to document the incidence and severity of erythrocytosis with current drug regimens.
- • Studies are needed to document the role of ACE-Is and ARBs in reducing the incidence of erythrocytosis.
Chapter 23: Hyperuricemia and Gout
- 23.1: We suggest treating hyperuricemia in KTRs when there are complications, such as gout, tophi, or uric acid stones. (2D)
- 23.1.1: We suggest colchicine for treating acute gout, with appropriate dose reduction for reduced kidney function and concomitant CNI use. (2D)
- 23.1.2: We recommend avoiding allopurinol in patients receiving azathioprine. (1B)
- 23.1.3: We suggest avoiding NSAIDs and COX-2 inhibitors whenever possible. (2D)
CNI, calcineurin inhibitor; COX-2, cyclo-oxygenase-2; KTRs, kidney transplant recipients; NSAID, nonsteroidal anti-inflammatory drug.
Definitions of hyperuricemia differ widely. Local laboratories often report the upper normal range as a population mean plus two standard deviations (gender-specific), and this performs well in clinical practice (804). An international task force recommends that a level of >0.36 mmol/L (6.0 mg/dL) be defined as hyperuricemia in the general population (804). For each 0.06 mmol/L (1.0 mg/dL) increase above 0.06 mmol/L, the adjusted RR of gout increases by 2.33 (95% CI 2.00–2.71). The threshold of 0.36 mmol/L is associated with 67% sensitivity and 78% specificity for diagnosing gout. A threshold of 0.42 mmol/L (7.0 mg/dL) is associated with a 57% sensitivity and 92% specificity (805). However, because of gender differences, men are less likely to experience gout at level between 0.36 and 0.42 mmol/L (6.0 and 7.0 mg/dL) and a higher level (>0.42 mmol/L [7.0 mg/dL]) is generally used for men (804). Detailed information is not available in KTRs, but the Work Group chose to define hyperuricemia as >0.36 mmol/L (6.0 mg/dL) in women and >0.42 mmol/L (7.0 mg/dL) in men.
- • Hyperuricemia is very common in KTRs.
- • Hyperuricemia increases the incidence of gout and other complications in KTRs, and it may be associated with loss of kidney function and CVD.
- • Important drug interactions and precautions will alter treatment strategies in KTRs with gout.
The incidence of hyperuricemia approaches 80% in KTRs (806,807). A recent analysis of 29 597 US Medicare recipients found that the cumulative incidence of gout was 7.6% at 3 years after transplantation (808). This relatively high incidence is consistent with a number of smaller reports (809–812).
The mechanisms responsible for hyperuricemia and gout are complex. Several studies have shown rates to be higher with CNIs, and especially CsA, when compared to azathioprine (806,808,809,811). The incidence of hyperuricemia appears to be similar with CsA and tacrolimus regimens, both being higher compared to regimens without CNIs. For example, in a recent large RCT, uric acid levels were similar between patients treated with low-dose CsA and tacrolimus at the end of 1 year, and significantly higher in comparison to patients on sirolimus and MMF (813). Consistent with these results is a study in which 35 patients were converted from CsA to tacrolimus had no change in uric acid levels (814). However, in another report of patients converted from CNIs to sirolimus, there was a significant reduction in uric acid levels (815). Similarly, in a small (n = 28) RCT of liver transplant recipients, conversion from CNIs to MMF was associated with a 15–20% reduction in uric acid levels (816). Other risk factors associated with hyperuricemia and gout are prior history, higher BMI, diuretics, older age, more recent year of transplantation and hypertension (806–809,812,817).
Of the clinical manifestations of hyperuricemia, gout is the most common. It can be disabling and is associated with lost time from work. Impressive tophaceous deposits in the hands can occur (806,812). Evidence that hyperuricemia causes or contributes to progressive kidney disease or CVD is weak, even in the general population (804,818,819). Acute kidney injury from very high uric acid levels has been reported (820). A large registry cohort study recently demonstrated an association of gout with elevated mortality (adjusted hazard ratio 1.26, 95% CI 1.08–1.47) and graft loss (adjusted hazard ratio 1.22, 95% CI 1.01–1.49) (808). This association with mortality, though, has not been observed in other studies. There are no RCTs to show that lowering uric acid levels is associated with better graft survival, kidney function or patient survival. There is one small (n = 54) recent RCT in nontransplant patients with kidney impairment, however, in which improved function with uric acid reduction failed to reach statistical significance (821). Case series have not shown a consistent benefit of uric acid reduction on kidney function in CKD (822).
Monitoring patients for hyperuricemia at the time of other routine blood monitoring might help prevent further increases in uric acid levels and greater risks for gout. There is evidence that dietary interventions (losing weight and reduced meat and alcohol consumption) and avoiding diuretics in the general population can lower uric acid levels (804). There are no studies in KTRs. Several medications used in KTRs can lower uric acid levels. For example, in a randomized crossover trial of 26 KTRs, losartan was associated with an 8% fall in uric acid levels (823). The uric acid lowering effect would not be the sole reason for using these medications, but could be substituted if these medications were needed for other indications. Monitoring might also give clinicians an increased level of suspicion for dealing with atypical symptoms of gout. Measuring uric acid levels is indicated in patients with suspected gout; however, during an acute gouty attack, levels may be normal (804). Treatment of asymptomatic hyperuricemia has not been generally recommended in the general population or KTRs, but it is advocated in those with recurrent symptomatic episodes of gout, tophi or radiographic changes of gout (627,804,824).
Treatment of gout is beyond the scope of these guidelines. There are evidence-based reviews on the treatment of hyperurcemia and gout (824). Briefly, oral colchicines and/or nonsteroidal anti-inflammatory agents are recommended as first-line agents for gout (824). Nonsteroidal anti-inflammatory agents and cyclo-oxygenase-2 inhibitors can be associated with significant reductions in kidney function and acute kidney injury (825–827). Patients with normal kidney function may use these agents in moderate doses for short periods of time, but nonsteroidal anti-inflammatory agents should be avoided in KTRs whenever possible (806).
Cochicine levels may be increased in patients with reduced kidney function and in patients treated with CsA (and presumably tacrolimus). Life-threatening colchicine toxicity has been described in patients with reduced kidney function receiving colchicine 1 mg/day for only 5–8 days (828). A disabling myoneuropathy has also been described in patients with reduced kidney function receiving long-term colchicine therapy (829,830). Therefore, prolonged use of colchicine should be avoided in patients with eGFR <60 mL/min/1.73 m2. However, colchicine can be used at reduced doses for <1 week in patients with eGFR >10 mL/min/1.73 m2 not requiring dialysis. In patients with eGFR <60 mL/min/1.73 m2, avoid doses higher than 0.6 mg/day. Intraarticular or short-term systemic steroids have also been used if the above therapies are contraindicated or not tolerated (824).
Allopurinol is a common uric acid lowering agent (804). However, allopurinol and azathioprine used together can result in profound, life-threatening pancytopenia (627,753), and thus this combination should be used with extreme caution, or not at all. If used together, azathioprine should be reduced by at least 50% and frequent complete blood counts should be used to monitor the interaction (806). Further dose reductions may be needed. Mycophenolate does not interact with allopurinol and can be used in place of azathioprine if an antiproliferative agent is necessary for immunosuppression (831). Patients allergic to allopurinol may be given benziodarone (832,833).
The American Society of Transplantation guidelines recommended measuring uric acid levels once 2–3 months after transplantation, with additional screening in patients with reduced function and on diuretics (627). The Caring for Australasians with Renal Impairment guidelines for patients with CKD state that treating hyperuricemia does not retard progression and cannot be recommended; patients on protein-restricted diets treated with allopurinol may require dose reductions (822). The European Best Practice guideline on kidney transplantation recommends that the combination of allopurinol and azathioprine be avoided (708).
- • A RCT with adequate statistical power is needed to study the effect of treating asymptomatic hyperuricemia on preventing loss of kidney function, gout and CVD.
Chapter 24: Growth and Development
- 24.1: We recommend measuring growth and development in children (1C):
- • at least every 3 months if <3 years old (including head circumference) (Not Graded);
- • every 6 months in children ≥3 years until final adult height. (Not Graded)
- 24.2: We recommend using rhGH 28 IU/m2/week (or 0.05 mg/kg/day) in children with persistent growth failure after kidney transplantation. (1B)
- 24.3: We suggest minimizing or avoiding corticosteroid use in children who still have growth potential. (2C)
rhGH, recombinant human growth hormone.
- • CKD and CKD stage 5 can cause growth failure in children before kidney transplantation.
- • Despite successful kidney transplantation, growth failure can persist.
- • Recombinant human growth hormone (rhGH) is safe and effective in children with growth failure after kidney transplantation.
- • Children with growth failure (height <3rd percentile, height target standard deviation score <−2, or height velocity <25% for chronological age) grow faster after kidney transplantation with 28 IU/m2/week of rhGH for 1 year compared to no treatment.
- • Long-term steroid use has a negative effect on normal growth in children.
- • Steroid minimization/avoidance protocols may be safe and effective in children.
The three major factors that can influence growth following successful kidney transplantation are age at transplantation (prepubertal vs. pubertal), allograft function and use of corticosteroid therapy. The height increment associated with the pubertal growth spurt is suboptimal in patients with CKD (834) and the lack of normal pubertal growth spurt in KTRs contributes to inadequate final adult height (835). Persistent growth failure, despite successful kidney transplantation, led to rhGH use being studied to address concerns regarding efficacy in the presence of corticosteroid immunosuppression, increasing risk of acute rejection and the potential for increasing the already raised incidence of malignancy in an immunosuppressed population.
Randomized controlled trials have shown that rhGH is effective in improving the growth of children with CKD during the first year of administration, with increases in all height indices (836), including children with growth retardation after kidney transplantation. The summary of RCTs, eight of which included children with kidney transplants (836), showed that treatment with rhGH (28 IU/m2/week) resulted in a significant increase in height standard deviation score at 1 year and a significant increase in height velocity at 6 months and 1 year. However, there was no further increase in height indices during the second year of administration, compared to untreated controls. On average, children treated with rhGH had an improvement in height standard deviation score by 0.8, height velocity by 3.8 cm/year and height velocity standard deviation score by 6 above nontreated controls (836). Most of the children in the studies after kidney transplant were on relatively low doses of glucocorticoids, with GFR >20 mL/min/1.73 m2 and all were greater than 1 year after transplant with height <3rd percentile, height standard deviation score <−2 or height velocity <25% for chronological age at the time of starting therapy. Overall, there is a moderate level of evidence that rhGH is better than placebo for increasing growth and that 28 IU/m2/week is superior to 14 IU/m2/week (see Evidence Profile and accompanying evidence in Supporting Tables 59–61 at http://www3.interscience.wiley.com/journal/118499698/toc). Alternatively, a multicenter placebo-controlled trial showed that a rhGH dose of 0.05 mg/kg/day significantly increased height in children with CKD (837).
Cohort studies in children with CKD have demonstrated that response to rhGH therapy is better in prepubertal than pubertal children (838), and in CKD stages 3 and 4 compared to CKD stage 5 (838). However, in short-term studies, there was no significant difference in the magnitude of rhGH-related growth with either pubertal status (including pediatric KTRs (839,840)) or between CKD stages 3 and 4 compared to CKD stage 5 (836).
Although no RCTs have been published with final adult height as an outcome, published data do provide some indirect support that rhGH improves final adult height in children with CKD, including KTRs. A longitudinal study of children with CKD treated with rhGH and followed until they achieved final adult height indicated that treated children had sustained catch-up growth where untreated matched children had progressive growth failure (838). Improved final height in rhGH-treated children has also been reported from US Transplant Registry data (841). However, it still needs to be determined whether rhGH therapy will result in an increase in final adult height in children who have received a kidney transplant.
Reported adverse events related to rhGH include asthma, acute rejection, deterioration in kidney function, papilledema, raised fasting glucose and glucose intolerance. However, a meta-analysis found no significant difference between treatment and controls in the change in bone age, kidney function, cholesterol, triglycerides, apolipoproteins and glucose tolerance (836). Additionally, there is no evidence that rhGH acts to advance the pubertal growth spurt.
Persistent growth failure, despite successful kidney transplantation, led to rhGH use being studied to address concerns regarding efficacy in the presence of corticosteroid immunosuppression, increasing risk of acute rejection and the potential for increasing the already raised incidence of malignancy in an immunosuppressed population (842). None of the four RCTs in pediatric KTRs (839,843–845) reported an increase in acute rejection associated with rhGH therapy or an adverse effect of this treatment on graft function. However, two did determine that prior acute rejection history is a risk factor for the development of acute rejection following the initiation of rhGH therapy (844,845). The conclusion drawn from these RCTs was that rhGH is a well-tolerated and effective treatment in growth-retarded KTRs. However, no pharmaceutical company that manufactures rhGH has applied to the FDA or European agencies to extend approval for this treatment to the pediatric KTR population.
Concern about a relationship between rhGH use and the development of renal cell carcinoma in pediatric KTRs receiving growth hormone therapy led researchers to probe databases maintained by the pharmaceutical companies that produce rhGH for evidence of an association (846). Only the International Growth Database collected data on kidney malignancy in KTRs on rhGH. rhGH was not found to be an independent risk factor for the development of renal cell carcinoma (846). Isolated incidents of PTLD have been reported in patients receiving rhGH, but a definitive relationship has not been shown.
When considering rhGH therapy for growth-delayed pediatric KTRs, the health-care provider should inform the patient and family that the benefits to growth need to be balanced with possible adverse events and the difficulty of adhering to a daily subcutaneous injection regimen.
Corticosteroids have been used in pediatric KTRs as maintenance immunosuppressive therapy and as a treatment for acute rejection since the 1960s (847,848). A correlation between a daily corticosteroid dose in excess of 7 mg/m2 of body surface area and impaired growth in pediatric KTRs has been reported (849). Over the years, practitioners have made efforts to reduce steroid use in pediatric KTRs in an effort to avoid the potentially negative impact on growth. In a prospective clinical trial of steroid minimization, researchers studied 35 KTRs at 14–27 months following transplantation, 17 of whom received alternate-day corticosteroid therapy and 18 of whom received daily corticosteroid therapy (850). At 1 year, the mean height standard deviation score was +0.49 in the alternate-day group, compared with −0.12 in the daily-dose group. An analysis of the North American Pediatric Renal Transplant Cooperative Study database also found that short-term improvement in height standard deviation score was associated with alternate-day dosing of corticosteroids (851). No decline in graft function was observed in patients receiving daily vs. alternate-day steroid therapy. However, it is important that, when considering alternate-day dosing as a strategy for steroid minimization, the health-care provider address the potential for increased incidence of nonadherence due to the potential difficulty of this dosing regimen.
In 2001, a pilot study reported the initial positive results of steroid avoidance using anti-IL2 receptor antibody for induction and every 2 weeks for the first 5 months after transplant, in addition to tacrolimus and MMF as maintenance immunosuppression therapy (852). A follow-up report in 2003 indicated a significant improvement in the mean height standard deviation score at 1 year in the corticosteroid-avoidance group when compared with a historical control group who had received corticosteroid treatment daily (853). This led to a prospective multicenter RCT of steroid avoidance where 130 unsensitized primary KTRs 0–21 years of age were randomized to steroid-free vs. steroid-based immunosuppression (2004–2006) with a 3-year follow-up (854). Patients in both arms received tacrolimus and MMF immunosuppression. Preliminary analysis does not reveal an overall significant growth advantage at 1 year in children receiving steroid-free or steroid-based immunosuppression. Longer-term follow-up of current and future RCTs will be important in determining the effect of steroid-free or minimization protocols on growth and graft function in pediatric KTRs.
- • A RCT is needed to determine whether higher doses of rhGH during puberty improve pubertal growth in children with persistent growth failure after kidney transplantation.
- • Further follow-up of ongoing and future studies is needed to evaluate the effect of steroid minimization or avoidance on growth.
Chapter 25: Sexual Function and Fertility
- 25.1: SEXUAL FUNCTION
- 25.1.1: Evaluate adults for sexual dysfunction after kidney transplantation. (Not Graded)
- 25.1.2: Include discussion of sexual activity and counseling about contraception and safe sex practices in follow-up of adult KTRs. (Not Graded)
KTRs, kidney transplant recipients.
- • Sexual dysfunction is common in men and women KTRs.
- • Many patients will not spontaneously report sexual dysfunction.
- • Modification of medications may alleviate sexual dysfunction.
- • Therapies are available, although less are available for women than men.
- • Sexual dysfunction negatively affects quality of life.
- • Contraception can help prevent unwanted pregnancies.
- • Safe sex practices can help prevent the acquisition of disease.
Sexual dysfunction is frequent in patients with all stages of CKD, particularly among patients with CKD stage 5 or after transplantation (855). Sexual dysfunction in KTRs may have both organic and psychological causes (856–858). The scope of the problem includes erectile dysfunction, decreased libido and lower frequency of intercourse.
Following kidney transplantation, the metabolic milieu improves, and for some patients sexual function improves as well (859). For others, sexual function does not change and may even worsen (860). One study compared sexual function in men and women on hemodialysis, peritoneal dialysis, with patients with rheumatoid arthritis or after transplantation and found that men and women on dialysis had statistically increased incidence of ‘hypoactive sexual desire’ compared with those after transplantation. Men on hemodialysis had a significantly higher incidence of ‘sexual aversion disorder’ as well as ‘inhibited male orgasm.’ In this study, the ‘male erectile disorder’ did not differ between the dialysis and transplant groups. Overall, the study concluded that sexual dysfunction in dialysis patients was a consequence of lost sexual interest attributable to fatigue (858).
Problems with sexual function are common following kidney transplantation, but the reported prevalence varies. Problems with sexual function in general have been reported in the range of 45–50% (855,861). One survey found that more than 30% reported a problem to be moderate or severe in magnitude (861). There are few studies focused on this issue for women (861). For men, erectile dysfunction can affect quality of life and be associated with anxiety, depression and loss of self-esteem (862). Following transplantation, erectile dysfunction may improve, especially for younger men (862,863). However, for others the problem may not change or may even get worse (862–864). An Egyptian study of 400 male KTRs reported erectile dysfunction in 36% (865). In a study of simultaneous pancreas–kidney transplant recipients, 79% suffered from some degree of erectile dysfunction (866). Transplant surgery may contribute to erectile dysfunction. Diversion of blood from the penile arteries when the internal iliac arteries are used for transplant anastomosis may play a role (863).
Therapy with 5-phosphodiesterase inhibitors may be effective. These agents are helpful for some, but not all patients (864,867). A double-blind crossover RCT of KTRs found that sildenafil was more effective than placebo with respect to erectile function, orgasmic function, intercourse satisfaction and overall satisfaction (868). There was no significant difference with respect to sexual desire in this study (868). Modification of medications may also be useful for patients with erectile dysfunction and/or decreased libido.
Whether to evaluate men with sexual dysfunction, or initiate a trial with a 5-phosphodiesterase inhibitor, is often unclear. When 5-phosphodiesterase inhibitors are prescribed, care must be taken that the patient is hemodynamically stable and that he avoids alpha-adrenergic antagonists. How to counsel and approach therapy in women with sexual dysfunction is less clear.
Follow-up of KTRs should include discussion of sexual activity, and counseling about contraception and safe sex practices, as is true for patients in the general population (and therefore beyond the scope of this guideline). Sexually active patients who are not in long-term monogamous relationships should use latex condoms during sexual contact to reduce their risk for exposure to CMV, HSV, HIV, HPV, HBV, HCV and other sexually transmitted infections. Sexually active KTRs should avoid sexual practices that could result in oral exposure to feces or genital secretions.
Recommendations for contraception should be made on an individual basis with consideration given to what is most effective, as well as what can actually be used. Concerns regarding intrauterine devices have included the potential for infection, as well as that they may be less effective in transplant recipients (869). Whether current intrauterine devices may be more effective and less risky in this patient population is unknown.
- • Studies are needed to determine the etiology, diagnosis and treatment of sexual dysfunction in KTRs.
- 25.2: FEMALE FERTILITY
- 25.2.1: We suggest waiting for at least 1 year after transplantation before becoming pregnant, and only attempting pregnancy when kidney function is stable with <1 g/day proteinuria. (2C)
- 25.2.2: We recommend that MMF and EC-MPS be discontinued or replaced with azathioprine before pregnancy is attempted. (1A)
- 25.2.3: We suggest that mTORi be discontinued or replaced before pregnancy is attempted. (2D)
- 25.2.4: Counsel female KTRs with child-bearing potential and their partners about fertility and pregnancy as soon as possible after transplantation. (Not Graded)
- 25.2.5: Counsel pregnant KTRs and their partners about the risks and benefits of breastfeeding. (Not Graded)
- 25.2.6: Refer pregnant patients to an obstetrician with expertise in managing high-risk pregnancies. (Not Graded)
EC-MPS, enteric-coated mycophenolate sodium; KTRs, kidney transplant recipients; MMF, mycophenolate mofetil; mTORi, mammalian target of rapamycin inhibitor(s).
Female KTRs of child-bearing potential are those who are not peri- or postmenopausal, and those who have a uterus and at least one ovary. There are no prospective studies on the risks of immunosuppressive medications in pregnancies. Evidence that a drug is not safe in pregnancy may come from case reports, or animal studies demonstrating toxicity at doses comparable to those which might be used in humans (normalized to body surface area). In the absence of data, a drug should be presumed to be unsafe, and patients should be treated accordingly.
- • Fertility is increased in KTRs compared to CKD stage 5 before transplantation.
- • Pregnancy and childbirth in KTRs have a high incidence of complications to mother and child.
- • Complications of pregnancy and childbirth can be minimized by the use of lower-risk immunosuppressive agents and multidisciplinary care that includes an obstetrician with expertise in managing high-risk pregnancies.
Pregnancies in patients with CKD stage 5 are uncommon (870). However, fertility is improved and often restored after successful kidney transplantation (871,872). The risks of pregnancy and childbirth to both mother and child are higher for KTRs, compared to the general population, but in stable KTRs pregnancies most often have a good outcome. In KTRs with good kidney function, no proteinuria, and well-controlled blood pressure, there is little risk of graft loss (873–876). However, KTRs with reduced kidney function are at higher risk for allograft dysfunction and graft failure (877). There are few published data in KTRs on which to base a safe recommended GFR. Data in the nontransplant population indicate that women with GFR <40 mL/min/1.73 m2 and proteinuria >1 g/day are at increased risk for a significantly accelerated GFR decrease, as well as low-birth-weight babies (878). These data were used for the recommendations noted above. It is unclear whether or not the same levels apply to KTRs. Cyclosporine levels decline during pregnancy (877). Nevertheless, the incidence of acute rejection during pregnancy appears to be relatively low (877).
An American Society of Transplantation consensus conference recommended that patients wait for 1 year without acute rejection before pregnancy with the proviso that individual circumstances may modify the appropriate time frame to a shorter or longer time period. Each situation needs to be evaluated on a case-by-case basis (879). It was recently reported that pregnancy within the first 2 years following transplantation may increase the risk of graft loss (880). On the other hand, there have been successful pregnancies before the end of the first posttransplant year (872). Some reports suggest that there is a high incidence of hypertension (873) and preeclampsia in pregnant KTRs (881). Deliveries are more likely to be by caesarean section, for medical indications. The transplant kidney is neither affected by, nor does it affect, a vaginal delivery. In the absence of medical indications, vaginal deliveries are possible (873).
There is also a higher risk to the fetus for pregnancies in KTRs. There is a higher risk of preterm delivery (<37 weeks) and low birth weight (<2500 g) (873,877,882). The fetus, of course, is exposed to potentially teratogenic immunosuppressive agents (882). There are no RCTs indicating which, if any, immunosuppressive agents are safe to use in pregnancy.
Mycophenolate has been reported to cause severe structural malformations. A characteristic phenotype associated with in utero exposure to MMF is emerging that includes cleft lip and palate, microtia, and absence of external auditory canals (883–885). Thus, MMF should generally be changed to azathioprine during pregnancy, a practice endorsed by the European Best Practice Guidelines (886). These guidelines suggest a 6-week window after discontinuing MMF and starting azathioprine, before pregnancy is attempted (886). These same concerns should also apply to EC-MPS. Azathioprine is rated by the FDA as category ‘D’ (i.e. there is evidence of human fetal risk, but the benefits from use in pregnant women may be acceptable despite the risk). Despite the FDA category D, azathioprine has been used safely over the years in pregnant transplant recipients. It is considered an acceptable immunosuppressant to use in this clinical setting.
In a meta-analysis of the use of CsA during pregnancy, the incidence of major fetal malformations was 4.1% (2.6–7.0%) (877). This was numerically higher than, but not statistically significantly different from, the rate with non-CNIs. Prednisone at doses low enough to prevent thymic aplasia (usually less than 15 mg/day) is safe in pregnant KTRs. High levels of azathioprine and prednisone can be associated with problems that do not occur when they are used at standard doses.
There are few reports of the use of mTORi and pregnancy. The FDA categorizes sirolimus as ‘C.’ Category C indicates that either studies in animals have revealed adverse effects on the fetus (teratogenic or embryocidal effects or other) and there are no controlled studies in women, or studies in women and animals are not available (887). The FDA-approved package labeling for sirolimus notes that sirolimus was embryotoxic or fetotoxic in rats at doses 0.2–0.5 of the clinical doses adjusted for body surface area (887). A voluntary registry reported only seven cases of pregnancy in organ transplant recipients receiving sirolimus (885). None were associated with adverse outcomes, although in most the drug was discontinued when pregnancy was discovered. There are case reports of normal-term pregnancies in women receiving sirolimus (888,889). However, in the absence of adequate safety data, it is prudent to avoid mTORi in pregnancy.
An American Society of Transplantation consensus conference concluded that breastfeeding for KTRs is not contraindicated (879). For KTRs who opt to breastfeed, prednisone is likely to be safe (890). Prednisone and azathioprine are detectible in breast milk (891), but there are no data for MMF or sirolimus. CsA is excreted into breast milk and is not recommended in breastfeeding mothers (877).
- • Observational studies are needed to determine the incidence and complications of pregnancies in KTRs.
- 25.3: MALE FERTILITY
- 25.3.1: We suggest that male KTRs and their partners be advised that:
- • male fertility may improve after kidney transplantation (2D);
- • pregnancies fathered by KTRs appear to have no more complications than those in the general population. (2D)
- 25.3.2: We recommend that adult male KTRs be informed of the possible risks of infertility from mTORi. (1C)
- 18.104.22.168: We suggest that adult male KTRs who wish to maintain fertility should consider avoiding mTORi, or banking sperm prior to mTORi use. (2C)
KTRs, kidney transplant recipients; mTORi, mammalian target of rapamycin inhibitor(s).
- • Male fertility improves in most KTRs, and may become normal.
- • Outcomes of pregnancies fathered by KTRs are similar to those of the general population.
- • Rapamycin is associated with low sperm counts. The abnormality is reversible with discontinuation of rapamycin.
Chronic kidney disease is associated with impaired spermatogenesis, decreased testosterone production, decreased libido and increased gonadotropins (892). Uremic hypogonadism is reversible in individuals with successful long-term kidney transplantation (893). Studies from the azathioprine as well as CsA eras show that testosterone levels rise after transplantation. Gonadotropins may decrease, but may not normalize, and semen analysis in most KTRs is normal (893–898). Longer time on dialysis prior to transplantation and kidney dysfunction may be risk factors for those with residual testicular dysfunction (896,899). Testicular biopsies performed after transplantation show significant improvement, with some residual reduction in sertoli cells and spermatogonia (900).
Although it is unclear whether CsA plays a role in male infertility, it is clear that rapamycin can lead to male infertility (893,899). It causes low sperm counts (901–904) by interrupting the stem cell factor/c-kit system that regulates germ cell proliferation, meiosis and apoptosis, consequently inhibiting spermatogenesis (905). The effects of rapamycin appear to be reversible (901–903).
With regards to potential congenital abnormalities, outcomes of pregnancies fathered by male KTRs do not differ from those of the general population (906). These conclusions are based on data from the National Transplant Pregnancy Registry. This registry is voluntary and thus potentially subject to reporting bias. Nevertheless, the data captured by this registry are crucial but limited. Fewer pregnancy outcomes are reported to the registry for men than women.
- • Observational studies are needed to determine the incidence and complications of pregnancies fathered by KTRs.
Chapter 26: Lifestyle
- 26: We recommend that patients are strongly encouraged to follow a healthy lifestyle, with exercise, proper diet, and weight reduction as needed. (1C) (See also Obesity, Recommendation 16.4.1.)
- • There are abundant data from the general population that a lifestyle that includes exercise, a proper diet and avoidance of obesity improves longevity and quality of life.
- • Although there is only one small RCT in KTRs, there is no reason to believe that exercise is not as beneficial in KTRs as in the general population.
- • There is no reason not to believe that a proper diet can help prevent CVD and other complications in KTRs as in the general population.
- • There is little harm associated with exercise, a proper diet and weight reduction; therefore, any benefit is likely to outweigh harm.
Data from RCTs in the general population suggest that exercise, proper diet and weight reduction (in obese patients) improve longevity, quality of life and other major health outcomes. In a RCT involving 100 KTRs, the group randomly allocated to receive regular telephone counseling on exercise had a greater exercise tolerance at 1 year after transplantation compared to the control group that did not receive counseling (603). This study did not have adequate statistical power to examine major CVD outcomes.
Chapter 27: Mental Health
- 27: Include direct questioning about depression and anxiety as part of routine follow-up care after kidney transplantation. (Not Graded)
- • Depression and anxiety are more common in KTRs than in the general population.
- • Depression and anxiety may be associated with medication nonadherence, sleep disorders and other adverse effects that make the diagnosis and treatment of depression and anxiety important.
- • Therapies are available for the treatment of depression and anxiety.
Anxiety and depression are common in dialysis patients, much more so than in the general population (907–913). For many patients with kidney failure, kidney transplantation is a better kidney replacement therapy than dialysis. However, kidney transplantation does not change underlying systemic disorders or reverse previously sustained physical damage. When the kidney graft is not working well, new medical problems may arise, and when the graft fails, the patient generally faces a return to an unwelcome form of therapy, thereby increasing stress. Whether because of new or preexisting medical conditions, medications such as corticosteroids, or work status changes, KTRs are at risk for anxiety and depression.
There have been few studies of mental-health disorders in KTRs. Many of the studies focus on quality of life, which encompasses psychological domains but does not necessarily examine anxiety and depression, or other mental-health disorders (914). A meta-analysis comparing emotional distress and psychological well-being among different forms of renal replacement therapies revealed less emotional distress and greater well-being with successful kidney transplantation than with other CKD stage 5 treatment modalities (915). However, case mix differences among the groups likely influenced the results, making it unclear whether it was the difference in patients or treatment modalities that accounted for the differences in outcome (915).
Studies of depression and anxiety in KTRs report conflicting results. Some have reported similarly high rates of depression in KTRs compared to dialysis patients (916,917), while others have reported less depression (918,919). Some have reported less anxiety in KTRs (920) and, in others, no difference compared to dialysis patients (917,919). A study examining 5- to 22-year-old KTRs found that 36% had emotional trauma and/or depression (921). Return to dialysis after graft loss has been associated with severe depression (918).
Anxiety and depression in KTRs have been associated with a poor quality of life, poor marital relations, sexual function and sleep quality (922). In one study, a high level of posttransplantation anxiety was associated with reduced functioning socially, having physical complaints, and more economic problems. High levels of anxiety were also associated with depression (923). Depression has also been associated with medication nonadherence (283,924).
Hospitalization for psychosis is not increased in KTRs compared with patients on chronic dialysis (925). Hospitalization for psychosis has been associated with increased risk for death, as well as graft loss (925). Depression identified in Medicare claims has been associated with an increased risk of graft failure, return to dialysis and death with a functioning graft (926).
Psychotherapy may be helpful (927). One RCT comparing individual and group psychotherapy in KTRs found that both approaches resulted in lower Beck Depression Inventory scores. Individual therapy was associated with a better outcome in this study (928). As most transplant centers have social workers, these individuals may be a useful resource for counseling. Antidepressants are often used. Given that some of these drugs are metabolized by the CYP3A4 enzyme pathway, levels of immunosuppressive medications also metabolized through this pathway may need to be adjusted (929).
How best to assess the mental health of transplant recipient is unclear. The screening tools used by psychologists are time-consuming and not familiar to general practitioners. Simple strategies, such as direct questioning on review of systems, or brief screening tools (930), may be a simple and useful initial screening approach. Further studies are needed to better understand how to monitor for mental-health disorders in KTRs.
- • Studies are needed to determine the optimal approach to screening and intervention for depression and other mental disorders in KTRs.
Appendix: Methods for Guideline Development
The overall aim of the project was to create a clinical practice guideline with recommendations for the care of the KTRs using an evidence-based approach. After topics and relevant clinical questions were identified, the pertinent scientific literature on those topics was systematically searched and summarized.
Overview of process
The development of the guideline included sequential and concurrent steps:
- • Appoint the Work Group and Evidence Review Team (ERT), which were responsible for different aspects of the process.
- • Confer to discuss process, methods and results.
- • Develop and refine topics.
- • Define specific populations, interventions or predictors and outcomes of interest.
- • Create and standardize quality assessment methods.
- • Create data-extraction forms.
- • Develop literature search strategies and run searches.
- • Screen abstracts and retrieve full articles based on predetermined eligibility criteria.
- • Extract data and perform critical appraisal of the literature.
- • Grade quality of the outcomes of each study.
- • Tabulate data from articles into summary tables.
- • Grade the quality of evidence for each outcome and assess the overall quality and findings of bodies of evidence with the aid of evidence profiles.
- • Write recommendations and supporting rationale statements.
- • Grade the strength of the recommendations based on the quality and strength of the evidence and other considerations.
- • Peer review by KDIGO Board of Directors in December 2008 and the public (March 2009), with subsequent revisions.
The Work Group, KDIGO Co-Chairs, ERT, liaisons and KDIGO support staff met for four 2-day meetings for training in the guideline development process, topic discussion and consensus development.
Creation of groups
The KDIGO Co-Chairs appointed the Co-Chairs of the Work Group, who then assembled the Work Group to be responsible for the development of the guideline. The Work Group consisted of domain experts, including individuals with expertise in adult and pediatric nephrology, transplant surgery and medicine, critical-care medicine, cardiology, infectious diseases, oncology and epidemiology, along with a patient advocate. Tufts Center for Kidney Disease Guideline Development and Implementation at Tufts Medical Center in Boston, MA, USA, was contracted to provide expertise in guideline development methodology and systematic evidence review. The ERT consisted of physician–methodologists with expertise in nephrology and internal medicine, and research associates and assistants. The ERT instructed and advised Work Group members in all steps of literature review, critical literature appraisal and guideline development. The Work Group and the ERT collaborated closely throughout the project. The ERT also included methodological input and assistance with literature searches from methodology experts at the Cochrane Renal Group in Sydney, Australia.
Systematic Review: General Process
The first task of the Work Group was to define the overall topics and goals for the guideline. The Work Group Co-Chairs drafted a preliminary list of topics. The Work Group identified the key clinical questions. The Work Group and ERT further developed and refined each topic, specified screening criteria, literature search strategies and data-extraction forms.
The ERT performed literature searches, and organized screening of abstracts and articles. The ERT also coordinated the methodological and analytic processes of the report, and defined and standardized the methodology of performing literature searches, data extraction and summarizing the evidence. Throughout the project, the ERT offered suggestions for guideline development, led discussions on systematic review, literature searches, data extraction, assessment of quality and applicability of articles, evidence synthesis, grading of evidence and recommendations and consensus development. With input from the Work Group, the ERT finalized eligible studies, performed all data extraction and summarized data into summary tables. They also created preliminary evidence profiles (described below), which were completed by the Work Group members. The Work Group members reviewed all included articles, data-extraction forms and summary tables for accuracy and completeness. The Work Group took the primary role of writing the recommendations and rationale statements, and retained final responsibility for the content of the recommendation statements and the accompanying narrative.
For questions of treatments in the KTRs, systematic reviews of the eligible RCTs were undertaken (Table 32). For these topics, the ERT created detailed data-extraction forms, and extracted information on baseline data for the populations, interventions, study design, results and provided an assessment of quality of evidence. The ERT then tabulated studies in summary tables, and assigned grades for the quality of the evidence in consultation with the Work Group.
|Chapter 1: Induction Therapy|
|Population||KTRs in the first 24 h after transplant|
|Predictor, reference standard||IL2 (mab) vs. no induction, antithymoglobulin vs. no induction, antithymoglobulin vs. IL2|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, infection, cancer, NODAT, fracture, BMD, erythrocytosis, neutropenia, quality of life, adverse events|
|Minimum number of subjects||N ≥ 50|
|Chapter 2: Initial Maintenance Immunosuppressive Medications|
|Intervention, reference standard||Tac vs. CsA (CsA or CsA-ME) (with AZA, MMF, Sirolimus, Everolimus), CNI vs. non-CNI regimens, MMF vs. AZA, MMF formulation vs. other MMF formulation, CNI-sparing (withdrawal), CNI-free, steroid withdrawal, steroid avoidance|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, infection, cancer, NODAT, erythrocytosis, neutropenia, fracture, BMD, hypertension, hyperuricemia, hyperlipidemia, quality of life, adverse events|
|Minimum number of subjects||N ≥ 100|
|Chapter 3: Long-Term Maintenance Immunosuppressive Medications|
|Intervention||Tac vs. CsA (CsA or CsA-ME) (with AZA, MMF, sirolimus, everolimus), CNI vs. non-CNI regimens, MMF vs. AZA, MMF formulation vs. other MMF formulation, CNI-sparing (withdrawal), CNI-free, steroid withdrawal, steroid avoidance|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, infection, cancer, NODAT, erythrocytosis, neutropenia, fracture, BMD, hypertension, hyperuricemia, hyperlipidemia, quality of life, adverse events|
|Minimum number of subjects||N ≥ 100|
|Chapter 4: Strategies to Reduce Drug Costs|
|Intervention||CsA-ME generics, other generic medications|
|Minimum number of subjects||N ≥ 20|
|Chapter 5: Monitoring Immunosuppressive Medications|
|Intervention||MMF fixed dose vs. AUC-adjusted doses, C0 vs. C2 CsA to determine dosing, anti-HLA antibodies|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, infection, cancer, NODAT, erythrocytosis, neutropenia, fracture, BMD, hypertension, hyperuricemia, hyperlipidemia, quality of life, adverse events|
|Minimum number of subjects||N ≥ 10|
|Chapter 6: Treatment of Acute Rejection|
|Population||KTRs with biopsy-proven acute rejection|
|Predictor||Adding induction agents or other (intravenous immunoglobulin, plasma exchange), change of maintenance regimen|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, infection, cancer, NODAT, fracture, BMD, erythrocytosis, neutropenia, quality of life, adverse events|
|Minimum number of subjects||N ≥ 100|
|Chapter 7: Treatment of Chronic Allograft Injury|
|Population||KTRs with CAN or biopsy-proven CNI toxicity|
|Intervention, predictor||Reduction in CNI, change in maintenance immunosuppression, adding ancillary treatments (ACE-I, ARB, etc.), CNI dose reduction, CNI withdrawal, replacement of CNI with another immunosuppression agent, comparisons with placebo or other treatments|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, biopsy changes, infection, cancer, NODAT, erythrocytosis, neutropenia, fracture, BMD, hypertension, hyperuricemia, hyperlipidemia, quality of life, adverse events|
|Chapter 7: Treatment of Chronic Allograft Injury|
|Minimum number of subjects||N ≥ 100|
|Chapter 8: Monitoring Kidney Allograft Function|
|Intervention||Protocol monitoring vs. no protocol, different frequencies of monitoring|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN|
|Study design||RCT; minimum follow-up time: ≥6 months|
|Minimum number of subjects||N ≥ 10|
|Chapter 9: Kidney Allograft Biopsy|
|Intervention||Protocol biopsy vs. not, different protocols, treatment of ‘borderline’ rejection based on protocol biopsy vs. no biopsy|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, recurrent disease|
|Study design||RCT; minimum follow-up time: ≥6 months|
|Minimum number of subjects||N ≥ 10|
|Chapter 10: Recurrent Kidney Disease|
|Population||KTRs with biopsy-proven recurrent disease|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, recurrent disease, GFR/SCr or eGFR, biopsy changes, serious adverse events|
|Minimum number of subjects||N ≥ 20|
|Chapter 11: Preventing, Detecting, and Treating Nonadherence|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, use of immunosuppressive medications as prescribed|
|Minimum number of subjects||N ≥ 20|
|Chapter 12: Vaccination|
|Population||KTRs (for PCP: any solid-organ recipient)|
|Intervention||Polyoma virus/BKV nephropathy: biopsies, urine NAT, urine decoy cells|
|EBV: acyclovir/ganciclovir, change immunosuppression agent, intravenous immunoglobulin, anti-CD20 antibody|
|PCP: sulfamethoxazole–trimethoprim vs. dapsone vs. pentamidine, prophylaxis vs. no|
|prophylaxis, different protocols|
|HBV: monitoring, drug prophylaxis|
|UTI: antibiotic prophylaxis|
|TB: PPD, QuantiFERON screening|
|Fungal: screening, prophylaxis|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, relevant disease, adverse events|
|Minimum number of subjects||N ≥ 10|
|Chapter 13: Viral Diseases|
|Intervention||Polyoma virus/BKV nephropathy: reduce immunosuppression, cidofovir, leflunomide|
|CMV: reduce immunosuppression, gancyclovir, valgancyclovir, intravenous immunoglobulin, acyclovir|
|EBV: acyclovir, gancyclovir, reduce immunosuppression, intravenous immunoglobulin, anti-CD20 antibody|
|HBV: interferon (timing), pegylated interferon, lamivudine, adefovir, entecavir|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, relevant disease, BKV nephropathy, change in management, HBV, liver disease progression (by biopsy), hepatocellular carcinoma, adverse events|
|Chapter 13: Viral Diseases|
|Study design||RCT, cohort|
|Minimum number of subjects||N ≥ 20 for RCT; N ≥ 100 for cohort|
|Chapter 14: Other Infections|
|Intervention||Antibiotic prophylaxis, PPD, Quantiferon screening, screening and prophylaxis for fungal infections|
|Outcomes||UTI, active TB, fungal disease, mortality, acute rejection, graft loss, kidney function, DGF, CAN, adverse events|
|Study design||RCT, cohort|
|Minimum number of subjects||N ≥ 20 for RCT; N ≥ 100 for cohort|
|Chapter 15: Diabetes Mellitus|
|Population||KTRs with NODAT|
|Intervention||Change in immunosuppressive medications|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, CVD events, fasting glucose|
|Study design||RCT, cohort; minimum follow-up time: ≥6 months|
|Minimum number of subjects||N ≥ 100 for RCT; N ≥ 500 for cohort|
|Chapter 16: Hypertension, Dyslipidemias, Tobacco Use, and Obesity|
|Population||KTRs with CVD risk factors|
|Intervention||Smoking cessation, obesity: weight loss|
|Outcomes||Reduction in risk factor, all-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, CVD|
|Study design||RCT; minimum follow-up time: ≥6 months|
|Minimum number of subjects||N ≥ 20|
|Systematic reviews were not performed for hypertension or dyslipidemia|
|Referred to KDOQI Guidelines for hypertension and dyslipidemia|
|Chapter 17: Cardiovascular Disease Management|
|Population||KTRs with CVD|
|Intervention||Aspirin, dipyridamole, ticlopidine, clopidogrel, cilostazol, pentoxyifylline|
|Minimum number of subjects||N ≥ 20|
|Chapter 18: Cancer of the Skin and Lip|
|Study Design||Registry data|
|Minimum number of subjects||N ≥ 1000|
|Chapter 19: Non-Skin Malignancies|
|Study design||Registry data or systematic review|
|Minimum number of subjects||N ≥ 1000|
|Chapter 20: Managing Cancer with Reduction of Immunosuppressive Medication|
|Population||KTRs with cancer|
|Intervention||Change in immunosuppressive regimens|
|Outcomes||Mortality, acute rejection, graft loss, kidney function, DGF, CAN, adverse events|
|Minimum number of subjects||N ≥ 10|
|Chapter 21: Transplant Bone Disease|
|Systematic review not performed|
|Referred to KDIGO CKD–MBD Guideline|
|Chapter 22: Hematological Complications|
|Population||KTRs with anemia, erythrocytosis or neutropenia|
|Intervention||Erythrocyte stimulation therapies, changes in immunosuppressive medications, granulocyte CSF, other treatments|
|Chapter 22: Hematological Complications|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, quality of life, CVD, infections, hemoglobin or hematocrit, neutropenia duration, adverse events|
|Minimum number of subjects||N ≥ 10|
|Chapter 23: Hyperuricemia and Gout|
|Population||KTRs with hyperuricemia|
|Intervention||Changes in immnosuppressive medications, allopurinol, serum uric acid|
|Outcomes||Gout, all-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, CVD events|
|Study Design||RCT, cohort|
|Minimum number of subjects||N ≥ 20 for RCT; N ≥ 100 for cohort|
|Chapter 24: Growth and Development|
|Population||Adult and pediatric KTRs|
|Intervention||Growth hormone, immunosuppressive regimens|
|Outcomes||Growth, growth retardation, development|
|Study design||RCT, cohort, systematic review|
|Minimum number of subjects||N ≥ 20 for RCT; N ≥ 100 for cohort|
|Chapter 25: Sexual Function and Fertility|
|Population||Kidney transplant patients with sexual dysfunction, mothers who are pregnant, have a transplant during pregnancy, are lactating, or fathers who has a transplant at conception|
|Intervention||Erectile dysfunction medications|
|Outcomes||All-cause mortality, DGF, slow graft function, acute rejection, graft failure/survival, kidney function, CAN, erectile dysfunction, pregnancy outcomes, pregnancy complications, immunosuppression medication levels in milk|
|Study design||RCT, cohort|
|Minimum number of subjects||N ≥ 10 for RCT; N ≥ 1 for mothers, fathers or N ≥ 50 for pregnancy in cohort|
|Chapter 26: Lifestyle|
|Systematic review not performed|
|Chapter 27: Mental Health|
|Population||Kidney transplant patients with depression|
For nontreatment questions, that is questions related to prevalence, evaluation and risk relationships, the ERT conducted systematic searches, screened the yield for relevance and provided lists of citations to the Work Group. The ERT created summary tables of selected observational incidence and predictor studies. The Work Group took primary responsibility for reviewing and summarizing this literature in a narrative format. The ERT also searched online databases for estimates of incidence rates of different cancers among larger countries representative of different regions. The primary database used was Cancer Mondial (http://www-dep.iarc.fr). SIRs for cancer in solid-organ transplant recipients were taken from a meta-analysis by Grulich et al. (623).
For topics on which previous or ongoing KDIGO or KDOQI guidelines have provided recommendations for KTRs, new systematic reviews were not performed. These include anemia, hepatitis C, mineral and bone disorders and pediatric nutrition. For these topics, the relevant recommendations and rationale text were excerpted and refined as necessary. The Work Group Chairs and selected members conferred with Co-Chairs of the concurrent KDIGO mineral and bone disorder guideline and KDOQI pediatric nutrition guideline on transplant bone disease (Chapter 21), and growth and development (Chapter 24).
Refinement of Topics
The Work Group Co-Chairs prepared the first draft of the scope of work document as a series of mock (preliminary) recommendations to be considered by Work Group members. At their first 2-day meeting, members added further mock guideline topics until the initial working document included all topics of interest to the Work Group. The inclusive, combined set of questions formed the basis for the deliberation and discussion that followed. The Work Group strove to ensure that all topics deemed clinically relevant and worthy of review were identified and addressed. The four major topic areas of interest for the care of KTRs included immunosuppression, infections, CVD and cancer. In addition, there were several miscellaneous topics.
At the initiation of the guideline development process, it was agreed that this guideline would focus on patients who have had kidney transplantations. Thus, with few exceptions (e.g. the timing of vaccinations), all topics, systematic reviews and study eligibility criteria were restricted to patients with existing kidney transplantations. The guideline does not address management issues regarding choosing patients for kidney transplantation, pretransplant care, intraoperative care (except for the timing of initiating immunosuppression) or management of patients who have lost their grafts. In addition, in regards to care of comorbidities and complications after kidney transplantation (e.g. infections, cancer and CVD), this guideline focuses primarily on monitoring and prevention of the conditions, as opposed to treatment of the conditions (with some exceptions, e.g. for infectious diseases). However, where the recommended treatment of conditions differed from the general population (e.g. due to drug interactions with immunosuppression agents), standard treatment recommendations are offered.
Based on the list of topics, the Work Group and ERT developed a list of specific research questions for which systematic review would be performed (Table 32). For each systematic review topic, the Work Group Co-Chairs and the ERT formulated well-defined systematic review research questions using a well-established system (931). For each question, explicit criteria were agreed on for the population, intervention or predictor, comparator, outcomes of interest and study design features. A list of outcomes of interest was generated. The Work Group ranked patient-centered clinical outcomes (such as death, graft loss or infections) as more important than intermediate outcomes (such as cholesterol level or hypertension). The outcomes were further categorized as being of critical, high or moderate importance to KTRs. Outcomes of low importance were not considered for the purpose of systematic review and evidence synthesis. The specific criteria used for each topic are described below in the description of the review topics. In general, eligibility criteria were determined based on clinical value, relevance to the guideline and clinical practice, determination whether a set of studies would affect recommendations or the strength of evidence and practical issues such as available time and resources.
Literature Searches and Article Selection
The MEDLINE, Cochrane Central Registry for trials, and Cochrane database of systematic reviews were searched from 1985 through January 2007 by the ERT to capture all citations relevant to the topic of kidney transplantation, including original articles, systematic reviews and previous guidelines. The Cochrane Renal Group ran parallel searches in their Renal Registry database and these supplemented the primary ERT searches. The search was updated through February 2008 and supplemented by articles identified by Work Group members through November 2008.
During citation screening, journal articles reporting original data were reviewed. Editorials, letters, stand-alone abstracts, unpublished reports and articles published in non–peer-reviewed journals were excluded. The Work Group also decided to exclude publications from journal supplements and Transplantation Proceedings journal because of potential differences in the process of how they get solicited, selected, reviewed and edited compared to peer-reviewed publications.
Potentially relevant existing systematic reviews were examined. If these reviews were deemed to adequately address topics of interest (even if only selected outcomes were reviewed), de novo searches on these topics were limited to the time period since the end of the literature search within the systematic reviews.
The MEDLINE and Cochrane search results were screened by the ERT for relevance using predefined eligibility criteria (Table 32). Restrictions by sample size and duration of follow-up were based on methodological and clinical considerations. Generally, it was deemed that trials with fewer than 100 people would be unlikely to have sufficient power to find significant differences in patient-centered clinical outcomes in KTRs. However, for specific topics where sparse data were available, lower sample-size thresholds were used to provide some information for descriptive purposes.
For most topics, the minimum mean duration of follow-up of 6 months was chosen based on clinical reasoning. For the treatments of interest, the proposed effects on patient-centered clinical outcomes require long-term exposure and, typically, would not be expected to become evident before several months of follow-up. For all treatment topics, all RCTs in children with five or more individuals per arm were included.
From the onset of the guideline development process, it was known that for numerous topics of interest (e.g. care of comorbidities and complications after kidney transplantation) very few or no RCTs of KTRs exist. In addition, several topics required data on predictors of outcomes as opposed to treatment efficacy. Therefore, for selected topics, large observational studies were reviewed. As described below, in general, associations from only multivariable regression analyses were considered. The observational studies were not graded for quality. For these topics, the ERT completed its search in December 2007 and did not update the search.
Literature Yield for Systematic Review Topics
|Topic||Abstracts identifieda||RCTs retrieved||RCTs accepted||RCTs data- extracted||RCTs included in summary tablesb||Systematic reviews in evidence profiles|
|Monitoring and infections||24||23||17||17||5|
The ERT designed data-extraction forms to capture information on various aspects of the primary studies. Data fields for all topics included study setting, patient demographics, eligibility criteria, kidney transplantation details, numbers of subjects randomized, study design, study funding source, descriptions of interventions (or predictors), description of outcomes, statistical methods used, results, quality of outcomes (as described below), limitations to generalizability and free-text fields for comments and assessment of biases.
Summary tables were developed to tabulate the data from studies pertinent to each question of intervention (see Supporting Tables 2 and 4 as examples at http://www3.interscience.wiley.com/journal/118499698/toc). Each summary table contains a brief description of the outcome, baseline characteristics of the population, intervention, results and methodological quality. Baseline characteristics include a description of the study size, country of residence, age, percentage of deceased donors and dates of transplant. Intervention and concomitant therapies and the results were all captured. The final column was assigned for a grade for methodological quality. The studies were listed by outcome within the table based on the hierarchy of important outcomes (Table 34). Categorical and continuous outcomes were summarized in separate sets of tables. Work Group members were asked to proof all data in summary tables on RCTs. Separate sets of summary tables were created for nonrandomized studies of incidence and predictors of outcomes.
|Critical importance||Mortality, graft loss, cardiovascular events, malignancy (except skin)|
|High importance||Acute rejection, CAN, skin cancer, NODAT, infection (disease), bone fracture, quality of life|
|Moderate importancec||DGF, kidney function, proteinuria, lipids, blood pressure, BMD, bone marrow suppression, diarrhea, infection (marker)|
Due to the large number of recommendations included here and the large volume of literature reviewed, summary tables are not published with this report. They are available at http://www3.interscience.wiley.com/journal/118499698/toc.
Evaluation of Individual Studies
Study size and duration
The study (sample) size is used as a measure of the weight of the evidence. In general, large studies provide more precise estimates. Similarly, longer-duration studies may be of better quality and more applicable, depending on other factors.
Methodological quality (internal validity) refers to the design, conduct and reporting of the outcomes of a clinical study. A three-level classification of study quality was used (Table 35). Given the potential differences in quality of a study for its primary and other outcomes, the study quality was assessed for each outcome. Variations of this system have been used in most KDOQI and all KDIGO guidelines, and have been recommended by the US Agency for Healthcare Research and Quality Evidence-Based Practice Center program (http://effectivehealthcare.ahrq.gov/repFiles/2007_10DraftMethodsGuide.pdf; last accessed March 30, 2009).
|Good quality: Low risk of bias and no obvious reporting errors, complete reporting of data. Must be prospective. If study of intervention: Must be RCT.|
|Fair quality: Moderate risk of bias, but problems with study/paper are unlikely to cause major bias. If study of study/intervention: Must be prospective.|
|Poor quality: High risk of bias or cannot exclude possible significant biases. Poor methods, incomplete data, reporting errors. Prospective or retrospective.|
Each study was given an overall quality grade. Each reported outcome was then evaluated and given an individual grade depending on the quality of reporting and methodological issues specific to that outcome. However, the quality grade of an individual outcome could not exceed the quality grade for the overall study.
The type of results used from a study was determined by the study design, the purpose of the study and the Work Group's question(s) of interest for which the results were used. Decisions were based on the screening criteria and outcomes of interest.
Grading the quality of evidence and the strength of a recommendation
A structured approach, based on GRADE (932–934) and facilitated by the use of Evidence Profiles (see Table 36 for an example), was employed in order to grade the quality of the overall evidence and the strength of recommendations. For each topic, the discussion on grading of the quality of the evidence was led by the ERT, and the discussion regarding the strength of the recommendations was led by the Work Group Chairs. The ‘strength of a recommendation’ indicates the extent to which one can be confident that adherence to the recommendation will do more good than harm. The ‘quality of a body of evidence’ refers to the extent to which our confidence in an estimate of effect is sufficient to support a particular recommendation (934).
Grading the quality of evidence for each outcome
Following GRADE, the quality of a body of evidence pertaining to a particular outcome of interest was initially categorized based on study design. For questions of interventions, the initial quality grade was ‘High’ when the body of evidence consisted of RCTs. In theory, the initial grade would have been ‘Low’ if the evidence consisted of observational studies or ‘Very Low’ if it consisted of studies of other study designs; however, the quality of bodies of evidence was formally determined only for topics where we performed systematic reviews of RCTs. The grade for the quality of evidence for each intervention/outcome pair was decreased if there were serious limitations to the methodological quality of the aggregate of studies, if there were important inconsistencies in the results across studies, if there was uncertainty about the directness of evidence including limited applicability of the findings to the population of interest, if the data were imprecise or sparse, or if there was thought to be a high likelihood of bias. The final grade for the quality of the evidence for an intervention/outcome pair could be one of the following four grades: ‘High,’‘Moderate,’‘Low’ or ‘Very Low’ (Table 37).
|Step 1: Starting grade for quality of evidence based on study design||Step 2: Reduce grade||Step 3: Raise grade||Final grade for quality of evidence for an outcome|
|Randomized trials = High||Study quality||Strength of association||High|
|Observational study = Low||−1 level if serious limitations||+1 level is strong,a no plausible confounders||Moderate|
|Any other evidence =||−2 levels if very serious||+2 levels if very strong,b no major threats to||Low|
|Very Low||limitations||validity||Very Low|
|−1 level if important inconsistency||+1 level if evidence of a dose response gradient|
|Directness||+1 level if all residual plausible confounders would have reduced the observed effect|
|−1 level if some uncertainty|
|−2 levels if major uncertainty|
|−1 level if sparse or imprecise data|
|−1 level if high probability of reporting bias|
Grading the overall quality of evidence
Each clinical outcome was ranked by the Work Group as to its level of clinical importance to the patient. The quality of the overall body of evidence was then determined based on the quality grades for all outcomes of interest, taking into account explicit judgments about the relative importance of each outcome. The resulting four final categories for the quality of overall evidence were: ‘A,’‘B,’‘C’ or ‘D’ (Table 38) (932). This evidence grade is indicated within each recommendation.
|A: High quality of evidence. We are confident that the true effect lies close to that of the estimate of the effect.|
|B: Moderate quality of evidence. The true effect is likely to be close to the estimate of the effect, but there is a possibility that it is substantially different.|
|C: Low quality of evidence. The true effect may be substantially different from the estimate of the effect.|
|D: Very low quality of evidence. The estimate of effect is very uncertain, and often will be far from the truth.|
Assessment of the net health benefit across all important clinical outcomes
The net health benefit was determined based on the anticipated balance of benefits and harm across all clinically important outcomes. The assessment of net medical benefit was affected by the judgment of the Work Group and the ERT. The assessment of net health benefit is summarized in Table 39.
|When there was evidence to determine the balance of medical benefits and harm of an intervention to a patient, conclusions were categorized as follows:|
|• Net benefits = the intervention clearly does more good than harm.|
|• Tradeoffs = there are important tradeoffs between the benefits and harm.|
|• Uncertain = it is not clear whether the intervention does more good than harm.|
|• No net benefits = the intervention clearly does not do more good than harm.|
Grading the strength of the recommendations
The strength of a recommendation is graded as Level 1 or Level 2. Table 40 shows the KDIGO nomenclature for grading the strength of a recommendation and the implications of each level for patients, clinicians and policy makers. Recommendations can be for or against doing something. Table 41 shows that the strength of a recommendation is determined not just by the quality of the evidence, but also by other, often complex, judgments regarding the size of the net medical benefit, values and preferences and costs. Formal decision analyses, including cost analysis, were not conducted.
|Level 1: ‘We recommend’||Most people in your situation would want the recommended course of action and only a small proportion would not||Most patients should receive the recommended course of action||The recommendation can be adopted as a policy in most situations|
|Level 2: ‘We suggest’||The majority of people in your situation would want the recommended course of action, but many would not||Different choices will be appropriate for different patients. Each patient needs help to arrive at a management decision consistent with her or his values and preferences||The recommendation is likely to require debate and involvement of stakeholders before policy can be determined|
|Balance between desirable and undesirable effects||The larger the difference between the desirable and undesirable effects, the more likely a strong recommendation is warranted. The narrower the gradient, the more likely a weak recommendation is warranted.|
|Quality of the evidence||The higher the quality of evidence, the more likely a strong recommendation is warranted.|
|Values and preferences||The more variability in values and preferences, or more uncertainty in values and preferences, the more likely a weak recommendation is warranted.|
|Costs (resource allocation)||The higher the costs of an intervention—that is, the more resources consumed—the less likely a strong recommendation is warranted.|
The KDIGO consensus statement on grading (933) had recommended a category for a ‘consensus-based statement.’ This category was designated for guidance by the Work Group based predominantly on expert opinion in areas of low- or very low-quality evidence. However, it became clear that ‘consensus-based’ was not a distinguishing feature, since all recommendations are supported by Work Group consensus. Still, it was felt that having a category that allows the Work Group to issue general advice would be useful. Typically, an ungraded statement meets the following criteria: it provides guidance based on common sense; it provides reminders of the obvious; it is not sufficiently specific to allow application of evidence to the issue, and therefore it is not based on systematic evidence review. Common examples include recommendations about frequency of testing, referral to specialists and routine medical care. We strove to minimize the use of ungraded recommendations.
This grading scheme with two levels for the strength of a recommendation together with four levels of grading the quality of the evidence, and the option of an ungraded statement for general guidance, was adopted by the KDIGO Board in December 2008.
Format for Guideline Recommendations
Each section contains one or more specific recommendations. Within each recommendation, the strength of recommendation is indicated as level 1 or level 2 and the quality of the supporting evidence is shown as A, B, C or D. These are followed by a brief background with relevant definitions of terms, then the rationale starting with a ‘chain of logic,’ which consists of declarative sentences summarizing the key points of the evidence base and the judgments supporting the recommendation. This is followed by a narrative in support of the rationale. In relevant sections, research recommendations suggest future research to resolve current uncertainties.
Limitations of Approach
While the literature searches were intended to be comprehensive, they were not exhaustive. MEDLINE and various Cochrane databases were the only databases searched. Hand searches of journals were not performed, and review articles and textbook chapters were not systematically searched. However, important studies known to the domain experts that were missed by the electronic literature searches were added to retrieved articles and reviewed by the Work Group. Not all topics and subtopics covered by this guideline could be thoroughly and systematically reviewed. Decisions to restrict the topics were made to focus the systematic reviews on those topics where existing evidence was thought to be likely to provide support for the guideline. Although nonrandomized studies were reviewed, the majority of the ERT and Work Group resources were devoted to review of randomized trials, since these were deemed to be most likely to provide data to support level 1 recommendations with very high- or high-quality (A or B) evidence. Where randomized trials are lacking, it was deemed to be sufficiently unlikely that studies previously unknown to the Work Group would result in a higher-quality level 1 recommendations. A small number of supplemental sets of evidence were collected with a nonsystematic review approach. Any such evidence that is summarized is noted. Decisions to take a nonsystematic review approach for these topics were made due to time constraints and resource limitations.
Review of the Guideline Development Process
Several tools and checklists have been developed to assess the quality of the methodological process for guideline development. These include the Appraisal of Guidelines for Research and Evaluation (AGREE) criteria (936) and the Conference on Guideline Standardization (COGS) checklist (937). Supporting Table 62 shows the COGS criteria that correspond to the AGREE checklist and how each one of them is addressed in this guideline.
Biographic and Disclosure Information
Bertram L. Kasiske, MD (Work Group Co-Chair), is Professor of Medicine and Medical Director of Kidney and Pancreas Transplantation at University of Minnesota, USA. He received his medical degree from the University of Iowa and completed his Internal Medicine residency and fellowship training in Nephrology at Hennepin County Medical Center where he is also currently Director of Nephrology and Medical Director of Kidney Transplantation. His primary research interests encompass immunosuppression, dyslipidemia and CVD in transplant recipients. He is a Co-Investigator in a RCT of homocysteine (FAVORIT), Study of Heart and Renal Protection (SHARP), and the US Renal Data System (USRDS). Dr Kasiske has served as Medical/Scientific Representative to the Board of Directors of United Network for Organ Sharing (UNOS) and was formerly Editor-in-Chief of American Journal of Kidney Diseases. He has published over 200 journal articles and has recently contributed book chapters in Brenner and Rector's The Kidney, Comprehensive Clinical Nephrology, The Kidney: Physiology and Pathophysiology and Kidney Transplantation: Principles and Practice. Dr Kasiske is also a recipient of the Outstanding Research Accomplishment Award from University of Minnesota School of Medicine in 2002 and the National Kidney Foundation's Garabed Eknoyan Award in 2003.
Advisor/Consultant: Astellas; LithoLink; Novartis; Wyeth
Grant/Research Support: Bristol-Myers Squibb; Genzyme; Merck-Schering Plough
Martin G. Zeier, MD, FASN (Work Group Co-Chair), is Head of the Department of Nephrology and Hypertension at the University of Heidelberg, where he also received his training in Internal Medicine and Nephrology. He maintains an active interest in areas of hereditary kidney disease (polycystic), and hypertension and viral infections in kidney transplant patients. Dr Zeier is a member of numerous professional organizations including, American Society of Nephrology, European Dialysis and Transplantation Association and International Society of Nephrology. He has published more than 200 publications and is currently Section Editor of Nephrology Dialysis Transplantation and Co-Editor of Der Nephrologe.
Grant/Research Support: Astellas; Novartis; Parexel
Jonathan C. Craig, MBChB, MM (Clin Epi), DCH, FRACP, PhD, is Professor of Clinical Epidemiology at the University of Sydney's School of Public Health; Senior Staff Specialist in Pediatric Nephrology at the Children's Hospital at Westmead; and Head of the Centre for Kidney Research at the Children's Hospital at Westmead, Australia. He began his current appointment as the Coordinating Editor of the Cochrane Renal Group in 2000 and has cultivated his research interests in RCTs, systematic reviews and diagnostic test evaluation, and applied them to a variety of kidney areas. Dr Craig is a contributor to and an editor of the text, Evidence-Based Nephrology, and has served on many editorial boards, including American Journal of Kidney Diseases, Journal of the American Society of Nephrology and Nephrology. Dr Craig has authored over 200 articles, including publications in the Lancet, The New England Journal of Medicine, Annals of Internal Medicine and the British Medical Journal. He is also a member of more than 12 professional societies and a recipient of the TJ Neill Award for Outstanding Contribution to Nephrological Science in 2009.
Dr Craig reported no relevant financial relationships.
Henrik Ekberg, MD, PhD, is Senior Transplant Surgeon at the University Hospital in Malmö, Sweden, and Professor of Transplant Surgery at Lund University, Lund, Sweden. After obtaining his medical and doctoral degrees and subsequent board certification in General Surgery, he spent 2 years (1986–1988) performing research in the field of transplantation at the University of Sydney in Australia. His main research interests center on clinical and experimental immunosuppressive treatments and for the past 18 years, he has had an active role in a large number of pivotal multicenter trials on immunosuppression in kidney transplantation. Professor Ekberg was also the sponsor and chairman of the steering committee of the recently completed, multicenter, international Symphony study. He is an active member of several professional societies and has served as Councillor for The Transplantation Society (TTS) 2006–2008 and Editor-in-Chief of the TTS Newsletter. Until recently he was also Vice-President of the European Society of Organ Transplantation (ESOT) and chairman of the ESOT education committee. Dr Ekberg is also Associate Editor of the American Journal of Transplantation and an Editorial Board member of Transplant International, Transplantation, and Clinical Transplantation. He is a frequently invited lecturer and has published more than 160 original articles and reviews in the field of transplantation.
Advisor/Consultant: Astellas; Bristol-Myers Squibb; Hansa Medical; Hoffmann-LaRoche; Life Cycle Pharma; Novartis; Wyeth
Speaker: Astellas; Hoffmann-LaRoche
Catherine A. Garvey, RN, BA, CCTC, is a Transplant Coordinator at the University of Minnesota Medical Center-Fairway where she manages all living organ donor programs, oversees the screening and evaluation of living organ donors, directs donor evaluations for kidney and pancreas transplant recipients and provides ongoing education to donors and their families. Ms. Garvey is currently President of NATCO (North American Transplant Coordinators Organization) Board and a member of the National Living Donor Assistance Program (NLDAC) Advisory Committee, AST (American Society of Transplantation) and ITNS (International Transplant Nurse Society). In recognition for her contributions to this field, she was honored with the NATCO Quality of Care Award in 2005 and most recently, the Health Care Hero for Nursing Award.
Mrs Garvey reported no relevant financial relationships.
Michael D. Green, MD, MPH, is Professor at Children's Hospital of Pittsburgh and Professor of Clinical and Translational Science at University of Pittsburgh School of Medicine. Dr Green obtained his medical degree from University of Illinois and Masters of Public Health from University of Pittsburgh and he is board certified in Pediatrics with a subspecialty in Pediatrics Infectious Diseases. His current research interests involve the studying of prophylactic antimicrobials in children with vesicoureteral reflux and impact of antimicrobial prophylaxis on the development of antimicrobial resistance. Dr Green has written over 130 publications and contributed more than 50 review articles and book chapters in notable texts such as Infectious Diseases, Pediatric Solid Organ Transplantation, Principles & Practice of Pediatric Infectious Diseases and Textbook of Pediatric Infectious Diseases. In addition, he has served as peer reviewer of more than 20 journals and as advisory committee member to various organizations including the Food & Drug Administration, International Pediatric Transplant Association and American Society for Transplantation. Dr Green was named the St. Giles Foundation Fellow (2008–2009) at Columbia University Seminar on National Health & Science Policy.
Dr Green reported no relevant financial relationships.
Vivekanand Jha, MD, FRCP, is Additional Professor of Nephrology and Coordinator of Stem Cell Research Facility at the Postgraduate Institute of Medical Education & Research, Chandigarh, India. Dr Jha has held numerous committee positions in professional bodies such as The Transplantation Society, International Society of Nephrology and most recently a Steering Committee member of WHO initiative on data harmonization in transplantation. His ongoing research projects include the development of optimal strategies of immunosuppressive drug use after kidney transplantation by pharmacogenomic approaches and the studying of BMD and histomorphometry in chronic kidney failure and its evolution posttransplantation. He is Editor of The Cochrane Renal Group and a frequent peer reviewer for 14 journals. Dr Jha has authored over 135 publications and 25 book chapters and serves as an editor of an upcoming textbook, Management of Kidney Transplant Recipient.
Dr Jha reported no relevant financial relationships.
Michelle A. Josephson, MD, is Professor, Department of Medicine, Section of Nephrology, and Medical Director of Transplant Nephrology at University of Chicago. She obtained her medical degree from University of Pennsylvania and completed a Nephrology fellowship at University of Chicago. Having served as an investigator in areas related to bone disease and infections posttransplantation and efficacy of immunosuppressive medications, she maintains an active research interest in medical complications of kidney transplantation. Dr Josephson is an author of more than 60 publications and a peer reviewer for journals including American Journal of Transplantation, American Journal of Kidney Disease and Kidney International. She is a frequent lecturer at national and international meetings and a recipient of the Excellence in Medical Education and Clinical Care Award from University of Chicago in 2007.
Advisor/Consultant: Digitas Health; MKSAP; Wyeth
Grant/Research Support: Amgen; Astellas; Wyeth
Bryce A. Kiberd, MD, is Professor of Medicine at Dalhousie University, Halifax, NS, Canada. He received his medical degree from the University of Toronto and continued his nephrology fellowship at Toronto and Stanford University. Dr Kiberd is a member of 11 professional medical societies in addition to his ongoing participation on the Scientific Advisory Board for the Chronic Renal Insufficiency Cohort Study (NIH-NIDDK). He has long-standing interests in areas of immunosuppressive medications and prevention of kidney disease progression and complications in transplant recipients. Dr Kiberd has authored over 100 publications and he is a recipient of the Excellence Award for Research, Dalhousie University from 1998 through 2006
Henri A. Kreis, MD, is Professor Emeritus of Nephrology at Université Paris Descartes and past Chairman of Department of Transplantation and Intensive Care at Hôpital Necker, Paris, France. He was the Founding member and Past President of the French Transplantation Society and also served previously as General Secretary and Vice-President of the France Transplant Association. Dr Kreis has been involved in many clinical and organizational aspects of transplantation in Paris since 1960 and maintains active interests in areas of immunosuppression and rejection. In addition to being a past member of various Health Ministry commissions, he currently serves as Treasurer of Comité de Protection des Personnes (CPP) Ile-de-France II. In 2004, he was awarded the Knight of the Legion of Honour.
Ruth A. McDonald, MD, is Professor of Pediatrics at University of Washington and Clinical Director of Nephrology at Children's Hospital and Regional Medical Center in Seattle, Washington. She completed her medical degree at University of Minneosta School of Medicine where she is a recipient of the Top Medical Graduate: Hewlett-Packard Award. Dr McDonald is currently involved in numerous multicenter clinical studies including a controlled trial of Anti-CD20 monoclonal antibody therapy in historically unsensitized renal transplant recipients with donor-specific antibodies; a Phase II study to determine safety and immunomodulatory functions of induction therapy with Campath 1H, combined with MMF and sirolimus; a surveillance study of viral infections in renal transplant recipients and many others. She is also a member of eight professional organizations including American Society of Pediatric Nephrology, American Society of Transplantation and International Pediatric Transplant Association. Among her teaching responsibilities, she has trained over 25 fellows and also served as Medical Student Research Mentor. Dr McDonald has written over 50 publications and has given close to 40 invited and extra-institutional lectures in the past 10 years.
Dr McDonald reported no relevant financial relationships.
John M. Newmann, PhD, MPH, is President of Health Policy Research & Analysis, Weston, Viriginia. As a prominent health-care consultant and a KTR himself, Dr Newmann has been a tireless patient advocate on issues related to the Medicare end-stage renal disease program, organ transplantation policies, quality of life for dialysis patients and kidney disease education. In addition to speaking on these issues before Congress, he has sought to improve patient and physician communication in presentations at national meetings such as American Society of Nephrology, National Kidney Foundation and Renal Physicians Associations. Dr Newmann is also extensively involved in many organizations including, the American Kidney Fund, Optimal Renal Care Medical Advisory Board and American Association of Kidney Patients. He was also past Vice President of Patient and Donor Affairs at UNOS. Dr Newmann is also a recipient of NKF's Trustee Award and Quality of Life Award from Nephrology News & Issues and he was recently recognized by American Nephrology Nurses’ Association (ANNA) for his 30 years of service to the renal community.
Advisor/Consultant: Arbor Research Collaborative; Renaissance Health Care
Gregorio T. Obrador, MD, MPH, is Professor of Medicine and Dean at the Universidad Panamericana School of Medicine in Mexico City. He also serves as Adjunct Staff at the Division of Nephrology of the Tufts-New England Medical Center and Assistant Professor of Medicine at the Tufts University School of Medicine in Boston. He earned his medical degree from the University of Navarra (Pamplona, Spain), completed his medicine residency at the Western Pennsylvania Hospital (Pittsburgh) and his Nephrology fellowship at Boston University. While doing a clinical research fellowship at Tufts and a Master of Public Health at Harvard University, he demonstrated that the management of patients with CKD prior to stage 5 is suboptimal, and that this is an important factor for the high morbidity and mortality observed in these patients. Dr Obrador has been a member of the Advisory Board and of the Anemia Workgroup of the NKF KDOQI guidelines and has participated in several committees of the Kidney Disease: Improving Global Outcomes (KDIGO) initiative, the American Society of Nephrology, the International Society of Peritoneal Dialysis, the Mexican Academy of Medicine and other Mexican nephrology societies. He is founding member of the Mexican Kidney Foundation and one of the leaders of the Mexico Kidney Early Evaluation Program (KEEP) program. He has given 97 lectures in national and international forums and has several publications in the area of CKD.
Dr Obrador reported no relevant financial relationships.
Jeremy R. Chapman, MD, FRACP, FRCP, is Network Director of Acute Interventional Medicine at Sydney West Area Health Service; Director of Renal Medicine at Westmead Hospital; and Clinical Professor at University of Sydney. He received his undergraduate and medical degrees from Cambridge University and had completed specialist training in nephrology and transplantation at Oxford. Dr Chapman is currently President of The Transplantation Society and past President of World Marrow Donor Association. His research interests include clinical immunosuppression, post-kidney transplant malignancies, CAN and clinical bone marrow, islet, kidney and pancreas transplantation. He has authored over 260 publications and book chapters and has given more than 120 lectures since 1998. Dr Chapman is also a member of numerous professional organizations and a recipient of the Order of Australia Medal in 2003.
Advisor/Consultant: Astellas; Hoffmann-LaRoche; Novartis; Wyeth
Grant/Research Support: Bristol-Myers Squibb; Novartis; Wyeth
Flavio G. Vincenti, MD, is Professor of Clinical Medicine and Surgery at University of California, San Francisco and Medical Director of the Kidney Pancreas Program. He received his medical degree from American University Medical Center at Beirut, Lebanon and completed his fellowship in nephrology in Emory University School of Medicine, Atlanta, Georgia. Dr Vincenti is past President of American Society of Transplantation and current member of Steering Committee, Cooperative Trials in Pediatric (Renal) Transplantation (CCTPT) for NIH. As an attending physician at an outpatient clinic and transplant ward, he devotes much of his time to teaching and mentoring duties while still actively engaging in 13 research projects primarily in the area of immunosuppressive medications. He is currently Associate Editor of CJASN and editorial board member of Transplantation. He has written over 190 publications and authored chapters in several well-known texts including: Goodman & Gilman's the Pharmacological Basis of Therapeutics, Pediatric Solid Organ Transplantation and Smith's General Urology.
Grant/Research Support: Astellas; Bristol-Myers Squibb; Genentech; Hoffmann-LaRoche; Novartis; Wyeth
Kai-Uwe Eckardt, MD, is Professor of Medicine and Chief of Nephrology and Hypertension at the University of Erlangen—Nuremberg, Germany. He received his MD from the Westfälische Wilhelms-Universität Münster, Germany. In 1993, following postgraduate training in internal medicine, pathology and physiology, he was appointed Assistant Professor of Physiology at the University of Regensburg, Germany. Subsequently, he continued his training in internal medicine and nephrology at the Charité, Humboldt University in Berlin, where he was appointed Associate Professor of Nephrology in 2000. His major scientific interests are in the molecular mechanisms and physiological/pathophysiological relevance of oxygen sensing and the management of anemia. Professor Eckardt is Subject Editor of Nephrology, Dialysis and Transplantation and serves on the editorial board of several other journals. He contributed to the development of the European Best Practice Guidelines (EBPGs) for Anemia Management and is a member of the executive committee of KDIGO. Dr Eckardt is associated with CREATE and TREAT studies.
Advisor/Consultant: Affymax; Amgen; HEXAL; Hoffmann-LaRoche; Johnson & Johnson; Ortho Biotech; STADA
Speaker: Amgen; Hoffmann-LaRoche; Johnson & Johnson; Ortho Biotech
Grant/Research Support: Hoffmann-LaRoche; Ortho Biotech
Bertram L. Kasiske, MD. Please see earlier entry for Work Group Co-Chair.
Evidence Review Team
Ethan M. Balk, MD, MPH, is the Director, Evidence-based Medicine at the Tufts Center for Kidney Disease Guideline Development and Implementation, in Boston, MA, and Assistant Professor of Medicine at Tufts University School of Medicine. Dr Balk completed a fellowship in Clinical Care Research. His primary research interests are evidence-based medicine, systematic review, clinical practice guideline (CPG) development and critical literature appraisal.
Dr Balk reported no relevant financial relationships.
Martin Wagner, MD, MS, is currently a research fellow at the Tufts Center for Kidney Disease Guideline Development and Implementation in Boston, MA. He started his training in internal medicine at the University of Würzburg, Germany, where he also received his MD degree. He completed his fellowship in nephrology at Tufts Medical Center, and received an MS in the Clinical Care Research program at the Sackler School at Tufts University. After his return to the University of Würzburg, he will remain as adjunct staff at the Division of Nephrology of Tufts Medical Center. Dr Wagner's research interests include CPG development, systematic review and predictive modeling.
Dr Wagner reported no relevant financial relationships.
Gowri Raman, MD, is Assistant Professor of Medicine at Tufts University School of Medicine and Assistant Director, Tufts Evidence Practice Center at the Center for Clinical Evidence Synthesis. Dr Raman is currently a Clinical Care Research Trainee fellow in the Institute for Clinical Research and Health Policy Studies at Tufts Medical Center. She has over 7 years experience in conducting systematic reviews and CPG development. Her primary research interests are health technology assessment, systematic review and CPG development.
Dr Raman reported no relevant financial relationships.
Amy Earley, BS, is a project coordinator at the Tufts Center for Kidney Disease Guideline Development and Implementation in Boston, MA. She assists in the development of CPGs and conducts systematic reviews and critical literature appraisals.
Ms. Earley reported no relevant financial relationships.
Samuel Abariga, MD, MS, was a Research Associate in the Division of Nephrology at Tufts-Medical Center. He specifically works with the Evidence Review Team at the Tufts Center for Kidney Disease Guideline Development and Implementation. His research interest includes evidence-based medicine, systematic review, CPG development and critical literature appraisal.
Dr Abariga reported no relevant financial relationships.
A special debt of gratitude is owed to the current KDIGO Co-Chair Kai-Uwe Eckardt, founding Co-Chairs Garabed Eknoyan and Norbert Lameire and the KDIGO Board for their invaluable guidance throughout the development of this guideline. We also thank the efficient support provided by the Kidney Disease: Improving Global Outcomes/National Kidney Foundation (KDIGO/NKF) staff that was instrumental to the success of this endeavor, particularly Michael Cheung, Donna Fingerhut and Dekeya Slaughter-Larkem. In particular, we thank the Evidence Review Team members: Ethan Balk, Martin Wagner, Amy Earley, Gowri Raman and Sam Abariga, for their substantial contribution to the rigorous assessment of the available evidence. We are also especially grateful to the Work Group members for their expertise throughout the entire process of literature review, data extraction, meeting participation, the critical writing and editing of the statements and rationale which made the publication of this guideline possible. The generous gift of their time and dedication is greatly appreciated.
Finally, and on behalf of the Work Group, we gratefully acknowledge the careful assessment of the draft guideline by external reviewers. The Work Group considered all of the valuable comments made and where appropriate, suggested changes were incorporated into the final publication. The following individuals and organizations provided review of the draft guideline:
Kevin Abbott (Walter Reed Medical Center), Omar I Abboud, Mario Abbud-Filho (Instituto de Urologia & Nefrologia), Daniel Abramowicz (Hôpital Erasme; European Renal Best Practice), Nasrulla Abutaleb, Vidya N. Acharya (National Kidney Foundation/India), Horacio E. Adrogue (University of Texas Medical Branch), Mona Al-Rukhaimi (Dubai Hospital), Josefina Alberú (Instituto Nacional De Ciencias Medicas Y. Nutricion S Z), Upton Allen (Hospital for Sick Children), Farhana Amanullah (Sindh Institute of Urology and Transplantation), American Society of Transplantation, American Society of Transplant Surgeons, Mariano Arriola, Richard Baker (UK Renal Association), Ashraf Bakr (Mansoura Faculty of Medicine), Rashad S. Barsoum (Cairo University), Gregor Bartel, Scott Batty (Genzyme), Yolanda Tai Becker (University of Wisconsin), Luboslav Bena, Mohamed Benghanem Gharbi (Ibn Rochd Quartier des Hospitaux), Bassam Bernieh (Tawam Hospital), Emily A. Blumberg (University of Pennsylvania), Daniel C. Brennan (Washington University), Brenda S. Brewer (NATCO), Bristol-Myers Squibb, Ghil Busnach, Evelyn Butera, Canadian Society of Nephrology, Canadian Society of Transplantation, Marcelo Cantarovich (McGill University Health Center; The Transplantation Society Council), Fernando Carrera (Hospital SAMS), Sue Cary (American Nephrology Nurses Association), Valerie Cass (McGill University Health Center), Blanche M. Chavers (University of Minnesota), Pierre Cochat (Hospices Civils de Lyon et Université de Lyon; European Renal Best Practice), Jean Colaneri (American Nephrology Nurses Association), William G. Couser (University of Washington), Ana Cusumano (Latin American Society of Nephrology and Arterial Hypertension), Romina A. Danguilan (National Kidney and Transplant Institute), Gabriel M. Danovitch (David Geffen School of Medicine, UCLA), Jane Davis (University of Alabama), Jean-Yves De Vos (AZ Werken Glorieux Dialysis Unit), Ian Dittmer (Auckland City Hospital), Grahame Elder (Westmead Hospital), Donna L. Ennis (NATCO), Tracy A. Evans-Walker (NATCO), Jeffrey Fadrowski (Johns Hopkins University School of Medicine), Randall J. Faull (Australian and New Zealand Society Of Nephrology), Stuart M. Flechner (Cleveland Clinic), Joseph T. Flynn (Seattle Children's Hospital; American Society of Pediatric Nephrology), Patricia G. Folk (The International Transplant Nurses Society), Lorenzo G. Gallon (Northwestern University Feinberg School of Medicine), Alvaro A. Garcia, Valter D. Garcia (Brazilian Association of Organ Transplantation), Rozina Ghazalli (Penang Hospital), Osama Gheith, Elaine Go (St Joseph Hospital), Simin Goral (University of Pennsylvania), Darla K. Granger (American Society of Transplant Surgeons), Debbie Gregory (Renal Society of Australasia), Sinead Gregory (Roche Diagnostics), Ann P. Guillot (University of Vermont), William E. Haley (Mayo Clinic), Judith Hambleton (Rogosin Institute), Jeff Harder (University of Washington), Erica L. Hartmann (American Society of Transplantation), Susan Hayes (S Hayes Consulting), Rebecca Hays (University of Wisconsin Hospital), Domingo Hernandez-Marrero (Hospital Regional Universitario Carlos Haya), Hans H. Hirsch (University Hospital Basel), Hallvard Holdaas (Oslo University Hospital Rikshospitalet), Herwig Holzer (Medical University, Graz), Chakko K. Jacob (Christian Medical College and Hospital), Michel Jadoul (Cliniques Universitaires St Luc), Michelle James (University of Minnesota), Alan Jardine (UK Renal Association), Pradeep V. Kadambi (University of Chicago), Nada Kanaan, Bruce Kaplan (University of Arizona School of Medicine), Frederick J. Kaskel (Children's Hospital at Montefiore), Tammy Keough-Ryan (Dalhousie University), Ron H. Kerman (University of Texas Medical School), Markus Ketteler (Klinikum Coburg), Allan D. Kirk (Emory University Hospital), Greg A. Knoll (Canadian Society of Transplantation), Stephan Korte (Novartis), Andreas Kribben (Universitatsklinikum Essen), K. Sampath Kumar (Meenakshi Mission Hospital), Dirk Kuypers (University Hospitals Leuven), Sondra Kybartiene, Jack Lake (University of Minnesota), Craig B. Langman (Northwestern University Feinberg School of Medicine), Anastasia Laskari, Daniel Lavanchy (World Health Organization), Edgar V. Lerma (University of Illinois at Chicago College of Medicine), Zhi-Hong Liu (Nanjing University School of Medicine), Francesco Locatelli (‘A. Manzoni’ Hospital; European Renal Best Practice), Robert MacTier (UK Renal Association), Samir G. Mallat, Kevin C. Mange (Novartis), Martin Maraschio (Hospital Privado Centro Médico De Córdoba S.A.), Pablo U. Massari (Catholic University of Córdoba), Arthur Matas (University of Minnesota), Dianne McKay (Scripps Research Institute), Enisa Mesic (University Medical Center Tuzla), Marian G. Michaels (Children's Hospital of Pittsburgh), Mark Mitsnefes (Cincinnati Children's Medical Center), Rafique Moosa (University of Stellenbosch), Jose M. Morales (Hosp Universitario Doce De Octubre), Alfredo Mota (Coimbra University Hospital), Barbara T. Murphy (Mount Sinai School of Medicine; American Society of Transplantation), Judit Nagy (Hungarian Society of Nephrology), Nancy Nardelli (Medical City Dallas Hospital), Alicia M. Neu (Johns Hopkins University School of Medicine), Joseph Nogueira (University of Maryland), Ole Oyen (Rikshospitalet University Hospital), Fatma N. Ozdemir (Baskent University), Alejandro Lucas Penagos, Ronald Perrone (Tufts Medical Center), Todd E. Pesavento (Ohio State University; United Network for Organ Sharing), Mark Pescovitz (Indiana University), Thomas G. Peters (University of Florida Health Science Center), Helen Pilmore (Auckland City Hospital), Lee Po-Chang, G.V. Ramesh Prasad (University of Toronto), Erasmia Psimenou, Ruth Rahamimov (Belinson Medical Center), Harun Ur. Rashid (Bangladesh Renal Association), Luis Re, Rafael Reyes (Hospital Miguel Hidalgo), Sally I. Rice (University of Louisville), John P. Roberts (American Society of Transplant Surgeons), Bernardo Rodriguez-Iturbe (Hospital Universitario and Universidad del Zulia, Maracaibo), Lionel Rostaing (Toulouse University Hospital), Mahmoud Sadeghi, Kaija Salmela (Helsinki University Central Hospital), John D. Scandling (Stanford University Medical Center), Heidi Schaefer (Vanderbilt University Medical Center), Tammy Sebers (Oregon Health and Science University), Myra Sgorbini, Ayo Shonibare (St Nicholas Hospital), Christopher O. Simon (Independent Dialysis Foundation), Ammar Sirawan (Ajn W Sein Hospital), Dhavee Sirivongs (Khon Kaen Medical School), William T. Smith (Wyeth), Edison Souza, Goce B. Spasovski (University Clinical Center, Skopje; European Renal Best Practice), William P. Stamford, Sharon Swofford (American Nephrology Nurses Association), Hiroshi Toma (Tokyo Women's Medical University), Yvette Tomacruz (National Kidney and Transplant Institute), Marcello Tonelli (University of Alberta; Canadian Society of Nephrology), Armando Torres (Universidad de La Laguna), Ray Trevitt (EDTNA/ERCA), Lara E. Tushla (Society for Transplant Social Workers), Ray Vanholder (University Hospital, Ghent; European Renal Best Practice), Joseph Vassalotti (National Kidney Foundation), Ruben L. Velez (Dallas Nephrology Associates), Alberto Vianello (Feltre General Hospital), Jelka Zaletel Vrtovec, Sean Wagner, Yi Wang, Bruno Watschinger (Medizinische Universität Wien), Willem Weimar (University Medical Center Rotterdam), Matthew R. Weir (University of Maryland), Barbara Weis Malone (University of Colorado), José R. Weisinger (Baptist Health South Florida), Patricia Weiskittel (American Nephrology Nurses Association), Kerstin W. Westman (University Hospital of Malmö), David C. Wheeler (University College London Medical School), Anne Wiland (Novartis), Alan Wilkinson (David Geffen School of Medicine, UCLA), Alexander Wiseman (University of Colorado; American Society of Transplantation), Germaine Wong (Children's Hospital at Westmead), David Wu (Wyeth), Chul Woo Yang (Catholic University of Korea), Jo-Anne Young (University of Minnesota), Carlton J. Young (University of Alabama), Zhu Zhou, Carmine Zoccali (Ospedali Riuniti; European Renal Best Practice).
Participation in the review does not necessarily constitute endorsement of the content of this report by above individuals, or the organization or institution they represent.
Bertram L. Kasiske, MD, Work Group Co-Chair Martin G. Zeier, MD, Work Group Co-Chair
- 13A randomized trial of thymoglobulin vs. alemtuzumab (with lower dose maintenance immunosuppression) vs. daclizumab in renal transplantation at 24 months of follow-up. Clin Transplant 2008; 22: 200–210., , et al.
- 26No difference in degree of interstitial Sirius red-stained area in serial biopsies from area under concentration-over-time curves-guided cyclosporine versus tacrolimus-treated renal transplant recipients at one year. J Am Soc Nephrol 2006; 17: 305–312., , et al.
- 31Mycophenolate mofetil in renal transplantation: 3-year results from the placebo-controlled trial. European Mycophenolate Mofetil Cooperative Study Group. Transplantation 1999; 68: 391–396.
- 34A blinded, randomized clinical trial of mycophenolate mofetil for the prevention of acute rejection in cadaveric renal transplantation. The Tricontinental Mycophenolate Mofetil Renal Transplantation Study Group. Transplantation 1996; 61: 1029–1037.
- 92Pharmacokinetic and metabolic investigations of mycophenolic acid in pediatric patients after renal transplantation: Implications for therapeutic drug monitoring. German Study Group on Mycophenolate Mofetil Therapy in Pediatric Renal Transplant Recipients. Ther Drug Monit 2000; 22: 20–26., , et al.