Cord blood transplantation for haematological malignancies: conditioning regimens, double cord transplant and infectious complications

Authors


Colleen Delaney, Fred Hutchinson Cancer Research Center/University of Washington, D2-100, 1100 Fairview Ave North, Seattle, WA 98109, USA. E-mail: sdelaney@fhcrc.org

Summary

Growing evidence supports the efficacy of cord blood transplantation (CBT) to treat patients with haematological malignancies, and the number of CBTs is rapidly increasing. Herein, we review considerations regarding conditioning regimens for CBT, the impact of double unit transplantation on CBT outcomes, and data regarding infectious complications following CBT.

Conditioning regimens

An increasing number of agents with variable immunosuppressive, myelosuppressive and anti-malignancy activity have broadened the options for conditioning regimens for haematopoietic stem cell transplantation (HCT). Disease type and remission status, as well as patient age and comorbidities, are important considerations in optimizing HCT regimens. Additional considerations relevant to cord blood transplantation (CBT) include the possibility of delayed time to neutrophil and platelet engraftment, concerns about delayed acquired immune reconstitution, and the increased potential for graft failure. While a variety of factors contribute to the incidence of engraftment following CBT and include cell dose, degree of human leucocyte antigen (HLA) matching, prior treatment and post-grafting immunosuppression, the conditioning regimen is critically important. A summary of reported CBT conditioning regimens is provided in Table I.

Table I.   Summary of reported conditioning regimens in CBT.
  1. TBI, total body irradiation; FLU, fludarabine; CY, cyclophosphamide; Ara-C, cytosine arabinoside; BU, busulfan; TT, thiotepa; ATG, anti-thymocyte globulin; MEL, melphalan.

Regimens with high reported engraftment rates
 High-dose TBI-based regimens13·2 Gy TBI, 75 mg/m2 FLU, 120 mg/kg CYBarker et al (2005a)
13·2–13·75 Gy TBI, 120 mg/kg CY, 90 mg/kg equine ATGWagner et al (2002); Kurtzberg et al (2008)
12 Gy TBI, 120 mg/kg CY, 8–12 g/m2 Ara-CYamada et al (2008); Ooi et al (2009, 2008)
12 Gy TBI, 150 mg/m2 FLU, 10 g/m2 Ara-COkada et al (2008)
 Non high-dose TBI-based regimens and reduced-intensity regimens9·6 mg/kg IV BU, 10 mg/kg TT, 150 mg/m2 FLU, 8 mg/kg ATG Sanz et al (2007)
100 mg/m2 MEL, 180 mg/m2 FLU, 6 mg/kg rabbit ATGBallen et al (2007)
80 mg/m2 MEL, 125 mg/m2 FLU, 4 Gy TBIMiyakoshi et al (2004)
200 mg/m2 FLU, 50 mg/kg CY, 2 Gy TBI ± 90 mg/kg equine ATGBrunstein et al (2007)
Regimens with low reported engraftment rates
 520 mg/m2 IV BU, 160 mg/m2 FLUHorwitz et al (2008)
BU (600–900 ng/ml), 135 mg/m2 MEL, 90 mg/kg equine ATGWall et al (2005)
8 mg/kg BU, 200 mg/m2 FLU, 2 Gy TBI Barker et al (2003)
180 mg/m2 MEL, 120 mg/m2 FLUNarimatsu et al (2008)

Total body irradiation (TBI)-based regimens

High dose total body irradiation (TBI) has been the cornerstone of most myeloablative CBT conditioning regimens used to treat haematological malignancies. In many large series reporting outcomes on patients undergoing CBT in the late 1990s and early 2000s, either all patients (Wagner et al, 2002; Takahashi et al, 2007; Kurtzberg et al, 2008) or the majority of patients (85% and 76%) (Laughlin et al, 2004; Eapen et al, 2007) received conditioning with fractionated TBI, ranging in intensity between 12 Gy and 13·75 Gy. In subsequent series reporting on patients transplanted more recently, many of which involved fewer patients, TBI continued to be the backbone of most myeloablative conditioning regimens (Barker et al, 2005a; Okada et al, 2008; Ooi et al, 2008, 2009; Yamada et al, 2008; Atsuta et al, 2009). Given the heterogeneity of diseases treated in these reports, differences in age of patients, evolving standards for infused cell doses and degree of HLA matching, as well as varying additional therapies used in conjunction with TBI, it is not possible to draw conclusions regarding optimal TBI dose, dose rate, or fractionation schema.

Optimal therapies to accompany TBI have evolved and remain an area of active investigation. In earlier series, 120 mg/kg of cyclophosphamide (CY) and 90 mg/kg of equine anti-thymocyte globulin (ATG) were commonly incorporated into regimens (Wagner et al, 2002; Kurtzberg et al, 2008). More recently, however, the University of Minnesota has replaced ATG with 75 mg/m2 fludarabine (FLU) (Barker et al, 2005a). Though it is uncertain the extent to which this change, along with a change in post-transplantation immunosuppression from ciclosporin (CSA) and methylprednisolone (MP) to CSA and mycophenolate mofetil (MMF), has contributed to improvements in outcomes reported by the University of Minnesota (Barker et al, 2008), the combination of 13·2 Gy TBI, 120 mg/kg CY and 75 mg/m2 FLU has now emerged as a standard myeloablative conditioning regimen at many centres, including ours. Other centres have investigated regimens including 12 Gy TBI, 120 mg/kg CY and high dose (8–12 g/m2) cytosine arabinoside (Ara-C) (Ooi et al, 2008, 2009; Yamada et al, 2008) as well as 12 Gy TBI, 150 mg/m2 FLU and 10 g/m2 Ara-C (Okada et al, 2008) and 12 Gy TBI, 24 g/m2 Ara-C, 90 mg/m2 FLU and G-CSF (Tomonari et al, 2006).

Small patient numbers prevent drawing conclusions regarding the relative efficacy of these conditioning regimens for particular diseases, however they all appear to provide adequate immunosuppression to promote engraftment. As long as cell dose and HLA matching thresholds are met, more recent series consistently report >90% engraftment rates following myeloablative TBI-based conditioning (Barker et al, 2005a; Okada et al, 2008; Ooi et al, 2008, 2009), however, the toxicity of TBI-based regimens limits their widespread use. A recent Japanese report describes comparable outcomes among CBT patients with no marked organ dysfunction aged 50–55 years receiving 12 Gy TBI-based conditioning regimens versus younger patients receiving similar regimens (Konuma et al, 2009), however, the toxicity of TBI increases with age. At our centre, the current upper age limit for our TBI-based CBT conditioning regimen is 45 years. Additionally, younger patients who have received significant prior irradiation during therapy are not eligible.

Non-TBI based regimens

Development of non-high dose TBI-based conditioning regimens is an area of active investigation, with the goals of retaining potent anti-malignancy potential and reducing the toxicity of high dose TBI while establishing regimens that remain sufficiently immunosuppressive to ensure engraftment.

The literature concerning high intensity non-TBI based regimens is limited. Busulfan (BU) has been a cornerstone of many high dose non-TBI based allogeneic transplant regimens, particularly for the treatment of acute myeloid leukaemia (Santos et al, 1983; Tutschka et al, 1987). In the CBT setting, however, the available data raises concerns about the engraftment potential of BU-based regimens. While BU might be myeloablative and stem cell ablative at higher doses, it has limited toxicity against mature lymphocytes and is not markedly immunosuppressive (Peters et al, 1987). In a recent report from the Duke University School of Medicine group, eight of ten patients experienced primary or secondary graft failure following conditioning with 520 mg/m2 intravenous (IV) BU (median daily area under the curve 4225 μmol-min) and 160 mg/m2 FLU (Horwitz et al, 2008). Similarly, at the MD Anderson Cancer Center, only 6/11 CBT patients conditioned with pharmaco-kinetically guided IV BU/FLU engrafted (Ciurea & Andersson, 2009). In an arm of the Cord Blood Transplantation Study (COBLT) investigating a non-TBI based regimen among infants and young children, BU targeted to 600–900 ng/ml, 135 mg/m2 melphalan (MEL) and equine ATG 90 mg/kg was associated with a cumulative incidence of engraftment of only 59% (Wall et al, 2005). In their initial efforts at developing a reduced intensity conditioning (RIC) CBT regimen, the University of Minnesota group used 8 mg/kg BU, 200 mg/m2 FLU and 2 Gy TBI. In addition to patients experiencing prolonged neutropenia following this conditioning regimen, four of 21 patients experienced graft failure, leading to the replacement of BU with 50 mg/kg of CY (Barker et al, 2003). BU-based regimens have also been described in several large European series, but these reports also included large proportions of patients receiving TBI-based regimens. Because outcomes analyses are not stratified according to conditioning regimens, interpreting the relative efficacy of different conditioning regimens is challenging (Locatelli et al, 1999; Rocha et al, 2000, 2001, 2004; Michel et al, 2003; Kögler et al, 2005; Arcese et al, 2006).

Addition of further immunosuppressive agents to BU-based regimens may facilitate engraftment, but raise concerns about regimen-related toxicities and further delaying immune reconstitution following CBT. Sanz et al (2007) presented encouraging preliminary data regarding the efficacy of 9·6 mg/kg IV BU, 10 mg/kg thiotepa, 150 mg/m2 FLU and 8 mg/kg ATG as a myeloablative regimen capable of promoting engraftment without excessive toxicity. Among 73 consecutive patients with a median follow up of 7 months, the cumulative incidence of engraftment, which occurred at a median of 22 d, was 89%, and day-180 transplant-related mortality (TRM) was 20% (Sanz et al, 2007). At our centre, we have recently initiated an investigation of a treosulfan (TREO)-based conditioning regimen including 42 g/m2 TREO, 150 mg/m2 FLU and 2 Gy TBI. A BU analogue, TREO is a novel agent that in vitro data suggests may have more potent anti-leukaemic and immunosuppressive intensity than BU, and early clinical experience suggests it may be less toxic. Encouraging preliminary data on TREO-based regimens has been reported in the non-CBT allogeneic transplant setting (Casper et al, 2004; Holowiecki et al, 2007; Nemecek et al, 2008).

Melphalan (MEL), dosed at varying intensities, has also been explored in several smaller studies of non-TBI based regimens. Similar to high dose BU/FLU regimens, high dose MEL/FLU regimens may not be sufficiently immunosuppressive to ensure engraftment. Narimatsu et al (2008) reported only a 60% engraftment rate in 10 recent patients conditioned with 180 mg/m2 MEL and 120 mg/m2 FLU. The addition of either ATG or low dose TBI may, however, be sufficient to promote engraftment in MEL-based regimens; regimens including 180 mg/m2 FLU, 100 mg/m2 melphalan (MEL) and 6 mg/kg rabbit ATG (Ballen et al, 2007) and 125 mg/m2 FLU, 80 mg/m2 MEL and 4 Gy TBI (Miyakoshi et al, 2004) have been associated with engraftment rates >90%. There is insufficient data to determine the relative efficacy of MEL-based regimens versus BU-based regimens with regard to disease-specific relapse or overall TRM.

Beyond establishing non-TBI based regimens with potent anti-leukaemic potential, establishing minimally intensive conditioning regimens suited to older patients or patients with significant comorbidities is an important goal in CBT. While as little as 2 Gy TBI is sufficient to promote engraftment following matched related peripheral blood transplants (McSweeney et al, 2001), the minimum necessary therapy to promote engraftment following CBT is not certain. The University of Minnesota has pioneered a regimen of 200 mg/m2 FLU, 50 mg/kg CY, 2 Gy TBI ± 90 mg/kg equine ATG that has established benchmark outcome expectations for minimally intensive RIC CBT (Brunstein et al, 2007). In a series of 110 patients, median age 51 (range 17–69) years, with heterogeneous haematological diseases, investigators observed a 45% 3-year overall survival, 26% 3-year incidence of TRM and 31% 3-year incidence of relapse. Patients who had received less than two cycles of multiagent chemotherapy within the 3 months before enrollment (and had no history of autologous transplant) were deemed at higher risk for graft failure and received ATG in their conditioning regimen. Primary neutrophil recovery occurred in 92% of patients at a median of 12 d and the cumulative incidence of sustained engraftment (neutrophil recovery with complete chimaerism) was 85% (95% confidence interval, 77–92%). In univariate analysis, ATG was associated with increased TRM, while on multivariate analysis ATG was associated with a lower risk of acute graft versus host disease (GVHD). In a subsequent analysis, the University of Minnesota group confirmed the safety of this regimen in patients older than 55 years (Majhail et al, 2008). Important questions raised by the University of Minnesota experience include whether further dose reductions might be possible to establish a more minimally intensive regimen for heavily pretreated patients and, given concerns about immune reconstitution in the CBT setting, whether ATG might be replaced with alternative dose escalations in patients deemed at higher risk for graft failure. At our centre, we are currently investigating the possibility of replacing ATG with small increases in TBI.

Disease targeted conditioning

Limited data exists regarding conditioning regimens targeted to specific disease types in the CBT setting, but developing such regimens should be an important goal. A growing body of literature supports the efficacy of CBT using minimally intensive regimens to treat low and intermediate grade lymphoid malignancies (Majhail et al, 2006a; Brunstein et al, 2009; Rodrigues et al, 2009). Promising novel therapeutics for lymphoma, including immunotherapeutic and radioimmunotherapeutic agents might be incorporated into novel conditioning regimens targeting specific diseases, and post-transplant maintenance therapy may further improve outcomes. Similarly, for Philadelphia chromosome positive acute leukaemias, tyrosine kinase inhibitors have already shown promise in post-transplant maintenance after conventional transplantations (Carpenter et al, 2007) and require examination in the cord blood setting. As additional molecular aberrations are identified in specific acute leukaemias, the potential benefit of targeted therapies (i.e. FLT3 inhibitors in FLT3 mutated patients), used either in the conditioning regimen or as maintenance therapy, also warrants evaluation.

Double cord transplant

Clinical experience has demonstrated that both infused cell dose and degree of HLA matching of cord blood units correlate with engraftment and outcomes. Numerous studies have demonstrated that total nucleated cell (TNC) count and CD34+ cell doses above minimum thresholds are associated with improved engraftment and decreased TRM (Locatelli et al, 1999; Michel et al, 2003; Gluckman & Rocha, 2004; Arcese et al, 2006; Majhail et al, 2006b). Determining appropriate minimum thresholds, however, is complicated by the interaction between cell dose and HLA typing. Based on a recent analysis, the New York Blood Center recommended minimum TNC ≥ 2·5 × 107/kg for 5/6 or 6/6 (low resolution typing at HLA A and B and high resolution at DRB1) single unit transplants and a minimum TNC ≥ 5 × 107/kg for 4/6 matched single unit transplants (Barker et al, 2007). For many adults and large children, no single units are available that meet these cell dose requirements. To overcome this obstacle, the University of Minnesota introduced the double cord transplant model in which two cord units are infused simultaneously to increase cell doses.

Double unit CBT appears to have significantly increased engraftment rates among adults and older children and contributed to decreased TRM in these patients. In contrast to outcomes for adults in the COBLT trial, where the median prethaw TNC dose was 2·3 × 107 (range 1·4–5·5) and the cumulative incidence of neutrophil engraftment among 34 patients was 66% (Cornetta et al, 2005), in the initial report of outcomes following double unit CBT, median infused TNC dose was 3·5 × 107 (range 1·1–6·3) and 21 of 23 consecutive patients (all evaluable patients) achieved donor engraftment following myeloablative conditioning (Barker et al, 2005a). Subsequent reports in the RIC setting (Ballen et al, 2007; Brunstein et al, 2007) and larger preliminary series in the myeloablative setting (Brunstein et al, 2008) have confirmed high engraftment rates following double unit CBT among adults and large children. In spite of apparent increases in engraftment rates following double unit CBT, time to engraftment remains similar to modestly reduced compared to the time to engraftment following single unit CBT (Majhail et al, 2006b; Waller et al, 2007). Double unit CBT appears to be associated with increased moderate acute GVHD as compared to single unit CBT, though not with increased severe acute GVHD or chronic GVHD (Brunstein et al, 2007; MacMillan et al, 2009). Preliminary evidence also suggests, at least for patients with good disease control at the time of transplant, that double unit CBT may be associated with a decreased risk of relapse (Verneris et al, 2005; Brunstein et al, 2007; Gutman et al, 2008; Rodrigues et al, 2009).

The biology behind double unit CBT is not well understood. In the vast majority of cases, a single unit emerges as the sole source of long term haematopoiesis. By day 21 post-transplant, single unit dominance is detectable in over 80% of patients, though some degree of mixed chimaerism may be present in a larger portion of patients undergoing RIC conditioning than myleoablative conditioning. By 1-year post-transplant, single unit haematopoiesis is present in nearly all patients (Barker et al, 2005a; Brunstein et al, 2007). To date, no factors, including viability, infused TNC, CD34+, CD3+, sex mismatch, ABO blood group, HLA mismatch, and order of infusion have been identified that reliably predict which unit will emerge as the winner (Majhail et al, 2006b). The observation that only one unit ‘wins’ raises questions about how the infusion of two units enhances engraftment rates, but it is possible that the ‘losing’ unit may facilitate engraftment, perhaps by increasing numbers of infused ancillary cells, such as mesenchymal stem cells. Alternatively, the use of two units may simply increase the chances of infusing a unit with engrafting potential. The observations of increased acute GVHD following double unit CBT as well as the possibility of decreased relapse rates suggest that immunological interactions may underlie the emergence of a winning unit. The Sloan Kettering group has recently provided murine model data supporting this hypothesis (Eldjerou et al, 2008), and preliminary work by our group suggests that CD8+ T cells may mediate rejection of the losing unit (Gutman et al, 2009).

Double unit CBT has become standard practise at many centres for patients who do not have an adequately sized single unit. At our centre, we have developed a conservative algorithm such that single unit transplant is permitted for patients with 6/6 units with TNC ≥ 3·0 × 107/kg, 5/6 units with TNC ≥ 4·0 × 107/kg and 4/6 units with TNC ≥ 6·0 × 107/kg. If these criteria are not met, double unit transplant is performed. Each unit is required to have a TNC ≥ 1·5 × 107/kg and must be at least 4/6 matched to the patient. Preference is given to better-matched, acceptably sized, smaller units, and units must be at least 3/6 matched to each other. Because of high variability in assays assessing CD34+ cells, we do not consider CD34+ counts unless comparing units sized comparably by TNC, in which case we select the unit with the higher CD34+ count. An algorithm for unit selection and conditioning regimen assignment for CBT patients treated at our centre is provided in Fig 1.

Figure 1.

 Algorithm for donor selection and conditioning regimen. A basic algorithm for donor selection and conditioning regimens as used at the Fred Hutchinson Cancer Research Center/Seattle Cancer Care Alliance is presented.

For patients who have an adequately sized single unit, it is uncertain whether addition of a second unit might improve outcomes, and an ongoing randomized Blood and Marrow Transplant Clinical Trials Network trial is investigating double versus single unit transplant in patients aged 2–21 years with adequately sized single units.

Infectious complications

Infectious complications are a major concern following CBT. Delayed neutropenia, particularly following myeloablative regimens, increases the period of significant vulnerability to bacterial and fungal infections, and delayed acquired immune reconstitution raises concerns about viral complications. Data suggests, however, that vigilant monitoring and aggressive supportive measures can do much to overcome these challenges. Additionally, decreased GVHD following CBT may decrease immunosuppressive-related infections. Novel strategies to improve infectious outcomes by decreasing time to neutrophil engraftment through ex-vivo expansion of cells or manipulation to improve homing to the stem cell niche, as well as strategies to improve acquired immunity through adoptive immunotherapy and ex vivo T cell manipulation, are beyond the scope of this review.

General data

Infectious complications were reported in several studies comparing outcomes between CBT patients and unrelated donor (URD) patients. Interpretation of these data is complicated by heterogenous degrees of mismatching between URDs. In some (Rocha et al, 2000, 2001; Laughlin et al, 2004; Eapen et al, 2007; Atsuta et al, 2009), but not all (Rocha et al, 2004) registry comparative studies, deaths due to infection accounted for a higher proportion of early deaths among CBT patients as compared to URD patients. Several additional studies reported high rates of early TRM largely due to infection among CBT patients, though low infused cell doses and patient selection biases may have confounded these results (Locatelli et al, 1999; Michel et al, 2003; Arcese et al, 2006). A limited number of studies have focused exclusively on comparing rates of infection among CBT patients and patients receiving peripheral blood stem cell transplants (PBSCTs) and bone marrow transplants (BMTs). In a report describing infectious outcomes among paediatric patients in the 2 years after transplantation, 60 UCB patients were compared to 52 BM patients and 24 T cell-depleted (TCD) patients (Barker et al, 2005b). The cumulative incidence of one or more serious infections was comparable between groups (BM 81%, TCD 83%, UCB 90%= 0·12) and only TCD patients had a higher risk for infections. A Spanish study comparing outcomes among 48 adult CBT patients compared to 144 BMT and PBSCT patients demonstrated a higher incidence of severe bacterial infections among CBT patients prior to day 100 but no increased incidence of infection-related mortality at day 100 or 3 years nor any increase in fungal or viral infections at 3 years (Parody et al, 2006). Among CBT patients, neutropenia on day +30 and infused TNC < 2 × 107/kg were associated with increased infection-related mortality. Similarly, a smaller single centre study describing outcomes among 23 MUD adult patients and 28 adult UCB patients reported a higher incidence of bacterial infections among CBT patients prior to day-50 post-transplant, but no increase in early viral or fungal infections nor any increase in any types of infections after day-50 post-transplant (Hamza et al, 2004).

Bacterial and fungal infections

Several additional non-comparative series have specifically examined the incidence of bacterial and fungal infections among CBT patients. A recent large retrospective Japanese study examined 664 paediatric patients and 1208 adult patients transplanted between 1997 and 2005 (Yazaki et al, 2009). Cumulative incidence of bacterial infection (defined as clinical symptoms associated with pathogenic microorganisms) by day 100 was 11% for children and 21% for adults. Seventy-four percent of bacteremias were caused by Gram positive organisms. Early bacterial infection was associated with a significant increase in mortality in adults but not in children. Among children, older age was associated with an increase risk for early bacterial infection; among adults, no risk factors for increased risk of early infection were identified. A Japanese study of 102 consecutive patients undergoing reduced-intensity conditioning CBT reported a cumulative incidence of bacteremias of 32% within 100 d of CBT, of which 25% were directly fatal (Narimatsu et al, 2005). A retrospective review of 128 patients investigating fungal infections following RIC CBT demonstrated a 3-year cumulative incidence of invasive fungal infection of 10·2% at a median onset of day-20 post-transplantation (range 1–82 d) (Miyakoshi et al, 2007). Use of prednisolone >0·2 mg/kg was significantly associated with a risk of infection.

Viral infections

An increasing amount of literature is emerging regarding viral complications following CBT. Cytomegalovirus (CMV) has been the subject of the most reporting. Heterogeneous series have noted varying incidences of CMV infection (subclinical activation or reactivation), ranging from 22% to 100%, as well as varying incidences of CMV disease, ranging from 6% to 16%, following CBT (Montesinos et al, 2009; Walker et al, 2007; Matsumura et al, 2007;Narimatsu et al, 2007; Takami et al, 2005; Tomonari et al, 2008a,b; Parody et al, 2006). Part of this heterogeneity is attributable to different monitoring strategies, inclusion of different patient populations in the analyses (i.e. all patients versus CMV seropositive patients only), as well as different treatment regimens. In a recent large series comparing incidence of CMV infection and disease among 228 CBT patients and 525 BMT or PBSCT patients, the University of Minnesota reported CMV infection rates of 22% and disease rates of 6%, which were comparable to rates in BMT and PBSCT patients (Walker et al, 2007). CMV infection and disease were significantly more likely in patients who were seropositive to CMV, in those with acute GVHD, and in those receiving TCD grafts. CMV antigenemia was used to monitor for reactivation and all CMV seropositive recipients or seronegative recipients with seropositive donors received high dose acyclovir prophylaxis (500 mg/m2 (10–12 mg/kg) IV every 8 h or 800 mg (18 mg/kg paediatric) orally five times daily up to day +100 following transplant (Walker et al, 2007). Montesinos et al (2009) recently reported a cumulative incidence of 47·3% CMV infection at 1 year among 151 patients (57·6% reactivation among 117 seropositive patients and 11·8% activation in seronegative patients) and a 10·8% cumulative incidence of CMV disease (12·1% among seropositive patients and 5·9% among seronegative patients). Monitoring was by antigenaemia for the first 60 patients and by polymerase chain reaction (PCR, threshold 500 CMV DNA copies) for the remaining patients. Prophylaxis was acyclovir 500 mg IV three times daily until engraftment followed by either ganciclovir 5 mg/kg IV daily or oral valganciclovir 900 mg daily up to day 120. CMV infection occurred at a median of 45 d after transplant (range 14–239 d). Conditioning regimen with ATG and the development of severe GVHD were risk factors for CMV infection. In a Japanese series reporting on CMV following RIC CBT in 140 patients, 55% experienced infection on a median of day 35 (range 4–92) and 16% developed disease on a median of day 33 (range 15–106) (Matsumura et al, 2007). Acute GVHD and low CD34+ cell dose were risk factors for CMV disease. CMV antigenaemia was used to monitor for reactivation and viral prophlaxis was 600 mg/d acyclovir.

Optimal CMV treatment strategies following CBT are not defined. Management strategies must balance the toxicity and cost of therapeutic agents against the potential benefits of aggressive treatment. Many centres continue to employ a pre-emptive strategy for CMV treatment, monitoring and treating for CMV upon evidence of activation (Takami et al, 2005; Matsumura et al, 2007; Narimatsu et al, 2007; Tomonari et al, 2008a,b), but more aggressive treatment in this high risk population may be appropriate. As described above, the University of Minnesota utilizes a high dose acyclovir prophylactic strategy and reports a low incidence of CMV infection. The Spanish group reported no significant differences in toxicities or efficacy between valganciclovir and ganciclovir prophylaxis started at the time of engraftment. The Columbia group has reported excellent outcomes among paediatric patients at high risk for CMV infection, including 29 CBT patients. Using a prophylactic strategy of intravenous foscarnet (90 mg/kg per 48 h) alternating with intravenous ganciclovir (5 mg/kg per 48 h) initiated at engraftment and continued through day, only two of 57 patients in the study (3·5%) developed PCR reactivation at the center’s positive threshold of 600 DNA copies/ml and no patients developed disease (Shereck et al, 2007). At our centre we have recently altered our CMV treatment strategy for CBT patients. Using a pre-emptive strategy for CMV management, we observed reactivation in 25 of 26 consecutive seropositive patients (using PCR monitoring with threshold for positivity of 100 DNA copies/ml) and a disease rate of 25%. Based on these findings, we have implemented a more aggressive management strategy for CMV seropositive patients which includes treatment with 5 mg/kg ganciclovir IV once daily on days-8 to -2 pretransplant and then high dose valacyclovir (2 g three times daily) or 500 mg acyclovir IV every 8 h (or as appropriate for paediatric patients) up to day-100 post-transplant. All CBT patients are monitored for CMV reactivation twice weekly by PCR, beginning on the day of transplant, with initiation of ganciclovir or foscarnet for any positive PCR, and 900 mg valganciclovir daily (or as appropriate for paediatric patients) from day-100 pos-transplant up to 1 year. Preliminary observations using this regimen indicate a significant reduction and delay in time to initial reactivation, and we have observed no cases of CMV disease. No cases of primary CMV infection have been observed among seronegative patients and we continue to treat these patients with a pre-emptive strategy.

Reporting on other viral complications has been more limited. ATG containing regimens, particularly RIC regimens, have been associated with a particularly high rate of Epstein Barr virus (EBV) reactivations and complications, and monitoring for EBV reactivation and early treatment with rituximab may be appropriate for patients treated with ATG (Brunstein et al, 2006). A Japanese group described a high incidence of human herpes virus (HHV6) reactivation following CBT, and a number of case reports have documented HHV6 complications following CBT (Tanaka et al, 2005; Tomonari et al, 2005; Matàet al, 2008;Muramatsu et al, 2009). A recent Canadian report comparing 37 children undergoing CBT to 77 children undergoing BMT, in which no patients received varicella zoster virus (VZV) prophylaxis, demonstrated a 46% cumulative incidence of VZV disease at 3 years versus 31% in the BMT group (Vandenbosch et al, 2008). No large studies have reported on viral respiratory tract infectious complications following CBT. At our centre, we have observed frequent urinary symptoms and haemorrhagic cystitis associated with BK viruria and viraemia following CBT. Symptoms typically resolve within several weeks following supportive measures, though we have administered cidofivir in patients with persistent problems.

Conclusion

Umbilical cord blood is a viable and increasingly utilized source of stem cells for HCT. Continued investigation and optimization of conditioning regimens should lead to improved stratification of patients to appropriate disease-specific and therapeutically intense regimens. Double unit CBT appears to have improved outcomes among adults and older children and raises important biological questions that may yield insight into mechanisms of both graft-versus-leukaemia and GVHD. While decreasing infectious complications related to prolonged neutropenia and delayed acquired immune reconstitution remain important areas for future improvement, aggressive monitoring and supportive care for CBT patients appear to result in comparable long-term infection-related outcomes when compared to HCT with other donor sources. We are optimistic that ongoing investigation will lead to continued improvements and further expansion of the role of CBT in the future.

Ancillary