Correspondence: Mitchell S. Cairo, Pediatric Hematology, Oncology and Stem Cell Transplantation, Children and Adolescent Cancer and Blood Diseases Center, Cellular and Tissue Engineering Laboratory, Department of Pediatrics, Pediatrics, Medicine, Pathology, Microbiology & Immunology and Cell Biology & Anatomy, New York Medical College, Munger Pavilion, Room 110, Valhalla, NY 10595, USA.
Most children, adolescents and young adults with acute lymphoblastic leukaemia (ALL) in first complete remission (CR1) have an excellent prognosis with multi-agent chemotherapy in induction, consolidation, re-induction and maintenance therapy. However, there is a subset of patients with a more guarded prognosis using this approach, who may benefit from haematopoietic allogeneic stem cell transplantation (alloHSCT). Commonly used criteria for alloHSCT in children, adolescents and young adults with ALL in CR1 include: induction failure, poor cytogenetics, persistent minimal residual disease (MRD), age, immunophenotype, white blood cell count at diagnosis and rapidity of induction response. Two-year event-free survival following alloHSCT in patients with ALL in CR1 ranges from 50 to 80% depending on disease status, donor source, conditioning therapy, age and other risk factors. Future studies should focus on more precisely identifying poor-risk features, such as disease genomics and host pharmacogenomics, refining MRD measurements, improving unrelated donor matching, reducing MRD prior to alloHSCT, and developing post-alloHSCT humoral and cellular therapy approaches.
The prognosis of children and adolescents with acute lymphoblastic leukaemia (ALL) has dramatically improved in the last half century, with current 5-year event-free survival (EFS) rates ranging from 76 to 86% (Pui et al, 2011). However, there is a significant amount of heterogeneity in childhood and adolescent ALL and patients with certain disease subsets have a 5-year EFS of less than 40% and may benefit from alternative strategies, including allogeneic haematopoietic stem cell transplantation (alloHSCT).
Overview of risk factors in children and adolescents with ALL
Age, white blood cell (WBC) count, T-cell immunophenotype
Population-based analyses of incidence and outcomes in adults and children with acute leukaemia demonstrate distinct biological differences between childhood and adult ALL which significantly impact outcomes (Table 1). Overall, infants <1 year of age have worse survival outcomes, whereas children 1–4 years old typically have highest survival. Overall survival (OS) decreases with increasing age with a notable decline after 20 years of age. A report from the Children's Oncology (COG) group recently addressed the issue of the adolescent and young adult (AYA) population (16–21 years old) with ALL and their tendency for worse outcomes. While encouraging data has been used as an argument for treating older adolescents and young adults patients with more intensive therapy, there are a couple of risk factors worth noting. Patients with WBC >50 × 109/l at diagnosis had significantly poorer outcomes (75·4% vs. 43·9%, P = 0·0004) (Nachman et al, 2009). Stock et al (2008) also analysed 321 AYA (aged 16–20 years) and similarly found that there was benefit in intensified treatment, especially early central nervous system (CNS) prophylaxis. Again, patients with elevated WBC>50 × 109/l fared worse and while elevated WBC is not at this time an indication for transplant, it remains a significant poor prognostic factor, and consideration of alloHSCT in CR1 is suggested.
Table 1. Risk factors associated with significantly poorer outcomes in children with ALL
ALL, acute lymphoblastic leukaemia; WBC, white blood cell count; TKI, tyrosine kinase inhibitor; alloHSCT, allogeneic haematopoietic stem cell transplantation.
Infants <1-year-old worse survival outcomes in general; Decreasing event-free survival with older age >10 years old with notable decline after age 20 years
Elevation >50 × 109/l
T-cell phenotype with worse outcome among all age groups
Low hypodiploid (<40 chromosomes) or near haploid with significantly worse event free survival
Ph+ ALL (t9;22)
Present in only about 2–3% children with ALL; Very low event free survival; Possible improvement with addition of TKI's but less so in higher risk groups (high WBC, older age)
Predominance in infant leukemia; Difficult to treat after relapse
Risk of induction failure increases with high risk features, such as older age, high WBC, T-cell phenotype or high risk cytogenetics; M3 marrow (>25%) at most risk; improved survival with upfront alloHSCT
Minimal residual disease
Cutoff used is ≥10−4; Independent risk factor; Utilized to assign risk to patients early in treatment
Seen in Ph+ or Ph+-like ALL; Associated with JAK2 and other mutations leading to overexpression of regulatory cytokine receptors
Complex associations with various additional gene mutations such as MCL1, FLT3, IRX2 and TACC2 which confer poorer prognosis
T-cell ALL heterogeneity
Early T-cell precursor leukaemia has significantly increased risk of induction failure and/or subsequent relapse
T-cell phenotype shows worse outcomes among all age groups in children under 20 years of age compared to observations in the adult population (Dores et al, 2012). In the German ALL- Berlin-Frankfurt-Munster (BFM) 90 and BFM 95 trials, children with T-cell ALL (T-ALL) (n = 191) who received alloHSCT in CR1 had a 5-year disease-free survival (DFS) of 67% vs. only 42% in patients treated with chemotherapy alone, when compared using the Mantel-Byar method and Cox regression analysis, including time to SCT as a time-dependent covariate (Schrauder et al, 2006).
Cytogenetic risk factors
It is now accepted that children with ALL with hypodiploidy have progressively worse outcomes with decreasing chromosome number. Hypodiploidy (<45 chromosomes) is rare in children with ALL and occurs in only about 2–6% of patients, but confers a dismal prognosis (Figs 1A and 2B). A study from the UK reported on the survival of patients with low hypodiploidy (<40 chromosomes) or near-haploidy and have demonstrated significantly poorer outcomes within these groups (Harrison et al, 2004). Children with karyotypes of 25–39 chromosomes have been shown to have a 3-year EFS of only 29% compared to 66% EFS for those with 42–45 chromosomes. In the near haploid group, outcome is even worse (Harrison et al, 2004). A similar review of 10 international cooperative groups did not find significant differences in those with increasingly lower chromosome numbers, however there were significant differences in patients with 44 or more chromosomes having improved EFS (52% vs. 30%, P = 0·01) and OS (69% vs. 37·5%, P = 0·017), suggesting that patients with fewer than 44 should be considered higher risk and candidates for alloHSCT (Nachman et al, 2007).
Ph+ ALL in children
Philadelphia chromosome t(9;22) is present in ~2–3% of newly diagnosed children with ALL. An initial small report of 21 children treated in Toronto found superior outcomes utilizing Cox multivariate analysis for up-front matched related donor transplants with an EFS of 53 ± 15% as compared to both unrelated donor transplants or chemotherapy alone, who showed EFS of only 33 ± 17% and 35·7 ± 20%, respectively (Sharathkumar et al, 2004). Arico et al (2000) reported on 326 children over a 10-year period treated with chemotherapy alone versus related and unrelated HSCT. At that time, human leucocyte antigen (HLA)-matched related but not unrelated alloHSCT was found to be superior to chemotherapy alone (Arico et al, 2000). However, the same group later went on to analyse a larger cohort of 610 children and adolescents, 325 of them in first remission, and using a landmark and Cox regression analysis, found that with longer follow-up and presumably better transplant strategies over time, both related and unrelated donor alloHSCT offered more durable remission and better overall outcomes than chemotherapy alone (Arico et al, 2010).
Most of the historical data presented reflects outcomes prior to the use of tyrosine kinase inhibitors (TKIs). The Children's Oncology Group (COG) evaluated whether imatinib along with an intensive chemotherapy regimen would improve outcome in children and adolescents with Ph+ ALL in CR1 (Schultz et al, 2009) and reported a doubling of EFS compared to historical controls (80 ± 11% vs. 35 ± 4%). Futhermore, log-rank testing showed that there was no statistical difference between patients receiving intensive chemotherapy + imatinib versus sibling donor versus unrelated donor, thus calling into question whether intensive chemotherapy + imatinib might replace alloHSCT altogether in children with Ph+ ALL in CR1 (Schultz et al, 2009). However, the follow-up in this study is short and the number of higher risk patients with older age (≥10 years) and high WBC (≥100 × 109/l) is not discussed. Burke et al reported (2009) on their single institution experience where 37 children treated with imatinib either pre- or post-alloHSCT transplant did not show differences in DFS or relapse rates at 3 years. In addition, there was no significant difference in outcomes between matched related versus unrelated donor sources. The only significant difference found was the DFS of patients transplanted in CR1 vs. ≥CR2 (71% vs. 29%, P = 0·01), which supports the continuation of alloHSCT from the best available donor as standard practice for children with Ph+ ALL in CR1. This practice is supported by the recent report from the Associazione Italiana Ematologia Oncologia Pediatrica (AIEOP)-HSCT group which showed that leukaemia-free survival (LFS) was higher in children who underwent alloHSCT in CR1 (56% vs. 34%, P = 0·008) (Fagioli et al, 2012).
11q23 rearrangement in childhood ALL in CR1
Kosaka et al (2004) reported the results of 44 infants with ALL found to have MLL gene arrangements who received intensive chemotherapy followed by alloHSCT over a 5-year period and demonstrated a 3-year OS and EFS of 58·2% and 43·6%, respectively. For patients transplanted in first remission, the 3-year post-transplantation EFS was 64·4% (46·4–82·4%). Similarly, Jacobsohn et al (2005) also reported on their experience with infant ALL treated with either matched sibling or unrelated umbilical cord blood (UCB) when in first remission. Twelve of 16 patients (75%) remain in long term remission, and alloHSCT was found to be well tolerated with the majority of patients achieving rapid neutrophil engraftment and minimal graft-versus-host disease (GVHD) (Jacobsohn et al, 2005).
An extensive review of close to 45 000 children and adolescents with newly diagnosed ALL treated on 14 different cooperative group studies in Europe, North America and Asia over a 15-year period from 1985 to 2000 analysed the effects of disease characteristics as well as treatment differences on outcomes (Schrappe et al, 2012). It is clear that patients with induction failure often present at diagnosis with older age, high WBC count, T-cell phenotype or high-risk cytogenetic features (Philadelphia chromosome or 11q23 rearrangement). The 10-year survival rate in this cohort only reached 32% (Schrappe et al, 2012). Patients >10 years old, those with T-cell phenotype, or those with an M3 marrow (>25% blasts) at the end of induction were considered particularly poor risk (Schrappe et al, 2012). Another recent multivariate analysis of over 2000 patients with either relapsed or refractory ALL who underwent alloHSCT, tracked through the Center for International Blood and Marrow Transplant Research (CIBMTR), demonstrated superior post-alloHSCT survival for patients with primary induction failure or untreated first relapse that had fewer than 25% marrow blasts at the time of transplantation and were aged <10 years old (Duval et al, 2010).
Minimal residual disease (MRD) in children with ALL
Early studies prospectively assessed MRD in children with newly diagnosed ALL in clinical remission. The incidence of relapse among patients with MRD at the end of the induction phase was 68 ± 16% if MRD persisted through week 14 of continuation therapy, compared with 7 ± 7% if MRD became undetectable (P = 0·035) (Coustan-Smith et al, 2000). The prognostic significance of MRD remains an independent risk factor even after adjusting for other adverse features. Infants were also shown to demonstrate lower DFS with higher levels of MRD at the end of induction and/or consolidation time points. A report from the Interfant-99 protocol shows 100% of patients in the MRD-high group ultimately relapsed compared to only 13% in the MRD-low group (Van der Velden et al, 2009). Additional groups have looked at MRD significance at earlier time points as well, with most trials finding that this information can be utilized to assign risk to patients as early as Day 15 of induction (Basso et al, 2009). Newer markers now allow for detection of MRD of one leukemic cell among 105 bone marrow cells which will allow for broader use with greater sensitivity (Coustan-Smith et al, 2011).
Over 75% of ALL cases have a detectable genetic abnormality (Fig 1A). Mullighan et al (2007) showed that while some genetic mutations such as mixed lineage leukaemia (MLL) rearrangements require few DNA alterations, others such as BCR-ABL1 or ETV6-RUNX1 are often more complex with multiple gene mutations required for oncogenesis (Mullighan et al, 2007). For example, Ph+ ALL with its characteristic BCR-ABL1 translocation is also associated with IKZF1 deletion (IKAROS mutation) in 80% of cases (Mullighan et al, 2008). Furthermore, it appears that the IKZF1 deletion itself, in conjunction with a high prevalence of JAK2 mutations, is responsible for a subtype of high risk ALL that is similar in genetic profile, but negative for BCR-ABL1, which results in overexpression of the cytokine receptor CRLF2 (Mullighan et al, 2009a,b). These patients have a very poor overall prognosis independent of clinical or response-based predictors, and probably require alloHSCT for improved outcome until newer targeted agents can be developed. Similarly, gene expression profiles of infant ALL have also shown that additional genetic factors beyond MLL rearrangements become important in predicting response and outcomes. MCL1, FLT3, IRX2 and TACC2 have all been shown to be overexpressed in MLL-rearranged infants who have a poor/resistant steroid response or later found to have poor long-term EFS (Kang et al, 2012). Perhaps one of the newest identified genetic alterations in acute leukaemia is the discovery of the genetic heterogeneity in T-cell leukaemia. Early T-cell precursor leukaemia has been shown to have increased risk of induction failure or subsequent relapse compared to standard T-ALL (57–72% vs. 10–14%, respectively) (Coustan-Smith et al, 2009), thus making it a distinct subset that requires a more aggressive therapeutic strategy. Gene mutations have been discovered involving activating mutations of regulatory cytokines, inactivating deletions controlling haematopoiesis, and histone-modifying genes (Zhang et al, 2012). It is unknown at this time if allografting in CR1 will overcome the above-mentioned poor prognostic genetic factors. This is an important research question that is yet to be addressed.
Donor sources and conditioning regimens for alloHSCT in children with ALL
A recent review of over 500 children with acute leukaemia transplanted with either UCB, T-cell depleted unrelated bone marrow or un-manipulated bone marrow, demonstrated that upon comparison of donor source there were no significant differences in 2-year OS or EFS among the three groups. There were tradeoffs between toxicities, such that UCB showed higher 100-day transplant-related mortality (TRM) but lower acute and chronic GVHD (Rocha et al, 2001). For patients without an identifiable HLA matched related or unrelated donor, there is some promise in the use of parental haploidentical transplantation (Lang et al, 2003; Geyer et al, 2012).
Traditional conditioning regimens for ALL have been fully myeloablative, utilizing high dose cyclophosphamide (CY), sometimes in combination with other agents, and total-body irradiation. These combinations have been shown to be feasible with the benefit of prompt bone marrow engraftment. CY at a dose of 60 mg/kg/d ×2 consecutive days is the benchmark to which more current regimens are compared. More recent trials have demonstrated the safety and benefit of adding thiotepa, which has CNS penetration, to the standard total body irradiation (TBI)-CY regimen in an attempt to minimize post-transplant relapse. This combination was shown to have a DFS of 65% at 3 years overall, with an 85% DFS in children with ALL transplanted CR1 (Zecca et al, 1999). Multiple newer agents are currently in the evaluation process as part of myeloablative preparative regimens with enhanced anti-leukaemic activity including agents such as TKIs, clofarabine, sirolimus and targeted immunotherapy (Pulsipher et al, 2009). Our group has been investigating the use of clofarabine in combination with TBI and cytosine arabinoside in children with ALL with poor risk features including induction failure with equally promising results (Radhakrishnan et al, 2012). One major change will be the use of non-myeloablative or reduced-intensity conditioning (RIC) regimens in acute leukaemia. While RIC has until now been utilized primarily for patients with acute myeloid leukaemia (AML) and lymphomas, there is some promising data emerging for ALL. OS and relapse-free survival rates become even higher when RIC is utilized for patients in first remission without evidence of MRD (Ram et al, 2011). As mentioned earlier, post-transplant therapy modifications with TKIs appear safe and although there has not been data to support improvement over standard myeloablative transplant, they may ultimately serve a role when utilized in combination with reduced intensity conditioning in patients with Ph+ ALL. A similar model may be found to be beneficial with other biological agents or post-alloHSCT immune modulators.
Outcomes following alloHSCT in CR1 in children with ALL
Outcomes in children and adolescents undergoing alloHSCT for ALL in CR1 tend to be directly related to both the patients' individual risk profile as well as the chosen alloHSCT approach. The criteria for AlloHSCT in CR1 mentioned above are useful indicators of which patients may benefit from upfront transplant. However, they are only useful if transplant as an alternative is found to be successful with both low morbidity and superior relapse-free survival. There is evidence of a significant graft-versus-leukaemia (GVL) effect that justifies the use of matched family or alternative donor stem cell transplant in patients thought to be at high risk of relapse. In addition, there has also been some data to suggest that in patients who lack a suitable allogeneic donor, autologous stem cell transplant with chemo-purged bone marrow and post-transplant immunochemotherapy is safe and can induce a GVL effect leading to prolonged DFS (Houtenbos et al, 2001). A recent review of over 1000 patients aged 0–18 years old among 14 cooperative groups during a 15 year period identified many of the high risk features of ALL already discussed previously in this article. This intergroup collaboration found that for children with ALL who failed induction therapy, the 10-year survival rate among patients who underwent alloHSCT was statistically similar, based on Mantel-Byar analysis, to that for patients who underwent chemotherapy alone (43 ± 4% vs. 41 ± 3%, respectively) (Schrappe et al, 2012). However, it is clear from their data that certain patients are at higher risk and do show benefit from transplant. While children under the age of 6 years without additional adverse risk factors showed no benefit from transplant, older children and adolescents who received a matched related donor or those with T-cell ALL of any age with any donor, had a clear survival advantage (Fig 2A–C) (Schrappe et al, 2012). Satwani et al (2007), reporting for the COG, demonstrated that in children with ALL and ultra-high-risk features, alloHSCT in CR1 yielded a 5-year EFS for all patients of 58·6%, but much higher EFS (77·8%) in those patients without cytogenetic abnormalities. In addition, patients with Ph+ ALL had an EFS of 66·7% and those with induction failure, an EFS of 71% (Satwani et al, 2007). The TBI and CY regimen was well tolerated overall and TRM was low at 3% (Satwani et al, 2007). This is comparable to other published studies where there is a balance between relapse-free survival and transplant-associated toxicities (Table 2) (Satwani et al, 2007). Infant leukaemia trials have demonstrated the importance of this balance more than in older age groups. Often, reviews of alloHSCT outcomes show little benefit from transplant in infant leukaemia over chemotherapy alone, due to a higher degree of mortality and not necessarily due to relapse. Eapen et al (2006) reported that among very young children <18 months of age with acute leukaemia, 3-year LFS was similar between HLA-matched sibling and unrelated donor transplantation in CR1 (49% vs. 54%, respectively). The survival rates dropped to 20–30% for those with advanced leukaemia (refractory, CR3) suggesting that up front alloHSCT in these patients with a family or unrelated donor was feasible and should be considered prior to relapse (Eapen et al, 2006).
Table 2. AlloHSCT in CR1 for children and adolescents with ALL
Relapse rate (%)
5 year EFS (%)
UHRF-ALL, ultra high risk features acute lymphoblastic leukaemia; TRM, transplant-related mortality; EFS, event-free survival; WBC, white blood cell count; CNS, central nervous system; MSD, matched sibling donor; MUD, matched unrelated donor; MMSD, mis-matched sibling donor; MPD, matched paternal donor; TBI, total body irradiation; CY, cyclophosphamide; HD-ARA-C, high dose cytosine arabinoside; BU, busulfan.
Reprinted from Satwani et al, 2007, Copyright (2007), with permission from Elsevier.
Recent advances in the treatment of ALL have led to better survival rates in the paediatric population; however, this may not be the case for young adult patients with newly diagnosed ALL. Despite the achievement of high induction remission rates for ALL (~80–90%), the rate of successful consolidation of this remission is relatively low (~30–40%) (Bacigalupo et al, 2001). AlloHSCT as a consolidating strategy for adult patients in CR1 remains controversial.
Characteristics of young adults with ALL
There is no consensus regarding the definition of ‘young adults’ in the medical literature (Warren et al, 2010). The most recent National Comprehensive Cancer Network's NCCN Guidelines® (National Comprehensive Cancer Network, 2012) for ALL adopted ‘15–39’ as the definition for AYA; however, for the purpose of this review we will consider the term young adult to include ages 18–35 years old, as patients over 35 years of age are classified as high-risk by the Medical Research Council United Kingdom Acute Lymphoblastic Leukaemia XII (UKALLXII)/Eastern Cooperative Oncology Group (ECOG) E2993 (MRC UKALL XII/ECOG E2993) international trial (Goldstone et al, 2008). Traditionally, young adult patients with ALL have been treated under protocols designed for adults, based on the assumption that most adults will not tolerate the intensive protocols designed for the paediatric population. Survival rates for young adults remain inferior compared to adolescents. The 5-year survival of patients aged 15–19 years old with ALL in CR1 is superior relative to those aged 20–29 years old (61% vs. 44% respectively) (Pulte et al, 2009). Comparatively, patients in this age group are markedly underrepresented in clinical trials. Fern and Whelan (2010) reported that only 2% of the 20–29 year-old group are enrolled in clinical trials in the United States, compared to 60% of paediatric patients under 15 years old. Other factors that may contribute to the disparity in disease outcomes in young adults versus paediatric and adolescent populations includes psychosocial support, compliance with the regimen protocol by patients and physicians, type of chemotherapy intensification, and disease biology (McNeer & Raetz, 2012).
The high rate of relapse during or after ALL induction chemotherapy, has led to exploration of alloHSCT in CR1as a possible consolidation therapy in young adults with ALL. AlloHSCT can provide better disease control by two mechanisms: first through the cytotoxic effect of the conditioning regimen, which eradicates the leukaemic cells using high-dose therapy, and second through the immune-mediated GVL effect. The GVL effect in ALL has been indicated by observations of higher relapse rates after syngeneic or autologous HSCT (autoHSCT) compared with alloHSCT, lower incidence of relapse in patients with GVHD, and increased relapse rates in recipients of T-cell-depleted marrow grafts (Appelbaum, 1997).
Selection criteria for transplantation
The optimal approach for selecting the post-induction consolidation therapy in young adults with ALL in CR1 continues to evolve. AlloHSCT may be the only curative therapeutic option for some types of patients with ALL; however, due to the high treatment morbidity and mortality, careful selection of transplant candidates from patients with ALL in CR1 is warranted. The current NCCN Guidelines® for Acute Lymphoblastic Leukaemia (National Comprehensive Cancer Network, 2012) recommend transplantation in first remission after induction therapy for adolescents and young adults with Ph+ disease. For all relapsed patients with ALL, alloHSCT is recommended if remission can be achieved. Allogeneic HSCT is suggested as a consideration for patients with available donors who have MRD positivity, hypodiploidy, MLL rearrangement, WBC ≥30 × 109/l (B-lineage) or ≥50 × 109/l (T-lineage). Outcomes of transplant have obvious implications for the decision to proceed with transplant in CR1 and must be weighed against outcome data for high risk patients without AlloSCT. Prognostic factors that may weigh in on the decision to proceed to transplant are discussed in the following sections.
Age and WBC count at diagnosis
Many studies have determined that ages of greater than 35 years and increased WBC, WBC > 30 × 109/l in B-cell ALL and WBC > 100 × 109/l in T-cell ALL, are important adverse prognostic risk factors in young adults with ALL (Hoelzer et al, 1988; Kantarjian et al, 1990; Larson et al, 1995). The importance of these factors has been confirmed by a large prospective study carried out by the MRC in the UK in collaboration with the ECOG in the U.S. Increased WBC (as mentioned above) and age older than 35 years are also independently associated with inferior DFS and OS in patients with Ph-negative (Ph−) ALL (Goldstone et al, 2008).
Cytogenetics in young adults
Although chromosomal abnormalities associated with prognostic features are well identified and regularly utilized in the clinical classification and treatment of the paediatric population, this is not the case in adult patients. Perhaps due to the complex cytogenetic profiles and the relative rarity of the disease (Moorman et al, 2007), there is inconsistency in the prognostic significance of specific aberrations in different adult ALL clinical trials. Despite these complications, certain karyotypic abnormalities are found to be associated with adverse prognosis, among these the well-known Philadelphia chromosome from t(9;22)(q34;q11.2) resulting in the BCR-ABL1 fusion (discussed in detail below). Other adverse chromosomal abnormalities include t(v;11 q23)– MLL rearrangement, t(8;14)(q24.1;q32), complex cytogenetics, low hypodiploidy (33–39 chromosomes) and near-haploidy (22–29 chromosomes) (Harrison et al, 2004; Moorman et al, 2007; Swerdlow et al, 2008). Ph+ ALL is a clear indication for alloHSCT in CR1, but for other adverse genetic abnormalities in young adults alloHSCT in CR1 could be considered if a donor is available.
Recently, other factors found to be associated with poor outcomes in young adults with ALL in CR1 include CD20 positivity, altered expression of IKZF1 and BAALC genes, and pharmcogenetic factors (Mullighan et al, 2009a; Thomas et al, 2009; Kuhnl et al, 2010; Rowe, 2010); however, these factors need to be validated in future clinical trials prior to use in clinical practice. Perhaps the most important risk factor in adults with newly diagnosed ALL, in addition to age and cytogenetics at diagnosis, is the achievement of complete remission as a direct reflection of sensitivity to chemotherapy.
Minimal residual disease (MRD) in young adults
The debate on whether to offer alloHSCT in CR1 to patients with standard risk ALL has led to an increasing interest in assays quantifying post-induction MRD. Although MRD assessment research is further developed in the paediatric arena, MRD in adults with ALL in CR1 is also associated with poor outcomes. Despite the disparity in the methods used to assess MRD, the sensitivity of the methods, and the timepoints at which they are measured, there is consensus on the increased risk of relapse in young adult patients with ALL with persistent MRD. Table 3 illustrates the adverse influence of MRD on DFS at various time points for young adult patients with ALL. The presence of MRD can be used to tailor post-consolidation maintenance with appreciable clinical benefit; patients with available donors can proceed to alloHSCT, and those without donors receive high-dose regimens supported by autoHSCT (Bassan et al, 2009). Intriguingly, the conversion from MRD-negative to MRD-positive after completion of initial treatment is found to be an indicator for relapse in ~60% of patients (Raff et al, 2007). In Europe, MRD assessment is being standardized and has recently been suggested as a future primary endpoint for clinical trials, replacing CR (Bruggemann et al, 2010; Brüggemann et al, 2012). For young adult patients with MRD positivity following induction therapy, physicians may want to consider alloHSCT when a donor is available.
Table 3. Minimal residual disease in adults with ALL in CR1
Philadelphia chromosome positive BCR-ABL1 ALL in young adults
The incidence of Philadelphia chromosome positivity (Ph+) in adults with ALL ranges between 15% and 30%. There is evidence that the incidence of adults with Ph+ ALL increases with age and occurs in ~18% of patients between the ages of 20 and 39 years old (Moorman et al, 2007). The prognosis of Ph+ ALL is poor, especially in older populations, and alloHSCT remains the only potentially curative option (Fielding et al, 2009). In the French Leucémie Aiguë Lymphoblastique de l'Adulte (LALA) 94 trial, the estimated 3-year OS for Ph+ ALL patients who achieved CR1 and were allocated to the ‘donor’ group (either matched related or unrelated) was 37%, vs. 12% for the ‘no-donor’ patients (P = 0·02) (Dombret et al, 2002). This finding has been confirmed with patients who were treated under the MRC/ECOG 2993 trial. Patients with Ph+ ALL were found to have inferior EFS and OS compared with Ph− ALL: 16% vs. 36% for EFS and 22% vs. 41% for OS (Moorman et al, 2007). All Ph+ patients were allocated to a Ph+ donor group that allowed unrelated donors, compared to the Ph− patients in the combined donor (related only) and no-donor groups. Long-term benefit of alloHSCT has been observed in a retrospective follow-up study of adults with ALL receiving alloHSCT in CR1 at City of Hope (COH) and Stanford University, with a 10-year OS of 54% (Fig 3) (Laport et al, 2008).
Outcomes for young adults with ALL after alloHSCT in CR1
Transplant in CR1 for high-risk patients
As discussed previously, outcomes following AlloSCT in CR1 must be balanced against relapse-free survival in patients not undergoing transplant who display upfront high risk features. A few prospective and retrospective studies have attempted to answer the question, ‘what is the optimal treatment consolidation in young adults with ALL who achieve first remission?’ However, due to small sample sizes, lack of randomization in some, and heterogeneity in the inclusion criteria of others, it is difficult to draw meaningful conclusions. A strategy used by some trials has been to adopt specific stratification criteria or biological allocation, based on availability of HLA-matched siblings (donor versus no-donor). The donor versus no-donor multi-centre trial, LALA-87, found that only patients with high-risk features (Ph+ ALL, WBC>30 × 109/l, undifferentiated ALL, age > 35 years, or time to CR1 > 4 weeks) have better OS (P =0·03) and DFS (P =0·01) with alloHSCT, while patients with standard risk showed no significant advantage of alloHSCT over chemotherapy or autoHSCT (Sebban et al, 1994). These results were confirmed in a larger study from the same group, LALA-94, which additionally concluded in a randomization of the ‘no-donor’ arm, that there is no significant difference in DFS between autoHSCT and chemotherapy for high-risk patients (Thomas et al, 2004).
Several other reports indicate that alloHSCT offers long-term survival rates of between 40% and 60% for high-risk adults with ALL in CR1 (Blume et al, 1987a; Chao et al, 1991; Doney et al, 1991). Researchers at Stanford University reported on a series of 55 patients, children (29%) and adults (71%) with a median age of 24 (range 0–48) years, with high-risk ALL who underwent alloHSCT in CR1 (Jamieson et al, 2003). Selection criteria included WBC >25 × 109/l, chromosomal translocations t(9;22), t(4;11), t(8;14); age older than 30 years; extramedullary disease at the time of diagnosis; and/or requiring >4 weeks to achieve CR1. With a median follow-up of 6 years, the EFS is 64% with a relapse rate of only 15% in primarily young adult patients who would otherwise be expected to fare poorly (Fig 4) (Jamieson et al, 2003).
Transplant in CR1 for standard-risk patients
The recently reported MRC/EGOG trial compared alloHSCT, autoHSCT and chemotherapy alone, in adult ALL patients in CR1. ALL Ph+ patients with either a related or unrelated donor proceeded to alloHSCT, but standard and high-risk patients underwent a sibling-donor versus no-sibling-donor biological allocation to alloHSCT. Patients without donors were then randomized to autoHSCT or chemotherapy. AlloHSCT resulted in improved disease control in all adult patients with ALL, but with long-term benefit seen only in younger patients (15–35 years) with standard-risk disease (Fig 5) (Goldstone et al, 2008). This could be due to the high non-relapse mortality (NRM) rate of 36% in the older high-risk patients, which offset any potential survival advantage of the reduced relapse rate conferred by myeloablative alloHSCT. Another study from the Dutch-Belgian Haemato-Onocology Cooperative group (HOVON), in which 67% of the patients were younger than 55 years old (median age 31 years), indicates that patients with standard-risk ALL with an available sibling donor have more favourable survival following alloHSCT than patients allocated to the no-donor group (Cornelissen et al, 2009). The 5-year cumulative incidence of relapse was 24% in the donor group versus 55% in the no-donor group (hazard ratio [HR], 0·37; 0·23–0·60; P < 0·001). DFS at 5 years was significantly better: 60% in patients with HLA-matched sibling donors versus 42% versus for those with no-donor (HR: 0·60; 0·41–0·89; P = 0·01). Considering the previously reported data from the LALA trials indicating benefit from transplant to high-risk patients in CR1, and the high treatment-related mortality from alloHSCT, these trials have led to considerable debate about the implications for adults with standard-risk ALL. Some experts advocate early alloHSCT for nearly all adult patients, whereas others caution for continued individual assessment. The presence of MRD after induction therapy should also be included in the risk stratification to help identify additional ‘standard-risk’ patients as candidates for alloHSCT.
Effects of donor source and conditioning regimens on transplant outcomes
Related versus unrelated alloHSCT
A retrospective analysis of 221 (175 patients were 17–40 years old) adult patients with ALL in CR1 who received alloHSCT between 1990 and 2002 from matched related or unrelated donors, revealed no significant difference between the 5-year DFS in patients who received alloHSCT from matched related donors (45%) and those who received alloHSCT from matched unrelated donors (42%) (Kiehl et al, 2004). Nishiwaki et al (2010) reported that OS is not significantly different between related versus unrelated donor transplants (65% and 62% respectively) in patients with ALL in CR1. However, the risk factors are different between the two groups, as relapse rate was higher for related alloHSCTs while NRM was higher in the unrelated group (Nishiwaki et al, 2010).
There is an increasing use of UCB as an alternative source of stem cells for adult allografts; however alloHSCT with UCB has been associated with delayed neutrophil engraftment and increased risk of infection (Laughlin et al, 2004; Rocha et al, 2004). Some recent reports indicate that OS and DFS are similar between UCB and well-matched unrelated donor stem cell transplantation. An OS of 52% and LFS of 45% were reported following unrelated UCB compared with 58% and 51%, respectively, following unrelated adult donor alloHSCT (Atsuta et al, 2009). In another report (Tomblyn et al, 2009), LFS was comparable between UCB, unrelated donor and related allogeneic donor stem cell transplants (49%, 42% and 40% respectively).
The choice of the conditioning regimen used for alloHSCT not only influences the regimen-related morbidity and mortality but can affect the relapse rate, OS and DFS. Conventionally, the preparative regimens for alloHSCT in ALL have been radiation-based, due to the ability of TBI to eradicate leukaemic cells hiding within the central nervous system and testicles, providing prolonged EFS and lower relapse rates (Thomas et al, 1975a,b). TBI in combination with CY (TBI/CY) is the most common adult ALL transplant preparative regimen. Due to radiation toxicities including secondary malignancies, cataracts and interstitial pneumonitis, alternative preparative regimens continue to be investigated.
Busulfan/cyclophosphamide (Bu/CY) conditioning yields comparable OS, relapse rate, and DFS to the TBI/CY regimen; however, it is also associated with serious side effects, including hepatic sinusoidal obstruction syndrome (SOS) – previously veno-occlusive disease, VOD – and haemorrhagic cystitis. Conversion from oral to IV Bu in the Bu/Cy regimen has decreased the incidence of SOS and improved 100-day survival. More recently IV Bu/CY conditioning in adults with ALL in CR1 was associated with 30-month OS and relapse rates of 65·7% and 40% with decreased TRM and SOS, respectively (Tang et al, 2011).
At the City of Hope we substituted etoposide (VP16) for cyclophosphamide in combination with fractionated TBI (13·2 Gy) followed by alloHCT for adults with ALL (Blume et al, 1987a,b). DFS was 57% with a 32% relapse rate, suggesting this regimen has significant activity in adult patients with advanced ALL. This result was confirmed in a subsequent trial from the Southwest Oncology Group, comparing TBI/VP16 with Bu/CY (Blume et al, 1993). A comparative analysis of TBI combined with either CY or etoposide chemotherapy concluded there is an advantage to regimen intensification via substituting etoposide for CY or, when CY is used, in increasing the TBI dose to >13 Gy (Marks et al, 2006).
The role of RIC regimens in the treatment of AYA patients is limited, as this type of regimen is primarily indicated for the older population; however, patients with co-morbidities and high-risk features may require an RIC regimen. Multiple trials have attempted to assess the feasibility and effectiveness of RIC specifically for treatment of ALL in high-risk populations. The 2-year OS rates for these studies average 32% (median: 31%, range: 18–50%), with variable, but high, relapse rates and TRM, depending upon remission status. Despite the small number of patients, there is a trend toward improved survival in patients who received RIC prior to alloHSCT in adults with ALL in CR1 (Arnold et al, 2002; Martino et al, 2003). In a retrospective analysis of 24 high-risk patients with ALL who received RIC with fludarabine and melphalan, six patients were under the age of 39 years. Of those six, only one patient died from non-relapse causes and another patient experienced disease progression, however, the remaining four were alive and in remission at the time of the report (Stein et al, 2009).
Role of tyrosine kinase inhibitors pre- and post- alloHSCT in young adults with Ph+ ALL
Tyrosine kinase inhibitors, beginning with the first-generation drug, imatinib, were introduced in the treatment of BCR-ABL1 chronic myeloid leukaemia (CML) more than a decade ago. Since then there has been increasing interest in launching trials that combine TKIs with induction chemotherapy and/or alloHSCT in patients with Ph+ ALL. Imatinib has been studied both as a single agent and incorporated within induction chemotherapy regimens for newly diagnosed Ph+ ALL, demonstrating high response rates, improved survival in combination settings, and feasibility and tolerability in older populations (Lee et al, 2005; Yanada et al, 2006; Ribera et al, 2010). TKIs, such as imatinib, dasatinib and nilotinib, are now standard therapy for Ph+ patients and the resultant increase in initial remission rates has allowed greater eligibility for alloHSCT. A recent study (Mizuta et al, 2011) compared 51 patients who received imatinib in combination with chemotherapy followed by alloHSCT in CR1, with 122 patients who received alloHSCT in CR1 during the pre-imatinib era. The 3-year OS of the imatinib group was 65% compared to 44% for the pre-imatinib group (Fig 6) (Mizuta et al, 2011). Furthermore, patients with BCR-ABL1 transcript positivity after alloHSCT seem to experience higher relapse rates compared to BCR-ABL1-negative patients (45% vs. 23%, P =0·0013) (Stirewalt et al, 2003). These results led to early initiation of post-transplant TKI therapy, when leukaemic cell burden is minimal, with some suggesting therapy initiation at 90 days post-transplantation (Carpenter et al, 2007). The optimal duration of TKI therapy post-alloHSCT has not yet been determined, but some investigators continue TKI treatment as long as patients will tolerate it. In a recent study the median duration of imatinib therapy after alloHSCT was ~1 year (range, 3–50 months) (Ram et al, 2011).
Late effects of alloHSCT
Post-transplant survivors can face a number of long-term complications, and younger patients may be coping with these late effects for decades. A long-term survivor study compared more than 400 patients diagnosed with leukaemia (ALL + AML), with 319 siblings of participants, and demonstrated an increased incidence of multiple late effects including diabetes, hypothyroidism, osteoporosis, and neurosensory impairments in survivors of HSCT (Baker et al, 2010). Socie et al (1999) also reported inferior life expectancy of 2-year transplant survivors compared to the normal population, despite their excellent long-term survival. This could be due to multiple factors including late relapse, persistence of GVHD, prolonged immunodeficiency and secondary malignancies (Socie et al, 1999). Pro-active post-transplant long-term follow-up, as well as use of less toxic agents and stricter control on GVHD without impairment of immune reconstitution, could minimize these late effects and should be the focus of future studies.
The optimal therapeutic approach for children and AYA with ALL in CR1 continues to evolve. AYA patients in particular have great potential for improvement in survival. The most recent guideline from the National Comprehensive Cancer Network suggested the use of high intensity induction chemotherapy in the AYA group, due to the mounting evidence of improved EFS in patients receiving augmented paediatric regimens compared with patients treated on adult protocols (de Bont et al, 2004; Hallbook et al, 2006; Seibel et al, 2008). AlloHSCT in CR1 remains the curative option for both children and AYA with ALL with high risk features. Patient disease-specific genomic studies, host pharmacogenomic identification, and improved MRD detection can help to identify candidate patients for alloHSCT in CR1 who are otherwise considered at standard risk. Philadelphia chromosome positivity is a clear indication for allografting in first remission for children, adolescents, and young adults. AlloHSCT should also be considered in CR1 for patients with other adverse cytogenetics, high WBC count at diagnosis, and presence of MRD. Additional outcome data is needed to determine which high risk features are truly predictive of poor outcome and may require more aggressive upfront treatment strategies. In addition, questions remain regarding the benefit of early chemotherapy intensification, immunotherapy or other novel agents in long-term relapse-free survival. There is ongoing debate as to whether higher risk patients can successfully be managed without the need for AlloHSCT. Important determinants of outcome will depend not only on the efficacy of newer agents being used but also upon changing AlloHSCT strategies. Paediatric and adult cooperative group collaboration and interaction and multidisciplinary investigations will be required to establish the best therapeutic approaches to treatment of children and AYA with ALL in CR1.
The authors would like to acknowledge the expert assistance of Sandra Thomas, PhD and Erin Morris, RN in the preparation of this manuscript. Supported in part by grants from the Pediatric Cancer Research Foundation, Marisa Fund, Ashley G. Foundation and St. Baldrick's Foundation.
J.H., S.K., S.J.F. and M.S.C. each wrote the paper and approve of the submitted and final version.