Planned progressive antimicrobial therapy in neutropenic patients

Authors


Dr ClaudioViscoli Immunocompromised Host Disease Unit, Istituto Nazionale per la Ricerca sul Cancro, L.go R. Benzi 10, 16132 Genoa, Italy.

Major advances have been achieved in medical science in recent decades, such that patients with illnesses previously considered untreatable now receive appropriate therapy and may survive. However, this achievement has usually been obtained with aggressive and invasive procedures, which in turn lead to multiple disruptions in immunological protection and therefore to increased susceptibility to opportunistic infections. The net results is that patients surviving severe underlying diseases may succumb to bacterial, fungal, viral or protozoan infections. This is particularly true for cancer patients, where infectious complications represents a major cause of morbidity and mortality ( Meyers & Thomas, 1988; Morittu et al, 1989 ; Pizzo & Meyers, 1989).

In recent years the management of infectious complications in cancer patients has improved substantially, especially in the field of bacterial infections. For example, among about 800 documented bacteraemia cases observed in the eight therapeutic trials (I, II, III, IV, V, VIII, IX and XI) performed by the International Antimicrobial Therapy Co-operative Group (IATCG) of the European Organisation for Research and Treatment of Cancer (EORTC) from 1978 to 1994, the overall mortality rate decreased from 21% to 7%. In particular, the 30-day mortality rate from any cause in patients with Gram-negative and Gram-positive bacteraemia is now as low as 10% and 6%, respectively (personal communication, Marianne Paesmans, Statistician, EORTC–International Antimicrobial Therapy Co-operative Group). This represents a dramatic improvement with respect to, for example, a classic study of Gram-negative rod bacteraemias performed in 1962, in which the mortality rate was approaching 90% ( McCabe & Jackson, 1962) and to the first study of the International Antimicrobial Therapy Co-operative Group (IATCG) of the EORTC performed in 1978, in which more than 20% of the patients with Gram-negative sepsis and about 15% of those with Gram-positive sepsis died ( EORTC–International Antimicrobial Therapy Project Group, 1978). Reasons for this improvement are probably multiple, although the strategy of the rapid institution of empirical broad-spectrum antibacterial therapy with very active antimicrobial compounds at the commencement of fever has probably played a major role.

The pattern of infective pathogens has also changed significantly over time. Whereas in the early days of antineoplastic chemotherapy, Gram-negative rods were the most common pathogens in neutropenic patients, nowadays Gram-positive micro-organisms are prevalent and cause about 60% of documented bacteraemias ( EORTC–International Antimicrobial Therapy Cooperative Group, 1990; Klastersky et al, 1988 ; Brown et al, 1995 ). The reasons for this modification in the pattern of infecting pathogens remain unclear. The treatment of cancer has become more intensive and it is now associated with more severe oral mucositis and diarrhoea, leading to major damage of mucosal barriers and increasing risk of infection due to the resident Gram-positive oral flora. Herpes simplex oral lesions, frequently observed in granulocytopenic cancer patients, may further compromise mucous membrane integrity, allowing bacteria to enter the bloodstream ( Corey & Spear, 1986). In addition, patients with cancer are often fitted with indwelling catheters (especially Broviac or Hickman type and Port-a-Cath), which can easily become colonized with skin bacteria and constitute a well-recognized risk factor for Gram-positive infections ( Viscoli et al, 1988 , 1994; Viscoli & Castagnola, 1995). Finally, the selective pressure of antibiotics, such as third-generation cephalosporins and fluoroquinolones, more active against Gram-negative than Gram-positive bacteria, may also have played a role. Elting et al (1992 ) showed that the use of fluoroquinolones was one of the factors significantly associated with the development of streptococcal bacteraemia, especially when used in combination with H2-blockers and other antacids, probably due to increased colonization of the gastric and oesophageal flora by oral streptococci. Interestingly, in the trials performed by the IATCG of the EORTC the respective proportion of Gram-negative and Gram-positive organisms causing bacteraemia have remained stable over the last 5 years and there are reports suggesting that Gram-negative infections are again increasing, especially in paediatric populations ( Aquino et al, 1995 ). In a recent surveillance study of bacteraemia in neutropenic and non-neutropenic children with cancer performed by the Italian Association for Paediatric Haematology and Oncology, 46% of the isolated pathogens were Gram-positive cocci, versus 44% Gram-negative rods and 10% fungi (Castagnola et al, unpublished observation).

In recent years several reports have emphasized the increasing role of fungal infections as a cause of morbidity and mortality in compromised patients, including those with cancer. For example, a national survey of nosocomial fungal infections in American hospitals (Hospital Infection Program, CDC, Atlanta, U.S.A.) showed that the incidence of these infections increased from 2.0 to 3.8 infections/1000 discharges form 1980 to 1990, with the incidence of nosocomial candidaemia increasing from 1.0 to 4.9 infections/1000 discharges ( Beck-Saguè & Jarvis, 1993). There are studies (including autopsy studies) performed on both sides of the Atlantic showing that the increase in fungal infections is well documented in cancer patients ( Bodey et al, 1992 ; Groll et al, 1996 ; Meunier et al, 1992 ; Martino et al, 1993 ). For years, Candida albicans was the most frequently isolated species among Candida isolates. However, data from a recently completed surveillance study of fungaemia in cancer patients conducted by the Invasive Fungal Infection Group of the EORTC reveal that non-albicans candidaemia is now prevalent among patients with haematological malignancies (Viscoli et al, unpublished observation). Finally, as shown by an autopsy study performed in Germany ( Groll et al, 1996 ), in recent years there has been an increase in fungal infections due to filamentous fungi, including species that were never previously known to be a cause of human infections.

Antibiotic strategies and the results of clinical trials

Since the discovery of penicillin, bacteria have shown a remarkable propensity to become resistant to antibiotics by means of several mechanisms. The argument has recently been fully reviewed ( Gold & Moellering, 1996; Swartz, 1997). Gram-negative rods producing extended-spectrum β-lactamases, penicillin-resistant streptococci (including pneumococci), methicillin-resistant staphylococci and vancomycin-resistant enterococci and staphylococci are the more dramatic examples. As a consequence, it is the responsibility of physicians working in tertiary-care hospitals and with high-risk patients (for example, in haematology), as well as regulatory agencies and hospital administrators, to envisage a more rational way of using antibiotics in order to prevent infections due to multi-resistant pathogens and to minimize the widespread diffusion of resistance. Indeed, antibiotics should not be considered simply as ancillary drugs potentially able to overcome troublesome, but trivial, incidences which disturb the treatment of neoplastic diseases.

The strategic decision of administering antibacterial and/or antifungal prophylaxis, the choice of the antibiotic regimen for prophylaxis or empirical therapy, and the eventual therapeutic modifications of the allocated antibiotic regimens, are often based on the results of phase III or IV prospective and randomized clinical trials. Unfortunately, as recently reviewed ( Viscoli et al, 1995 ), the results of these trials cannot be readily transferred to clinical practice because the methodology is often flawed and definitions, study endpoints and eligibility criteria often differ from trial to trial. More importantly, the value that has been given to the results of these trials has frequently resulted from misunderstanding the true message of the trial. The antibacterial activity of the regimen under study and the practical benefits it gives to the overall patient population treated for fever and neutropenia have often been confused. These two meanings are often taken as perfectly interchangeable, whereas they are completely distinct (although not independent) treatment effects. The antibacterial activity of a given regimen is a necessary but not a sufficient condition for the presence of a beneficial effect for the patients receiving that regimen. Most trials conducted in this patient population show that a given drug is able to do its work (i.e. it is able to prevent or treat an infection). This result does not, however, imply that use of the drug is justified in daily clinical strategy. Recognizing this distinction is important in order to make good decisions in clinical practice.

Another important problem in this field is represented by the patient population in which the clinical trials are conducted. For reasons of sample size, most study groups have been compelled to perform their trials in mixed cancer patient populations, which therefore have highly variable risk levels of infectious complications. Indeed, older patients with acute myeloid leukaemia in first induction or in relapse have very different risk factors with respect to children with acute lymphoblastic leukaemia or to patients with lymphoma or solid tumours, and the prognosis is also different ( Hann et al, 1997 ). As a consequence, the effect of an aggressive antibiotic strategy, which might be effective among the highest risk patients, becomes diluted in a mixed population of high, medium and low risk patients. The results of the study may be misleading, since they could show that the investigative regimen is not effective, when it might actually be effective but only in a selected population of high-risk patients. Similarly, a clinical trial including mostly high-risk patients may conclude that a given approach is suitable, but this does not mean that it is suitable for low-risk patients.

Infection prophylaxis

Ever since the early trials of intensive chemotherapy of neoplastic diseases, prevention of infection has been considered a very attractive approach, potentially able to decrease the infection rate, improve the patients' quality of life, decrease infectious mortality and, hopefully, improve the overall survival from the underlying disease. Several non-chemotherapeutic procedures have been proposed, from simple reverse isolation to a total protection environment. These procedures have been reviewed recently ( Wiblin & Wenzel, 1995). Strict isolation procedures with use of laminar air-flow rooms have been abandoned almost everywhere because of poor cost-effectiveness. Careful hand-washing is the only procedure considered to be effective, whereas isolation measures should be aimed rather at preventing the spread of transmissible diseases or multi-resistant organisms from patient to patient ( Wiblin & Wenzel, 1995; Gold & Moellering, 1996). An exception is represented by the use of HEPA filters for the prevention of aspergillosis in units where allogeneic bone marrow transplantation is performed. This procedure has been shown to decrease the incidence of this dreadful disease in this patient population ( Sherertz et al, 1987 ). Since transplant patients with haematological malignancies can develop aspergillosis (or can become colonized with Aspergillus spores) even during remission induction therapies, it is likely that all units where severely immunocompromised cancer patients are cared for should be equipped with air conditioning and HEPA filters.

Chemoprophylaxis of bacterial and fungal infections was first introduced when it became clear that 80% of the infecting bacterial pathogens were originating from the patient's endogenous flora, and that about half of them were acquired during the hospital stay ( Schimpff et al, 1972 ). At this point, the first approach that was attempted was the administration of non-absorbable antibiotics aimed at suppressing the intestinal bacterial and fungal flora and preventing the acquisition of exogenous organisms (intestinal decontamination). Two types of intestinal decontamination were envisaged: one that was active against both the aerobic and the anaerobic bacterial flora (total intestinal decontamination) and the other active only against aerobes (selective intestinal decontamination).

As extensively reviewed ( Hathorn, 1993; Donnelly, 1995), total intestinal decontamination was directed at a complete suppression of the endogenous microbial flora, thought to be the main source of bacteraemia and fungaemia in cancer patients, and was usually associated with the use of laminar air-flow rooms and other protective measures (total protective isolation). Antibiotics more commonly used for this type of decontamination were gentamycin, vancomycin, framycetin, colistin, neomycin-polymyxin, nystatin and amphotericin B, in various combinations. Poor palatability, nausea, diarrhoea and psychological depression (together with a lack of convincing efficacy) made this procedure very unpopular among patients, thus affecting compliance. Poor compliance was associated with the risk of recolonization with opportunistic invasive organisms coming from the hospital flora, development of resistance, and failure to prevent infections ( Schimpff et al, 1975 ).

Selective intestinal decontamination was designed to provide suppression or virtual elimination of the aerobic and fungal flora, while maintaining the bacterial anaerobic flora. The conceptual basis of this prophylaxis relies on a number of studies in immunosuppressed animals in which suppression of the aerobic flora with preservation of the anaerobic one was accompanied by a decrease in the number of infections, without colonization by potential hospital pathogens ( Guiot et al, 1983 ). Some clinical trials in humans confirmed these findings, but others did not. Cotrimoxazole, which was the drug most commonly used for this type of prophylaxis, usually given in combination with oral nystatin or amphotericin B, was sometimes associated with an increased duration of granulocytopenia. This therapeutic approach was optimal only when it was accompanied by a meticulous monitoring of the enteric flora, aimed at detecting any bacterial or fungal regrowth at an early phase (with obvious increase in costs). This approach, which is still very popular in many Dutch cancer centres, has been mostly abandoned, especially because of the intrinsic need of intensive microbiological support and because of the increased resistance of Gram-negative and Gram-positive pathogens to cotrimoxazole ( Donnelly, 1995). Cotrimoxazole (three times a week) is, of course, still being used for preventing Pneumocystis carinii infections.

Chemoprophylaxis of bacterial infections

The administration of cotrimoxazole was clearly providing not only a topical effect (intestinal decontamination) but also a sort of systemic prophylaxis, since the drug was well absorbed and able to reach detectable blood levels. The concept of combining intestinal decontamination and systemic prophylaxis became popular among many cancer centres when a new group of very active antibiotics, i.e. the fluoroquinolones, became available. In the few placebo-controlled trials that were performed, fluoroquinolone antibiotics provided a consistent reduction in the incidence of Gram-negative bacteraemia ( Donnelly, 1995; Cruciani et al, 1996 ), but had no effect or even resulted in an increase in Gram-positive bacteraemias. This effect was also indirectly confirmed by epidemiological studies, For example, in a multivariate analysis of factors associated with a diagnosis of bacteraemia in febrile and granulocytopenic cancer patients, fluoroquinolone prophylaxis was associated with a reduced risk of bacteraemia due to Gram-negative rods, with no effect on the risk of Gram-positive bacteraemias ( Viscoli et al, 1994 ). Norfloxacin, the quinolone antibiotic most commonly used in the early days, was soon replaced by ciprofloxacin, which was shown to be superior to norfloxacin in a large multicentre study performed in Italy ( The GIMEMA Infection Program, 1991). After an initial wave of enthusiasm, a certain degree of scepticism started to spread among experts when it became evident that the decreased incidence of Gram-negative bacteraemias was not accompanied either by a decreased incidence of febrile episodes nor by a reduction in the use of intravenous antibiotics ( Bow et al, 1995 ; Castagnola & Viscoli, 1997). In addition, there was an increasing incidence of staphylococcal infections in patients receiving quinolone prophylaxis ( Kotilainen et al, 1990 ). In addition, although no trial had been designed to show a reduction in the number of febrile episodes and mortality, no effect on these ‘strong’ endpoints was shown in a well-conducted meta-analysis of prophylactic trials ( Cruciani et al, 1996 ). Indeed, it is the general impression now that the effect of quinolone prophylaxis is probably not an absolute reduction in the incidence of Gram-negative bacteraemias but rather a change in the respective proportions of Gram-positive bacteraemias, Gram-negative bacteraemias and unexplained fevers. Whether or not this is a beneficial effect in pragmatic terms remains to be determined. The scepticism about the use of quinolone prophylaxis spread even more when data indicating the increased incidence of bacterial resistance after prolonged use of fluoroquinolone prophylaxis became available. In the early days of this prophylaxis there was no evidence of any risk of rapid development of resistance among Gram-positive and Gram-negative bacteria. After some years it became evident that the use of quinolones as prophylaxis in neutropenic patients was actually associated with the development of resistance, at least among staphylococci ( Kotilainen et al, 1990 ) and E. coli ( Kern et al, 1994 ). In addition, a review of the sensitivity patterns of pathogens isolated in the trials performed by the IATCG of the EORTC ( Cometta et al, 1994 ) showed that between 1983 and 1993 the proportion of neutropenic cancer patients receiving fluoroquinolone prophylaxis increased from 1.4% to 45% in patients randomized in these trials. During this time 1118 bacterial strains were isolated from blood and were sent to the reference laboratory of the group in Lausanne, Switzerland. All 92 strains of Escherichia coli isolated from 1983 to 1990 were susceptible to ciprofloxacin, norfloxacin, ofloxacin, L-ofloxacin and sparfloxacin. In contrast, 11 of 40 (27%) strains of Escherichia coli isolated between 1991 and 1993 were resistant to all fluoroquinolones. The 11 resistant strains were isolated from 10 patients, all of whom were receiving fluoroquinolone prophylaxis, whereas the 39 sensitive strains were isolated from 29 patients, only one of whom had been given quinolone prophylaxis.

In conclusion, it seems logical to discourage the widespread use of fluoroquinolone prophylaxis in all neutropenic cancer patients ( Hughes et al, 1997 ; Murphy et al, 1997 ; Castagnola & Viscoli, 1997). However, this does not mean that certain high-risk patient populations cannot benefit from short prophylactic courses. Examples of this type of patients could be those colonized with aggressive organisms or in centres with epidemic clusters due to organisms that are susceptible to fluoroquinolones. New extended spectrum fluoroquinolones will become available in the near future, and might have a spectrum of action wider than the ‘old’ fluoroquinolones. In order to prevent the same mistake from happening again, a more careful and appropriate selection of patients who would benefit from antibacterial prophylaxis should be performed. This should be the responsibility of drug companies, individual investigators, and study groups.

Chemoprophylaxis of fungal infections

At the present time no drug can cover all possible fungal pathogens which cause infection in cancer patients. Thus, as is commonly seen with bacterial infections, no ‘comprehensive’ prophylactic approach can be recommended. As far as Candida infection is concerned, the issue has been reviewed recently in the setting of a consensus conference ( Edwards et al, 1997 ). At least two large clinical trials showed that fluconazole, at dosages of 400 mg/d, reduced the incidence of these infections in patients undergoing allogeneic bone marrow transplantation ( Goodman et al, 1992 ; Slavin et al, 1995 ). A similar effect was also shown in immunocompromised children, including BMT recipients, at the dose of 3 mg/kg/d ( Ninane, 1994). In contrast, no beneficial effect has been shown in patients with acute leukaemia ( Winston et al, 1993 ; Menichetti et al, 1994 ). As in the field of bacterial infections, antifungal prophylaxis also has pitfalls that should be recognized. Several reports have focused on the increasing rate of colonization and infection due to the natively resistant Candida species (C. krusei and C. glabrata) in patients receiving fluconazole ( Wingard et al, 1991 , 1993; Fan Havard et al, 1991 ). Although this phenomenon was not confirmed in large prospective studies ( Goodman et al, 1992 ; Slavin et al, 1995 ; Winston et al, 1993 ), it would be surprising if antifungal prophylaxis had no effect in the pattern of pathogens causing infection in patients receiving prophylaxis. Indeed, in a surveillance study of candidaemia in cancer patients we found that administering antifungal prophylaxis, not only with fluconazole, was one of the factors associated with an increasing risk of non-albicans candidaemia (Viscoli et al, unpublished observations). The other factors identified were severity of the underlying disease and neutropenia. In addition, as shown in HIV-infected patients undergoing prolonged treatment with fluconazole, Candida strains can develop resistance to the new azoles. An interesting approach is the one proposed by Martino et al (1993 ) and Guiot et al (unpublished observations), in which antifungal prophylaxis and empirical antifungal therapy are reserved for those patients who are colonized with Candida in multiple organs.

Fluconazole cannot be expected to be effective against filamentous fungi, such as Aspergillus, which represent an important cause of morbidity and mortality, especially in patients with acute leukaemia and in those undergoing bone marrow transplantation. Chemoprophylaxis of aspergillosis has been unsuccessful so far. Itraconazole's spectrum of action is promising, but large clinical trials showing its efficacy in preventing aspergillosis are lacking. In addition, small trials showed that failures were correlated with lower blood levels, which were probably the consequence of the drug's inconsistent oral absorption ( Boogaerts et al, 1989 ). A new oral itraconazole formulation with improved bioavailability, in which the drug is diluted with cyclodextrin in suspension, has recently been tested in a large controlled clinical trial (Del Favero et al, unpublished observations). No major advantage in the prevention of fungal infections, including aspergillosis, was shown, although only a small number of patients at high risk of aspergillosis were included in the study. The study itself was not specifically designed for the prevention of aspergillosis. Intranasal amphotericin B has also been proposed as a mean of reducing Aspergillus nasal colonization, which is considered a risk factor for invasive aspergillosis. Unfortunately this approach was ineffective in a recent randomized study (Heinemann et al, unpublished observations), although it was shown to be effective, when combined with oral itraconazole, in a small non-comparative study ( Todeschini et al, 1993 ). Finally, BMT patients who develop pulmonary aspergillosis during leukaemia remission induction can be successfully treated with pre-emptive amphotericin B at low doses (0.5 mg/kg every other day) during the post-transplant neutropenic period ( Karp et al, 1988 ). The same approach could be recommended for centres with Aspergillus outbreaks, although protection is not guaranteed and toxicity is not negligible ( Perfect et al, 1992 ).

In conclusion, most of the experts in this field agree that chemoprophylaxis of bacterial and fungal infections cannot be recommended in all instances and that a selective approach should be adopted. Certainly, no prophylaxis should be administered in patients in whom a short episode of neutropenia (i.e. 7–10 d) is expected. Nowadays, this is the case in most solid tumour and lymphoma therapeutic protocols, including those followed by stem cell infusion. However, prophylaxis could be considered for selected populations of high-risk patients, such as those receiving very intensive chemotherapeutic regimens for the treatment of haematological malignancies, especially if they are colonized at multiple sites with organisms known to be able to cause severe infections in this patient population. Narrowing the spectrum of prophylaxis to specific organisms has shown to be a reliable approach in some instances ( EORTC–International Antimicrobial Therapy Co-operative Group, 1994). The term ‘pre-emptive therapy’ is probably more appropriate in describing this type of prophylactic approach. In any case, the duration of the pre-emptive therapy should cover only the high-risk period, in order to minimize its unavoidable impact on resistance induction. The problem of multi-drug resistant pathogens is, of course, of major concern in both immunocompromised and immunocompetent hosts.

Empirical therapy of febrile neutropenia

Fever without clinical signs of a localized infection represents the most common clinical presentation of a potentially overwhelming infection in neutropenic patients, and it is usually considered to be a medical emergency ( Armstrong et al, 1971 ). For example, in the randomized therapeutic trials performed by the IATCG of the EORTC over more than 20 years, more than 50% of the patients had no sign of infection other than fever at randomization ( Klastersky et al, 1988 ). As is widely known, the incidence and severity of fever and infection in this patient population is inversely related to the absolute neutrophil count and to the duration of neutropenia, and it is highest when the granulocyte count falls below 0.1 × 109/l ( Bodey et al, 1966 ). Historic data have shown that if an empirical treatment is not promptly undertaken, mortality in severely neutropenic patients with Gram-negative bacteraemia can approach 40% ( Schimpff et al, 1971 ; Klastersky, 1986). This information is the rationale behind the early empirical administration of broad-spectrum antibiotics upon the development of fever, an approach which has become common practice in this patient population, with substantial improvement in prognosis. However, the specific composition of the empirical regimen remains controversial and subject to change, due to the changing pattern of pathogens, the rapid development of bacterial resistance, the emergence of new clinical entities, and the availability of new effective drugs.

Several controversies have arisen among investigators about which therapy constitutes the best approach for the febrile and neutropenic patient and, especially, whether a double or triple antibiotic combination is superior to monotherapy with third-generation cephalosporins or carbapenems.

The classic β-lactam–aminoglycoside combination has long been considered the best therapeutic approach for febrile neutropenia ( Hughes et al, 1990 , 1997), because of its wide spectrum of action, its potential synergistic activity against Gram-negative rods, and its potential ability to reduce the emergence of resistant strains, both in the single patient, during treatment, and in the environment over time. The disadvantages of this regimen include poor activity against staphylococci and streptococci, possible development of resistance in Gram-negative rods ( Rains et al, 1995 ), aminoglycoside-related toxicity, and the need to administer multiple daily doses of both antibiotics. Two developments in this area have been the demonstration that a penicillin combined with β-lactamase inhibitors might provide better anti-streptococcal coverage ( Cometta et al, 1995 ) and the demonstration that single daily dosing (with ceftriaxone and amikacin) is a feasible, effective and safe practice ( EORTC–International Antimicrobial Therapy Co-operative Group, 1993). Another approach, which has mainly been used in recipients of bone marrow transplantation, has been the use of a double-β-lactam combination, usually with piperacillin and ceftazidime ( Winston et al, 1991 ). The main disadvantages of this therapy include the high cost and the relatively unpredictable pharmacodynamics of antibiotics sharing the same mechanism of action. The advantages include better coverage of streptococci (due to the inclusion of the ureidopenicillin) and lower toxicity.

A monotherapeutic approach can also be considered reliable for febrile and neutropenic patients, especially as very large spectrum antibodies, such as third and fourth generation cephalosporins with anti-Pseudomonas activity (ceftazidime and cefepime) and carbapenems have become available. This reliability has been clearly shown in several trials ( Pizzo et al, 1986 ; de Pauw et al, 1994 ; Freifield et al, 1995 ; Cometta et al, 1996 ). At present the available evidence suggests that at the end of the neutropenic period there should be little difference (if any), in terms of survival, between febrile and neutropenic patients who started with monotherapy and those who received combinations. However, how much the patient (in terms of duration of fever and therefore quality of life) and the health-care system (in terms of cost) have to pay with both approaches is largely unknown. In addition, resistance to third-generation cephalosporins is increasing ( Rains et al, 1995 ; Gold et al, 1996 ), and this may be a cause of concern for centres using ceftazidime monotherapy.

The issue of the early inclusion of an anti-Gram-positive glycopeptide antibiotic (vancomycin or teicoplanin) in the empirical regimen has been discussed thoroughly in many meetings and has been addressed by several authors in light of the increased incidence and the decreased response rate of these infections to β-lactam–aminoglycoside combinations. Some studies ( Karp et al, 1986 ; Del Favero et al, 1987 ; Shenep et al, 1988 ) reported favourable results from this practice, with more rapid resolution of fever and reduction in Gram-positive secondary infections, total febrile days, and need to use empirical amphotericin B. More convincingly, other groups ( Rubin et al, 1988 ; Viscoli et al, 1991 ; EORTC–International Antimicrobial Therapy Cooperative Group and the National Cancer Institute of Canada–Clinical Trials Group, 1991; Micozzi et al, 1993 ) did not find any significant advantage in an early anti-Gram-positive coverage and suggested that vancomycin should be added to the early empirical regimen only upon documentation of a Gram-positive infection not responding to initial empirical treatments. Indeed, the recent reports of staphylococci and enterococci displaying variable levels of resistance to glycopeptide antibiotics ( Gold et al, 1996 ) should further discourage physicians from using these important drugs on an empirical basis.

The results of clinical trials are not the only factor on which to base the choice of the empirical regimen for febrile neutropenia. Indeed, although these studies tell us that a given regimen could be used as empirical therapy of febrile neutropenia, the choice of precisely which regimen to use in individual institutions is another problem and remains the physician's responsibility. There are other factors that are at least as important as the data stemming from clinical trials. These include local antibiotic policies, bacteriological statistics, and resistance patterns, as well as antibiotic toxicity and cost. In addition, a number of patient-related factors should be taken into account, such as the clinical presentation, the presence of organ failure, the state of the underlying disease, and the expected duration of neutropenia, among others. For example, the demonstration that meropenem as a single-agent is as effective as the combination of ceftazidime and amikacin should not induce a change in the therapeutic approach of a given centre, unless in that centre there is a high incidence of Gram-negative bacteraemias due to ceftazidime-resistant pathogens. This is not only for reasons of cost (which in this case is higher for the single-agent therapy than for the combination), but also because the carbapenems are, at the same time, both potential inducers of antibiotic resistance (production of β-lactamases) and the most active drugs now available. For these reasons, they should probably not be used in first-line therapy, but only upon specific indication.

In conclusion, it is the general feeling of most experts in the field that in febrile and neutropenic patients the treatment of choice should be tailored according to local and patient-related aspects and that the more active, less toxic, and less expensive regimen should be used, reserving the more active drugs for complicated cases.

Treatment modification in non-responding patients

The indications for modifying the antibiotic regimen in non-responding patients are controversial. The decision depends upon several factors, including not only the persistence of fever but also the physician's self-confidence, the type of infection documented (i.e. bacteraemia, clinically documented infection, fever of unknown origin), the pathogen involved, the antibiotic susceptibility pattern of the isolated pathogen, the expected duration of neutropenia, and the presence of detectable signs of infection. Obviously there is little need to discuss the action required when a pathogen is isolated in blood culture and the sensitivity tests show resistance to the allocated antibiotic. The majority of experts believe that microbiologically documented infections should be treated with antibiotics to which the isolated pathogen is susceptible in vitro, even if the patient's clinical condition improves spontaneously. More controversial is what to do when the patient remains febrile in the absence of any microbiological or clinical documentation of infection (fever of unknown origin) or in infections due to pathogens that are sensitive to the allocated regimen in vitro. Indeed, the real question is how to define the lack of response to treatment. In general, good clinical practice in infectious diseases suggests that persistence of fever does not necessarily mean failure of a given antibiotic regimen, especially if the patient is otherwise clinically stable. A febrile and neutropenic patient with bacteraemia might require 2–7 d to defervesce, even if the isolated pathogen is sensitive to the allocated antibiotic(s). As a consequence, it is likely that in patients with fever of unknown origin or with an infection caused by a pathogen that is sensitive to the allocated therapy no modification of the initial empirical therapy should be made, in the absence of clear signs of clinical deterioration, antibiotic-related toxicity or secondary infections. Obviously the treatment should be modified in the presence of a clinical picture suggestive of a specific aetiology which is not likely to be covered by the allocated antibiotic regimen (catheter-related infection, perianal cellulitis, abdominal typhlitis, pulmonary infiltrates, etc.).

It has become common practice in many cancer centres in Europe to add a glycopeptide antibiotic in the ‘non-responsive patient’, even in the absence of any documentation or clinical suggestion that the patient actually has a Gram-positive infection sustained by a methicillin-resistant Gram-positive organism (which is the main indication for using a glycopeptide antibiotic). The effectiveness of this practice has never been tested in controlled clinical trials and therefore cannot be recommended. Unlike fungal organisms, Gram-positive bacteria can easily be grown in culture and rarely cause death in the first 48 h of fever. The potential for inducing resistance to glycopeptides is a further reason against this practice ( Gold et al, 1996 ).

The only modification which has been shown to be somewhat effective in persistently febrile and neutropenic cancer patients is the addition of an antifungal agent. The rationale for this practice comes both from autopsy studies, which showed the increasing role of fungal infections as the cause of death in cancer patients ( Bodey et al, 1992 ; Groll et al, 1996 ), and from clinical observations showing the importance of early treatment in the prognosis of fungal infections (Aisner et al, 1997), at least in the presence of a rapid bone marrow recovery. Burke et al (1976 ) showed a decreased mortality rate from invasive mycosis in patients receiving empirical amphotericin B compared with historic controls. Following these observations, two groups of investigators on both sides of the Atlantic decided to test the effectiveness of this procedure in a more controlled fashion. Pizzo et al (1982 ) at the National Cancer Institute in Bethesda randomized patients who were still febrile and granulocytopenic after 7 d of empirical antibiotic therapy and lacked any documentation of infection, to discontinue all antibiotic treatments (16 patients), to continue the same combination as the initial treatment (16 patients) and to add empirical amphotericin B (18 patients). The lack of addition of amphotericin B resulted in an increased incidence of fungal infections. In the group of patients who did not have added amphotericin B there was one bacterial infection and five fungal infections, two of which were fatal. One more patient, who died, had a disseminated fungal infection detected at autopsy. In the group of 18 patients randomized to receive amphotericin B and to continue antibacterial therapy, only two documented infections developed (a disseminated Cytomegalovirus infection and a Petriellidium boydii pneumonia and arteritis) and both patients died. Antibiotic therapy discontinuation, as carried out in the first group, resulted in the development of serious complications of both bacterial and fungal origin.

The International Antimicrobial Therapy Co-operative Group of the EORTC randomized 132 persistently febrile and granulocytopenic cancer patients not responding to empirical antibacterial therapy and affected with a fever of unknown origin or a clinically documented infection to receive empirical amphotericin B or to continue their antibacterial coverage without modification. There was no statistically significant difference between the two groups in terms of defervescence and survival, although no death due to fungal infection occurred among the patients receiving empirical amphotericin B compared to four in the other group (P = 0.05) and the number of documented fungal infections was higher in patients not receiving amphotericin B (six versus one; P = 0.1). Subgroup analyses showed that the addition of amphotericin B was apparently effective only in high-risk patients, such as adults not receiving antifungal prophylaxis, and in severely granulocytopenic patients. This led to the suggestion that the empirical antifungal therapy strategy should probably be reserved to selected groups of high-risk patients ( EORTC–International Antimicrobial Therapy Cooperative Group, 1989).

Although in both of these trials the statistical power of the observed results was very small, especially for subgroup analyses, the use of empirical amphotericin B in persistently febrile and granulocytopenic cancer patients without documented infections has become the rule in many cancer centres world-wide. Amphotericin B remains the drug of choice for this indication, despite its intrinsically severe toxicity. The optimal time at which empirical antifungal therapy should be started remains undetermined, although most experts recommend waiting until the 5th or 7th day of persistent fever and granulocytopenia ( Hughes et al, 1990 , 1997; Viscoli et al, 1997 ). It has been suggested recently that fluconazole might also be effective for empirical antifungal therapy ( Viscoli et al, 1996 ; Edwards et al, 1997 ), although only in patients not receiving fluconazole prophylaxis and at low risk for aspergillosis.

New treatment modalities for febrile neutropenic patients

Every recommendation regarding the duration of empirical therapy and time to discharge patients from the hospital is necessarily arbitrary, since few studies have been performed. For patients with documented infections (especially bacteraemia) a reasonable approach might be to treat for at least 8–15 d after the last positive blood culture, based on the isolated pathogen, the presence of an infected site, and the granulocyte count ( Pizzo, 1993). The duration of treatment with respect to the granulocyte count in patients with fever of unknown origin is a matter of debate ( Klastersky et al, 1988 ; Pizzo, 1993; Lucas et al, 1996 ). Some physicians prefer to treat these patients until the recovery of neutropenia (granulocytes >0.5 × 109 cells/l) in order to minimize the incidence of further bacterial infections ( Pizzo et al, 1982 ), whereas others treat these patients for a total of 7 d or for 4 d after defervescence, and discontinue all antibiotic therapy even if neutropenia persists, with the aim of decreasing the incidence of fungal secondary infections ( Di Nubile, 1988).

There is enough evidence, especially in the paediatric population, that certain patients who have febrile neutropenia could be discharged early from the hospital and followed as out-patients ( Hughes et al, 1990 , 1997; Mullen et al, 1990 ; Martino et al, 1992 ; Tomiak et al, 1994 ; Cohen et al, 1995 ; Lucas et al, 1996 ; Sundararajan et al, 1997 ). This can probably be done if the patient has been without fever for at least 2 consecutive days, if the underlying disease is in remission, if there are early signs of bone marrow recovery, and if they can be properly monitored at home. In one study the therapy was even continued orally successfully ( Lau et al, 1994 ).

Not only has the early discharge of the febrile neutropenic patient been suggested, but also their home or out-patient management. Indeed, the classic principle that all febrile and neutropenic patients should be hospitalized and should remain so until complete defervescence is currently being questioned, since it has been shown that, in selected cases and under favourable logistical conditions, these patients can be evaluated and treated at home or in an out-patient setting ( Wiernikowski et al, 1991 ; Martino et al, 1992 ; Rubenstein et al, 1993 ; Talcott et al, 1994 ). In some instances adult patients have been managed safely with oral self-administered treatments ( Malik et al, 1994 ). Although patients at low risk of severe infection ( Viscoli et al, 1994 ) and/or serious medical complications ( Talcott et al, 1992 ) have been identified in a reasonably reliable way, out-patient management of febrile neutropenia cannot be widely recommended as a safe procedure and should probably be reserved for experimental approaches or very experienced centres.

Conclusions

In conclusion, there is little evidence that a planned progressive succession of antibiotic therapies in cancer patients actually exists and can be recommended for widespread use. Indeed, with the exception of empirical antibacterial therapy of febrile neutropenia (and, possibly, of empirical antifungal therapy in persistently febrile and neutropenic patients with fever of unknown origin), every approach in the prophylaxis and treatment of infectious complication in cancer patients should be tailored to the individual patient and to the local epidemiological situation. The adverse biological consequences of an excessive use of antibiotics for prophylaxis and treatment, both in the individual patient and in the environment, are of the utmost importance, and the consequences are even more serious when the effectiveness of these procedures has not been properly demonstrated. It would be desirable to replace the concept of prophylaxis with that of pre-emptive therapy and to stop the spiral of empirism in the use of antibiotics in cancer patients by replacing it with a more clinical and patient- or infection-orientated approach.

Acknowledgements

We are grateful to Mrs Mauri S. Ulivi for reviewing the English text and to Mrs Laura Veroni for her assistance in preparing the typescript.

Ancillary