At the moment, there seems to be no policy at an international or national level to prevent the emergence of parasite resistance to anti-leishmanial drugs. World Health Organization (WHO) issues guidelines on the control of the leishmaniasis that include recommendations for treatment. These are intended to create common policies in endemic areas and improve the treatment of individual patients, which includes the prevention of relapse and thereby the emergence of secondary resistance, but does not address the question of resistance per se. Some countries may follow these guidelines, some have established their own and some have no consensus. But none has developed guidelines designed to prevent or control drug resistance. The questions are, would it be it possible to develop such a policy, and would it be practicable to implement one?
The nature and extent of the problem
Leishmaniasis presents a great variety of clinico-pathological forms that present in many different circumstances and make different demands of treatment. The requirements of self-healing cutaneous sores, diffuse cutaneous disease and visceral disease for example, are not the same. This paper addresses the problems of visceral leishmaniasis (VL). VL occurs mainly in poor, rural communities with sparse and weak medical facilities. It is currently important in four main settings, each with its own particular problem for treatment. In India the epidemic in Bihar and West Bengal continues and has spread into Pakistan, resistance to antimonials is the rule north of the River Ganges in Bihar. The HIV epidemic is spreading from the towns into the country villages and is likely to make the situation worse: drug resistance is the priority. In Sudan, the epidemic that killed 100 000 people between 1984 and 1996 (Seaman et al. 1996) is spreading along the borders with Ethiopia; management under difficult field conditions is the priority. In Brazil VL has urbanized and is a common cause of admission to hospital (Costa et al. 1990); efficient hospital management is the priority. In southern Europe co-infection with HIV, which has led to difficulties with diagnosis and treatment, is changing the epidemiological pattern of the disease and may be introducing a new, human, reservoir of infection (Alvar et al. 1997); management of the co-infection is the priority.
The literature has often confused unresponsiveness, which is the failure of the patient to respond to treatment, relapse after treatment and resistance of the parasite to the drug (Bryceson 1987). Unresponsiveness and relapse may or may not indicate resistance, but the link has seldom been investigated formally. With antimonials that have been widely and successfully used for decades, unresponsiveness is in principle likely to indicate resistance.
Primary unresponsiveness or resistance to antimonials is found in about 1% of previously untreated patients with VL in most parts of the world outside India (Chulay et al. 1983; Ben Salah et al. 2000) and there is considerable variation in sensitivity to antimonials among primary isolates from untreated patients with cutaneous leishmaniasis in South America (Berman et al. 1982), which was found often to correlate with the patient’s response to treatment. Secondary resistance develops in patients with visceral, mucosal and diffuse cutaneous leishmaniasis who have relapsed. Twice relapsed patients are almost always unresponsive to further courses of antimonials. Additional factors that may contribute to the development of secondary resistance include poor compliance, under-dosage and incomplete treatment (Bryceson et al. 1985a), and the use of a single drug. Clinical unresponsiveness to antimonials in humans (Lira et al. 1999) and in dogs (Gramiccia et al. 1992) with relapsed VL has been shown to correlate with decreased sensitivity of parasite isolates, respectively, in macrophage cultures in vitro by a factor of 3–4, and in mice by a factor of 8–10. Thus unresponsiveness is at least in part due to drug resistance, which in turn is due to selection of resistant mutants by drug pressure. Experimentally, promastigote resistance to antimonials can be increased by a factor of 33–212 by passage through increasing concentrations of sodium stibogluconate.
In endemic foci with a canine reservoir, as in Europe and Brazil, secondary resistance has not led to an increase in primary resistance. But in India, there has been an epidemic of primary resistance. In Muzaffarpur, the epicentre of the outbreak, more than 60% of previously untreated patients are unresponsive to antimonials (Gramiccia et al. 1992; Sundar et al. 2000a,b). It seems likely that primary resistance emerges where man is the reservoir of infection, there is a large biomass of parasites and transmission is intense. At the moment these conditions exist in India, but they may develop elsewhere soon. In Sudan it is likely that the disease is anthroponotic, no reservoirs other than man having been convincingly demonstrated, and the disease is now settling into a state of high endemicity. Post-kala-azar dermal leishmaniasis (PKDL), which is an established reservoir state in India, is being seen there commonly for the first time. Treatment under difficult field conditions is likely to be inadequate and incomplete.
In Europe and Brazil, HIV is changing the nature of the human infection, the response to treatment and the epidemiology. VL shares with tuberculosis and leprosy some of the problems of treatment of intracellular organisms in an immunosuppressed patient, including the need for a long course of treatment, the risk of relapse and the development of drug resistance. Further immune-suppression by HIV or drugs for renal transplantation aggravates the situation. Experimentally, HIV drives leishmanial infection, and vice-versa (Alvar et al. 1997). Co-infected patients have higher parasite burdens and weaker or absent immune responses. Co-infected patients respond slowly to treatment with antimonials (Rosenthal et al. 1995). Apparent clinical improvement is not matched by parasite clearance from splenic aspirate smears. Relapse rates are of the order of 60% within 1 year, regardless of the antileishmanial drug used (Lopez Velez et al. 1998), and secondary resistance has emerged to all of them (Bryceson et al. 1985b; Davidson & Russo 1994). Under experimental conditions Phlebotomus ariasi, an important vector in southern Europe, can become infected by feeding on HIV co-infected patients (Alvar et al. 1997). Without HIV, patients with VL are not normally infectious to this sandfly. Whether such patients will provide a human reservoir adequate to alter the epidemiology of the disease in southern Europe and lead to the emergence of primary drug resistance remains to be seen. Concomitant treatment with highly active anti-retroviral drugs (HAART) has improved the efficacy of anti-leishmanial treatment in southern Europe, and postponed relapses (Lopez Velez et al. 1998), but this benefit may not be available everywhere, for example in India or Sudan.
Characteristics of drugs used to treat leishmaniasis, with respect to the induction of drug resistance – pharmacokinetics, toxicity and mode of action
The greatest problem is with the antimonials, but it is not clear whether this is simply because of the extent of their use over so many years, or whether they are inherently more likely than are other drugs to induce resistance because of their mode of action or pharmacokinetic profile. Pentavalent antimonials are excreted in two phases (Chulay et al. 1988). The first (T2 2 h) accounts for over 99% of the drug, the second (T2 56 h) probably represents the excretion of a trivalent component; its possible significance in relation to drug resistance is unknown. Standardization of dosage at 20 mg Sb/kg body weight daily for 30 days has led to better rates of cure, but increasing intolerance and toxicity have reduced compliance. Although the antimonials have been the mainstay of treatment for 60 years, more reliable and safer alternatives need to be developed. Antimonials are thought to act by inhibiting the enzymes of glycolysis and other metabolic pathways (Berman 1988).
In principle a drug that has a short half-life and high therapeutic ratio is least likely to induce resistance. Conventional amphotericin B deoxycholate has a half-life of 24 h, and is the most efficient of the anti-leishmanial drugs. Primary resistance has not been reported, and secondary resistance only in HIV co-infected patients. Isolates from HIV co-infected adult patients in France were increasingly resistant to amphotericin B following relapse after treatment (Di Giorgio et al. 1999). Children who did not have coincidental HIV did not relapse after treatment with amphotericin B. Standard regimens of ˜2 mg/kg/dose are poorly tolerated, and toxic, but in India 1 mg/kg on alternate days gives comparable cure rates and is tolerable (Mishra et al. 1994; Thakur et al. 1996). Liposomal amphotericin B (Ambisome®) is the least toxic and most efficient of the lipid-associated preparations, but at the moment its price restricts its use to wealthy countries. The plasma level is higher than with the deoxycholate, and tissue levels remain high for several days. Resistance may emerge in relapsed HIV co-infected patients. Amphotericin acts by binding to ergosterol in the parasite cell membrane. The azole compounds are inevitably weaker than amphotericin, because they act at several enzyme stages earlier in the synthesis of ergosterol, by inhibiting the demethylation of lanesterol. They have not proved consistently effective in the treatment of VL on their own (Wali et al. 1992; Sundar et al. 1996). Azoles antagonize amphotericin in animal models of fungal disease.
Paromomycin (aminosidine) is an aminoglycoside antibiotic. It has a major short half-life (2.5 h) and a minor long one (40 h) but a low therapeutic ratio. In vitro, sensitivity varies between and within species of Leishmania. It is at least as effective as the antimonials in patients with VL (Chunge et al. 1990; Jha et al. 1998). Resistance has not yet been reported, but experience with its antibacterial usage suggests that the potential is there if it were used widely on its own. Its mode of action against Leishmania is not known. It is synergistic with antimonials in vitro and the combination has been used effectively in India (Thakur et al. 1995) and Sudan (Seaman et al. 1993) (Table 2).
Table 2. Possible combinations of drugs for use in trials of visceral leishmaniasis in man or animal models
Miltefosine is an oral drug that has undergone extensive trials against VL in Bihar. It has a low therapeutic ratio, but yields cure rates in excess of 95% (Jha et al. 1999). It interacts with cell membranes. Miltefosine has a long-terminal half-life which ranges between 150 and 200 h. About four half-lives (25–33 days) are required to reach more than 90% of the plateau levels (steady-state). The plasma levels of oral miltefosine are roughly dose proportional and urine excretion is negligible. Due to the long half-life, a sub-therapeutic level of miltefosine may remain for some weeks after a 4-week course (Zentaris, personal communication). This characteristic might encourage the emergence of resistance. Its widespread use as a single agent in India might lead to the rapid emergence of widespread resistance, which would be a tragedy. Its abortifacient and teratogenic properties may limit its distribution in the field. Its mode of action against leishmania is unknown.
Allopurinol has not proved effective on its own in human VL. It is a purine analogue and interferes with ribonucleic acid synthesis. It is effective in vitro, but it seems that Leishmania are capable of scavenging purines in vivo (Berman 1988). It is however, synergistic with antimonials in vitro against certain species of Leishmania, including L. donovani, and the combination has proved clinically useful in antimony unresponsive patients in Kenya (Chunge et al. 1985).
Prevention of primary resistance
If primary resistance to antimonials is a product of intense transmission in highly endemic areas of anthroponitic leishmaniasis, then public health measures should be taken to reduce transmission wherever and however, possible. In India, where transmission is domestic and peridomestic, the methods of vector control through insecticide spraying and early case detection and treatment are well proven (Saxena et al. 1996) but are not at the moment adequately implemented. In Sudan little is known about the nature of man–sandfly contact, and logical methods to interrupt transmission have not been developed. In areas of zoonotic leishmaniasis, where the dog is the reservoir, control of dogs through slaughter or diagnosis and treatment would be of little value to prevent primary resistance unless it were shown that human cases co-infected with HIV were infecting significant number of sandflies to alter the epidemiology of the disease (Alvar et al. 1997).
Prevention of secondary resistance
This means the prevention of relapse and the emergence of resistant mutants. To some extent this may be achieved by adequate compliance with a strong drug given in adequate dosage for adequate duration. However, drug toxicity, rural setting, poverty, variation in wild parasite sensitivity to antimonials and presumably to other drugs, the effects of HIV on immune response and parasite burden, and the pharmacokinetic features of individual drugs render this policy inadequate on its own in many individuals and in certain endemic areas. Resistant mutants emerge when a population of parasites is exposed to low concentrations of a drug for long periods of time. Two approaches have been taken to prevent this in two other tropical infections. In leprosy, which is a systemic intracellular infection, the epidemic of resistance to dapsone was controlled by the introduction of triple therapy for multibacillary cases, who have a load of ˜109 viable bacteria (Ellard 1984). The rationale, untried at that time, was that a strong drug, rifampicin, would kill dapsone-resistant mutants and reduce the bacterial load in a few days to a level below which new mutants were unlikely to emerge, and that long-term treatment with the less efficient drug dapsone would deal with the low level of rifampicin-resistant mutants. In malaria, which is a protozoal infection, the first rationale is that two drugs with different modes of action and which therefore do not share the same resistance mechanisms will reduce the chance of selection because the chance of a resistant mutant surviving is the product of the parasite mutation rates for the individual drugs, multiplied by the number of parasites in an infection that are exposed to the drugs (White 1999). This theory underlies the success of artemether–mefloquine combinations that have controlled and reversed drug resistance on the borders of Thailand (Simpson et al. 2000). The second rationale for malaria is similar to that for leprosy: that one very active drug with a short half-life will reduce the biomass of parasites to a level at which a second, more slowly acting drug will be able to kill the remainder. The extent of the kill by the first drug, and the plasma concentration and area under the curve of the second drug are critical to the success of the combination (White et al. 1999). There is a third approach, that two relatively weak drugs may be used together if their actions are synergistic.
Can these approaches be applied to leishmaniasis? Nothing is known about the ways in which leishmania acquire resistance to drugs, nor of their population dynamics in the human host. But pharmacokinetics of most of the drugs used and the mode of action of some are well understood. A rational approach to the question of combinations should be possible. But the trials have not been conducted and in many instances there is little support from animal models.
Combinations of an efficient drug with a short half-life combined with a longer acting drug (Table 2)
This would include paromomycin plus miltefosine. Paromomycin is better tolerated, less toxic and intended to be cheaper than amphotericin B. It can be given intramuscularly. Trials of the combination are needed to determine the rate of parasite disappearance from splenic aspirates and therefore the duration for which paromomycin would need to be given. The slow accumulation and metabolism/excretion of miltefosine make it suitable as the second drug. Trials could be performed to see if the standard 4-week course could be shortened. Antimonials would be less suitable in combination with miltefosine because of toxicity, the variation in natural parasite sensitivity and the variable rate of parasite clearance from splenic aspirates (Chulay et al. 1983).
Drugs with known synergy include allopurinol with antimony and paromomycin with antimony. Allopurinol with antimony is no more effective than antimony alone in previously untreated patients, but has not been looked at with respect to the prevention of relapse and secondary resistance. A study in an area of India where there is low level of resistance would be justifiable. Paromomycin plus antimonial would be less practicable as it would involve two injections daily.
Combinations of existing ‘weak’ oral drugs
Three reports of the use of an azole with allopurinol are of interest. They describe four immuno-depressed patients with VL in whom the combination seems to have induced prolonged remission.
• Two with HIV-VL who received fluconazole 400 mg and allopurinol 300 mg daily for 3 weeks, followed by fluconazole 200 mg daily (Torrus et al. 1996)
• One previously untreated HIV-VL patient who received allopurinol 21 mg and itraconazole 400 mg daily (Raffi et al. 1995)
• One Saudi Arabian woman who developed VL after receiving a renal transplant and was treated with allopurinol 300 mg and ketoconazole 400 mg daily (Halim et al. 1993)
It is difficult to deny the therapeutic effect of the combination of allopurinol and an azole in these four cases. Trials in patient and animal models are justified.
Other oral drugs that have some action against leishmania include dapsone which has clear efficacy against cutaneous leishmaniasis (CL) in western India (Dogra 1991), the antimalarial atovaquone which has been shown to have useful activity in L. donovani-infected mice, especially when combined with an antimonial (Murray & Hariprashad 1996) but not on its own in human VL, the lepidine WR6026 or Sitamaquine®, which is undergoing trials in Kenya, and roxythromycin, which produced a definitive cure rate of 87% in Indian VL (Lal et al. 1996). Formal randomized trials of combinations in animal models and patients would seem warranted.
What should be the indications for introducing a policy of combination treatment?
Ideally, parasite resistance should be monitored, rather than patient relapse rates, and endemic countries should be encouraged, in the absence of genetic markers, to set up reference laboratories capable of testing drug sensitivity of amastigotes in macrophage cultures. It would then be possible to establish guidelines along the lines of those that exist for malaria or tuberculosis. But at the moment countries in areas endemic for anthroponotic VL should consider introducing such a policy as soon as the relapse rate to the standard drug starts to increase; for antimonials and paromomycin this would be about 3%. In areas where there is a high rate of HIV coinfection, combination therapy should be introduced as soon as possible. When suitable combinations have been established, these countries should introduce them routinely. In countries endemic for zoonotic VL, a proven combination therapy would be used to reduce duration of treatment, the cost of hospitalization and the risk of relapse. Trials would be needed to establish the role of combination treatment in HIV-co-infected patients.