Exposure to factor VIII and prediction of inhibitor development: exposure days vs. danger days, or both?


Cedric Hermans, Hemostasis and Thrombosis Unit, Haemophilic Clinic, Division of Haematology, Cliniques Universitaires Saint-Luc, Université Catholique de Louvain, Avenue Hippocrate 10, 1200 Brussels, Belgium.
Tel.: +32 2 764 17 85 (direct line); +32 2 764 17 40 (secretary); fax: +32 2 764 89 59.
E-mail: cedric.hermans@uclouvain.be

The development of alloantibodies directed against factor VIII is the most severe and challenging complication of hemophilia. A prerequisite is exposure to exogenous FVIII administered for replacement therapy. Exposure to FVIII is, indeed, necessary in order to induce an immune response resulting in the development of alloantibodies against FVIII, which is well illustrated by the observation that untreated hemophilic patients do not develop FVIII inhibitors [1]. Exposure to FVIII may be quantified by calculating the number of exposure days (EDs). An ED is usually understood and perceived by those treating hemophiliacs as a unit of time (1 day) in which replacement treatment is given to a patient. The concept of EDs has proven to be useful. Indeed, in patients with severe hemophilia A, inhibitors develop after a median FVIII treatment time of 10–15 days [2–4]. During this period, regular screening for inhibitor development is recommended. After 50–75 EDs, the cumulative incidence of inhibitors reaches a plateau. The incidence rate of inhibitor development in patients with hemophilia A who have been previously treated for at least 150 EDs has been estimated to be approximately 2–5 per 1000 patient-years [5]. Thus, in practice, careful monitoring of the number of EDs when initiating treatment is recommended, in order to enable early detection of the development of inhibitors. The cumulative number of EDs has also been used to categorize patients into various groups, comprising previously untreated patients, previously treated patients, and minimally treated patients. In patients with mild disease who may require intensive replacement much later in life, a good estimate of previous exposure and its intensity is valuable for estimation of the risk of inhibitor development.

Quantifying exposure to FVIII by calculating the number of EDs has several limitations. First, it does not allow for integration of treatment-related features, such as the amount of factor concentrate given, the frequency of infusions on the treatment day, and the type of or change in concentrate at the time of exposure. Other non-integrated features include the context of treatment (hemophilia severity, patient age, prior exposure, and concomitant inflammation or infection) and the indication for replacement therapy (prevention or treatment). Thus, EDs mainly, if not exclusively, contitute a quantitative measurement, i.e. an estimate of the total number of days of FVIII exposure. The question must therefore be raised as to whether the concept of EDs should be refined in the light of recent insights regarding the pathophysiology of inhibitor development.

Only a small proportion of patients treated with FVIII eventually develop an inhibitor, which clearly suggests that the risk of inhibitor development is modulated by genetic and environmental factors [6–8]. In addition, the risk of inhibitor development varies over time. The development of FVIII inhibitors shows a complex and multifactorial etiology, which has not yet been fully elucidated. Apart from hemophilia-causing mutations, a patient’s risk of developing an FVIII inhibitor was found to be related to other genetic variants or polymorphisms located in the gene for the major histocompatibility complex class II and in immunomodulatory genes [8,9]. Recently, FVIII gene haplotypes were also identified in certain populations, and were different from those present in recombinant concentrates [10]. These genetic variants are suggested to account for the higher incidence of inhibitors in certain families and ethnic groups, being independent of FVIII mutations, but more data are required to allow these findings to be fully understood.

Recent data have suggested that several external factors play a decisive role in inhibitor development, such as the reason for the first infusion at a young age and the intensity of treatment. Indeed, when concentrates are administered in an acute setting (bleeding, trauma, or surgery), which often requires intensive and prolonged substitution (e.g. surgery), the immune system may be stimulated in such a way that the risk of inhibitor development increases [11–13]. These observations clearly support the concept that the risk of developing inhibitors is markedly increased if concentrates are given when the immune system is stimulated, underlining the fact that it is not only the number of EDs that matters, but also the situation in which FVIII is administered. In fact, exposure to the deficient factor for a preventive reason without any concomitant stimulation of the immune system is not the same as exposure in an inflammatory setting. Therefore, introducing the concept of danger days (DDs) may be more appropriate for better appreciation of the number of exposures required for there to be a risk of developing inhibitors. This concept of DDs is supported by the results of recent clinical studies, in which FVIII administered prophylactically at low doses outside danger signals was associated with low inhibitor risk [14–16].

In order to be introduced into clinical practice, the concept of DDs should be well defined, without ambiguity. An ED may be defined as a day of exposure to concentrate in the absence of any associated immune system challenge, such as systemic infection and/or inflammation, bleeding requiring repeated infusions, invasive procedure, or vaccination, whereas treatment given in the presence of these conditions or any concomitant immune system challenge should be defined as DD treatment. In other words, only treatment given preventively either in the framework of long-term or punctual prophylaxis or curatively to treat a limited bleed should be considered as ED treatment. For patients on long-term efficient prophylaxis without bleeding, all days of treatment should thus be considered as EDs and not DDs. These suggestions emphasize the need for a clear and practical definition of DD that should reflect a consensus before implementation in research and clinical practice. The concept of DDs therefore requires standardization by the relevant committee of the ISTH.

The concept of DDs appears to be complementary to the calculation of EDs. All patients should be profiled with respect to their cumulative numbers of EDs and DDs. The potential usefulness of the DD concept is multiple, and it should be re-evaluated in studies previously conducted as well as being incorporated into ongoing and future prospective studies. First, this concept has relevant practical consequences for the treatment and follow-up of patients [17]. Physicians should better understand the usefulness of low-dose prophylaxis initiated at a young age, with the aim of not only preventing bleeding, but also avoiding, as much as possible, intensive treatments that may induce tolerance to FVIII. If intensive treatment is required, those treating hemophiliacs should perform laboratory assays in order to detect the potential appearance of an inhibitor as early as possible. The concept of DDs should contribute to determining to what extent previous prolonged exposure in the absence of danger signals provides an element of tolerization and reduces the risk of delayed inhibitor development induced by high-risk situations in patients with, for instance, 100 EDs without any DD. Second, besides these practical considerations, we believe that retrospective and prospective studies evaluating inhibitor development should consider the number of DDs. This would allow us to better distinguish the influences of genetic and environmental risk factors, and to make a more valid comparison between concentrates (plasma-derived vs. recombinant), as, in our view, such a comparison should take into account patients with both similar EDs and similar DDs. Similarly, studies comparing the risk of inhibitor development associated with treatment strategies, such as continuous vs. repeated infusions, should ideally include patients with similar profiles in terms of EDs and DDs. Although fewer inhibitors are observed in patients with hemophilia B, the environmental risk factors are likely to be similar, meaning that the concept of DDs should also be applied to patients with FIX deficiency.

In conclusion, we suggest that the concept of DDs should be integrated into future studies, as it is complementary to EDs in terms of estimating more objectively the risk of a single patient developing an FVIII inhibitor. On the basis of both immunological findings and clinical study results, we propose recording both EDs and DDs in hemophiliacs requiring FVIII substitution. This would allow for a better appreciation of the influence of environmental factors on inhibitor development, along with improved clinical management of patients.

Disclosure of Conflict of Interests

The authors state that they have no conflict of interest.