Increased resting heart rate is an independent modifiable risk factor for the development of cardiovascular disease. Numerous studies have demonstrated improved clinical outcomes with heart rate reduction in patients with coronary artery disease and heart failure, but its role in transplanted hearts is not yet established. Sinus tachycardia is more common in heart transplant recipients due to graft denervation. Although a large number of studies have recognized increased heart rate as a predictor of native coronary artery atherosclerosis and overall cardiac mortality, contradicting results have been observed in heart transplant recipients. There is no clear consensus about what the normal range of heart rate should be following heart transplantation. The aim of this article was to review the literature to evaluate whether heart rate reduction should be considered in heart transplant recipients.
Dr. Simon G. Williams and Dr. Steven M. Shaw have received honoraria from Servier for advisory work.
The authors have no other funding, financial relationships, or conflicts of interest to disclose.
Elevated resting heart rate (HR) is an independent modifiable risk factor for the development of cardiovascular disease. Several epidemiological studies have demonstrated that faster HR is associated with increased cardiovascular morbidity and mortality in the elderly population1 as well as in patients with hypertension,1 coronary artery disease,2 and heart failure.2 Although earlier studies showed a strong relationship between resting HR and cardiovascular mortality in men, there seemed to be a lack of significance in healthy women. However, the National FINRISK study,3 a large prospective population-based observational study, has extended this risk to even healthy women.
The exact mechanism by which faster HR increases cardiovascular risk remains unclear. Several plausible mechanisms are postulated, such as faster HR promotes atherogenesis by causing injury to the arterial wall and endothelial cell dysfunction. In the later stage of atherosclerosis, faster HR facilitates plaque rupture and causes coronary thrombosis.4 In patients with coronary artery disease, faster HR induces myocardial ischemia by causing a mismatch between oxygen delivery and consumption.4
About 105 adult heart transplants are performed each year in the United Kingdom. The most common indications for heart transplantation are heart failure due to dilated cardiomyopathy and ischemic heart disease. Less common indications are severe secondary ventricular dysfunction due to valvular heart disease, restrictive and hypertrophic cardiomyopathy, refractory angina, and hemodynamically compromising ventricular arrhythmias not amenable to conventional treatment. Overall, the 1-year survival rate is about 85%; however, after the steep fall in survival during the first 6 months, survival then decreases at a very linear rate (approximately 3%–4% per year). During the early postoperative period, graft failure, multiorgan failure, and infection accounts for most deaths, and thereafter cardiac allograft vasculopathy and malignancy are the major causes of death.5 Other factors, such as the etiology of heart failure leading to transplantation, has also been found to significantly influence survival. The recent International Society of Heart and Lung Transplantation registry data have shown that patients with nonischemic cardiomyopathy who undergo transplantation have the best survival, followed by patients with ischemic cardiomyopathy. And those with congenital heart disease, valvular cardiomyopathy, and those in need of retransplant have decreased survival compared to the former 2 groups.5 Despite changes in immunosuppressive therapy and modifying risk factors (eg, hypertension, obesity, hyperlipidemia, and diabetes), there is still high transplant-related morbidity and mortality.
There is increasing evidence that HR reduction in patients with coronary artery disease, heart failure, and hypertension significantly improves cardiovascular outcome, but its role in heart transplant patients is not yet established. The aim of this article was to review the literature to assess whether HR reduction in heart transplant (HTX) recipients should be considered.
Pathophysiology and Consequences of Elevated HR After Heart Transplantation
The intact heart is richly innervated by the parasympathetic and sympathetic fibers of the autonomic nervous system. Transection of these autonomic fibers results in altered electrophysiologic properties of the transplanted heart. With parasympathetic denervation, suppression of sinoatrial (SA) node automaticity is lost, leading to persistent increase in resting HR.6 On the other hand, sympathetic denervation contributes to a delay in exercise or stress-induced augmentation of SA node automaticity, resulting in diminished maximum HR response with exercise.6 The natural history of HTX recipients involves gradual autonomic reinnervation of the cardiac allograft with improved HR response to exercise and decreased HR at rest.7,8 Although sympathetic reinnervation occurs in more than 50% of adult HTX recipients, parasympathetic reinnervation occurs in <5% of patients.9,10 Studies have shown that elevated sympathetic activity and increased adrenergic activation accompanied by parasympathetic withdrawal for a long period is associated with myocyte apoptosis, pathologic remodeling, and dysregulation of calcium handling that leads to myocardial ischemia, a decrement in contractile function, and an increased risk of sudden cardiac death.11,12
In the denervated heart, other factors like leptin may also play a significant role. Leptin, the protein product of the ob gene, has been linked to increased HR, blood pressure,13 and cardiovascular risk.14 An animal study showed that leptin has a specific effect on T-lymphocyte responses, differentially regulating the proliferation of naive and memory T cells.15 Here, leptin increased Th1 and suppressed Th2 cytokine production. Winnicki et alassessed the relationship between plasma leptin levels and HR in 32 HTX recipients.16 Endomyocardial biopsy, coronary angiography, and ventriculography were used to exclude acute rejection, significant cardiac allograft vasculopathy (CAV), and impaired systolic function within 7 days before the study. Blood samples was collected from each patient while fasting. The results from this study showed that HR was related to leptin levels and drug effect. In a multivariate analysis, HR was independently and positively associated with leptin levels. The authors speculated that this association may be due to a direct effect of leptin on HR mediated through cardiac leptin receptors. An alternative explanation was that leptin levels may be a surrogate for 1 or more substances that influence heart rate.
Shear stress (the tangential force that occurs due to friction of the flowing blood on the endothelial surface) is significantly influenced by HR (Figure 1). Low shear stress associated with faster heart rates stimulate specific mechanosensors located on the surface of endothelial cells, which leads to the upregulation of several proatherogenic genes.17 As a result, low shear stress promotes functional changes like increased low-density lipoprotein (LDL) uptake,18 reduced nitric oxide synthesis, and increased degradation,19 expression of adhesion molecules, chemoattractants (eg, monocyte chemoattractant protein-1), cytokines (tumor necrosis factor-α and interleukin-6),20 and structural changes to the endothelial cytoskeleton.21 These changes together with endothelial cell apoptosis (promoted by low shear stress) makes the endothelium more permeable to circulating inflammatory cells like monocytes, T cells, mast cells, and LDL.22 Tumor necrosis factor-α exerts a major effect on T-cell function after cardiac transplantation and plays an important role in initiating and orchestrating the rejection response. Therefore, this may suggest an important pathophysiological relevance to the graft outcome.
Correlation of Heart Rate With Clinical Outcomes After Cardiac Transplantation
The prognostic importance of HR reduction in HTX recipients has been examined in several small observatory studies (Table 1). In a retrospective analysis of 78 patients, Anand et alassessed whether HR predicted survival after heart transplantation.23 Results from this study showed that patients with HR >90 bpm were much more likely to die early than those with HR ≤90 bpm (hazard ratio, 2.8; 95% confidence interval: 1.5-5.1; P < 0.0013). They also found that patients with a net increase in HR over time were 4.7 times more likely to die prematurely than those whose HR did not change or decreased over time. It was concluded that HR ≤90 bpm conferred a significant long-term survival benefit. In another study of 104 patients who had survived at least 3 months after heart transplantation, Scott et alfound higher mortality rates in patients who had mean heart rates greater than the 95th percentile.24 Yet, it is unclear from these studies how faster HR contributed to decreased survival in HTX recipients. Possible mechanisms suggested included that tachycardia is simply a reflection of poor myocardial function, and that in turn is the primary trigger for increased mortality. Other possible mechanisms were that tachycardia might be a reflection of higher circulating catecholamines, which carries an elevated risk of arrhythmogenesis, adverse cardiac remodeling, and hypertension. Interestingly, studies that have analyzed the effects of HR on CAV have not shown any obvious adverse findings. CAV is a rapidly progressive form of atherosclerosis mediated by immunologic and nonimmunologic mechanisms related to the recipient or the allograft.25 Although a large number of epidemiological studies have shown that increased resting HR is a predictor for the development of native coronary atherosclerosis and overall cardiovascular mortality in the general population, conflicting data have been observed so far in HTX recipients. Ambrosi et al26 assessed 143 patients who underwent heart transplantation between 1985 and 2006 and survived at least 2 years after transplantation. Patients who were on HR-lowering medications and who had acute graft rejection and/or had acute infectious disease at that time were excluded from this study. Fifty-six patients had coronary lesions, and 87 had angiographically normal coronaries during the mean follow-up of 9.5 years. No significant difference in mean basal HR was observed between these 2 groups (96.4/min vs 98.3/min, P = 0.34). They concluded that findings from this series did not support an influence of HR on CAV in heart transplant recipients. One of the possible explanations given to this finding is that CAV is a very specific type of coronary disease mainly related to immunological processes and may not be influenced by heart rate.27 Gullestad and his colleagues from Stanford University analyzed intracoronary ultrasound (ICUS) examinations in 130 HTX recipients at an annual evaluation on average 3.0 years after transplantation.28 They found that the presence of CAV defined by mean coronary artery intimal thickness (MIT) >0.3 mm is more prevalent in patients with lower, rather than higher, HR (46% vs 32%). The authors hypothesized that CAV itself might be the cause of sinus node dysfunction, and that the lower HR could have been the consequence rather than the cause of CAV development. In a recently published article, Olmetti et al29 assessed the relationship between HR and CAV in approximately 250 patients over a median follow-up of 96 months. Following multivariate analysis, donor's age, chronic renal failure, and left ventricular end-diastolic wall thickness were found to be significant predictors of CAV. Surprisingly, a mean HR ≥90 bpm was found to be associated with a 45% relative reduction in the risk of CAV. The authors hypothesize a different explanation, linking sinus node dysfunction to donor age. In a denervated heart, sinus HR is not modulated by autonomic control and simply reflects the intrinsic HR, which in turn is age related and decreases with advancing age.30
Table 1. Studies Correlating Heart Rate and Clinical Outcomes After Cardiac Transplantation
Patients with an HR >90 bpm within the first 3 months after HT were 2.8 times more likely to die than patients with an HR ≤ 90 bpm.
Scott et al, 199324
HR and late mortality
Patients with an inappropriately high resting HR in long-term survivors of cardiac transplantation is an adverse prognostic sign.
Ambrosi et al, 201026
HR and CAV
No significant difference in mean basal HR was observed in patients who had coronary lesions and those who had normal coronaries. This series did not support a prognostic influence of HR for CAV.
Gullestad et al, 199728
3.7 ± 3.0 years
HR and CAV
CAV is more prevalent in patient with lower rather than higher HR.
Olmetti et al, 201129
HR and CAV
HR <90 but not ≥90 bpm was significantly associated with an increased CAV development. Sinus tachycardia in a denervated heart is not a risk factor for coronary atherosclerosis.
Experience of Heart Rate Lowering Pharmacotherapy After Heart Transplantation
Numerous studies have shown improved clinical outcomes with HR-lowering agents (β-blockers, nondihydropyridine calcium antagonists, and If channel blockers) in patients with cardiovascular diseases, including heart failure, angina and myocardial infarction, but the effect of these medications on the transplanted heart is not well studied due to concerns over the hemodynamic consequences in the denervated heart (Table 2).
Table 2. An Overview of HR-Lowering Pharmacotherapy After Cardiac Transplantation
Acute β-adrenergic blockade accentuates impairment in ventricular performance and appears to be detrimental in HTX recipients.
Bexton et al, 198332
n = 6
β-Blockade reduces the exercise capability in HTX recipients.
Kushwaha et al, 198433
n = 10
β-Blockade adversely affects exercise tolerance and cardiovascular response to exercise in HTX recipients.
Hall et al, 199534
n = 26
β-Blockade improves systolic performance in heart failure patients not until 1 month after therapy and may have mild systolic impairment initially.
Gardner et al, 200235
n = 1
β-Blockade improved symptoms and graft function in an HTX recipient with idiopathic left ventricular systolic dysfunction over an 8-month period.
Schroeder et al, 199336
n = 106 (52 received and 54 did not receive diltiazem)
Diltiazem prevents or slows decline in the coronary artery diameter during the first year after heart transplantation.
Delgado et al, 200337
n = 112
Diltiazem administration and cyclosporine level >362 ng/mL in the first month after heart transplantation reduced acute rejection during the first year.
Doesch et al, 200938
n = 30
Ivabradine reduced HR effectively and caused significant reduction in left ventricular mass index.
Zwicker et al, 201039
n = 1
Increasing doses of ivabradine, in contrast to a short acting β-blocker, controlled heart rate and supported recovery from cardiogenic shock in a HTX recipient with tachycardia induced-cardiomyopathy.
Doesch et al, 200740
n = 25
Ivabradine lowered heart rate effectively and is better tolerated than β-blocker therapy.
β-blockers remain the cornerstone of treatment in heart failure and ischemic heart disease, but detrimental results have been observed in HTX recipients. In 1 prospective study, response to exercise was assessed before and after β-blockade in 35 clinically stable HTX recipients and 5 healthy subjects.31 After β-blockade, lower maximal HR, ejection fraction, and cardiac index were observed during exercise in both groups. The peak exercise cardiac index was found to be 42% lower in transplant recipients than control subjects after β-blockade. They concluded that acute β-adrenergic blockade accentuates the impairment in ventricular performance and appears to be detrimental in these patients. Similar results were observed in a number of small studies involving transplanted hearts with normal systolic function. The use of propranolol in these transplant recipients led to reduction in exercise tolerance, reduced cardiovascular response, and increased circulating levels of catecholamines.32,33 We should recognize, however, that these studies have only evaluated short-term effects of β-blockade on the transplanted heart.
The effects of longer-term exposure remains relatively undetermined. It is well known, for instance, that β-blockade in the setting of heart failure results initially in the significant impairment of cardiac performance before eventual recovery, and then an increase in performance occurs. In 1995, Hall et al34 analyzed the time course of improvement in left ventricular function and long-term effects of metoprolol on left ventricular mass and geometry in patients with dilated cardiomyopathy. Patients underwent serial echocardiography at baseline and at day 1 and months 1 and 3 of drug therapy. Metoprolol was titrated weekly to the maximum dosage of 50 mg twice daily over a period of 1 month. The results of this study suggested that improvement in systolic performance do not occur until after 1 month of metoprolol therapy and may have a mild reduction in function initially. Long-term treatment resulted in left ventricular mass regression and restoration of ventricular geometry by 18 months. Interestingly, Gardner et al35 reported on a 49-year-old male who presented 5.5 years following heart transplant with heart failure symptoms due to left ventricular systolic dysfunction. CAV was excluded by coronary angiography. He was successfully treated with gradual titration of carvedilol, which appeared to result in significant improvements in symptoms and also graft function over an 8-month period.
Calcium Channel Antagonists
The usefulness of diltiazem in HTX recipients has been evaluated in several studies. In 1986, Schroeder and his colleagues from Stanford assessed the efficacy of diltiazem in preventing CAV.36 Fifty-two patients were randomly assigned to receive diltiazem and 54 to receive no calcium channel blocker 10 to 14 days after transplantation. The extent of coronary artery disease was analyzed by a coronary angiogram obtained early after transplantation and annually thereafter. The average coronary artery diameter decreased significantly in the group that did not receive diltiazem from 2.41 ± 0.27 mm at baseline to 2.19 ± 0.28 mm at 1 year and to 2.22 ± 0.26 mm at 2 years (P < 0.001). The average diameter in the diltiazem group changed little from the baseline to 1- and 2-year follow-up. Death due to coronary artery disease or retransplantation occurred in 5 patients in the group that did not receive a calcium channel blocker compared to none in those who received diltiazem. The mean oral dose of cyclosporine was significantly lower in the diltiazem group due to the known tendency of diltiazem to elevate blood levels of cyclosporine. However, the cyclosporine blood levels were the same. The results of this study suggested that diltiazem prevents or slows a decline in the coronary artery diameter during the first year after heart transplantation. The mechanism by which diltiazem attenuates the coronary artery luminal narrowing remains unclear. No significant difference in blood pressure between the 2 groups was identified. Interestingly, the study group did not appear to evaluate HR or HR change between the 2 groups to determine if this was a significant factor. We speculate, given the known HR-lowering properties of diltiazem, that HR was lowered in the active study group.
In another study, clinical factors associated with acute rejection in the first year after heart transplantation were analyzed in 112 HTX recipients.37 The result from this study showed that diltiazem administration and cyclosporine level in the first month after HT were both independently associated with the absence of acute rejection. Both diltiazem and cyclosporine level showed a large and independent protective effect. Again, this study did not include HR in the risk assessment. So although it is impossible to say whether there is any benefit due to HR-lowering effect, it is tempting to think this might be 1 of the potential mechanisms.
If Channel Antagonist
Ivabradine is a selective HR-lowering agent, which does not appear to exert any inotropic or antihypertensive action. Doesch et al38 analyzed the effects of ivabradine on HR control, effects on left ventricular mass, and safety in 30 HTX recipients over a period of 12 months. Mean heart rate was determined using standard Holter recordings and left ventricular dimensions by using transthoracic echocardiography. In 3 patients, ivabradine had to be discontinued due to nausea (n = 2) and patient's preference (n = 1). During follow-up, mean HR was reduced from 96.2 ± 8.6 bpm to 80.9 ± 8.1 bpm (P < 0.0001). Interestingly, a statistically significant reduction in left ventricular mass index was also identified (104.3 ± 22.7 g at baseline vs 95.9 ± 18.5 g at follow-up). The authors concluded that ivabradine may offer a beneficial effect on left ventricular remodeling in HTX recipients. Zwicker et al39 extended these findings in a 37-year-old HTX recipient who developed cardiogenic shock due to tachycardia-induced cardiomyopathy. Administration of the short-acting β-blocker esmolol further aggravated her low-output heart failure. On the other hand, HR reduction with increasing doses of ivabradine resulted in recovery from cardiogenic shock as well as improvement in patient's clinical condition, left ventricular dimensions, and function.
In another study involving 25 HTX recipients,40 HR control, tolerability, short-term safety, and effects on exercise capacity were studied consecutively with the β-blocker metoprolol succinate compared with ivabradine. Drug discontinuation following side effects occurred in 5 patients (metoprolol: 4, ivabradine: 1). Mean HR was reduced from baseline (96.5 ± 7.0 bpm) to 84.4 ± 8.8 bpm on the β-blocker (P = 0.0004 vs baseline) and to 76.2 ± 8.9 bpm with ivabradine (P = 0.0001 vs baseline and P = 0.003 vs β-blocker). HR reduction with ivabradine was effective and potentially better tolerated than β-blocker therapy in heart transplant recipients.
HR is being increasingly recognized across different disease boundaries as an important and independent predictor of outcome. There is no clear consensus about what the normal range of HR should be after heart transplantation. Questions remain about the implications of raised HR after transplantation that arises due to denervation, especially given the complex pathophysiology of disease in this environment. Correlative studies to date are limited and have mixed conclusions.
Experience with drugs that lower HR is also limited, but looks rather promising as a potential therapeutic target. There are lots of reasons to suspect HR has a detrimental effect based on experimental evidence from both transplant and nontransplant studies. Overall, the available evidence suggest that we should be stimulated to explore HR and modulation of it after transplantation, with the hope it may improve outcomes. Given the small number of transplantations performed in individual transplant centers annually and scarcity of prognostic data, a large multicenter study examining long-term effects of lowering heart rate with various therapeutic agents is warranted.