Increasingly, health care providers are seeking advantages through integration (mergers). Although market power resulting from “horizontal” mergers between hospitals or insurance companies has received considerable attention from the media (Kowalczyk and Weisman 2012; Pearlstein 2012), health economists (Capps et al. 2002), and antitrust regulators at the federal (Federal Trade Commission and Department of Justice 1996) and state (Coakley 2010) levels, mergers between insurance companies and hospitals have received less scrutiny. However, such plan–provider or “vertical” integration is justified by organizations on the same cost efficiency and quality improvement grounds as horizontal integration of hospitals or insurers.1 Both types of integration raise concerns about excessive market power and consumer welfare.
In this study, we focus on plan–provider integration in the Medicare Advantage (MA) market. This is both convenient and relevant to policy. The data necessary for a study of this kind are publicly available. The same cannot be said of the commercial health insurance market. In addition, integration in the MA market is incentivized by provisions of the 2010 Patient Protection and Affordable Care Act (Public Law 111–148) (hereafter, ACA) and subsequent regulation. Through bonus payments for quality improvement and cost reduction, the ACA encourages the formation of accountable care organizations (ACOs), networks of providers responsible for the care of a defined group of Medicare patients (Frakt and Mayes 2012). ACOs give providers an incentive to consolidate the spectrum of care under one management because bonus payments will be tied to performance on quality measures and a spending target based on the difference between a benchmark and all Medicare spending attributed to beneficiaries associated with the ACO—even when incurred for services provided outside the ACO. In addition, some ACO contracts put providers at financial risk if their costs are above a benchmark. Consequently, ACOs with risk management capabilities will be better positioned to succeed (Fuchs and Schaeffer 2012). Providers can develop these capabilities internally or acquire them by merging with an insurer. Finally, the ACA offers quality bonus payments to MA plans (Jacobson et al. 2011). To the extent that higher quality can be achieved through plan–provider integration, this is another incentive to integrate.
Although our focus is on MA, our work is relevant to the market for commercial insurance plans and providers, where plan–provider integration may be just as common, if not more so. According to our analysis, about 17 percent of MA plans are integrated. Rabin (2012) reports on an industry survey that found that 20 percent of hospital networks offer an insurance product and an equal proportion are considering doing so. Although we make no claims about the generality of our findings beyond Medicare, this suggests that our study is in the context of an integration trend that is considerably broader than the market we examine.
Although plan–provider integration is occurring and encouraged by policy, to our knowledge there have been no studies of its relation to quality and premiums. Our study begins to fill this void. Using data before ACA passage, we find that integrated plan–providers charge higher premiums, controlling for quality. Such plans also have higher quality ratings. We also find no evidence that integrated plans offer more generous benefits. In the concluding discussion, we speculate on what these results might mean for consumers and policy makers.
Data and Methods
- Top of page
- Data and Methods
- Supporting Information
We constructed an analytic file for 2009 from publicly available plan- and county-level data with one exception noted below. We selected this year because it is prior to any possible anticipation by plans and providers of the changes made to MA plan payments and the health care landscape by the ACA, which was passed in 2010. In particular, the ACA encourages the formation of ACOs, froze plan payments in 2011, adjusts the payment formula in subsequent years to lower payments and to reduce the geographic variation in the difference between plan payments and Medicare fee for service (FFS) cost, and changes the plan payment formula to incorporate plan quality (aka, “stars”). Some of these changes may have been anticipated by plans as early as 2010. Integrated firms might have responded to quality bonuses more effectively than nonintegrated ones. Our aim was to assess integration without the complication of the bonus program.
To create the analytic file, we began with all CCPs in U.S. states and the District of Columbia. We then excluded nondrug MA plans because they are qualitatively different from plans that offer drug benefits (hereafter referred to as MA-PD plans) and because premium data do not allow us to credibly separate premiums into drug and nondrug components. Although CMS provides both drug and nondrug premiums for MA-PD plans, plans likely subsidize nondrug benefits with drug premium revenue. As such, neither the reported drug premium nor the nondrug premium is an unambiguous reflection of the true premium. We also excluded special needs plans because they serve different populations than other MA plans (typically, Medicaid-Medicare dual eligibles and institutionalized beneficiaries) and are, effectively, in a different market. The final analytic file contains drug-offering CCPs that are not special needs plans and for which data for all variables described below are not missing.
We control for market factors related to level of government payment to plans, costs, and demand in 2009. All such data originate from CMS's administrative files at the plan–county level unless otherwise indicated. From CMS's Medicare Advantage Rates & Statistics webpage2 we obtained the county-level benchmark payment rate; FFS cost data; and 2006 diagnosis-based risk scores. We aggregated 2007 MA enrollment data to the firm level to compute the historical HHI in each county.3 From a large insurer, we obtained county-level Medigap premiums for 2005 (our only source of nonpublic data).4 To all these data, we merged other county-level, historical cost and demand correlates from the Area Resource File (ARF).5 Plan service areas and MA product premiums are from the Drug and Health Plan Data and Plans Information by County. Contract-level star quality ratings are from the Plan Ratings Data.6
As Song, Landrum, and Chernew (2012) articulated, plans are offered across multicounty service areas. Therefore, it is appropriate to aggregate market factors across the counties in which plans operate. We aggregated all county-level market variables into plan-level variables through enrollment weighting. That is, the value of a market variable assigned to a plan is the enrollment-weighted value across all the counties in which that plan operates. Plan characteristics (e.g., premium, benefits, and star quality ratings) are constant across counties in the service area, so weighting was not required for these variables. Our final unit of analysis is the plan.
The plan “star” quality data consist of quality ratings at various levels of aggregation, derived by CMS from four sources:
- HEDIS, developed and maintained by the National Committee for Quality Assurance, measures health care process and intermediate outcome quality;
- CAHPS, an initiative of the Agency for Healthcare Research and Quality, measures patients’ experiences or consumer satisfaction with their health plans (e.g., customer service and getting needed care quickly);
- The Health Outcomes Survey, a CMS survey of self-reported outcomes;
- Other CMS administrative sources (Jacobson et al. 2011; MedPAC 2012b).
To obtain the broadest possible measure of quality, we used the two CMS-provided summary scores reflecting prescription drug plan and health plan quality. These are on a five-point rating scale with 1 star the lowest and 5 stars the highest. We added these two measures together to construct a single, 2–10 stars summary quality score. We imputed missing summary quality scores from a model based on finer level star ratings provided in the Plan Ratings Data.
By reviewing plans’ websites and governing documents, we determined which plan-offering firms had vertically integrated with a hospital or provider group. “Integration” means that the provider and plan are owned by the same firm. We coded plans associated with such firms as “integrated.” All other plans are “not integrated.” We randomly sampled about 30 percent of plans coded as integrated and verified from online sources that they were integrated prior to our study year. Table 1 provides summary statistics for our variables. The appendix provides additional detail on how we ascertained integration status and a table of means by integration status.
Table 1. Variable Definitions and Univariate Statistics
|Variable||Description||Mean (SD)||[Min, Max]||Source|
|Premium||Monthly premium, 2009 (dollars)||50.57 (67.14)||[0, 511]||CMS|
|Quality||Star rating, reported in 2009 based on data from prior years||6.39 (0.94)||[4.30, 8.99]||CMS|
|Key independent variable|
|Integrated firm||Indicator of vertical integration, 2009||0.17 (0.37)||[0, 1]||Web|
|Prop. integrated||Proportion of other firms integrated, 2009||0.037 (0.054)||[0, 0.21]||Web|
|HHI||MA firm Herfindahl–Hirschman index, 2007||0.33 (0.12)||[0.11, 0.79]||CMS|
|MA enrollment||Thousands of enrollees in MA plans, 2009||24.34 (30.89)||[0.17, 126.13]||CMS|
|Benchmark||Benchmark payment rate, 2009 (dollars)||864.20 (99.28)||[740.82, 1,237.61]||CMS|
|FFS cost||Monthly average FFS cost, 2009 (dollars)||743.18 (120.56)||[540.74, 1,213.25]||CMS|
|Prop. elderly 75+||Proportion of elderly 75+ years old, 2000||0.47 (0.029)||[0.33, 0.55]||ARF|
|Docs. per capita||General practitioners per capita, 2006||0.00028 (0.00010)||[0.000035, 0.00066]||ARF|
|Beds per capita||Hospital beds per capita, 2005||0.0033 (0.0013)||[0.00076, 0.011]||ARF|
|Rural county||Rural county, 2003||0.014 (0.060)||[0, 1]||ARF|
|Urban county||Urban county, 2003||0.88 (0.23)||[0, 1]||ARF|
|Rx Medigap prem.||Monthly drug Medigap premium, 2005 (dollars)||250.14 (43.03)||[187.66, 465.50]||–*|
|Non-Rx Medigap prem.||Monthly nondrug Medigap premium, 2005 (dollars)||148.70 (23.47)||[102.95, 263.00]||–*|
|Risk score||Aged/disabled risk score, 2006||1.02 (0.091)||[0.84, 1.34]||CMS|
|Prop. elderly in poverty||Proportion elderly in poverty, 1999||0.094 (0.036)||[0.036, 0.24]||ARF|
|Per capita income||Per capita income in thousands, 2005||33.93 (7.38)||[16.91, 93.37]||ARF|
|Prop. HS diploma||Proportion of population age 25+ with a high school diploma, 2000||0.79 (0.056)||[0.52, 0.91]||ARF|
|Prop. 4+ years col.||Proportion of population age 25 with 4+ years college, 2000||0.23 (0.065)||[0.089, 0.49]||ARF|
|Prop. Manufacturing||Proportion of workers in manufacturing, 2000||0.12 (0.053)||[0.015, 0.28]||ARF|
|Prop. white collar||Proportion of white-collar workers, 2000||0.59 (0.058)||[0.43, 0.79]||ARF|
|Prop. const.||Proportion of workers in construction, 2000||0.070 (0.015)||[0.017, 0.12]||ARF|
Based on these data sources, in 2009 there were 1,047 nonspecial need, non-PFFS, drug-offering CCPs in the U.S. states and the District of Columbia. Of these, 910 had sufficient star quality data to be included in our models. The appendix includes a table that compares means for observations with and without star quality data. Although the means are statistically significantly different for many variables, this is not unexpected. There is a several-year lag between data collection and star rating reporting, and contracts that are too new can have missing star data. Also, contracts for plans with enrollment below 1,000 are not required to submit HEDIS data to CMS. Consequently, our results do not necessarily apply to the newest or smallest plans.
We estimated two ordinary least-squares models with firm random effects (because a given firm may offer multiple plans) and state fixed effects:
α, β, and γ are coefficients to be estimated, where γ is the coefficient on the state fixed effect (indexed by state s), and the error terms ε are assumed to be uncorrelated with each other and the independent variables. The value of a state fixed effect for a given plan is one if the plan offers services anywhere in that state, otherwise it is zero; therefore, multiple state fixed effects can be associated with a single plan. Market structure, cost, and demand are vectors and the independent variables that comprise them, as well as all others, are listed in Table 1. Premium is the beneficiary monthly out-of-pocket premium; quality is the 10-point star rating described above; integrated firm is an indicator that the firm with which the plan is associated is integrated with a provider; market structure includes proportion integrated, the (unweighted) proportion of MA plan-offering firms in the county that are integrated with a provider, not including the plan in question, lagged HHI, and MA enrollment in all plans in the market; benchmark is the MA benchmark payment rate; cost includes variables that are correlated with MA plan cost (Frakt, Pizer, and Feldman 2012): average Medicare FFS cost, the proportion of elderly age 75 years or older, doctors and hospital beds per capita, urban/rural indicators, Medigap premiums, and a diagnosis-based risk score, which measures average health status of the Medicare population in the market (Pope et al. 2004); demand includes per capita income and the proportions of the population who are elderly, in poverty, have a high school diploma, have four or more years of college, and work in manufacturing, construction, or white-collar jobs. These labor force variables are significant predictors of Medicare plan entry (Cawley, Chernew, and McLaughlin 2005; Pizer, Feldman, and Frakt 2005).
- Top of page
- Data and Methods
- Supporting Information
We examined the relations between plan–provider integration, premiums, and quality in the MA market. We found that integration is associated with an increase of $28 per month in premiums and a monetized increase of just under $8 per month in quality.7 Consequently, about 70 percent of the total premium difference between integrated and nonintegrated plans is not attributable to quality. Some or all of this premium increase could be associated with benefit enhancements by plans, but we did not observe a statistically significant increase in benefit generosity with integration among the subset of benefit variables we examined. An alternative possibility is that higher premiums for integrated plans are related to higher market power commanded by those plans (whether due to integration or a causal factor of it). Because we did not examine all possible benefits, we cannot completely distinguish between these two possibilities, and that is a natural focus for future investigation.
Our study has several limitations. First, and most important, we did not estimate causal relationships between integration and premium and quality (or benefits). It is possible that integration causes increased premiums and quality, but it is also possible that causality runs the other way. Consider, for example, if nonintegrated insurers underprovide access to higher quality hospital services, perhaps because consumers do not value them or because of insurer monopoly pricing. In this case, a higher quality, monopoly hospital has an incentive to integrate to address underappreciation of quality and underutilization caused by monopoly pricing of hospital services. For these reasons, plan–provider integration may be more common among higher quality providers than lower quality ones. The only study we could find addressing the causal effect of quality on integration (Fernández-Olmos, Rosell-Martínez, and Espitia-Escuer 2009) found that wineries producing higher quality wines were more likely to vertically integrate grape growing with wine production than those producing lower quality wines. Consequently, our coefficient estimate of integration in the quality Equation (2) may be biased upward under a different causal interpretation. If so, integration would induce a smaller quality increase than suggested by our estimate. If one also interprets Equation (1) causally, the implication is that integration causes premiums to be more than 70 percent higher than warranted by the quality increase alone. However, we stress that this interpretation requires assumptions we are not articulating or defending here.
A second limitation is that we examined only the Medicare market. Although integration is occurring outside that market, and perhaps to a greater extent, we urge caution in generalizing the findings. Third, our results do not reflect plans too new or under contracts too small (in number of enrollees) to have star quality ratings. No star rating data were available for such plans, so they were excluded from our sample. Fourth, to establish a baseline, we deliberately studied a period before the changes to the MA program and the health care landscape brought about by the ACA. Future work should examine the effects of the ACA on integration and hence on premiums and quality, contrasting them with our findings. Fifth, as shown in Table 2, many markets have no integrated plans. It is possible that such markets and the plans therein differ systematically from those with integrated plans. Our analysis assumes that our controls, which include state fixed effects, address all sources of that difference. Finally, as mentioned, a more thorough investigation of the relation between integration and benefits is warranted, but it is beyond the scope of this study.
Despite these limitations, our findings have some implications for policy makers. They demonstrate that plan–provider integration in the MA market is associated with substantially higher premiums. Policy makers considering promoting integration as a means to increase quality and reduce cost should be aware that recent experience does not support an expectation of lower cost. Experience under new initiatives, like accountable care organizations, may be different, especially if the new organizations are created to take advantage of shared savings opportunities. Nevertheless, our research suggests the potential for anticompetitive effects that may be challenging to manage through regulation.