Transforming Mental Healthcare in the Veterans Health Administration: A Model for Measuring Performance to Improve Access, Quality, and Outcomes

Authors


For more information on this article, contact Katherine E. Watkins at kwatkins@rand.org.

Abstract:

In this paper we present the conceptual framework and research design of a national evaluation of the quality of mental healthcare provided to veterans by the Veterans Health Administration, and present results on the reported availability of evidence-based practices. We used the Donabedian paradigm to design a longitudinal evaluation of the quality of mental healthcare. To evaluate the structure of care we used a combination of survey and administrative data and designed a web-based facility survey to examine the availability and characteristics of 12 evidence-based practices and other mental health services. We identified 138 unique facilities that provided mental healthcare to 783,280 veterans. With the exception of opiate substitution therapies, every evidence-based practice was reported in at least one location in each service network. We use maps to estimate the maximum number of veterans that might benefit from expanding the availability of an evidence-based practice. We demonstrate the feasibility of overcoming several major challenges typically associated with measuring the quality of healthcare systems. This framework for evaluation of mental healthcare delivery provides a model upon which other stakeholders can continue to build and expand.

Efforts to improve the quality of mental healthcare have gained momentum over the past decade, fueled by the Institute of Medicine's Quality Chasm report in 2001, and the subsequent adaptation of the chasm framework to improve the quality of care for mental health and addictive disorders (Institute of Medicine, 2001, 2005). Numerous studies have documented the discrepancies between mental healthcare that is known to be effective and what is actually delivered (Bauer, 2002; Kessler et al., 2005; Mechanic & Bilder, 2004; Moos, 2005; National Committee for Quality Assurance, 2007; Olfson et al., 2002; Richardson, Di Guiseppe, Christakis, McCauley, & Katon, 2004; Rushton, Fant, & Clark, 2004; Stein et al., 2004; Substance Abuse and Mental Health Services Administration, 2004; Watkins, Burnam, Kung, & Paddock, 2001) and the significant personal and economic consequences of poor care (Murray & Lopez, 1996; World Health Organization, 2001). Nationally, questions are being raised about access to and quality of mental healthcare, within both the public and private sectors, including the Veterans Health Administration (VHA) (Oliver, 2008a, 2008b). VHA spending has more than doubled in the past decade, rising from US$17 billion in 1996 to US$36 billion in 2007, consistent with the expanding veteran patient population (Oliver, 2008b). US policy makers and medical professionals are increasingly recognizing that quality mental healthcare is a key contributor to better, healthier lives for those with mental illness, and that setting measurable goals, promoting evidence-based processes of care, and monitoring performance plays a vital role in improving healthcare delivery and, ultimately, patient outcomes (Bhatia & Fernandes, 2008; Bremer, Scholle, Keyser, Knox Houtsinger, & Pincus, 2008; Cully, Zimmer, Khan, & Petersen, 2008).

In 2006, the US Department of Veterans Affairs (VA) funded an independent evaluation of the quality of mental health services provided to veterans who receive care by the VHA. Authorized by the Government Performance and Results Act of 1993 and Title 38, which require federal agencies to independently evaluate important programs, the evaluation focuses on five mental health diagnoses: (1) schizophrenia; (2) bipolar disorder; (3) posttraumatic stress disorder (PTSD); (4) major depressive disorder; (5) substance use disorder (SUD). The evaluation began in August 2006 and will continue through November 2010.

In 2004, responding to US government's New Freedom Commission on Mental Health (2002), the VHA finalized a 5-year Mental Health Strategic Plan (MHSP), which emphasizes mental health as an important part of veterans' overall health. To support the overall initiative, VA increased annual funding for mental health by US$1.4 billion between FY 2005 and FY 2008, including US$915 million in mental health enhancement initiatives related to the MHSP implementation.

In this context, the study team was asked to examine how well the VA is translating the promise of improved mental healthcare into better, healthier lives. We designed a longitudinal evaluation of the structure, process, and outcomes of care for the VHA mental healthcare system using four different data sources: administrative data from FY 2004 to FY 2008; facility survey data from May 2007 to September 2009; medical record data from FY 2007 to FY 2008; and client survey data from November 2008 to July 2009.

In this paper we present the conceptual framework and research design of the overall evaluation. We also provide our methodology and baseline results for our evaluation of the VHA structure of care, with a particular focus on the implementation of evidence-based practices. Our purposes are (1) to offer an innovative model for assessing the performance of mental healthcare systems in the context of change; (2) to describe how the structural component of this model has been applied to the evaluation of the VHA mental healthcare system; and (3) to provide information about the structure of VHA mental health services at an early point in the system's transformation.

Conceptual Framework and Research Design

The evaluation was designed to assess the extent to which VHA was meeting its goal of maximizing the mental and social functioning of veterans by providing high-quality mental healthcare. Quality is demonstrated when a healthcare system is achieving or exceeding its overall goals in improving outcomes (Donabedian, 1966, 1980). Outcomes (e.g., symptoms, quality of life, functional status) are influenced by both the structure of care (e.g., staffing, hours of operation, provider workloads, availability of evidence-based practices) and the process of care (e.g., extent to which evidence-based practices are implemented, frequency, and timing of services). Although the goal of the healthcare system is to improve outcomes, there are significant challenges to using outcomes as indicators of quality: outcomes are determined by many factors, some of which are unrelated to healthcare, and they are more difficult to measure (Lilford, Brown, & Nicholl, 2007). To address these challenges, researchers have proposed using process measures to assess performance when there is clear evidence that desired health outcomes can be linked to a particular process of care (Berwick, 1995; McGlynn, 1998). The link between process and outcomes ensures that outcomes remain a priority without requiring an exclusive focus on outcome measurement. The linkage to process also provides the health system with information about why quality is poor and which specific administrative or clinical approaches improve performance.

Figure 1 illustrates the application of the Donabedian structure-process-outcomes model of quality to our evaluation of the VHA mental healthcare system. Structure is evaluated by describing the continuum of care available to veterans with the five targeted mental health diagnoses using two sources of information: survey data from VHA facilities at two points in time (May 2007 and October 2009), and administrative data between FY 2004 and FY 2008. Process is evaluated by looking at services received by veterans with the five targeted mental health diagnoses, using two sources of information: administrative data and medical record data from FY 2007 to FY 2008. Outcomes are evaluated by asking whether the services received affect veterans' functioning and quality of life, using two sources of information: client survey data collected between November 2008 and July 2009 and medical record data from FY 2007 to FY 2008. This paper relies on two sources of information, FY 2004–2006 administrative data and a facility survey fielded in May 2007 to assess the structure of care.

Figure 1.

Application of Structure-Process-Outcomes Model to Evaluation of the Veterans Health Administration Mental Health Care System

Veterans included in the study had at least one inpatient episode or outpatient encounter in FY 2006 with a primary or secondary diagnosis of one of 38 mental health ICD-9 diagnostic codes. To focus on veterans actively engaged with VHA, we required veterans without an inpatient episode to have at least two outpatient visits on different days. There were 783,280 unduplicated veterans with the five targeted mental health diagnoses in FY 2006. They represent approximately 15.1% of all unique veterans using VHA services and had an average of four inpatient discharges for every 10 veterans (1.68 of which were for mental health) and they averaged 3.3 outpatient encounters, of which one was for a mental health diagnosis. For each veteran, the most commonly listed residential zip code was drawn from the inpatient and/or outpatient utilization files.

Methods: Structure of Care

VHA is organized into 21 regionally oriented Veterans Integrated Service Networks (VISNs), which are designed to pool and align resources in order to better meet local healthcare needs and improve access to care. Each VISN has a nonoverlapping, defined geographic area of responsibility. Within each VISN there are smaller, nonoverlapping defined geographic areas of responsibilities, which we define as major facilities. Typically a major facility includes a Veterans Affairs Medical Center (VAMC) with its associated community-based outpatient clinics (CBOCs), domiciliaries, or nursing homes. We identified 138 unique major facilities and recorded the latitude and longitude of each VAMC and CBOC1 (http://www.research.va.gov/resources/pubs/cboc.cfm).

In consultation with VA and the VHA, we designed a web-based facility survey to examine the availability and attributes of mental health services provided by VHA as of May 2007. The targeted respondent was the supervising chief of mental health. All major facilities provided data.

The facility survey contained two broad types of questions. The first type asked about availability, hours of operation, and location of basic, specialized, and consumer-oriented mental health services. Basic services include acute inpatient beds, psychotherapy, and pharmacotherapy by mental health providers; walk-in visits; and crisis management services. Specialized services target a specific population, such as women veterans, the homeless, or veterans with PTSD. Consumer-oriented services relate to recovery support and consumer involvement. Basic and specialized services are part of a full continuum of mental health services, and consumer-oriented services were specifically targeted by the MHSP.

The second type of survey questions addressed the availability of 12 evidence-based practices, or those treatments for which there is an empirical base of support linking the delivery of the practice to improved mental health outcomes (Table 1). In order to strengthen the reliability and validity of the self-report, we provided a detailed description of the practice and, where applicable scales exist, asked about the presence or absence of specific characteristics of the practice. We also asked respondents about how much they believed the practice they were reporting on corresponded to an ideal version of the practice. A facility was categorized as not providing the practice only if the survey respondent reported that the practice was not available either within VHA, through contracted care, or by non-VA providers, but paid for by VHA.

Table 1. Evidence-Based Practices Included in the May 2007 Veterans Health Administration (VHA) Facility Survey
  • a

    Not necessarily appropriate to provide at every major facility.

Medication evaluation and managementDescription of history of the presenting illness, nature of the presenting problem, mental status examination, medical decision making, counseling, and coordination of care.
Mental health intensive case management (MHICM) (SAMHSA, 2003a)The goal of MHICM is to help people with serious mental illness stay out of the hospital and develop skills for living in the community. MHICM offers services that are customized to the individual needs of the consumer, delivered by a team of practitioners and available 24 hr a day.
Supported employment (SAMHSA, 2003b)Supported employment is a well-defined approach to helping people, with mental illnesses, find and keep competitive employment within their communities. The focus is community jobs anyone can apply for that pay at least minimum wage, including part-time and full-time jobs.
Family psychoeducation (Murray-Swank & Dixon, 2005)Family psychoeducation is focused on improving the well-being and functioning of the patient and meeting the family members' need for education, guidance, and support as the family participates in the ongoing care of an ill relative.
Cognitive behavioral therapy (Kingdon, 2006; Turkington, Kingdon, & Weiden, 2006; VA/DoD Clinical Practice Guideline Working Group, 2003)Cognitive behavioral therapy (CBT) is a structured and time limited therapy that combines elements of cognitive and behavioral approaches, and emphasizes both behavioral activation and changes in negatively biased patterns of cognition.
Intensive outpatient treatment for substance use disordersIntensive outpatient treatment should include outpatient treatment for at least 3 hr a day 3 days a week.
Psychosocial interventions for substance use disorders (American Psychiatric Association, 2006; Rollnick & Miller, 1995)Refers to cognitive behavioral relapse prevention, brief motivational intervention, contingency management therapy, contingency contracting, and practitioner-led 12-step facilitation counseling.
Opiate substitution maintenance therapy with methadone or buprenorphineRefers to the use of methadone or buprenorphine for maintenance rather than detoxification. Combines pharmacotherapy with a full program of assessment, psychosocial intervention, and support services.
Integrated dual-diagnosis therapy (SAMHSA, 2003c)Integrated treatment for people who have cooccurring disorders, mental illness, and a substance use disorder. This integrated dual diagnosis treatment (IDDT) approach offers services for both mental health and substance use disorder (SUD) at the same time and in one setting.
Specialized therapies for posttraumatic stress disorder (PTSD) (The Committee on Treatment of Posttraumatic Stress Disorder, 2007; VA/DoD Clinical Practice Guideline Working Group, 2003)Includes image rehearsal, exposure therapy, and cognitive processing therapy.
Treatment with clozapineClozapine is used for treatment of refractory schizophrenia.
Electroconvulsive therapya (American Psychiatric Association Task Force on Electroconvulsive Therapy, 2001)ECT for bipolar disorder, major depressive disorder, or schizophrenia

An index score from 0 to 11 was created to summarize the availability of 11 of the 12 identified evidence-based practices listed in Table 1 (excluding electroconvulsive therapy as it is not necessarily appropriate at every major facility).

Based on this information, we created a series of maps illustrating the relationship between the reported availability of the 11 evidence-based practices at major VHA facilities and the size of the target population that might benefit from each evidence-based practice. We intended to estimate the maximum number of veterans that might benefit from expanding the availability of an evidence-based practice to locations where it was unavailable as of the May 2007 survey. We note that this analysis does not identify major facilities where an evidence-based practice may be offered, but where the facility has insufficient capacity to meet demand. Nor does it address the fidelity with which the intervention was delivered.

Results

Figure 2 graphically represents the locations of the study cohort, as well as VISN and major facility boundaries, medical centers, and CBOCs. Note that this does not reflect density because facility boundaries vary greatly in size. The 25% of major facilities with the fewest number of study veterans are illustrated with the lightest level of shading.

Figure 2.

2006 Veterans Health Administration Study Cohort Veterans (All Diagnoses) by Major Facility

Figure 3 shows the total number of evidence-based practices reported to be available at each major facility as of May 2007. With the exception of opiate-substitution maintenance therapies, every evidence-based practice was provided in at least one location in every VISN, and most were offered in the majority of major facilities. Clozapine therapy, psychosocial interventions for SUD, specialized therapies for PTSD, and cognitive behavioral therapy were available in at least one location in nearly all major facilities; opiate substitution maintenance therapies and integrated dual diagnosis therapy stand out as evidence-based practices that were reported to be not as consistently provided. Within a major facility, survey respondents reported that evidence-based practices were primarily offered at medical centers rather than the community-based outpatient centers.

Figure 3.

Total Evidence-Based Practice Availability, May 2007 (N=138 Major Facilities)

We considered the size of the diagnostic cohorts as an upper estimate of the target population that might benefit from receipt of an evidence-based practice, recognizing that not every veteran with a particular diagnosis will need or want a specific evidence-based practice. We provide results for a single evidence-based practice as an exemplar; additional maps are available upon request. Figure 4 illustrates our analysis for specialized therapies for PTSD and shows the gap between its reported availability (either directly or through contract or non-VHA care paid for by VA) and an upper estimate of the target population of veterans with a diagnosis of PTSD that might benefit from the practice. As of the May 2007 survey, the practice was available in 119 out of 138 major facilities. While most major facilities with large concentrations of the target population are served by specialized PTSD programs, 33,146 veterans with PTSD diagnoses reside closest to the 19 major facilities where the treatment was not reported to be offered as of May 2007.

Figure 4.

Major Facilities With and Without Specialized Therapies for Posttraumatic Stress Disorder (PTSD) by Veterans Health Administration Patient Population in PTSD Cohort, May 2007 (Includes Contract and Fee-Based Care)

Discussion

In this paper we present a comprehensive framework for evaluating the performance of mental healthcare systems. In doing so, we demonstrate that it is feasible to overcome several major challenges typically associated with quality measurement. These include the difficulty in systematically developing quality measures, limits of administrative databases for providing relevant quality information and the challenge of simultaneously collecting and interpreting data on structure, process, and outcomes. For all of these challenges, a VHA leadership and culture supportive of performance measurement and measure development and ready access to detailed administrative and medical record data via an extensive and sophisticated electronic medical record system, make VHA the ideal system within which to advance the field of mental health quality.

By applying this evaluation framework to the VHA mental healthcare system, we also provide information about the structure of VHA mental health services at an early point in the system's transformation under the MHSP, which may then be used to evaluate improvements over time. Results from the first facility survey indicate that, with the exception of opiate substitution maintenance therapies, every evidence-based practice was provided in at least one location in every VISN, and most were offered in the majority of major facilities. However, availability of an evidence-based practice within a VISN or a major facility boundary may not be sufficient to ensure access, depending on travel distances to the location where the service is provided. In addition, particularly for the large western VISNs, which are similar in size to large states, location of a practice at a single site within the VISN is unlikely to result in realized access. Finally, although geographic access to services is an important step toward building a high-performance mental healthcare system, equally as important are timely availability and how closely the implementation of the evidence-based practice adheres to empirically developed standards described in the literature, a quality known as fidelity.

The facility survey data acquired are unique in that everyone who qualified to complete the survey did so and the data were nearly complete. However, the validity of the self-reported survey findings is dependent upon accurate and unbiased response to the survey questions. In addition, concerns about unintentional errors of omission, confusion, or false information cannot be discounted. Although the survey went through a pilot process, no formal validation assessment was conducted. Thus, it is possible that some of the questions may have been unclear to the respondents or, at the very least, subject to different interpretations.

The main strength of using administrative data was their availability and comprehensive enumeration of the study population (Iezzoni, 1997). Moreover, the databases were sufficiently large to analyze population subgroups and specific geographic areas separately, which was particularly useful, given variation in problems related to access and availability across populations or within areas. Some administrative data were missing and data accuracy could not be guaranteed. This is not uncommon when data are collected and used for different purposes; studies support the use of administrative data combined with chart review as a more accurate way to assess performance (Newschaffer, Bush, & Penberthy, 1997; Parkinson, 2002; Steinwachs et al., 1998).

The results of this structural component of our evaluation suggest some specific short-term actions, including confirming the findings regarding practice availability, increasing efforts to disseminate protocols, clarifying national policies about the target populations, and establishing fidelity standards. With regard to its ongoing efforts to advance the overall quality of the VHA mental health system, the VA might also consider enhancing administrative data collection to include more information about the availability of evidence-based practices at the VISN level and nondiagnostic clinical information relevant to quality measurement (First, Pincus, & Schoenbaum, 2009), and expanding the use of objective, evidence-based tools for program evaluation.

While the structure-process-outcomes evaluation model holds great promise for advancing the science of mental healthcare quality improvement, a few final caveats are in order. First, in any healthcare system, the progression from evidence-based practice guidelines to performance indicators to improved patient outcomes is fraught with complexity. Great care must be taken to improve the validity of measurement through effective and efficient documentation that limits the burden of measurement. In addition, continued awareness of the complicated linkages between evidence-based practice and individual patient preferences and outcomes is essential. As recent studies amply demonstrate (Krumholz & Lee, 2008; The Action to Control Cardiovascular Risk in Diabetes Study Group, 2008), even the most basic of evidence-based practice improvements can result in different outcomes for different patients and for different reasons.

Second, not all mental healthcare systems look or operate like the VHA. Public and private sector mental healthcare functions largely as a cottage industry, with the majority of psychiatrists practicing in solo or two-physician practices; limited information technology; few centralized administrative databases; and no single entity or organization responsible for implementing and monitoring quality improvement strategies. While these differences must be recognized and addressed in the context of ongoing quality improvement, the same high-quality standards should nevertheless apply.

Third, to what extent this model can be adapted for use in other systems and in other contexts is not clear. Certain components of the model may be more suitable for mental health quality improvement efforts at the national or state levels or in large systems (e.g., managed care networks), while others may work well in more localized contexts (e.g., community mental health centers). Lastly, we note that the performance indicators and data collection instruments (e.g., facility and client surveys) that were developed as part of this study, while developed through systematic evidence-based processes, have not been fully validated.

The VA has undertaken the most extensive, systematic, and rigorous evaluation of mental healthcare delivery ever conducted. The framework, methodology, and preliminary results offer fertile ground upon which other stakeholders in the mental health field can continue to build and expand.

Footnotes

  1. 1VA facilities in Puerto Rico and the US Virgin Islands were excluded from these analyses.

Acknowledgments

This project was funded by the Department of Veterans Affairs (VA) (Contract Number: GS 10 F-0261K, 101-G67214/101-G67215; Program Evaluation of Veterans Health Administration Mental Health Services).

We would also like to acknowledge the following: Irving Institute for Clinical and Translational Research at Columbia University (UL1 RR024156) from the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). Mental Heath Therapeutics CERT at Rutgers, the State University of New Jersey funded by the Agency for Healthcare Research and Quality (AHRQ) (5 U18 HS016097).

Our Clinical Advisory Group and external experts were Dewleen Baker, Amy Cohen, Bob Gresen, Amy Kilbourne, JoAnn Kirchner, Jim Lohr, Lynnette Nilan, Chris Reist, Craig S. Rosen, Robert Rosenheck, Greer Sullivan, Michael Thase, Rohan Ganguli, Andrea Fagiolini.

Disclosures: None.

Authors' Biographies

Katherine E. Watkins (MD, MSHS), is a Senior Natural Scientist at RAND in Santa Monica, CA. She uniquely combines a research background in substance abuse and mental health treatment services with a clinical background in treatment for substance use and co-occurring disorders, and has focused on training substance abuse providers to deliver evidence-based mental health services to individuals with drug and alcohol problems. She is particularly interested in increasing access to treatment for vulnerable populations, and in improving care for individuals with co-occurring mental disorders and substance abuse.

Donna J. Keyser, PhD, MBA, is a management scientist at the RAND Corporation in Pittsburgh, PA, and associate director of the RAND-University of Pittsburgh Health Institute. She is involved in numerous government- and foundation-funded projects focused on improving healthcare processes and outcomes.

Brad Smith, PhD, is a Senior Analyst with Altarum Institute in San Antonio, TX. He works in the Institute's Health Quality Research group on projects for clients including the US Department of Veterans Affairs and the State of Texas Health and Human Services Commission.

Thomas E. Mannle, Jr., MPA, is an independent consultant specializing in analysis and program evaluation of Departments of Defense and Veterans Affairs healthcare delivery programs. He was formerly a Vice President at The Lewin Group Inc.

Daniel R. Kivlahan, PhD, is Director of the Center of Excellence in Substance Abuse Treatment and Education (CESATE) at VA Puget Sound in Seattle and Clinical Coordinator of the VA Quality Enhancement Research Initiative to implement evidence-based practices in treatment of substance use disorders. He is an Associate Professor in the Department of Psychiatry and Behavioral Sciences at the University of Washington.

Susan M. Paddock, PhD, is a Senior Statistician and Head of the Statistics Group at the RAND Corporation in Santa Monica, CA. Dr. Paddock is the project statistician for the external program evaluation of the VA's care for seriously mentally ill veterans.

Teryn Mattox, MPA, Public Administration, Harvard University, is on research staff at RAND and a Research Analyst with First 5 Los Angeles. Before joining RAND, Ms. Mattox worked with UNICEF India to manage a study assessing the impact of birth registration policy in rural India.

Marcela Horvitz-Lennon, MD, MPH, an Assistant Professor of Psychiatry in the Department of Psychiatry at the University of Pittsburgh Medical School in Pittsburgh, PA. She maintains significant clinical responsibilities and conducts mental health services research with a focus on quality and outcomes of care for people with severe mental illness.

Harold Alan Pincus, MD, is a senior scientist at the RAND Corporation and Professor and Vice-Chair of the Department of Psychiatry at Columbia University. He is also Director of Quality and Outcomes Research at New York-Presbyterian Hospital.

Objectives

By participating in the independent study offering the reader will be able to

  • 1Explain the Donabedian model of health care quality as it applies to mental health
  • 2Discuss different ways to measure structure of care as it applies to mental health
  • 3Identify 3 challenges to measuring quality of care

Journal for Healthcare Quality is pleased to offer the opportunity to earn continuing education (CE) credit to those who read this article and take the online posttest at http://www.associationoffice.com/nahq/etools/products/addprodtocart.cfm?primary_id=8036-290&web_text_id=2461&count=0&Subsystem=ORD. This continuing education offering, JHQ 226, will provide 1 contact hour to those who complete it appropriately.

Core CPHQ Examination Content Area

IV. Patient Safety

Questions:

  • 1What are the advantages of quality measures of outcome over quality measures of process?
  • a. Easier to measure
  • b. It is easier for health care system to know how to focus quality improvement efforts when outcomes are measured
  • c. The end goal of health care is improved outcomes
  • d. Outcomes are determined by many factors, some of which are unrelated to health care
  • 2The ‘grandfather’ of the quality of care field was:
  • a. Robert Brook
  • b. National Center for Healthcare Quality
  • c. The Veterans Health Administration
  • d. Avedis Donabedian
  • 3Quality can be measured by
  • a. Structure of care
  • b. Process of care
  • c. Outcomes of care
  • d. All of the above
  • 4Which statement is true?
  • a. Process can be measured by looking at the whether patients symptoms have improved
  • b. Data on structure can include both administrative data and survey data
  • c. The VHA is organized into 36 VISNs
  • d. Each VISN is the same size and serves approximately the same number of patients
  • 5When describing the structure of mental health care it is important to include
  • a. The location of basic and specialized services
  • b. Hours of operation
  • c. Availability of evidence based practices
  • d. The continuum of care
  • e. All of the above.
  • 6Which one is not an evidence-based practice?
  • a. Clozapine
  • b. Supported employment
  • c. Peer support
  • d. Cognitive behavioral therapy for depression
  • 7True or False
  • a. Enumerating the number of individuals with a particular diagnosis is an upper estimate of the number of people who might benefit from a given evidence-based practice.
  • 8Although services may be reported as available
  • a. Capacity may not be sufficient for demand
  • b. The services may be delivered with fidelity
  • c. Providers often choose not to deliver services
  • d. The extensive list of services confuses practioners
  • 9Which statement is false?
  • a. The validity of self-report depends on accurate and unbiased responses
  • b. Providing an evidence-based practice will always result in improved outcomes
  • c. Patient preferences are important when assessing the continuum of care
  • d. Availability of a practice may not be sufficient to ensure access
  • 10Which are challenges typically associated with quality measurement?
  • a. There are too many indicators among which to choose
  • b. Administrative data is the gold standard for determining diagnosis and severity, important factors for risk adjustment
  • c. Most organizations don't care about quality measurement
  • d. Simultaneously collecting and interpreting data on structure, process and outcomes is logistically complex

Ancillary