SEARCH

SEARCH BY CITATION

Keywords:

  • health services research;
  • quality of care;
  • implementation;
  • organizational change;
  • clinical practice

Abstract

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

BACKGROUND: The persistence of a large quality gap between what we know about how to produce high quality clinical care and what the public receives has prompted interest in developing more effective methods to get evidence into practice. Implementation research aims to supply such methods.

PURPOSE: This article proposes a set of recommendations aimed at establishing a common understanding of what implementation research is, and how to foster its development.

METHODS: We developed the recommendations in the context of a translation research conference hosted by the VA for VA and non-VA health services researchers.

IMPACTS: Health care organizations, journals, researchers and academic institutions can use these recommendations to advance the field of implementation science and thus increase the impact of clinical and health services research on the health and health care of the public.

In recent years, research sponsors, policy makers, and the public have increasingly viewed the “quality chasm,” or gap between what we know medical care should deliver based on principles of safety and efficacy and what patients actually receive, as an imperative for change.1 The Institute of Medicine's Clinical Research Roundtable identified the need for research that completes the cycle from basic scientific discovery to clinical research,2 including the need to invest in research on how to move the results of clinical research into clinical and public health practice. The purpose of this paper is to consider how the field of health services research can best facilitate improvements in the consistency, speed, and efficiency with which sound clinical and health care research evidence is implemented over the next several years. The paper is directed at health services research funders, health care organizations that use or want to use health services research, clinicians and clinician educators who participate in health services research, health care journal editors and reviewers, and implementation researchers themselves.

The recommendations in this paper arose from discussions among the participants in a workgroup convened at an implementation research conference convened by the VA Quality Enhancement and Research Initiative (QUERI) in August 2004 (see Acknowledgments). Any errors of interpretation are those of the authors.

DEFINING IMPLEMENTATION RESEARCH

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We first recommend that the health services research community adopt, through expert panel or similar methods, a common definition of implementation research. As a starting point, we propose the following definition:

Implementation research consists of scientific investigations that support movement of evidence-based, effective health care approaches (e.g., as embodied in guidelines) from the clinical knowledge base into routine use. These investigations form the basis for health care implementation science. Implementation science consists of a body of knowledge on methods to promote the systematic uptake of new or underused scientific findings into the usual activities of regional and national health care and community organizations, including individual practice sites.

To build this body of knowledge, implementation researchers focus on understanding and influencing the process of uptake of scientific findings by applying and developing theories on why health care providers do what they do, and on how to improve their performance. In using the term health care provider, we intend to include both health care professionals and nonprofessionals who carry out health promotion or clinical care activities within either health care or community organizations. This science's ultimate goal is to improve the health of the public through equitable, efficient application of rigorously evaluated scientific knowledge.

IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We recommend that the health services research community emphasize, in its publications, conferences, and strategic plans, the critical role of implementation research within the broader translation framework developed by the Institute of Medicine.2,3 Any efforts to improve translation will ultimately fail if the final step of actually implementing the research across broad populations is not completed. Yet, scientists have been slow to realize that the methods for achieving this step are often substantially different from those that apply earlier in the pathway, and are currently not well understood.

The Institute of Medicine's framework identifies 2 common “translation blocks” requiring translation research. The first represents impeded movement from basic science discoveries into clinical studies. The second represents impeded progress from clinical study results into health systems and medical practice. Close examination of the diversity of potential research strategies applicable to the second block shows the need for more detail. In Figure 1, we have modified the original conceptualization to identify 3 translation blocks. The second block now identifies the need to translate the results of clinical studies into practice standards, or guidelines. This translation process involves ensuring that the available body of clinical studies has addressed enough of the relevant issues, such as applicability to diverse populations and feasibility of use under routine conditions, to support national consensus on quality standards, such as guidelines. It also addresses the process of creating clinical guidelines or standards, including the conduct of metaanalyses and literature syntheses for this purpose.

image

Figure 1.  Translating research into practice.

Download figure to PowerPoint

The third translation block occurs when the progress of scientific clinical evidence into routine practice stalls despite broad consensus on the validity of the evidence. When standards or guidelines diffuse into routine practice without additional research, no third translation block occurs. Implementation science identifies methods for overcoming the third translation block. In doing so, implementation science supports the validity of the health research enterprise as a whole by transmitting that enterprise's benefits directly to the consumers who ultimately fund it.

Methods for overcoming the third translation block can be conceptualized as quality improvement interventions (QIIs). Quality improvement interventions are policies, programs, or strategies that aim to improve quality of care for clinical or community populations and, thus, put guidelines into practice. Implementation research aims to overcome the third translation block by creating new knowledge about how best to design, implement, and evaluate QIIs. Some of this knowledge comes, for example, from descriptive studies that identify determinants of poor care, some from qualitative studies of the change process or barriers to change, and some from studies that implement and evaluate QIIs.

TAKING ACCOUNT OF IMPLEMENTATION SCIENCE

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We recommend that health services research funders promote the development of literature and materials that summarize current implementation science. Implementation science draws from a wide theoretical and empirical base. Because it is scattered throughout journals and books from diverse fields, its core literature is difficult to access. Examples of the types of literature needed include summaries of how to carry out implementation research,4–6 systematic reviews of empirical studies of implementation,7–13 reviews of relevant theoretical constructs,14–19 and literature on methodologic issues relevant to implementation science.20–24 The technical manual category includes accessible works for those contemplating designing their own implementation studies.4–6 The systematic reviews of empirical studies identify the types of interventions known to have an impact, such as audit and feedback and clinical reminders, and the expected magnitude of impact of such interventions.7–13 The reviews of theoretical constructs are important for setting the stage for what theories need to be tested in future studies; they cover diffusion of innovations, psychologic theories of behavior change, and organizational culture.14–19 The methodologic papers deal with particularly thorny issues that are often either neglected or misunderstood, such as how to design and analyze cluster or place randomized trials, how to design studies to specifically test a theoretical construct, and the need to use modeling trials prior to full blown implementation trials to assure that the interventions are feasible and actually affect the constructs they are predicted to change.20–24

MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We recommend historical review of the major implementation research initiatives that have been undertaken by funding agencies in the United States, United Kingdom, the Netherlands, and other countries in the past decade and a half, and cross-cutting analysis of studies within current initiatives. Assessing past successes and failures can improve the efficiency of current efforts, and cross-cutting analyses that generate or test hypotheses about implementation across studies within these initiatives can provide new information beyond the results of individual studies. In the United States, major past efforts include the Agency for Healthcare Research and Quality (AHRQ), patient outcomes research teams (PORTs)25 begun in 1989 and continuing through the 1990s to understand and later to improve quality for a wide variety of conditions; the VA's National Surgical Quality Improvement Program (NSQIP), started in 1991 and continuing to the present, to promote the systematic collection, analysis, and feedback of risk-adjusted surgical data26,27; and the series of studies on depression funded over the last decade by the NIMH Division of Services and Intervention Research.28

These large initial efforts have been succeeded by a second generation of ongoing initiatives that should be rigorously analyzed and evaluated over the next decade. In the United States, this group includes the VA's QUERI, begun in 199929,30; AHRQ's Translating Research Into Practice or TRIP program, begun in 199931; and the Centers for Disease Control's (CDC) Translating Research Into Action for Diabetes (TRIAD), begun in 1999 and including a VA partnership.32–34 Among nongovernmental U.S. funding agencies, the Robert Wood Johnson Foundation's “Pursuing Perfection” initiative, begun in 2002 in partnership with the Institute for Healthcare Improvement (Donald Berwick, MD, MPP, Director), is noteworthy.35 Kaiser Permanente and Group Health Cooperative of Puget Sound are examples of health care organizations that have long histories of funding internal health services research centers.36 The ReAIM framework37 developed at Kaiser Permanente of Colorado and the Chronic Illness Care model developed by Group Health researchers in collaboration with the McColl Institute and the Institute for Healthcare Improvement are examples of new implementation approaches developed by these organizations and applied in the Breakthrough Series.38

REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We recommend active efforts to foster strategic progression of studies within particular topic areas from clinical science toward full implementation23,30 using implementation science and provider behavior theory.29 The progression should occur along 3 dimensions. First, studies should progress along a continuum spanning clinical guidelines or best practices, measurement of quality and quality variations, tests of QII effectiveness, tests of QII spread, and policy development. Second, studies of QIIs should progress from higher researcher control tests of QII efficacy or effectiveness to lower researcher control tests of QIIs as carried out by clinical and community organizations themselves. Third, studies of QIIs should progress from local studies (e.g., α testing) to regional studies (e.g., β testing) to national studies. As viewed along this third dimension, studies move from a focus on effectiveness to a focus on quality impacts, including business outcomes and performance measures, and from a focus on individuals enrolled in studies to populations. Identifying and assessing progression along these dimensions can enable research funders to identify unneeded, repetitive studies of the same techniques as well as continuing gaps in our knowledge base that need research.9,14

Pursuing active progression along these dimensions will ultimately foster better policy development on a national level. Policies based on systematic testing within the targeted political and organizational contexts they aim to influence will be more practical and successful. Such policies can incorporate detailed information on stakeholder costs and values, and more easily avoid unanticipated negative consequences.

INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We recommend that clinical guideline developers routinely incorporate implementation research findings into guideline recommendations. While both quality improvement practitioners and researchers have been quick to use clinical guidelines and best practices as a foundation for care improvement, they have not always applied implementation science within this context. For example, if particular care models or change strategies, such as clinical reminders, have been shown to be effective for ensuring higher quality care for a given health problem, guidelines should incorporate the use of clinical reminders into the guideline recommendation.

APPLYING EXISTING IMPLEMENTATION SCIENCE

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We recommend that researchers and their funders use existing implementation science to develop policies and information dissemination methods that promote adoption of research findings in routine care. Policies should recognize that research products have different propensities for being adopted outside of research, and should anticipate basic implementation support needs. We discuss below the kinds of challenges that should be addressed by these policies.

At the simplest level, we know that complex QIIs cannot be applied either in future research or in clinical settings without detailed information about what was done. Researchers should therefore be required to document all information and tools necessary for understanding how the product was developed, applied, and evaluated. This information should be publicly available in enough detail to support replication and diffusion, such as on the web.

Guideline concordant treatment and management strategies can be thought of as products that may or may not diffuse effectively. Greenhalgh et al.14 identified at least 13 different research traditions related to understanding how innovations diffuse. Among these, Rogers'16 theories about which characteristics make an innovation likely to diffuse is one of the most widely used. Innovations with positive diffusion attributes may have sufficient impact on clinical care through routine dissemination activities such as journal publication and commercial marketing, without any additional effort from the research community, while those with negative attributes may require substantial researcher implementation support.

For example, proton pump inhibitors (pills to treat gastro-esophageal reflux and ulcers), are effective, easy to prescribe, and affect bothersome symptoms. Pharmaceutical companies also have financial interests in promoting these products. Not surprisingly, proton pump inhibitors have been widely adopted with little implementation support from researchers. On the other hand, the finding that placing infants on their backs reduces sudden infant death syndrome (SIDS) did not diffuse based on journal articles, despite its low cost and simplicity. It contradicted prior habits and beliefs among many parents and pediatricians about sleeping position, and had no commercial market stakeholders. Successful dissemination required the use of social marketing research and methods39 and the involvement of researchers and community partners in a “Back to Sleep” campaign supported by at least 5 partner organizations, including the National Institutes of Health.40 As evidence of the impact of this campaign, participants cite a 70% decrease in prone sleeping between 1992 and 1996, along with a 38% reduction in SIDS mortality.41

Unfortunately, many research findings in need of implementation require significant behavior change, and provide no compelling financial or other advantages to those who must enact the change. For example, studies have repeatedly shown low adherence to hand washing recommendations. Grol et al.5 predicted, based on implementation science, that a QII approach that targets a variety of specific barriers to change at a variety of different levels (professional, team, patient, and organization) will be required to achieve lasting changes in hand-hygiene routines. Multicomponent organizational interventions, such as the hand washing QII envisioned above, are often the most effective means of achieving quality goals,7 but have negative attributes in terms of ease of diffusion. Interestingly, this issue of the Journal also contains the description of a successful QII based on the multicomponent organizational Six Sigma approach to implement hand-hygiene guidelines.42 Research organizations must become proactive in anticipating these implementation needs.

NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We recommend that the health services research community actively foster clinical/health services research partnerships that focus on implementation through funding such partnerships, through literature on how to make partnerships effective, and through embedding health services researchers within clinical organizations. Policies that provide incentives for participation on both sides of these partnerships should be developed.

Efforts to increase the rapidity of implementation of clinical and health services research findings are most likely to be successful if they involve broadly based organizational partnerships. Purely top-down initiatives based on fiat are rarely successful, unless bottom-up development has already occurred. Bottom-up development often occurs in the context of partnerships. Such partnerships include those between members of the health services research community, such as research funding organizations, academic and educational organizations, and scientific journals. They also include partnerships between the health services research community and nonresearch-centered organizations, such as those concerned with healthcare delivery, community interests, and policy (Fig. 2). As indicated in the figure, communication between and within these groups through, for example, joint conferences, publications, and strategic planning efforts is essential, because each of these types of partners has enormous influence on the progress of implementation.

image

Figure 2.  Reorganizing health services research to influence community health.

Download figure to PowerPoint

Table 1 lists the wide variety of partners in the United States necessary to (1) design, fund, carryout, and replicate implementation research and (2) to adopt and sustain these changes in routine practice. The listed clinical governmental agencies that also fund health services research, such as the VA, the Health Resources and Services Administration (HRSA), and Centers for Medicare & Medicaid Services Administration (CMS) have a particularly strong interest in the application of research findings to improve the health both of their target populations and of the public as a whole.

Table 1. Important Implementation Research Funders and Partners
Type of PartnerExamples of Implementation Research Funders and Partners*
  • *

    This column is intended to provide examples only; it is not exhaustive.

Research funders
Governmental research entitiesAgency for Health Care Research and Quality, Veterans Administration Health Services Research & Development Service, Centers for Disease Control, National Institutes of Health
Private foundationsCommonwealth, Rockefeller, Robert Wood Johnson, MacArthur, Hartford, Dana Afar, McColl Institute, Soros
Health care funders
Governmental agenciesHealth Resources and Services Administration (HRSA), Centers for Medicare & Medicaid Services Administration (CMS), National Institute on Drug Abuse (NIDA), and Substance Abuse and Mental Health Services Administration (SAMSHA)
InsurersBlue Cross, Pacific Care, and Aetna
EmployersGeneral Motors, Raytheon, General Electric
Health care providers
Governmental health care providersVeterans Health Administration, Indian Health Service, Federally Qualified Health Centers
Staff model HMOsKaiser Permanente, Group Health Cooperative of Puget Sound
Health care provider educational institutions
State and private universitiesSchools of Nursing, Medicine and Public Health
Hospitals involved in post-graduate trainingPrivate, Public Health Care Systems
Quality improvement and health care accreditation organizations
GovernmentalQuality Improvement Organizations (partners with CMS)
NonGovernmentalLeapfrog, Institute for Healthcare Improvement, National Center for Quality Assurance, Joint Commission on the Accreditation of Healthcare Organizations
Journals/publications
Health services research-oriented scientific journalsBMC Implementation Science; Health Services Research, Medical Care; Joint Commission Journals, Quality and Safety in Health Care; Journal of General Internal Medicine, Annals of Family Medicine
General medical or health journalsJournal of the American Medical Association, New England Journal of Medicine, Annals of Internal Medicine
Electronic health information resourcesNational Library of Medicine (PubMed, Ovid, Medline), Up-to-Date, MDConsult, PIER, MicroMedex, FDA website
Professional societies
Involved in promoting health services researchAcademy Health, the Society of General Internal Medicine, the North American Primary Care Research Group and the Society of Teachers of Family Medicine
Potential implementation partnersAmerican Academy of Family Practice, American College of Physicians, American College of Cardiology
Community partners
Advocacy organizationsAlzheimer's Association, Arthritis Foundation, Multiple Sclerosis Society, American Heart Association
Nonhealth community organizationsChurches, schools, local governments, unions

Finally, clinical organizations should find ways to embed health services researchers as active partners in care delivery. This approach has the potential for changing healthcare organizations into “learning organizations.” The VA, for example, which began embedding health services researchers in diverse sites during the 1980s, used this investment to develop information systems, clinical quality monitors, and primary care and economic models in a bottom-up and top-down model that produced rapid, large quality improvements when nationally implemented during the late 1990s.36,43 Without the decade-long bottom-up development process fostered by embedding, it is unlikely that any top-down strategy could have produced such major change.

RECOMMENDATIONS

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

Health services researchers and their sponsors should adopt a systematic approach toward promoting the development and use of implementation science. We identify 8 recommendations that, if carried out, would promote the further development of implementation research over the next several years (Table 2). Health services research organizations and stakeholders should consider identifying strategic goals and plans based on these recommendations, and invest the resources necessary to bring the plans into action.

Table 2. Recommendations to the Health Services Research Community for Promoting the Development of Implementation Research
1. Adopt, through expert panels or similar methods, a common definition of implementation research
2. Emphasize, through publications, conferences, and strategic planning, the critical role of implementation research within the broader translation research framework
3. Promote the development of literature and materials that summarize current implementation science
4. Carry out historical review and cross-cutting analysis of the major implementation research initiatives that have been or are being undertaken by funding agencies in the United States, United Kingdom, the Netherlands, and other countries
5. Foster integration of implementation science into quality improvement and policy development by promoting a strategic progression of studies from research into practice. This progression should minimize duplication and build on existing implementation science and provider behavior theory
6. Ensure routine incorporation of implementation research findings into the clinical guideline development process and final guideline recommendations
7. Develop policies and information dissemination methods that promote adoption of research findings into routine care through existing implementation science-based methods
8. Foster clinical/health services research partnerships that focus on implementation through funding, literature, and mechanisms that embed health services researchers within clinical organizations

Fostering implementation research will initially cause discomfort among researchers, managers and policy makers. Increasingly, managers and policy makers are challenged to provide evidence for the validity of their initiatives, and implementation research may encourage such challenges by raising the evidence bar for decision making. Implementation science advocates will need to be sensitive to the practical demands placed on managers and policy makers, and to learn when a research approach will impede action as well as when it will foster improved results. On the research side, pursuing scientific truth into the complexities of clinical and community settings will demand substantial cultural change. To achieve implementation, the relative scientific comfort of clinical trials must ultimately give way to the greater uncertainty of working with organizations and communities. As Greenhalgh comments,

The shifting baseline of context and the multiplicity of confounding variables must be stripped away (“controlled for”) to make the research objective. But herein lies a paradox. Context and “confounders” lie at the very heart of the diffusion, dissemination, and implementation of complex innovations. They are not extraneous to the object of study; they are an integral part of it.14

Implementation science provides the signposts that make translation of research into routine care more efficient and effective. Context becomes a legitimate objective for scientific study, and a legitimate influence on decision making. In the long run, implementation science supports both basic and clinical science by ensuring that research produces observable improvements in the health of the public, who ultimately foot the research bill.

Acknowledgments

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES

We acknowledge the support of the VA HSR&D Center of Excellence for the Study of Healthcare Provider Behavior, and the VA HSR&D Veterans Evidence-based Research, Dissemination and Implementation Center (VERDICT). We also acknowledge the participants in the working group of the VA State-of-the-Art Conference on Implementation Research that reviewed our paper and provided us with their ideas and commentary, including (alphabetically) Jeroan J. Allison, MD, MP; Anna C. Alt-White, RN, PhD; Caryn Cohen, MS; Joseph Francis, MD, MPH; Allen L. Gifford, MD; Brian Mittman, PhD; Julie J. Mohr, MSPH, PhD; Audrey L. Nelson, PhD; Timothy J. O'Leary, MD, PhD; Marjorie L. Pearson, PhD, MSHS; Gary E. Rosenthal, MD; Theodore Speroff, PhD; Mark L. Willenbring, MD.

REFERENCES

  1. Top of page
  2. Abstract
  3. DEFINING IMPLEMENTATION RESEARCH
  4. IMPLEMENTATION RESEARCH IN THE CONTEXT OF SCIENTIFIC TRANSLATION
  5. TAKING ACCOUNT OF IMPLEMENTATION SCIENCE
  6. MAXIMIZING LEARNING FROM PAST AND CURRENT IMPLEMENTATION RESEARCH INITIATIVES
  7. REDUCING DUPLICATION AND PROMOTING PROGRESSION OF EVIDENCE INTO PRACTICE
  8. INCORPORATING IMPLEMENTATION SCIENCE INTO CLINICAL GUIDELINES
  9. APPLYING EXISTING IMPLEMENTATION SCIENCE
  10. NECESSARY PARTNERS IN INITIATIVES TO INCREASE IMPLEMENTATION
  11. RECOMMENDATIONS
  12. Acknowledgments
  13. REFERENCES
  • 1
    Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington: National Academy Press; 2001.
  • 2
    Sung NS, Crowley WF Jr., Genel M, et al. Central challenges facing the national clinical research enterprise. JAMA. 2003;289: 127887.
  • 3
    Rosenberg RN. Translating biomedical research to the bedside: a national crisis and a call to action. JAMA. 2003;289: 13056.
  • 4
    Veteran's Administration. Health Services Research & Development. Guide for implementing evidence-based clinical practice and conducting implementation research. Available at: http://www.hsrd.research.va.gov/queri/implementation/. Accessed April 8, 2005.
  • 5
    Grol R, Wensing M, Eccles M. Improving Patient Care: The Implementation of Change in Clinical Practice. Edinburgh: Elsevier Limited; 2005.
  • 6
    Godfrey M, Nelson E, Batalden P. Improving health care by improving your microsystem V 2.1. Dartmouth College, 2004. http://cms.dartmouth.edu/images/PDF%20Files/CMAG040104.pdf
  • 7
    Stone EG, Morton SC, Hulscher ME, et al. Interventions that increase use of adult immunization and cancer screening services: a meta-analysis. Ann Intern Med. 2002;136: 64151.
  • 8
    Effective Practice and Organisation of Care Group. Part of the Cochrane Collaboration. http://www.epoc.uottawa.ca/ Accessed August 11, 2005.
  • 9
    Grimshaw JM, Thomas RE, MacLennan G, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004;8: 172.
  • 10
    Øvretveit J, Bate P, Cleary P, et al. Quality collaboratives: lessons from research. Qual Saf Health Care. 2002;11: 34551.
  • 11
    Shojania KG, Grimshaw J. Evidence-based quality improvement: the state of the science. Health Affair. 2005;24: 13850.
  • 12
    Shortell SM, Bennett CL, Byck GR. Assessing the impact of continuous quality improvement on clinical practice: what it will take to accelerate progress. Milbank Quart. 1998;76: 593624.
  • 13
    Shaw B, Cheater F, Baker R, et al. Tailored interventions to overcome identified barriers to change: effects on professional practice and health care outcomes. The Cochrane Database of Systematic Reviews 2005, Issue 3. Art. No.: CD005470. DOI: 10.1002/14651858. CD005470.
  • 14
    Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Quart. 2004;82: 581629.
  • 15
    Fleuren M, Wiefferink K, Paulussen T. Determinants of innovation within health care organizations: literature review and Delphi study. Int J Qual Health Care. 2004;16: 10723.
  • 16
    Rogers EM. Diffusion of Innovations. 4th edn. New York: The Free Press; 1995.
  • 17
    Michie S, Johnston M, Abraham C, Lawton R, Parker D, Walker A. Making psychological theory useful for implementing evidence based practice: a consensus approach. Qual Saf Health Care. 2005;14: 2633.
  • 18
    Scott T, Mannion R, Davies H, Marshall M. Implementing culture change in health care: theory and practice. Int J Qual Health Care. 2003;15: 1118.
  • 19
    Wensing M, Bosch M, Foy R, van der Weijden T, Eccles M, Grol R. Factors in theories on behaviour change to guide implementation and quality improvement in healthcare. Technical Report for the Rebeqi project. European Commission, Fifth Framework. Contract nbr. QLRT-2001-00657.
  • 20
    Bonetti D, Eccles M, Johnston M, et al. Guiding the design and selection of interventions to influence the implementation of evidence-based practice: an experimental simulation of a complex intervention trial. Soc Sci Med. 2005;60: 213547.
  • 21
    Boruch R, May H, Turner H, et al. Estimating the effects of interventions that are deployed in many places: place-randomized trials. Am Behav Scientist. 2004;47: 60833.
  • 22
    Collins LM, Murphy SA, Bierman KL. A conceptual framework for adaptive preventive interventions. Prev Sci. 2004;5: 18596.
  • 23
    Medical Research Council (UK). A framework for development and evaluation of RCTs for complex interventions to improve health. 2000. http://www.mrc.ac.uk/pdf-mrc_cpr.pdf. Accessed August 12, 2005.
  • 24
    Eccles M, Grimshaw J, Walker A, Johnston M, Pitts N. Changing the behavior of healthcare professionals: the use of theory in promoting the uptake of research findings. J Clin Epidemiol. 2005;58: 10712.
  • 25
    Freund D, Lave J, Clancy C, et al. Patient Outcomes Research Teams: contribution to outcomes and effectiveness research. Annu Rev Public Health. 1999;20: 33759.
  • 26
    Neumayer L, Mastin M, Vanderhoof L, Hinson D. Using the Veterans Administration National Surgical Quality Improvement Program to improve patient outcomes. J Surg Res. 2000;88: 5861.
  • 27
    Fink AS, Campbell DA Jr., Mentzer RM Jr., et al. The National Surgical Quality Improvement Program in non-veterans administration hospitals: initial demonstration of feasibility. Ann Surg Sep. 2002;236: 34453; discussion 353–44.
  • 28
    National Institute of Mental Health, Division of Services and Implementation Research. Available at http://www.nimh.nih.gov/dsir/dsir.cfm. Accessed April 8, 2005.
  • 29
    Rubenstein LV, Mittman BS, Yano EM, Mulrow CD. From understanding health care provider behavior to improving health care: the QUERI framework for quality improvement. Quality Enhancement Research Initiative. Med Care. 2000;38 (Suppl 1): I12941.
  • 30
    Veterans Administration. Health services research & development. Quality enhancement research initiative (QUERI). Available at: http://www.hsrd.research.va.gov/queri/. Accessed April 8, 2005.
  • 31
    Agency for Healthcare Quality and Research. Translating research into practice (TRIP-II) fact sheet. Available at: http://www.ahrq.gov/research/trip2fac.htm/. Accessed April 8, 2005.
  • 32
    Centers for Disease Control. Diabetes projects. Translation research projects (TRIAD). Available at: http://www.cdc.gov/diabetes/projects/research.htm/. Accessed April 8, 2005.
  • 33
    Kerr EA, Gerzoff RB, Krein SL, et al. Diabetes care quality in the Veterans Affairs Health Care System and commercial managed care: the TRIAD study. Ann Intern Med. 2004;141: 27281.
  • 34
    Brown AF, Gerzoff RB, Karter AJ, et al. Health behaviors and quality of care among Latinos with diabetes in managed care. Am J Public Health. 2003;93: 16948.
  • 35
    Institute for Healthcare Improvement. Pursuing perfection: raising the bar for healthcare performance. Available at: http://www.ihi.org/IHI/Programs/PursuingPerfection/. Accessed April 8, 2005.
  • 36
    Lomas J. Health services research. Br Med J. 2003;327: 13012.
  • 37
    Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of health promotion interventions: the RE-AIM framework. Am J Public Health. 1999;89: 13227.
  • 38
    Wagner EH, Glasgow RE, Davis C, et al. Quality improvement in chronic illness care: a collaborative approach. Joint Comm J Qual Improv. 2001;27: 6380.
  • 39
    Andreasen AR. Marketing Social Change: Changing Behavior to Promote Health, Social Development, and the Environment. San Francisco: Jossey-Bass; 1995.
  • 40
    McKee M, Fulop N, Bouvier P, et al. Preventing sudden infant deaths—the slow diffusion of an idea. Health Policy. 1996;37: 11735.
  • 41
    Lewin Group I. Factors influencing effective dissemination of prevention research findings by HHS. Contract HHS 100-97-0005, Submitted to the Department of Health and Human Services. Oct. 1, 2001; Final Report, Appendix C.
  • 42
    Eldridge NE, Woods SS, Bonello RS, et al. Using the six sigma process to implement the centers for disease control and prevention guideline for hand hygiene in 4 intensive care units. J Gen Intern Med. 2006;21(Suppl 2): S83S90.
  • 43
    Asch SM, McGlynn EA, Hogan MM, et al. Comparison: quality of care for patients in the veteran's health administration and patients in a national sample. Ann Intern Med. 2004;141: 93845.