Balancing reality in embedded research and evaluation: Low vs high embeddedness

Abstract Embedding research and evaluation into organizations is one way to generate “practice‐based” evidence needed to accelerate implementation of evidence‐based innovations within learning health systems. Organizations and researchers/evaluators vary greatly in how they structure and operationalize these collaborations. One key aspect is the degree of embeddedness: from low embeddedness where researchers/evaluators are located outside organizations (eg, outside evaluation consultants) to high embeddedness where researchers/evaluators are employed by organizations and thus more deeply involved in program evolution and operations. Pros and cons related to the degree of embeddedness (low vs high) must be balanced when developing these relationships. We reflect on this process within the context of an embedded, mixed‐methods evaluation of the Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program. Considerations that must be balanced include: (a) low vs high alignment of goals; (b) low vs high involvement in strategic planning; (c) observing what is happening vs being integrally involved with programmatic activities; (d) reporting findings at the project's end vs providing iterative findings and recommendations that contribute to program evolution; and (e) adhering to predetermined aims vs adapting aims in response to evolving partner needs.


| INTRODUCTION
Embedded research and evaluation programs link rigorous scientific processes and research methods with clinical, business, and operational needs in a healthcare organization. 1,2 This type of partnered linkage is important to the process of implementation science, and allows healthcare systems to function as learning health systems, balancing scientific expertise with quality improvement perspectives in order to identify, evaluate, and implement innovative practices. [3][4][5] An embedded research/evaluation relationship benefits both the healthcare organization and the research/evaluation team. Organizations profit from the methodologic rigor and expertise of research/ evaluation teams while producing evidence relevant to their own context and circumstances. Generated evidence is ready for rapid application to operational needs while the research/evaluation team can have a rapid impact on the services provided to patients.
While embedded research and evaluation has received increased attention in recent years, 6,7 the concept of "embeddedness" has been described in organizational research for over 70 years. Key concepts include the goals leading to embedded relationships, the structures upholding and strengthening these relationships, and the need to develop trust between members of groups who come to the relationship from differing institutional cultures. Additional areas of focus include the process by which organizations and groups learn from each other and maintenance of embedded relationships over time. 8 The potential for embedded research/evaluation has received increasing attention across fields such as education and healthcare. 6,9 The focus on close collaboration between researchers/evaluators and operational leaders within organizations is still an evolving area, and the embedded research/evaluation relationships take many forms.
Researchers and evaluators may have a low degree of embeddedness, with roles akin to outside consultants, or they may be highly involved Partnered Evaluation Initiative. The ongoing evaluation is guided by implementation science frameworks and theories including the consolidated framework for implementation research (CFIR), 26 theory of organizational readiness for change, 27 and theory of diffusion of innovation 28 and uses a variety of qualitative (structured observations, semi-structured interviews, focus groups) and quantitative data (performance data, systems to track practice implementation, surveys) to triangulate evaluation findings. 14,15 Per regulations outlined in VHA Program Guide 1200.21, the SHAARK evaluation has been designated a non-research quality improvement activity. Although QUERI funds teams of research investigators and staff across VHA facilities nationally, these projects are The initial goals of the SHAARK evaluation were developed with original leaders of the DoE program, whose focus was on rapidly standing up a high-visibility program (the Shark Tank), supporting participants, and institutionalizing the program. Aims focused on explaining program participation, motivations, and processes used for bidding in the Shark Tank, and implementation effectiveness in sites receiving facilitated implementation support as a result of having a "winning" Shark Tank bid.
As described above, DoE has expanded its facilitation role over the past 5 years toward promoting spread of successful innovations across the VHA through a combination of direct staff support, training, and active tracking spread across the healthcare system. As a result, SHAARK has also pivoted from a focus primarily on understanding the Shark Tank and early facilitated replication of innovation toward understanding how to help DoE support innovations once they have completed the initial replication stage. The goal is to consider how to help the VHA as a whole take maximum advantage of DoE Promising Practices as it seeks to serve veterans across the country.

| BALANCING THE DEGREE OF EMBEDDEDNESS
Below, we discuss considerations that must be balanced in order to maximize the benefits of embedded research or evaluation. Figure 1 shows five key points of balance that were highlighted through our

| Observing what is happening vs being integrally involved with programmatic activities
The SHAARK team has had to navigate the appropriate level of involvement with programmatic aspects of DoE. Through structured observations and semi-structured interviews, evaluators identified challenges experienced by some local implementation teams that could be traced to poor fit of the practice with local conditions. 14 To help address this challenge, the SHAARK team developed a "QuickView" tool to provide easy-to-read summaries and practice comparisons for the finalist practices in the Shark Tank. This tool uses simple, engaging graphics and color coding with easily understandable icons to help guide Shark bid preparations. The "QuickView" was emailed to Sharks nationwide, before the Shark Tank and hardcopies were broadly available during the Shark Tank. Following positive reception of the "QuickView," a second document, a bid "Wish List" was developed that clearly lays out what needs to be in place at facilities to maximize the likelihood of successful practice implementation.
These documents went from being a product of SHAARK to being fully integrated into DoE operations, an example in which implementation science expertise was applied to the development (and assessment of impact) of a pragmatic tool.

| Reporting findings at the project's end vs providing iterative findings and recommendations that contribute to program evolution
The evaluation team's focus on building strong partnerships with stakeholders such as DoE leadership, VHA program offices, VHA facility leaders, and participants in DoE programs has prompted the delivery of frequent reports, recommendations, and discussions. As illustrated above, the provision of iterative findings has influenced Recognize that aims may change Embedded research/evaluation is not static. While bias can be reduced by following through on initial aims, there are times when aims and methods must be adjusted to maximize the utility of the evaluative effort.
The leadership of DoE has turned over since the beginning of SHAARK, and the program is more mature. As a result, evaluation questions have been renegotiated, moving from a primary focus on the operation of the VHA Shark Tank to a focus on the spread of practices across the VHA.

Negotiate appropriate and reasonable expectations
Care must be taken to ensure that scope is appropriate-flexibility is required to be responsive to partner needs without exceeding the bounds of what is manageable for evaluators and researchers.
Partners regularly discuss how changes in operational needs and evaluation questions may impact needed changes in expectations related to both deliverables and timelines. In other words, if we add one thing, what other thing may need to be given up or delayed.

Describe how embedded research/ evaluation compares to other options
The embedded research/evaluation team should be prepared to describe the value they bring compared to other options the health system may have for evaluation or consultation.
The SHAARK team has a close working relationship with the management consultants supporting DoE. This includes clarifying operations, collaborating on information obtained from innovator/ projects (eg, survey development/ administration), and feeding back results to enhance operations.
Articulate the added value to all partners (operational and researchers/ evaluators) The embedded research/evaluation team should be prepared to explain the ways their work provides value to all partners (eg, identification of strategic partnerships for both groups; evaluation team publications and presentations; positive impact on patients when the program meets with success).
SHAARK has: (a) collected stakeholder feedback and conducted analysis of data related to specific elements of the DoE process with a focus on producing information that can be used to make programmatic adjustments; (b) produced operationally focused reports with information that can be incorporated into briefings provided by DoE to their stakeholder; (c) participated on DoE strategic planning; (d) developed specific products such as the "Quickview" described in this paper, and (e) disseminated results through a variety of presentations and publications so that information on the DoE process can be accessed by different audiences. DoE's evolution far more than would have been expected from an evaluation that provided a single report after several years' study.

| Adhering to predetermined aims vs adapting aims in response to evolving partner needs
There has been continued negotiation as to whether and how evalua- In many ways, this is the reality of any lasting relationship of any kind where the parties are not required to participate. SHAARK has had an initial grant and two subsequent extensions. Ultimately the reason we continue to work together is that there is an opportunity to help the VA enhance innovations that can positively impact veterans' lives while enjoying the process of working together. Through this process, we have summarized a number of lessons that have allowed for the calibration of the degree of embeddedness. These lessons, which are described in more detail in and repeated articulation of the value of the alliance to members of the organization. [30][31][32] Decisions also need to be made about the degree of embeddedness. Partners in the relationship need to decide where they and their relationship will fall on a continuum from "low" to "high" embeddedness. The answer to this question will impact research/evaluation team tasks and involvement in operations and will influence the productivity of the alliance over time.
The potential rewards of these interorganizational strategic alliances are considerable. Researchers extend their networks to ensure relevance and applicability of their work within a learning healthcare system; operational leaders benefit from researcher's methodologic expertise in addressing operational challenges.
Overall, the embedded research/evaluation approach that we describe here has helped to inform VHA's novel DoE program in ways that benefit VHA and that can be shared with outside systems as well.
The strong partnership between the SHAARK evaluation team and