Analysis of FRAME data (A‐FRAME): An analytic approach to assess the impact of adaptations on health services interventions and evaluations

Abstract Introduction Tracking adaptations during implementation can help assess and interpret outcomes. The framework for reporting adaptations and modifications‐expanded (FRAME) provides a structured approach to characterize adaptations. We applied the FRAME across multiple health services projects, and developed an analytic approach to assess the impact of adaptations. Methods Mixed methods analysis of research diaries from seven quality improvement (QI) and research projects during the early stages of the COVID‐19 pandemic. Using the FRAME as a codebook, discrete adaptations were described and categorized. We then conducted a three‐step analysis plan: (1) calculated the frequency of adaptations by FRAME categories across projects; (2) qualitatively assessed the impact of adaptations on project goals; and (3) qualitatively assessed relationships between adaptations within projects to thematically consolidate adaptations to generate more explanatory value on how adaptations influenced intervention progress and outcomes. Results Between March and July 2020, 42 adaptations were identified across seven health services projects. The majority of adaptations related to training or evaluation (52.4%) with the goal of maintaining the feasibility (66.7%) of executing projects during the pandemic. Five FRAME constructs offered the most explanatory benefit to assess the impact of adaptations on program and evaluation goals, providing the basis for creating an analytic approach dubbed the “A‐FRAME,” analysis of FRAME data. Using the A‐FRAME, the 42 adaptations were consolidated into 17 succinct adaptations. Two QI projects discontinued altogether. Intervention adaptations related to staffing, training, or delivery, while evaluation adaptations included design, recruitment, and data collection adjustments. Conclusions By sifting qualitative data about adaptations into the A‐FRAME, implementers and researchers can succinctly describe how adaptations affect interventions and their evaluations. The simple and concise presentation of information using the A‐FRAME matrix can help implementers and evaluators account for the influence of adaptations on program outcomes.

(52.4%) with the goal of maintaining the feasibility (66.7%) of executing projects during the pandemic.Five FRAME constructs offered the most explanatory benefit to assess the impact of adaptations on program and evaluation goals, providing the basis for creating an analytic approach dubbed the "A-FRAME," analysis of FRAME data.
Using the A-FRAME, the 42 adaptations were consolidated into 17 succinct adaptations.Two QI projects discontinued altogether.Intervention adaptations related to staffing, training, or delivery, while evaluation adaptations included design, recruitment, and data collection adjustments.
Conclusions: By sifting qualitative data about adaptations into the A-FRAME, implementers and researchers can succinctly describe how adaptations affect interventions and their evaluations.The simple and concise presentation of information using the A-FRAME matrix can help implementers and evaluators account for the influence of adaptations on program outcomes.

| INTRODUCTION
5][6] Not accounting for the influence of adaptations could potentially lead to misinterpreting evaluation findings and drawing erroneous conclusions about program effectiveness. 7Understanding how adaptations affect interventions and their goals is essential for continuously improving systems to ensure that programs are meeting their desired outcomes in the expected way.
One tool for systematically categorizing adaptations made to interventions is the Framework for Reporting Adaptations and modifications-expanded (FRAME). 4The FRAME describes 10 constructs that capture the process of creating adaptations (eg, when the adaptation occurred, who was involved in the decision), the types of adaptations (eg, what is modified), and why adaptations were made (eg, goals, reasons).Researchers and implementers have used the FRAME to classify intervention adaptations to help assess intervention effectiveness in new settings 8 and to document and describe implementation strategies. 9,10While systematic description of adaptations using the FRAME is helpful, there is greater potential for harnessing the data to enhance the explanatory value of studies.The purpose of tracking adaptations is to not only identify changes, but to understand what changes influenced intervention outcomes and how.However, there is no guidance for how researchers can analyze data about adaptations to meaningfully contribute to interpreting data from implementation and effectiveness studies.
2][13][14][15] The unique context of the pandemic provided an opportunity to examine adaptations across multiple health services research and quality improvement (QI) projects using the FRAME and develop a process for analyzing the documented adaptations.

| Aims and research questions
The aim of this study was to apply the FRAME to characterize adaptations made to health services projects within one academic research center during the early stages of the COVID-19 pandemic and develop a process to analyze data about adaptations.Our specific research questions were: 1. How were interventions and their evaluations modified in response to the COVID-19 pandemic?2. How can information captured in the FRAME be utilized to help evaluators make sense of the impact of adaptations on intervention and evaluation outcomes?
Following our analysis, we introduce a novel approach for analyzing the impact of adaptations captured using the FRAME.

| METHODS
We conducted a mixed methods analysis of the project adaptations captured in research diaries during the early stages of the COVID-19 pandemic.

| Sample
We collected research diaries from active QI and research projects within one health services research unit in an academic medical school.As our aim was to identify adaptations made due to the COVID-19 pandemic, we only included projects which were either: preparing for implementation, preparing to collect data, or collecting data.We excluded projects in the analysis or reporting phases.Seven projects were considered active and included in our analysis.

| Data collection
We created research diaries for each project to record daily and/or weekly observations of project events in Microsoft Excel.Entries included the date of the event, summary of the event, whether the event related to project implementation or evaluation, how the researcher was made aware of the event (eg, email, meeting, observation, or interview) and a link to the information for reference.The research diaries were created specifically to align tracking practices across projects.These research diaries formed the basis for abstracting data related specifically to adaptations.
Project teams nominated one researcher, all of whom are authors on this paper, from each project to identify adaptations within project diaries and extract adaptations into an Excel matrix for further analysis (referred to as "FRAME project matrices").The nominated researcher was part of the QI or research project team and thus considered knowledgeable in deciding whether a project change was an adaptation or not.To guide the identification of adaptations, we used Wiltsey-

Stirman and colleagues' definition of modifications and adaptations:
"Modifications can include adaptations, which are planned or purposeful changes to the design or delivery of an intervention, but they can also include unintentional deviations from the interventions as originally designed" (Wiltsey-Stirman et al, 2013, p. 2) 5 .For simplicity, we refer to all changes as "adaptations."Researchers also referred to the FRAME codebook (available online: https://med.stanford.edu/fastlab/research/adaptation.html) to guide decision making, as this provided further information on the different elements of adaptations that might be considered.Where researchers were not sure whether to classify an event as an adaptation, they discussed this with the project implementers, when possible, to arrive at a final decision.
Using Wiltsey-Stirman and colleagues' 2019 4 and 2013 5 articles, the FRAME figure outline (Figure 1), and the FRAME coding manual, each nominated researcher categorized the adaptations for their respective project using the categories within each FRAME construct    and provided a brief explanation for the categorization.When an adaptation did not fit into any of the existing FRAME categories, the researcher created a new category and provided a description.
The researchers representing all seven projects met biweekly during the 4-month period to discuss new project adaptations and align on use and understanding of the FRAME categories.We found some inconsistencies between the figure and manual; these discrepancies were resolved through group discussion.Each row in the FRAME project matrices represented one adaptation, and each FRAME construct was a column heading.Additional columns were added to facilitate cross-project analysis: project name, adaptation name, and description of the original project component so researchers external to the project could interpret the adaptation.
We also included a column to capture the researcher's assessment of the short-term impact of the adaptation (Rabin et al 6 ), and an open category for researchers to note information not represented in the FRAME (see Table S1 in supplemental material).FRAME project matrices were reviewed at two timepoints by HZM and LMH for reliability in researcher understanding and completeness.
Any discrepancies were discussed with the project researcher to ensure similar descriptions were categorized in the same way, working together to clarify or recategorize the adaptation.

| Data analysis
Our analysis process consisted of three steps.
Step 1, we used Excel to calculate the frequency of adaptation categories by FRAME construct across all projects to see the range of FRAME categories represented by the identified adaptations.
Step 2, we reviewed adaptations within projects individually to qualitatively assess the impact adaptations had on project goals.To assess impact, we examined the categories for each adaptation within projects in relation to the Rabin et al 6 construct of perceived short-term impact, including impacts to reach, adoption, and implementation, and the FRAME construct "relationship to fidelity", whether the adaptation preserved or altered the intervention's core elements or functions.The researcher used their depth of knowledge about the intervention to determine whether the adaptation had any short-term impact or impact on fidelity.We looked for patterns in researcher's descriptions to identify whether information captured in any construct provided more or less explanation as to whether an adaptation was rated as having an impact or not, for example, whether a change in the content vs a change in the format was driving the impact on the intervention as a whole.
Step 3, we looked to see whether adaptations within projects were linked or related, and thus could be consolidated into one adaptation with more explanatory value for the project.For example, Project F was a national, multi-site study where multiple sites reported shifting recruitment of patients from inperson to telephone, and thus individually reported recruitment adaptations were consolidated into one adaptation.In assessing the cumulative impact of adaptations, we merged and refined the FRAME constructs into the conceptual domains of what was modified, how it was modified, why it was modified, and whether the core function was altered (ie, impact).

| RESULTS
Table 1 describes the seven projects included in our analysis.Four projects evaluated local level QI initiatives within one hospital system; Access to resources  3.2 | Analysis of FRAME data: assessing the impact of adaptations within projects (steps 2 and 3) During step 2, we found that the FRAME constructs that provided the most explanatory details as to why an adaptation was rated as having an impact or not on the project were "What is modified", "What is the nature of the content modification," "What was the goal," "Reasons," and "Relationship to fidelity."The remaining FRAME constructs did not offer any explanatory information for how project goals and core function were impacted.In addition, we found that an essential starting point for assessing the impact of the adaptation was first understanding the original plan, along with the perceived shortterm results of the adaptation.
Step 3 combined adaptations within projects which appeared to cumulatively contribute to the same identified impact on a project.This process of synthesizing discrete adaptations into a more complete picture of the nature and impact of the change led us to refine the presentation of the FRAME constructs into what we call the analysis of FRAME data (A-FRAME) matrix.and can also be used to succinctly summarize adaptations and their impacts for use by implementers and researchers (see Table 4).
Sifting FRAME data from the project matrices into the A-FRAME, we identified three thematic groups for the types of adaptations made to the seven projects: (a) project discontinuation, (b) intervention adaptation, and (c) evaluation adaptation.As shown in Table 4, the 42 unique adaptations presented in Table 2 were consolidated into 17 succinct adaptations representing 12 distinct groupings in the A-FRAME matrix within the three thematic groups, described below.

| Project discontinuation
The thematic grouping project discontinuation was used to identify a project that was discontinued and resources reallocated for a different purpose.Two QI projects (C and D) were deprioritized in favor of COVID-19-related projects.In both instances, the only modification was the decision to stop the project, bringing both the intervention and evaluation to an end.Since the projects were not completed, the ability to achieve the intended goals and outcomes of the project were not realized.

| Intervention adaptation
Changes related to the intervention being implemented were classified under the intervention adaptation thematic group, which com-  Additional intervention adaptations that did not alter core function were changes made to overall intervention delivery timelines or new safety protocols.For example, Project F's intervention was to provide palliative care to upper gastrointestinal cancer patients scheduled to undergo surgery, but because surgeries were postponed, the corresponding intervention was also paused.The intervention itself and the way it was delivered remained unchanged.

| Evaluation adaptation
As noted above, the FRAME category training and evaluation was divided into two categories, with evaluation adaptation becoming a stand-alone thematic group as evaluation was perceived to be separate from the intervention.Evaluation adaptations included changes to study recruitment, data collection, and study design.
All evaluations were mixed methods, but the one project where the quantitative approach was modified was the only evaluation modification regarded as altering the core function.Project B's evaluation changed from a design to assess effectiveness with a comparison group to an observational study, which was perceived to be a critical change to what stakeholders would be able to conclude from the evaluation.Whereas the quantitative components of all other active projects remained unchanged, and the adjustments to the qualitative approaches were deemed not to alter the core function.In fact, the adaptations to the qualitative methods allowed for data to continue being collected not only to assess the implementation and effectiveness of interventions, but also document the effects of the pandemic that would otherwise be missing if only gathering quantitative data.
Commonly reported evaluation adaptations included changing the mode of data collection to remote methods (eg, telephone interviews) and pausing recruitment efforts, which typically corresponded to pauses in intervention delivery.Such changes were deemed not to impact the evaluation's goals because data were intended to be triangulated and used with other data sources, and evaluations were able to continue or continued after a delay.Some projects (A, E, G) added questions to their data collection tools (ie, interview guides, surveys) to explore the impact of the COVID-19 pandemic on their study population.This was perceived to add value by exploring the context of implementation during the pandemic, thus were seen to enhance the explanatory value of evaluations.

| DISCUSSION
The FRAME is comprehensive in capturing the who, what, when, how, and why of project adaptations, but how to organize that information to understand the impact of changes on projects is missing. 6We described an analysis process used across seven health services QI and research projects to make meaning out of systematically captured data on adaptations; we called the resulting analytic framework the "A-FRAME."This process identified that not all FRAME constructs and categories are necessary for assessing the impact of adaptations on projects and some adaptations within projects could be combined.
When Haley and colleagues 16 utilized the FRAME to describe adaptations of implementation strategies, the authors similarly included and excluded FRAME constructs.We applied a similar strategy and demonstrated that the inclusion and exclusion of selected FRAME constructs during the analysis process worked across seven different projects.Additionally, we incorporated a description of the original program or evaluation to better inform how the plans were impacted, which Miller and colleagues' 10 also included in their framework for recording adaptations and modifications for implementation strategies.Conceptually consolidating the constructs into the what, how, why, and impact on core function, along with project summary that constitute the A-FRAME matrix domains, as described and shown in Tables 3 and 4, cognitively reduced the burden of information on adaptations making it easier to explore and assess the impact of those adaptations on each project.
Comparing the impact of COVID-19 on the evaluation plans of seven different projects was illuminating in terms of the value of mixed methods for evaluations.In particular, the inclusion of qualitative approaches appeared to add value to projects as most were able to pivot their data collection strategy to incorporate data to explore the impact of the pandemic on the intervention and understand changes in the organizational context of implementation.Qualitative research methods allow for responsive and flexible data collection and helps present a more complete and nuanced understanding of program outcomes and implementation. 17,18We found that quantitative designs were less flexible and less able to capture the dynamic nature of the pandemic and its influence on implementation and outcomes.
Qualitative evaluation methods in our sample were more amenable to adapting to the unexpected events that arose during the pandemic.
Qualitative designs may therefore increase learning opportunities in research or QI, 19 and enable project viability when unexpected hurdles arise. 18 our sample, two of the four QI projects were brought to an early end by the COVID-19 pandemic.This is perhaps unsurprising as health care systems quickly shifted focus and resources to prepare for surges and rapid implementation of new policies and procedures. 20,21QI projects are meant to be continuous, iterative, and flexible, 22,23 so when the pandemic struck, projects that did not directly address the most urgent health system concerns at the time had their funding and resources reallocated to more immediate needs.While this may seem negative, this likely indicates the flexibility of QI to adapt to the needs of the health system.This is in contrast to the research initiatives, which, though they did ultimately continue as planned, experienced delays due to health systems focusing on managing the COVID-19 pandemic.

| Limitations
We recognize that evaluation adaptations made up a large proportion of the adaptations captured.While we do not know if this is more than other evaluations ongoing at the same time experienced, it is likely a result of having researchers who are attuned to evaluation processes document evaluation and intervention changes.We attempted to capture adaptations in real time, rather than through retrospective interviews as might more often be the case with studies on adaptations (eg, 8,24).This process may have identified more adaptations than if we had waited until the end of the study period, though our third analysis step of consolidating linked adaptations aimed to balance the salience of events due to recency with the importance of changes.We also experienced variability in opportunities to identify adaptations related to the cadence of communication between researchers and implementation teams.Some project researchers worked more closely with implementers, such as the partnered QI projects where researchers and implementers met weekly or more often.For other projects, such as one of the national studies, contact between researchers and implementers was less frequent.We therefore perceive that there may be variation between projects in the depth of knowledge about intervention changes due to the opportunistic observational approach used.The identification of adaptations was determined by individual researchers, and although the definition and adaptation categorization were reviewed and discussed, the nominated researcher on each project team made the final decision of whether an event was an adaptation or not.Ideally, intervention implementers would assist with tracking and/or reviewing adaptations.Adaptations were only systematically tracked during the first wave of the pandemic, and subsequent adaptations would likely provide further insight into project outcomes.Although the projects are from one academic research institution, the projects reflect local, regional, and national experiences of trying to implement different programs during the COVID-19 pandemic.While the specific adaptations may not be generalizable, the process developed to capture and make sense of adaptations could be utilized by researchers in other settings.

| CONCLUSION
The A-FRAME analytic approach proposed in this paper may provide a useful framework for those conducting QI, evaluation, or research projects who wish to assess how adaptations in their projects may be interpreted in understanding project progress and outcomes.This analytic process enabled us to determine the individual impact of adaptations and consolidate related modifications to arrive at information which is easily digested and usable by implementers and researchers alike.Adaptations that impact the outcomes of projects are an inevitable part of implementation and evaluation processes, and the A-FRAME offers a practical approach for making sense of those changes.

F I G U R E 1
The framework for reporting adaptations and modifications-expanded (FRAME), from Wiltsey-Stirman et al 20194 = 7) B (n = 9) C (n = 1) D (n = 1) E (n = 6) F (n = 9) G (n = 9) Total (n = 42) When did the modification occur? a delivery (implementation activities).The FRAME's training and evaluation category was separated, with training regarded as a strategy to improve intervention uptake or a component of interventions, thus, a better fit with 'intervention' as a concept than evaluation.Adaptations that restricted the hiring and training of care providers impeded the upskilling and capacity of the team to deliver the intervention, thus, were determined to impact implementation and the ability to deliver the intervention successfully.For example, Project G used a learning collaborative to teach dialysis providers to identify seriously ill patients and have quality goals of care discussions with them, but the pandemic precluded participants from traveling for the final learning session, so training was adapted to a virtual format, resulting in a reduction in attendance.In addition, the session attended to the effects of COVID-19 on participating centers rather than the originally planned content, omitting training content to support goals of care discussions, a core function of the intervention.Shifts to telehealth were considered adaptations to intervention delivery, which had mixed impacts on projects.For Project A, since the purpose of the intervention was to provide in-person home care, the adjustment to telehealth altered the core function and goal by stifling the amount of care provided.Furthermore, new patients could not be enrolled, further limiting care access by homebound older adults.However, Project E's shift to telehealth, although challenging, allowed the provision of palliative care, the core function and goal of the intervention, to continue.
Project descriptions Frequency of adaptations across seven projects (A-G) coded to FRAME categories.
T A B L E 1

Table 2
not previously represented in the FRAME; consequently, the categories of pandemic or other crisis and emergent/new mandates were added to the sociopolitical reasons.Step 1 indicated that some of the detailed categories were not used, but that those that were selected often appeared multiple times in a project and single adaptations often had multiple categories selected.
presents the frequencies of adaptations by FRAME categories by project and the total across projects.Here, we describe the categories of highest frequency per FRAME construct.Most adaptations occurred during implementation (33, 78.6%) and were unplanned/reactive (28, 66.7%).Decisions to make adaptations often involved researchers (20, 47.6%), administrators (17, 40.5%), and/or intervention developers (13, 31.0%), while those whom the adaptations affected were primarily at the clinic or unit level (13, 31.0%) or target interventhat altered the ability to deliver the intervention as intended (eg, missed trainings) or altered the methodological strength of evaluations.The goal of most adaptations was to improve feasibility (28, 66.7%) of executing projects.Unsurprisingly, the reason most adaptations were made was due to the pandemic (20, 47.6%) and resulting mandates (8, 19.0%) (ie, shelter-in-place orders), although this was Six adaptations, which all related to projects stopping or pausing activity, did not fit into any of the FRAME's four "What is modified" category types; we therefore added a fifth category: project stop or pause.Projects C and D, both QI projects in the pre-implementation phase, stopped altogether due to organizational decisions to focus efforts on pandemic relevant projects.Project C abandoned their plans to reduce operating room case cycle time and instead shifted to testing telehealth visits for surgical cases.Project D stopped the development of a tool to identify patients appropriate for advance care planning to focus on an algorithm to detect clinical deterioration.

Table 3
describes steps 2 and 3 in the development of the A-FRAME matrix, including which FRAME constructs were not included, merged, or refined, and new concepts added.The A-FRAME is the analytic approach used for analyzing the data captured by the FRAME to assess the impact adaptations have on a project's goals and function,T A B L E 3 (Continued)T A B L E 4 Exemplar use of the analysis of framework for reporting adaptations and modifications-expanded data (A-FRAME) matrix for seven projects.