Building to learn: Information technology innovations to enable rapid pragmatic evaluation in a learning health system

Abstract Background Learning health systems (LHSs) iteratively generate evidence that can be implemented into practice to improve care and produce generalizable knowledge. Pragmatic clinical trials fit well within LHSs as they combine real‐world data and experiences with a degree of methodological rigor which supports generalizability. Objectives We established a pragmatic clinical trial unit (“RapidEval”) to support the development of an LHS. To further advance the field of LHS, we sought to further characterize the role of health information technology (HIT), including innovative solutions and challenges that occur, to improve LHS project delivery. Methods During the period from December 2021 to February 2023, eight projects were selected out of 51 applications to the RapidEval program, of which five were implemented, one is currently in pilot testing, and two are in planning. We evaluated pre‐study planning, implementation, analysis, and study closure approaches across all RapidEval initiatives to summarize approaches across studies and identify key innovations and learnings by gathering data from study investigators, quality staff, and IT staff, as well as RapidEval staff and leadership. Implementation (Results) Implementation approaches spanned a range of HIT capabilities including interruptive alerts, clinical decision support integrated into order systems, patient navigators, embedded micro‐education, targeted outpatient hand‐off documentation, and patient communication. Study approaches include pre‐post with time‐concordant controls (1), randomized stepped‐wedge (1), cluster randomized across providers (1) and location (3), and simple patient level randomization (2). Conclusions Study selection, design, deployment, data collection, and analysis required close collaboration between data analysts, informaticists, and the RapidEval team.

development of an LHS.To further advance the field of LHS, we sought to further characterize the role of health information technology (HIT), including innovative solutions and challenges that occur, to improve LHS project delivery.
Methods: During the period from December 2021 to February 2023, eight projects were selected out of 51 applications to the RapidEval program, of which five were implemented, one is currently in pilot testing, and two are in planning.We evaluated pre-study planning, implementation, analysis, and study closure approaches across all RapidEval initiatives to summarize approaches across studies and identify key innovations and learnings by gathering data from study investigators, quality staff, and IT staff, as well as RapidEval staff and leadership.
Implementation (Results): Implementation approaches spanned a range of HIT capabilities including interruptive alerts, clinical decision support integrated into order systems, patient navigators, embedded micro-education, targeted outpatient hand-off documentation, and patient communication.Study approaches include pre-post with time-concordant controls (1), randomized stepped-wedge (1), cluster randomized across providers (1) and location (3), and simple patient level randomization (2).
Conclusions: Study selection, design, deployment, data collection, and analysis required close collaboration between data analysts, informaticists, and the RapidEval team.
This work was completed while all authors were at the University of Minnesota.

| INTRODUCTION
Learning health systems (LHSs) are defined by the Agency for Healthcare Research and Quality (AHRQ) as systems in which "internal data and experience are systematically integrated with external evidence, and that knowledge is put into practice," 1 resulting in higher quality care for patients.Pragmatic clinical trials fit well within LHSs as they combine real-world data and experiences with a degree of methodological rigor which supports generalizability. 2 However, the intervention being tested within a pragmatic trial must also be tailored to a given LHS setting to maximize success.This includes obtaining input and buy-in from clinical leaders, health information technology (HIT), and other shared service operations while also incorporating prospective control groups that allow for hypothesis testing.
Leveraging the electronic health record (EHR) to support recruitment, treatment assignment, and intervention deployment in pragmatic trials is gaining popularity. 3,4Multiple examples of pragmatic trials have used a range of technology tools, for example, best practice alerts [BPAs], and workflows in the EHR, demonstrating the potential to make a large impact on quality, cost, as well as both patient and care team experience. 3,4ile pragmatic trials are a promising method for conducting LHS research, they also present unique challenges.For example, a study by Richesson et al. surveyed 20 teams who used the EHR for pragmatic trials to understand the challenges they faced and solutions they developed. 5The authors found that 55% of projects had difficulties with IT staff turnover, and integration of data from heterogeneous systems.About 50% of teams reported difficulties with utilizing the EHR for intervention delivery.In response to these challenges, Richesson et al. suggested certain prerequisites that healthcare systems and EHRs should have before conducting pragmatic trials.These included ensuring adequate IT staff, standardizing data collection and validation, and building flexible, scalable processes that can be adapted to support additional trials in a standard and replicable fashion. 5itially deployed in late 2021, the Rapid Prospective Evaluation Unit (RapidEval) at the University of Minnesota's Center for Learning Health Systems Sciences (CLHSS) was designed to drive rapid, iterative learning that melds pragmatic trial design with mixed methods to support innovation in healthcare. 6To further advance the field of LHS, we conducted a detailed characterization of the role of HIT, including innovative solutions and challenges that occur, in improving LHS project delivery.Specifically, this report not only details what challenges arose in our LHS when implementing pragmatic trials in the EHR, but also what IT innovations were created and implemented to address these challenges.

| METHODS
This study was conducted by the RapidEval Unit at the University of Minnesota's CLHSS in Minneapolis, Minnesota between December 2021 and February 2023.During this time period, there were four "calls for proposals" for RapidEval projects.Calls for proposals were advertised widely among University and CLHSS staff and consisted of an application and an interview.Thus far, there have been 51 applications, of which eight have been selected based on feasibility of implementation within 2-3 months, potential for positive impact on healthcare delivery or health equity, alignment with health system priorities, low-risk nature of interventions, and impact on the science of healthcare delivery.Five projects are ongoing, one is in pilot testing, and two are in the planning phase.During the first year of RapidEval, we developed a unique process to evaluate, plan, and implement Rapi-dEval projects (Figure 1).
In order to support rapid cycle planning and evaluation, we needed to design infrastructure that standardized data collection, validation, and analysis (Figure 2).The general approach was to leverage predefined logic to process raw EHR data into useful categories including demographics, comorbidities, structured data, and outcomes which is continually validated for accuracy by the LHS.For each study, inclusion criteria and definition of a Time-0 were created to approximate when a patient may be recruited in collaboration with clinical leadership and stakeholders.For example, this could be the start of a hospitalization or outpatient encounter, or when a specific test was ordered.Then, a time horizon (length of time to collect data) and resolution (frequency of sampling and summarizing objective data) were decided.This could be augmented by study specific process measures and outcomes.As such, manual validation then focuses primarily on study-specific measures (inclusion criteria, process measures).Data were censored, dropping data that fell outside of the study horizon.
Data were then collapsed into average, minimum, maximum, and trajectory across predefined time windows (resolution).Finally, the datasets were deidentified using the safe-harbor approach, 7 and used to support baseline analysis including estimating baseline event rates, intra-class correlation, and parameters needed to support study design.All data processing was directed by health system and LHS analysts within the health system's computing environment.This standardized approach to data collection and validation, which facilitated timely data analysis by maintaining a consistent data structure across projects, eased analysis by statisticians by maintaining data in the same format across widely variable projects.Moreover, it supports subsequent structured dissemination which includes recommendations to the health system, such as to broadly scale the intervention, adapt the intervention, or de-implement.In addition to public dissemination, this approach supports continued longitudinal data collection to facilitate additional exploratory analysis outside of  include methods to reduce surgical site infections), and patient messaging (e.g., in an app tailored to enable responsible patient opioid use).
Pre-study planning required a collaborative approach between data analysts, informaticists, the RapidEval team, and clinical stakeholders to support rapid cycle planning and development of interventions which could be deployed in a manner which maximized statistical rigor (Table 1).Some challenges associated with these study approaches and interventions included: identifying variations in provider behaviors to better provide clinical decision support, embedding educational materials into the EHR, designing a communication platform to best reach patients after surgery, and more (see Table 1).
For each project, a work group was generated including health system leaders, study team leads, CLHSS leadership, IT leadership, and critical stakeholders.Decisions regarding underlying study design, including randomization strategy, and specifics of the intervention were made by this group, which leaned heavily on health system IT leadership and staff.The overall direction of RapidEval, and large decisions including study selection, funding and support, study closure, and dissemination were overseen by a steering committee comprised of CLHSS and RapidEval leadership.

| IMPLEMENTATION/RESULTS
We found that integrating health system IT stakeholders was critical for all phases of a project, including proposal evaluation, planning, implementation, and analysis.Throughout this process, IT stakeholders highlighted the importance of operational buy-in ( IT personnel were instrumental in identifying barriers, specific implementation strategies, and data acquisition approaches.For example, in the reduction of telemetry study, the original plan was to conduct as a cluster randomized trial; however, this would have required disentangling over 200 order bundles.This approach was not deemed feasible, and an alternative approach was identified.Another example involved the chemotoxicity study.An initial primary measure was the portion of patients with a dose reduction after initiation of chemotherapy.IT staff highlighted the difficulty of determining this measure given the complexity of chemotherapy data within the EHR.
Other unique innovations detailed in Table 2 were used to overcome encountered challenges included: creating a dedicated research environment for statisticians to access in order to embed analyses into projects; creating a de-implementation framework in order to end projects efficiently and free-up resources; and allowing data to accrue and share it so that learning continues.
This work highlights multiple important impacts of a practiceembedded pragmatic trial unit within a learning health system.This

| CONCLUSION
This case report details important IT innovations which enable rapid pragmatic trials leveraging the EHR.There were three critical components for success.First, direct collaboration with clinical informatics stakeholders is necessary in all phases of study selection, planning, and implementation.This helps ensure that selected projects are actually feasible and ensures smooth implementation.Second, standardized data collection with focused validation facilitates rigorous study planning and rapid analysis.We found that maintaining consistent data structures across projects eased statistical analysis and furthermore eases dissemination of data and results to the health system and even public.Finally, fostering local expertise in implementation and structured deployment that utilizes randomization, when possible, eases future study planning (an outcome which we hope to see as RapidEval continues in the years to come).IT infrastructure to support pragmatic trials directly supports innovations in patient care with wide reach and rigorous assessment to guide future planning, while simultaneously enabling the production of generalizable knowledge, meeting the aims of a learning health system.
the scope of RapidEval, and internal development of quality measures within the health system.Implementation approaches spanned a range of EHR capabilities including: interruptive alerts (e.g., asking patients about medication affordability), clinical decision support integrated both in the order systems and patient navigators for both inpatient and outpatient encounters (e.g., suggesting providers adhere to cardiac monitoring guidelines), targeted outpatient hand-off documentation (e.g., targeting hand-off to F I G U R E 1 RapidEval application and implementation process.F I G U R E 2 RapidEval pre-study evaluation and post-study analysis processes.T A B L E 1 Overview of RapidEval projects and associated challenges.
includes development of local expertise to not only develop a broad range of process improvement efforts but methods to improve the evaluation of innovations.The focus on structured data collection allowed the creation of multiple validated measures of clinical performance, health equity, and practice variation to support both research and operations.Finally, due to tight integration with the health system IT resources, RapidEval projects have led to the creation of a multitude of structured interventions with broad reach and impact on patient care.

Table 2 ,
"Implementation"), a difficult challenge to overcome given the technical limitations of the EHR, as a solution that would not have been created if not for RapidEval.Note that the "stakeholders" men-Challenges and information technology innovations in implementing RapidEval projects at each study phase.