• Open Access

Piloting a Nationally Disseminated, Interactive Human Subjects Protection Program for Community Partners: Unexpected Lessons Learned from the Field

Authors

Errata

This article is corrected by:

  1. Errata: Corrigendum to “Piloting a Nationally Disseminated, Interactive Human Subjects Protection Program for Community Partners: Unexpected Lessons Learned from the Field” Volume 7, Issue 3, 285, Article first published online: 30 May 2014

Abstract

Funders, institutions, and research organizations are increasingly recognizing the need for human subjects protections training programs for those engaged in academic research. Current programs tend to be online and directed toward an audience of academic researchers. Research teams now include many nonacademic members, such as community partners, who are less likely to respond to either the method or the content of current online trainings. A team at the CTSA-supported Michigan Institute for Clinical and Health Research at the University of Michigan developed a pilot human subjects protection training program for community partners that is both locally implemented and adaptable to local contexts, yet nationally consistent and deliverable from a central administrative source. Here, the developers of the program and the collaborators who participated in the pilot across the United States describe 10 important lessons learned that align with four major themes: The distribution of the program, the implementation of the program, the involvement of community engagement in the program, and finally lessons regarding the content of the program. These lessons are relevant to anyone who anticipates developing or improving a training program that is developed in a central location and intended for local implementation.

Introduction

The challenge of Institutional Review Board (IRB) review of community-engaged research has been widely recognized.[1-4] Optimal human subjects training programs in general, and for community partners specifically, need to be locally responsive, skills-based and face-to-face.[5-7] In response to this need, many universities throughout the country are requiring their community partners to take online training programs either identical or akin to the ones required of academic researchers.[8] Others recognize the irrelevance and inaccessibility of these trainings for community partners and are creating training programs specifically designed for their local context and their community partners.[9, 10]

This is one story of the latter approach that took place at the University of Michigan. At its CTSA-funded institution, the Michigan Institute for Clinical and Health Research (MICHR), a research ethicist (Solomon) with the Clinical Research Ethics Core and a community liaison (Piechowski) with the Community Engagement Core worked with community partners and their local IRB administrators to develop a human subjects training program for the thriving community partnerships with MICHR that were underserved by existing training programs.[11] Upon receiving positive feedback on the initial training and discussing this training with colleagues and community partners throughout the country, the developers decided to refine and adapt it into a package that could be distributed nationally while still being implemented locally. The two initial developers partnered with other CTSA cores at MICHR (the Education Core and the Evaluation Core) to create an adaptable and distributable training program as well as accompanying evaluation tools. Together, this group is referred to as the “developers” of this program.

The purpose of this novel program was to combine the optimal characteristics from both locally delivered programs (i.e., context-dependent, face-to-face, and interactive) and national online programs like the Collaborative Institutional Training Initiative, or CITI (easily accessed, electronically delivered, consistent across institutions, and affordable).1 While human subjects protections programs at most academic institutions may not involve training community partners, any nationally created program can learn from the lessons and challenges of bringing a program that satisfies both of these conditions to life.

A key component of this delivery process was engagement and feedback from the collaborators at each of the CTSA sites that implemented the training program. While the program included an evaluation completed by facilitators, we achieved even more insight by partnering with the site collaborators, which included facilitators, coordinators, and community partners. Building upon insights from the initial facilitator feedback, the developers and collaborators were able to glean broader lessons, rather than relying solely on evaluation documents. In the spirit of community engagement, the developers of the training program and the collaborators from the sites who facilitated and implemented it articulate the 10 key lessons learned from implementing this novel model of training. These lessons center around four major themes: the distribution of the program materials, the implementation of the program, garnering community participation in the program, and finally lessons regarding the content of the program (Table 1).

Table 1. Overview
ThemeLesson
DistributionLesson 1: Transfer materials efficiently to institutions and end users
 Lesson 2: Consolidate and summarize information
ImplementationLesson 3: Implementation requires practice
 Lesson 4: Timing is unpredictable, so training schedules should be flexible
 Lesson 5: Communication should be constant and consistent
 Lesson 6: Provide supplemental materials online
Community participationLesson 7: Secure “buy-in” from local contexts
 Lesson 8: Coordinate and integrate the program with existing local practices
 Lesson 9: Choose facilitators wisely
ContentLesson 10: Well-supported activities are crucial

A total of 12 collaborators from six different CTSA sites contributed to this manuscript, in the future referred to simply as “collaborators.2 Their contributions are reflected indirectly in the text and directly through italicized quotes below. They were:

  • Site #1: University of Rochester: Gail Newton and Sherita Bullock,
  • Site #2: Indiana University: Jere Odell and Emily Hardwick,
  • Site #3: University of Cincinnati: Lori Crosby,
  • Site #4: University of Minnesota: Andrea Leinberger-Jabari,
  • Site #5: Medical College of Wisconsin: Zeno Franco, Ryan Spellecy, and Samuel Holland,
  • Site #6: University of Michigan: Karen Calhoun, Adam Paberzs, and Brenda Eakin.

Lessons Learned

Theme 1: Distribution

Lesson 1: Transfer materials efficiently to institutions and end users

All materials for this program were provided in an electronic format and distributed online. This was deemed by the developers to be the most expedient and economical way to provide materials to a large number of geographically diverse pilot test sites. Unfortunately, this was not the most convenient method for many collaborators. Many organizations have limited administrative support to distribute materials locally.

“Some time and expense would have been saved (on our side, at least) if the materials had been printed, packaged, and mailed to us. At the very least, we would have appreciated fewer digital files. This would have made the task of printing a bit easier.” (Site #2, also mentioned by Site #6)

The developers chose to transfer materials electronically because it is less expensive and faster than distributing them in print, and allows for updates to program content in a timely and efficient manner. However, materials need to be created that will allow for a variety of distribution methods, thus improving the availability and acceptability of the program to collaborators. Some suggestions included mailing either paper copies of materials or a digital CD with the materials.

Lesson 2: Consolidate and summarize information

Although the developers designed all materials for this program to be self-explanatory and user-friendly, collaborators thought they could have been more concise. Facilitator's Guides for each module and an Implementation Manual were available in either Word or PDF formats. In addition, the videos that were included in each module were provided in separate files. The developers chose this type of format so that information would not be presented redundantly. However, having information related to each module in multiple documents and files created difficulty for some facilitators.

“Both the Facilitator Guide and the Implementation Manual were comprehensive and very helpful. However, it was difficult to match sections between the two.” (Site #1)

Combining information from the Implementation Manual and Facilitator Guides into one document would increase the clarity of information, improve the flow of the process, and increase flexibility in the presentation of program materials. In addition to consolidated materials, collaborators asked for “cheat sheets” that provided critical information in a one-page format. Other requests included (1) having a materials list for all modules on one sheet; (2) creating an agenda that specifies the time expected for each activity to be used by both facilitators and participants; (3) developing a short document that outlines the background, purpose, and expectations of the program to be used for recruiting facilitators and communicating with the IRB; and (4) providing more information regarding the IRB's roles, responsibilities, and limits. In addition, facilitators asked for certificates of completion and thank you letters for participants.

Theme 2: Implementation

Lesson 3: Implementation requires practice

One of the unique characteristics of this program was that it involved a combination of advanced online technology and “old school” physical space. While most of the materials needed for the training did not require the use of technology (all that was needed were flip pads, markers, signs, etc.), use of the prerecorded lectures (which was optional) required Internet access and audiovisual equipment. As a result, several collaborators experienced technical difficulties on the day they delivered the program. Several recommendations resulted from these challenges.

“Test all technology onsite. Don't expect things to work even if you test them out at another location.” (Site #1)

“Make sure you have a backup for the technology. We tested it, but still had challenges during the training. Luckily we had a backup system available until the original system was fixed.” (Site #3)

“We would recommend that trainers practice each module from start to finish before implementing the workshop. There are a lot of transitions required (e.g., from group activities, to the power point, video, etc.).”(Site #3)

Lesson 4: Timing is unpredictable, so training schedules should be flexible

Collaborators identified issues with the timing of activities within the program. While each module was designed to be delivered in an hour, some collaborators reported that developers had significantly underestimated the time needed to complete required components. The role-play activity (Module 3) in particular consistently took longer to execute than had been planned.

The great variability in timing between sites demonstrates that the same activities can vary greatly in time depending on the facilitators, size of the group, or other local factors such as community partner skills and experience engaging with universities. Several recommendations resulted of which the most important was building in more time than the developers envisioned.

“Trainers should build in an extra 10 minutes per module.” (Site #3)

Another option is to create alternative and/or modified activities so facilitators can choose the activities that fit with the time they have available. For example, if one activity takes 10 minutes longer than anticipated, facilitators can use a shortened version of the next activity to stay on time.

Lesson 5: Communication should be constant and consistent

While collaborators praised the developers for their availability and helpfulness, they nonetheless felt that most communication was done primarily on an ad hoc basis. One notable exception was the train-the-trainer webinars. This is an important lesson as collaborators reported needing consistent communication, especially at the beginning of the project. They wanted to know what expectations they should have about program materials, the amount of preparation time required, and the types of work required.

“Having more instructions for everything and laying it all out up front would have been more helpful… Ongoing, consistent communication with the sites would be very helpful in the future.” (Site #2)

Depending on the amount of resources available, this need could be met in various ways. If full-time staff are part of the training administration, then having weekly or biweekly communication with collaborators as they go through the process of setting up, training themselves, and facilitating the trainings would be very helpful. If this is not possible as is the case for the developers’ own program, alternative methods are required. The developers are currently creating a website with online resources with all the materials laid out, clear and upfront, along with a list of frequently asked questions (FAQs) and, most importantly, a platform for questions to be submitted on an ongoing basis. Then one staff person can cull the questions each week and respond.

Lesson 6: Provide supplemental materials online

While collaborators had problems accessing materials electronically, they nevertheless requested that the developers use technology to provide supplemental information more effectively. Many of the implementers requested a website with FAQs, a calendar or timeline, updates about next steps, backup and background materials. While a Website was not available for the initial program, setting up a website and providing support for managing materials and keeping track of updates on a consistent basis is a low burden, low cost option.

Theme 3: Community participation

Lesson 7: Secure “buy-in” from local contexts

While human subjects protections programs at most academic institutions may not involve training community partners, any nationally created program can learn from the lessons of building trust and buy-in from local contexts. Without this investment, both participation and investment in a curriculum will be lacking. One method to overcome these challenges is to secure “buy-in” from leaders in the local community, which can include leaders of the community where the research partners are located and leaders at the local academic institution (Office for Research, IRB chairs and staff, etc.). The more people on board from the outset, the more the program can be adapted with examples relevant to local contexts and needs. In fact, the difficulty of implementing a training program like this can depend greatly on the level of buy-in from local institutions.

“We developed a letter introducing the Training and inviting community members to participate. The letter was signed by the community member, who was facilitating the workshop, and who is well known and connected in the community. This definitely helped provide credibility and encouragement for community members to sign up.” (Site #2, developed by Site #1 as well)

IRB buy-in is also important. Several of collaborators worked with their IRBs from the beginning of the process as well as invited them to attend the training itself. This collaboration increased the likelihood of having this training program endorsed and recognized by their local IRB.

Lesson 8: Coordinate and integrate the program with existing local practices

If potential participants were personnel of a community organization, it was found to be helpful to coordinate implementation of this workshop with the community organization's needs and capacities.

“Working with a community organization to co-facilitate and recruit participants was definitely a plus… We relied on our community partner to help us in determining the day, duration and location for the training.” (Site #4)

Collaborators also found that it was helpful to integrate the new training program into existing programs that involve academic-community partners, such as pilot research programs.

“We have . . . conducted the workshop multiple times now with community-academic research teams that have received CTSA pilot funding. . . These workshops were especially useful because there were opportunities to discuss the information in-depth through direct application to their existing project and work through specific potential challenges and strategies. We encouraged pilot teams to bring their consent form and any specific IRB issues/questions to the workshop.” (Site #6)

“We integrated the training into the Community Leaders Institute, a 6-week research training for community partners and a small grant to carry out a research project.” (Site #3)

Any train-the-trainer program would benefit from integration into either funding or training programs, as well as being integrated into the IRB review and oversight process at local universities. Without such integration, the new training program would have to be done “in addition to” what is required by the university. This increases the burden for community partners. (Site #2)

Lesson 9: Choose facilitators wisely

For the pilot program described here we intentionally left it up to individual site collaborators to select facilitators. This yielded a great diversity of facilitators: some were experts in research ethics, some IRB staff, some community leaders, and some a combination of these. Facilitators ranged in education level from some college to Ph.D.s, and most had worked with the participants previously.

Collaborators identified key qualities that facilitators should possess. First was experience, either with research or with community work, and ideally with both. Facilitators who had spent time “in the trenches” conducting community-engaged research and facing the types of ethical dilemmas brought up by those participating in the workshop were best able to lead and guide the discussions.

“Community Co-Facilitator reputation, relationships, and knowledge of local area added to the quality of the presentation through specific historical/local/cultural examples that increased participant interest and engagement during the workshop in ways academic co-facilitator could not.” (Site #6)

While the developers provided sufficient information in the program materials (Facilitator Guides and Implementation Manual) for facilitators to use even without their own expertise, the materials were intended to be a guide supplemented by local expertise. To that end, collaborators chose facilitators who were research ethics specialists, IRB staff, or community-based researchers themselves. This allowed facilitators to draw on their own experiences and expertise to enhance and complement the content of the training program.

In addition, collaborators found that leading the training program was much easier with two or more co-facilitators. In this way, one person could lead discussions while another assisted with materials and technology and kept the program moving as scheduled.

“Our site had the benefit of three people to prepare, assist and deliver the training. I think, at the very least, it is a two person job. One person can do the recruiting and convening, but the delivery of the training (which includes props and activities) works best with someone to do most of the talking and another person to keep things moving.” (Site #2)

In the spirit of university-community partnership, implementers found that the ideal facilitation model was a team of two co-facilitators, one with research ethics experience (either IRB or research ethics scholars) and one with community-engaged research experience. This type of team offers many benefits, including enhanced buy-in from both the university and the community, broad expertise in the ethics of both research and community engagement, and fruitful power sharing between the two worlds that then are reflected in the training program itself.

Theme 4: Content

Lesson 10: Well-supported activities are crucial

Collaborators appreciated the numerous activities in the training program, and found them central to the participants’ learning process. The role-play was the most crucial and time-consuming activity (being a module in itself) and while most sites enjoyed it, they found that some participants needed more guidance than what was available in the background information we supplied.

“Our trainees really enjoyed the interactive portions of the session. Some of these, however, were a bit unsettling to facilitate. For example, we had no idea in what direction the participants would take the role play. As it turned out, the role play was very successful; the participants seemed to enjoy it and it gave us plenty to talk about and to share.” (Site #2)

The developers expected the participants to improvise based on some basic information, but we heard from collaborators that improvisation was a skill set that not all participants possessed or were comfortable demonstrating.

For activities that require a high level of participation, it is important to provide extensive support to facilitators. This support can include optional scripts and prompts so those who are not comfortable improvising can still participate, as well as videos of the activities taking place so facilitators can see the activities before having to lead them.

Conclusion

This training program manifests a novel combination of national distribution and local delivery. As is clear, piloting this approach provided many lessons, but perhaps the most important lesson was that delivering training programs locally and face-to-face yield numerous unexpected benefits. Although the developers anticipated and sought to measure increases in knowledge, satisfaction, and skills, they did not anticipate the ancillary benefits of this training. Several collaborators reported that the experience improved participants’ and facilitators’ understanding of the roles and responsibilities of other players in the research process such as the IRB, the Office of Research Administration, regulatory staff, and academic/community partners.

“For us, one of the major unanticipated benefits was having an IRB chair from the Medical College present and discuss the materials. Especially during [one] activity… [he] was able to engage the participants in thinking through not just the principles themselves but the need to balance them. I think for some of the community partners, this was the first time they really began to see what the IRB is there to do, and because there was a representative from the IRB present who also “speaks” community engagement, the IRB was perceived as less of an impersonal set of hurdles, but a “someone” with whom a relationship could be formed.” (Site #5)

A second unanticipated benefit was the networking opportunities provided by face-to-face interactive training. Participants who were members of different community organizations and academic disciplines were able to meet and bond, and many voiced intentions for future collaborations.

“We agree that the networking was an unanticipated benefit. Some of the community partners connected with others they felt had more experience dealing with ethical issues and/or examples of ethical protection documents/practices.” (Site #3)

A final unanticipated benefit was specific interest in building an ongoing community around discussing the ethical quality and challenges of research moving into the future. Collaborators relayed that many participants were interested in reporting results of the pilot back to their communities and requested regular communication with groups that may form as a result of the training.

“I exchanged contact information with one of the participants and met him again at a later event. While this professional networking may or may not amount to anything, it was good to extend our outreach and identify a potential, future collaborator.” (Site #2)

“We presented to the research team of a pilot grant award. They used the training to discuss project issues and implementation as appropriate during the training sections.” (Site #6)

These benefits may not have occurred if the program had taken place individually or solely online. The collaborative and concrete nature of research was reflected in the format of the program, and we hope the avenues of partnership and collaboration between academic researchers, community partners, and research administration will be sustained beyond the program itself.

Acknowledgments

This research was supported by the National Center for Research Resources, Grant UL1RR024986 (now the National Center for Advancing Translational Sciences, Grant UL1TR000433) and CTSA Supplemental support entitled “Development of a nationally implementable, locally deliverable human research participants training workshop for community-based researchers, collaborators and staff.” This research was also supported in part by National Center for Advancing Translational Sciences of the National Institutes of Health UL1TR000448 (Washington University, St. Louis). The content of this paper solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

We gratefully acknowledge the important contributions of our collaborators at sites throughout the CTSA consortium that were involved in the piloting of this training along with their community partners.

  1. 1

    Details about the content of the training and the results of participant evaluations are presented in the article. “Piloting a nationally disseminated, interactive human subjects protection program for community partners: Design, content and evaluation” in this issue.

  2. 2

    Most collaborators also served as facilitators of the training, but all received the materials and worked to deliver them locally.

Ancillary