• Open Access

Faculty Perceptions of How Community-Engaged Research Is Valued in Tenure, Promotion, and Retention Decisions

Authors

  • Kathleen M. Nokes Ph.D., R.N., F.A.A.N.,

    Corresponding author
    • Hunter College, CUNY, Hunter-Bellevue School of Nursing, Community Core Co-Director, Weill Cornell CTSA, New York, New York, USA
    Search for more papers by this author
  • David A. Nelson Ph.D., M.S.,

    1. Department of Family and Community Medicine, Clinical and Translational Science Institute of Southeast Wisconsin, Milwaukee, Wisconsin, USA
    Search for more papers by this author
  • Mary Anne McDonald Dr.P.H., M.A.,

    1. Department of Community and Family Medicine, Division of Community Health, Duke University, Durham, North Carolina, USA
    Search for more papers by this author
  • Karen Hacker M.D., M.P.H.,

    1. Harvard Medical School and Harvard School of Public Health, Director Safety Net Initiative, Harvard Catalyst, Boston, Massachusetts, USA
    2. Institute for Community Health, Cambridge Health Alliance, Cambridge, Massachusetts, USA
    Search for more papers by this author
  • Jacquelyn Gosse M.A.,

    1. External Collaborations Coordinator, Mayo Clinic, Rochester, Minnesota, USA
    Search for more papers by this author
  • Becky Sanford A.A.,

    1. Community Engagement, Mayo Clinic, Rochester, Minnesota, USA
    Search for more papers by this author
  • Shannon Opel M.P.H.

    1. Clinical and Translational Science Institute of Southeast Wisconsin, Milwaukee, WI, USA
    Search for more papers by this author

Correspondence: KM Nokes (knokes@hunter.cuny.edu)

Abstract

Purpose

We assessed the perceptions of community core faculty in academic medical center institutions that received Clinical and Translational Science Awards (CTSA) about how these institutions consider community-engaged scholarship (CES) when tenure, promotion, and retention decisions are made.

Method

An assessment tool was adapted to create an 18-item survey that was sent during November and December 2011 via the Internet to the 369 members of the community-engagement core mailing list of the CTSA.

Results

Fifty-nine responses were received which represented 37 of the possible 60 different funded institutions. The mean score was 48.14 (SD = 11.18); range of 23–74; and Cronbach's alpha was .91 About half reported that support for CES and its inclusion in the academic decision process increased since the institution was awarded a CTSA. Open-ended responses indicated some confusion with terminology although a definition of CES had been provided in the instrument instructions.

Conclusion

Respondents overall agreed there was moderate support for CES in tenure, promotion, and retention decisions which may have been influenced by the CTSA application requirements. This survey could be used to identify if there are differences in institutional and departments and measure changes over time.

Introduction

Launched in 2006 by the National Institutes of Health's National Center for Research Resources (NCRR), the Clinical and Translational Science Awards (CTSA) program creates academic homes for clinical and translational science at academic medical center research institutions.[1] By 2011, 60 institutions across the United States had been funded to achieve a variety of goals including: “enhancing public trust by reaching out to underserved populations, including racially and ethnically diverse groups as well as those living in rural and inner city areas”.[2]

The phrase “from bench to bedside to curbside” has been used to describe the variety of research areas, outreach, and dissemination that are supported through the CTSA mechanism.[3] The current increase in interest in community engagement in research (CEnR) is partly in response to demands from community leaders, policymakers and funders to support community involvement in addressing health issues facing communities[4] particularly as it relates to the elimination of health disparities. CEnR encourages researchers to work collaboratively with a variety of nonacademic partners including government agencies, community organizations, local nonprofits, and other traditional and nontraditional community partners to ensure sustainable accomplishments.[5] Despite a growing interest in CEnR and recognition of its benefits, there are unique research challenges including motivating faculty members[6, 7] to work in partnerships with communities.

There is increasing interest among researchers and healthcare providers in CEnR, but most academic institutions have yet to invest in infrastructure to ensure that faculty members have the training, skills, and resources to effectively conduct this type of research.[8, 9] Even when there is faculty interest in working to engage communities with research, faculty roles and rewards along with release time necessary to create, maintain, and sustainable equitable partnerships present formidable challenges.[10, 11] Developing trusting relationships with communities is not only an iterative process, but also one that can take years—an important consideration when the tenure “clock” is ticking. Identification of practices and policies that provide support for academic investigators interested in CEnR, such as recognizing this work in the faculty tenure and promotion processes needs to be addressed.[12, 13]

Despite growing support for CEnR from national bodies and funders, the majority of academic institutions have not kept pace in recognizing the work done by faculty with communities.[14] Despite recommendations to recognize CEnR, the reality of how promotion and tenure works at institutions and generating support for this work is a challenge.[15] Medical school faculties and deans of faculty affairs have struggled and questioned the evaluation criteria used in the promotion and retention of medical school faculty.[14, 16] Policies are needed that support and recognize the value of community-engaged scholarship (CES), achieved through CEnR[6] when making faculty tenure, promotion, and retention decisions. The current state of faculty tenure and promotion policies relating to recognition or support of CES is not well understood and infrastructure change may be needed to develop improved methods to evaluate promotion and tenure practices.[17] Literature demonstrates that faculty who conduct CEnR desire greater recognition and support from their institutions.[6, 18, 19] While much of the literature focuses on academic barriers and challenges to CEnR, absent is a focused assessment of perceptions of faculty about how institutions value and reward CES when tenure, promotion, and retention decisions are being made.[20]

Academic institutions funded through the CTSA mechanism include a community core that promotes the principles of CEnR. Our work was conducted as part of a cross institutional collaboration among CTSA members of the Education, Scholarship, and Engagement Work Group of the Community Engagement Key Function Committee (CE KFC) of the CTSA Consortium at Hunter College/Weill Cornell, Duke School of Medicine, Harvard University, Mayo Clinic, and the Medical College of Wisconsin. We surveyed faculty members of CTSA-funded institutions to learn their perceptions of current policies related to promotion and tenure and CEnR. In addition, we built on the institutional self-assessment work developed by Community Campus Partnerships for Health (CCPH) to develop a valid and reliable instrument that could be completed online by different constituencies interested in assessing perceptions of the value of CEnR in promotion, retention, and tenure decisions in academic settings.[20, 21] The purpose of this survey research was to assess the perceptions of CTSA community core faculty of how CTSA-funded institutions consider CES when tenure, promotion, and retention decisions are being made.

Methods

Setting and sample

Within the CTSA Consortium, there are 14 thematic groups called Key Function Committees (KFCs) that “serve as a venue for developing and sharing best practices and ideas between members to promote the CTSA consortium aims.” KFC membership is composed of CTSA institution representatives and federal staff.”[2] For example, there are KFCs for Informatics; Clinical Research Ethics; Evaluation; and Community Engagement. The purpose of the CTSA Community Engagement KFC is “to implement a successful broad plan of community and practice engagement among the CTSA sites by sharing knowledge, expertise. and resources.”[22] We sent the survey to the Community Engagement KFC e-mail distribution list after the removal of the emails of federal government members.

Instrument

We developed a cross-sectional, confidential, Web-based survey to be completed by eligible CTSA faculty members. A review of the literature and extensive conversation within the national CTSA structures yielded one quantitative assessment that had been used, in part, to identify the capacity and support of higher educational institutions for CEnR and CES. Although originally designed by Community-Campus Partnerships for Health (CCPH) as a self-assessment and evaluation tool to be used by groups of faculty and administration at academic institutions, the questions were appropriate for our use as well.[14, 23] This instrument was divided into dimensions covering different aspects of conducting CEnR within academic institutions; four levels were articulated, which represented commitment to CEnR and CES; we drew our items from Dimension VI: CES and used all 12-items from level 4 indicating the greatest commitment to CEnR. The Executive Director of CCPH agreed that we could use Dimension VI as a foundation for further development of our survey instrument (Seifer, Personal correspondence, 2010). Some of the original items asked about two distinct areas and so we separated those concepts into two items resulting in a total of 18 items. To illustrate, Item 6.11 stated: Community partners are regularly invited to participate in the review, tenure, or promotion processes in ways that go beyond writing letters of support (e.g., serving on a faculty review committee). In practice, these community partner contributions to the process are seriously considered and valued was changed to two items: Community partners are regularly invited to participate in the review, tenure, or promotion processes beyond writing letters of support (e.g., serving on a faculty review committee) and In practice, community partner contributions to the review, tenure, or promotion processes are seriously considered and valued. We added one additional CTSA-specific item: Support for community-engaged scholarship and its inclusion in the review, tenure, and promotion process has increased since my institution was awarded a CTSA.

We used a six-point Likert-type scale with 5 being strongly agree and 0 being no basis to respond, indicating that higher scores were consistent with more agreement with the item and a possible range of 0–90. We included a “no basis to respond” option since we thought that this may be the case for some respondents, this response was not included in further analysis. In order to ensure that all respondents were using the same framework and definitions we placed these instructions at the top of each survey:

According to the National Review Board for the Scholarship of Engagement, the Scholarship of Engagement captures scholarship in the areas of teaching, research, and/or service. It engages faculty in academically relevant work that both meet the mission and goals of the college or university as well as community needs. Engagement is a scholarly agenda that incorporates communities' issues and which can be within or integrative across teaching research and service. Community is broadly defined to include audiences external to the campus that are part of a collaborative process to contribute to the public good.[21] Community-engaged scholarship refers to the publications, presentations and other products of community engaged research. You are being asked to respond to questions about how different components of CES are valued in tenure, promotion, and retention decisions in your school, college, or institution.

We also developed a cover letter for the survey that explained its purpose, who was eligible to respond, that it had received IRB approval and who to contact with questions. The cover letter was reviewed by members of the Education, Scholarship and Engagement (ESE) Work Group of the Community Engagement KFC, revised, and then reviewed by the CTSA Strategic Goal Four Committee. A member of that committee, Dr. Linda Cottler, reviewed the instrument and suggested revisions. After further revisions, the cover letter and instrument was pilot tested with three faculty members at the Medical College of Wisconsin. After the pilot test, we made final revisions. Table 2 presents the final survey used in this research.

In addition to the survey, we asked three demographic questions (age, race/ethnicity, & gender), seven faculty-role-related questions (tenure status, if on tenure track, current academic rank, length of time at current rank, length of time at current institution, degrees held, name of school, university, or institution being considered when responding to the questions), and one question about whether they were currently conducting CES. We provided space for open-ended responses after each item and also at the end of the survey.

Data collection

The Promotion and Tenure Survey received IRB approval from each of the five institutions represented on the survey team. We collected data through a secure web application, REDCap (Research Electronic Data Capture), supported by the REDCap Consortium, a group composed of 228 active institutional partners from CTSAs, General Clinical Research Centers, Research Centers in Minority Institutions, and other academic medical centers. Three hundred sixty-nine emails were sent and 10 were returned as undeliverable for a total of 359 distinct email addresses. Instructions asked that only faculty members complete the survey since we believed that they would have interests and insights into the tenure, promotion, and retention processes. We distributed the survey from the Mayo Clinic on November 14, 2011 and later that day, the CCPH, a nonprofit based at the University of Washington, sent the questionnaire to one of their interest groups: members of the CCPH who also worked at CTSA institutions in CEnR. This was done independently but this list overlaps the group we wanted to reach, did not expand the sample and so was not a problem. The survey was open from November 14 through December 19, 2011, and four reminders were sent on November 21 and 28 and December 9 and 16, 2011. Completed surveys were returned to Mayo Clinic and, because the survey was confidential, email addresses were removed from the respondents' information by staff who were not members of the research team.

Data analysis

Data were exported from REDCap to an Excel database and then SPSS (SPSS Inc., Chicago, IL, USA) and SAS 9.2 (SAS Institute, Cary, NC, USA) for data cleaning and analysis.

We were unable to compute a return rate for the survey because we had asked that only faculty members respond and we did not know how many who received the survey were eligible. Instead, we used the number of returned surveys as a denominator for calculations. We completed a total score for the survey by adding all of the items and dividing by 18.

Additional responses written in the space after each of the items and at the end of the survey were compiled as a text document and thematically analyzed using NVivo9 (version 9, QSR International, Burlington, MA, USA) a qualitative text analysis program. We identified themes from the text and coded them, enabling the range of themes, and the relationships among them, to be examined. This methodology, thematic analysis, is commonly used in qualitative research.[24, 25]

Findings

Of the 369 surveys sent to email addresses, we received 59 total responses from individuals representing 37 different CTSA institutions (Figure 1). Table 1 shows the characteristics of the respondents, who are almost evenly divided between male and female, overwhelmingly white, 63% at the rank of full professor and 66% tenured. Table 2 shows that for the 55 out of 59 respondents, who included this information, the mean age was 54 years (SD = 8.7, range 31–68 years) and the mean time at their institution was 13.2 years (SD = 8.3, range = 1.5–35 years). The overwhelming number of respondents (92%, n = 54) reported they conducted CEnR.

Figure 1.

Survey distribution and return.

Table 1. Characteristics of the sample
CharacteristicNumberPercentagemissing
Female28492
Male2951 
Race
Non-Caucasian9150
Caucasian5085 
Education
M.D.28480
Ph.D.3153 
Rank
Assistant Professor6112
Associate Professor1526 
Full Professor3663 
Tenured
Yes316612
No1634 
Tenure Track
Yes226525
No1235 
Table 2. Survey table
ItemStrongly agreeAgreeNeither agree nor disagreeDisagreeStrongly disagreeNo basis to respondMean (SD)
1. In this institution, there is a formal, universally accepted definition for CES that is used consistently and is distinct from community engagement.07 (12%)8 (14%)28 (48%)14 (24%)2 (3%)2.14 (0.93)
2. Definitions of CES are used consistently to describe a variety of community-based teaching, research, and service activities.019 (32%)3 (05%)28 (48%)9 (15%) 2.54 (1.10)
3. CES is recognized for all categories of appointments regardless of tenure and/or clinical, teaching, and/or practice emphasis.2 (3%)17 (29%)11 (19%)18 (31%)9 (15%)1 (2%)2.73 (1.15)
4. Almost all of the community-engaged faculty are in tenure or tenure track positions.1 (2%)8 (14%)8 (14%)26 (44%)12 (20%)4 (7%)2.27 (1.02)
5. There is a mix of seniority and rank among the community-engaged faculty.15 (25%)29 (49%)3 (5%)5 (9%)3 (5%)4 (7%)3.87 (1.08)
6. CES is substantially recognized during the review, tenure, or promotion process.3 (5%)14 (24%)9 (15%)18 (31%)9 (15%)5 (9%)2.69 (1.20)
7. CES is explicitly included in the review, tenure, and promotion policies and procedures. 7 (12%)12 (20%)23 (39%)13 (22%)4 (7%)2.23 (.96)
8. The president, chief academic officer, trustees, and deans visibly support CES as an integral form of scholarship and demonstrate this support through their words and their actions.2 (3%)17 (29%)15 (25%)17 (29%)6 (10%)2 (3%)2.85 (1.07)
9. Review, promotion, and tenure policies support and encourage dissemination of CES through multiple venues (peer-reviewed publications, books, media, reports, etc.).3 (5%)15 (25%)12 (22%)25 (42%)2 (3%)1 (2%)2.86 (1.01)
10. In practice, review, promotion, and tenure decisions value the dissemination of CES through multiple venues.3 (5%)15 (25%)8 (14%)27 (46%)1 (3%)4 (7%)2.81 (1.05)
11. The review, promotion, and tenure decisions actively support and encourage interdisciplinary scholarship.13 (22%)21 (36%)12(22%)8(14%)1 (2%)1 (2%)3.66 (1.04)
12. Published articles in journals that are interdisciplinary or outside of the faculty member's discipline are given at least equal weight to those published in disciplinary journals.4 (7%)23 (39%)7 (12%)17 (29%)5 (9%)2 (3%)3.07 (1.17)
13. The review, promotion and tenure policies recognize and value funding of CES from a wide variety of sources (foundations, NIH, HRSA, etc.).10 (17%)24 (41%)13 (22%)6 (10%)3 (5%)2 (3%)3.57 (1.07)
14. In practice, faculty are recognized and valued for receiving funding from a wide variety of sources.14 (24%)19 (32%)9 (15%)14 (24%)2 (3%)1 (2%)3.50 (1.20)
15. There is mandatory training for members of review, promotion, and tenure committees to ensure a broad understanding of the definition, nature, documentation, and assessment of CES.25 (42%)2 (3%)2 (3%)22 (37%)25 (42%)8 (14%)1.62 (0.74)
16. Community partners are regularly invited to participate in the review, tenure, or promotion processes beyond writing letters of support (e.g., serving on a faculty review committee).02 (3%)3 (5%)17 (29%)33 (56%)4 (7%)1.52 (0.76)
17. In practice, community partner contributions to the review, tenure, or promotion processes are seriously considered and valued.1 (2%)3 (5%)7 (12%)14 (24%)24 (41%)9 (15%)1.83 (1.02)
18. Community impact of CES is valued and rewarded in the review, promotion, and tenure process with at least equal emphasis placed upon the local community impact as that placed on regional, national, and/or international impact.09 (16%)10 (17%)20 (34%)12 (20%)7 (12%)2.31 (1.02)

The item mean was imputed for all missing data values. The total mean score on the 18-item survey was 48.14 (SD = 11.18) with a range of 23–74 and the total alpha was computed as 0.91; Table 2 provides the responses and means for each item. About half of the respondents (mean = 3.2, SD = 1.11) reported that Support for community-engaged scholarship and its inclusion in the review, tenure, and promotion process has increased since my institution was awarded a CTSA. The Spearman correlation between the survey total score and the item about support increasing since CTSA funding was 0.35 (p = 0.0088).

The three items with the highest agreement were, There is a mix of seniority and rank among the community-engaged faculty (3.87); The review, promotion, and tenure policies recognize and value funding of community-engaged scholarship from a wide variety of sources (foundations, NIH, HRSA, etc.) (3.57) and in practice, faculty are recognized and valued for receiving funding from a wide variety of sources (3.50). The three items with the least agreement were Community partners are regularly invited to participate in the review, tenure, or promotion processes beyond writing letters of support (e.g., serving on a faculty review committee) (1.52); there is mandatory training for members of review, promotion, and tenure committees to ensure a broad understanding of the definition, nature, documentation, and assessment of community-engaged scholarship (1.62); and in practice, community partner contributions to the review, tenure, or promotion processes are seriously considered and valued (1.83).

Because of the small sample size and nonnormally distributed data, nonparametric tests were applied. The association of each question on the survey (18 in total) with each of the promotion and tenure demographics was analyzed using the Wilcoxon rank-sum test (gender, race, tenure status), Kruskal–Wallis test (current academic rank), and Spearman correlation analysis (age, time at institution). Running multiple statistical tests on the same data set exponentially increases the risk of making an incorrect decision; in order to reduce this risk, we changed the level of significance from 0.05 to 0.001 by using the Bonferroni adjustment. Using this new level of significance, the results cannot be considered to be significant unless the p-value falls below the new threshold of 0.001. Median and range are usually reported when using nonparametric tests (Wilcoxon rank-sum test and Kruskal–Wallis test). However, in this study, the mean is more meaningful and helpful for describing the difference between the groups.

After reviewing all of the results, none of the p-values are lower than 0.001. Using the data that are available, there were no differences based on gender, race, tenure status, current academic rank, age, or time at institution. The power of these tests is low due to the relatively small sample size and the multiple testing problem. By increasing the sample size or minimizing the number of tests, there would be more power to detect the difference in participant response between groups; factor analysis could not be conducted due to small sample size.[26, 27]

Findings from qualitative data from open-ended responses

The open-ended responses or comments provided qualitative data that added another dimension to the survey data. Twenty-seven of the 59 respondents (45%) provided at least one comment on at least one item with some respondents providing comments on multiple items. For analysis, we compiled comments with the item to which they responded; summary comments from the end of the survey were compiled and labeled item 20. We read this data to identify preliminary themes and additional themes emerged during the coding process. Because the majority of these short comments responded to specific questions they were easily categorized into thematic groups. Three main themes emerged: 1) Perceived barriers to the integration of CES into the tenure and promotion and retention processes; 2) evidence of change in the tenure, promotion, and retention processes to integrate CES; and 3) historical tenure standards persist. Table 3 shows the main themes and supporting subthemes.

Table 3. Main themes and supporting coded data
MAIN THEMESSupporting themesCoded data groups
  1. Two data nodes, R06-Question unclear to Respondent and R21-Comments on Survey are not included in the table above.

1. Perceived barriers

• Confusion and misunderstanding re CES

• Lack of tenure system

• CES not recognized across institution

R01-Not well defined, misunderstood

R03-No CES definition

R09-No tenure system at institution

R10-CES faculty not on tenure track or tenured, different policies in SPH, med school, arts, and sciences

R05-CES not recognized across institution

2. Evidence of change to integrate CES

• CES well defined

• Institution addresses CES & review, promotion, and tenure (RPT) on policy level

R04-Have definition of CES

R11-CES recognized in

RPT R14-Admin visibly supports CES

R20 -Support for CES since CTSA

3. Historic tenure standards

• Publications: quality, impact, discipline

• NIH Funding valued most by RPT

• Tradition & existing policy

R08-Some CES activity counts for RPT, some not

R15-Role of Interdisciplinary articles in RPT

R16–NIH Funding valued most by RPT

R17-All extramural funding valued alike for RPT

R12 & R18–CES explicitly OR NOT included in RPT. policies and procedures

R13-Against exception for CES-“Research is Research”

The first theme: perceived barriers to the integration of CES into the tenure, promotion, and retention processes, is supported by three subthemes: a) lack of established tenure systems in most medical schools; b) confusion and misunderstanding about what CEnR and CES actually are; and c) CES not recognized across the institution. The perceived barriers due to the lack of an established tenure system in medical schools are explained in the comment below.

Most medical schools are not set up with a promotion and tenure system that is applicable to academic work by nonphysicians. In order to change this system, the CTSA needs to start an initiative to change tenure and promotion systems at medical schools to include the work of nonphysicians. Without an initiative for change from the top, it will only happen in very small bits and pieces and may not be effective.

This situation is referenced in other comments. The second subtheme, confusion and misunderstanding about what CEnR and CES are, is supported by comments that CES is: not defined; not well defined and misunderstood; and the perception that CES is a liability for promotion. Representative comments include:

There is neither a CES definition nor any real understanding of community engagement except that it might be considered “service.”

Definitions are used for teaching, research, and service; but not consistently.

The lack of definitions of CES at most institutions may contribute to the perception that CES is problematic when applying for tenure, as this comment suggests:

To my knowledge, community-engaged scholarship is perhaps a liability in the promotion process, because it slows work down and may result in fewer publications. Publications, by the number, still reign supreme here.

Even if CES is defined consistently within a department or school, it is often not recognized across other departments or institutions within a CTSA-funded setting. For example:

Our CTSA is composed of several academic institutions. [It] Would be extremely difficult to get all to agree to one definition.

Because of the variety of how disciplines work, there may be accepted terms specific to some departments that capture the essence of CES, such as public anthropology. There are also terms that are used more broadly that actually confuse the concept, such as service-learning.

These comments demonstrate the breadth of the perceived barriers to the explicit integration of CES into institutional tenure, promotion, and retention processes.

The second main theme, Evidence of change in the tenure, promotion, and retention processes showed the steps some institutions had made to address the perceived barriers explained above. The two subthemes are 1) CES is well defined and 2) institution addressing CES on a policy level. A few respondents' institutions do have established definitions, for example:

The definition our school uses is from the Kellogg Commission on CES in the Health Professions, the definition you are using above.

Respondents did not comment on how well understood the established definitions are. Several reported that their institutions were addressing CES and tenure, promotion, and retention decisions on a policy level, for example:

Community engagement and engaged scholarship are very strongly articulated in our new strategic plan which was approved (Month) 20XX.* The implementation is in its early stages but the President and Provost are speaking about community engagement regularly and very publicly.

Another reported that CES may be included in these policies and procedures but that:

This is dependent on the school/departmental guidelines. Some do (Government, Public Health, Dentistry, and fixed term Medicine). Others are working on it, along with other recommendations from the Task Force report, at the request of the Provost.

The third theme consists of comments on historical tenure standards and their relationship to CES and tenure, promotion, and retention policies. The subthemes are predictable: 1) publications number, quality, impact, and discipline; 2) NIH and other Federal funding valued most by institutional structures; and 3) tradition and existing policy. The survey included four questions (9, 10, 11, and 12) concerning how CES publications were valued and the replies revealed:

The process is informally biased toward journals with large distribution and high impact factors. Journals with more specialized focus that are responsive to community engaged work are not viewed as highly as journals with a broader reputation.

The only dissemination recognized and counted in promotions is published articles in biomedical journals.

Items 13 and 14 concerned funding source preferences for tenure committees. Responses to those items and others show the high value placed on NIH and other Federal sources, but some institutions valued any extramural funding.

ONLY NIH or NSF funding is recognized.

If it were all NIH, everyone would be happy.

Extramural funding is considered valuable, regardless of the type of project.

The third subtheme, the role of tradition, and existing policy revealed two different approaches to the question of how to integrate CES into existing institutional systems. One approach viewed CES as already included in existing policy, or of becoming part of policy:

Interdisciplinary and translational research is explicitly recognized in our P&T policies. Community-engaged research certainly is part of that.

Policies don't distinguish community from other forms of scholarship.

Policies don't explicitly address CE scholarship. It is not precluded but not handled explicitly. “Case law” (i.e., experience promoting and tenuring [sic] folks who do this) is evolving slowly but successfully in this area.

Another approach, found in three comments, showed strong support for the traditional tenure policies and saw no need for any accommodation for CES in institutional policies.

I'd say that there is a disagreement between your implicitly recommended policies and procedures and the way we do it. You seem to want us to have a definition for CES that is separate from the definition of other kinds of scholarship. I, and my institution, disagree. Scholarship is well-defined at our institution. The same definition is applied whether the scholarship is community-engaged or laboratory-engaged or clinical-engaged. This works well for us.

Our tenure and promotion policies support scientific excellence regardless of research area. No one research area is named and so to insert CES would seem odd. While I was tenured and promoted with CES as part of my package, I did not call it such. I called it research and it was recognized as valid and important. I do not think it is recognized as a separate type of scholarship, and in our university, outreach or service cannot be the primary means for tenure.

These statements demonstrate a strong preference for the long-standing institutional traditions and policies and reject the need for modifications to integrate CES.

Limitations

We took an existing assessment tool and adapted it by separating some questions into two items to make each item more specific and ensure understanding; however, some respondents felt that some items were redundant. Analysis of the qualitative data revealed respondents' questions and possible misunderstandings of the survey questions. While the sample size was small, respondents represented a wide variety of CTSA-funded institutions with greatly differing policies and practices regarding CES and tenure, promotion, and retention practices and policies. Institutions also worked with a wide range of partner institutions that may have had very different relationships to communities. These settings ranged from CTSAs focused on laboratory research to state colleges actively working with their agricultural extension services on CEnR: these perspectives could be very different. Many respondents did not understand if we were asking about the entire institution or only the medical or other health professional school, so we do not know if they answered for their Department, School, the entire institution, or for the entire CTSA.

Despite our inclusion of a definition of CES in the introduction to our survey, some of the responses did not demonstrate the same understanding of the meaning of CEnR and CES that we had provided. This was especially evident in questions concerning promotion and tenure as many respondents appeared to confuse CES with community outreach, service-learning, or general community work unrelated to scholarship products. Several responded that their institutions did not have a “community track” for tenure and that projects for students would not “count” for tenure. We did not mean to suggest that a “community track” was an option or that clinical work be counted as an academic product for tenure. The majority of our respondents (66%) had tenure; our survey did not collect data on whether CES had been included in the application for tenure.

Discussion

This research explores faculty views of how CES is evaluated for promotion, retention, and tenure among institutions with medical schools receiving CTSA funding. These findings should be viewed as part of a larger movement among medical schools to reassess the definition of scholarship and the system of faculty rewards, building on the work of Boyer,[28] Glassick,[29] and others.[16] The four areas of scholarship, as defined by Boyer and Glassick, are the scholarship of discovery, integration, application, and teaching. In 2000, a special issue of Academic Medicine examined these different forms of scholarship, and how they are valued, to “stimulate continuing discussions that will define equitable methods for the continued assessment of the scholarly accomplishments of medical school faculty” (p. 872).[16] Our research on faculty perceptions of how CES is valued within the faculty reward system fits within this larger discussion.

While the great majority of respondents reported being engaged in CES which we saw as a positive, the qualitative data revealed that there is confusion in the field about the definition of CES making it difficult to draw conclusions about the value of CES in tenure, retention, and promotion. For example, some respondents did not distinguish between service-learning, outreach, and community clinic work and the long-term, equitable community partnerships that characterize CEnR. It appears that there is no consistent agreed upon definition of CES among the surveyed institutions. This lack of consensus understanding of the meaning of CES terminology indicates a need for faculty and institutional education especially as it relates to the promotion, retention, and tenure processes.

Several findings about item agreement are noteworthy. First, the high agreement for the item about the mix of senior and junior faculty engaged in CEnR is an unexpected finding as we expected that most CEnR work was being done by junior researchers given its more recent acceptance in academic circles. The presence of senior faculty in this pursuit, however, suggests that there is opportunity for mentoring in most institutions and this would support the growth of future CES investigators in future years. Second, the items with lowest agreement were those about community partner active engagement on university committees during the tenure, promotion, and retention deliberation processes. This is likely explained by the fact that the tenure and promotion process is private and confidential and the inclusion of nonacademics in this process is rare in most institutions. Lastly, the strong association between the total score on the survey and receipt of CTSA funding suggests that institutions receiving these awards are moving toward the NIH goal of integrating CEnR into the translational research process. Given the required focus on CEnR in the CTSA application, this is evidence of the increasing acceptance of CEnR.

The qualitative responses from the survey revealed two views on the incorporation of CES into standards and processes of promotion, tenure, and retention. While many comments supported the need for institutional changes to specifically include elements of CES, others viewed their existing systems as adequately rewarding strong scholarship; a standard that they believed could be applied across all types of research including CES. Without further research to obtain the specific perspectives of CES investigators across different institutions, it is difficult to ascertain whether current policies are adequate or not. These two views represent the core positions that will be discussed as faculties challenge institutions' tenure, promotion, and retention processes in the coming years. Do our current standards address and reward CES or not? For institutions who want to ensure the inclusion of CES in their definitions of scholarship and faculty reward systems, as challenged by Boyer, the work of the Faculty for an Engaged Campus Initiative provides extensive resources.[14, 20, 21, 30]

Conclusion

Although there is widespread belief that CES is not valued in academic setting, our findings indicate that faculty members of institutions with CTSA funding overall agreed that there was moderate support for CES in tenure, promotion, and retention decisions. We suspect that this perception has been greatly influenced by the inclusion of CEnR in the CTSA application requirements. Future researchers can use this tool to conduct institutional needs assessments, as the original assessment tool developed by CCPH was designed to do. The Community Engagement Carnegie Classification system allows academic institutions to elect to choose this designation for settings that are engaged with their communities. This survey could be used to identify if there are differences in how institutions that chose that elective designation value CES compared to other settings. As CES grows in popularity, this survey can provide an instrument to measure that perception and compare departments and academic institutions over time.

Acknowledgments

Sarena Seifer, M.D. and Community-Campus Partnerships for Health (CCPH); Leslie Boone, M.P.H.; Linda Cottler, Ph.D., M.P.H.; Allison Heiser; Valerie Kokai; Lloyd Michener, M.D.; Sergio Aguilar-Glaxiola, M.D., Ph.D.; Donna Jo McCloskey, Ph.D.; Program Officer, National Center for Research Resources and members of the CTSA Consortium Strategic Goal Four Committee.

The CTSA Community Engagement Education, Scholarship and Engagement Workgroup wishes to thank CCPH for permission to use and edit the instrument.

The authors wish to thank Alison Heiser, the Project Manager for the CTSA Community Engagement Key Function Committee during 2011, for her help in the development of this research.

They thank and acknowledge to Qun Xiang, M.S., and Jessica Pruszynski, Ph.D., for statistical consultation. They are supported, in part, by grant 1UL1RR031973 from the Clinical and Translational Science Award (CTSI) program of the National Center for Research Resources, National Institutes of Health.

Sources of Funding

This work was conducted with the support from the National Institutes of Health (NIH), the National Center for Research Resources, and the National Center for Advancing Translational Sciences at each of the following institutions:

  • Mayo Clinic and CTSA Award UL1 TR000135.
  • Harvard Catalyst – The Harvard Clinical and Translational Science Center, NIH Grant Award 8UL1TR000170–05 and financial contributions from Harvard University and its affiliated academic health care centers.
  • The Clinical and Translational Science Institute of Southeast Wisconsin, NIH Grant Award UL1RR031973 and by Advancing a Healthier Wisconsin Research and Education Initiative Fund, a component of Advancing a Healthier Wisconsin endowment at the Medical College of Wisconsin.
  • The Duke Translational Medicine Institute, NIH Grant Award 5UL1RR024128-05 and support from Duke University School of Medicine.
  • The Weill Cornell Medical College with funding from NIH Grant Award UL1 RR024996 and support from Hunter College, CUNY.

The content is solely the responsibility of the authors and does not necessarily represent the official views of the institutions listed above and their affiliated academic health care centers or the National Institutes of Health.

This project has been funded in whole or in part with Federal funds from the National Center for Research Resources and National Center for Advancing Translational Sciences (NCATS), National Institutes of Health, through the Clinical and Translational Science Awards Program (CTSA). The manuscript was approved by the CTSA Consortium Publications Committee.

  1. *Date deleted to protect confidentiality.

Ancillary