SEARCH

SEARCH BY CITATION

Keywords:

  • competency-based education;
  • curriculum development;
  • graduate education;
  • clinical and translational science

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background
  5. Developing a CBE Framework
  6. Conclusion
  7. Acknowledgments
  8. References

In the emerging field of clinical and translational science (CTS), where researchers use both basic and clinical science research methodologies to move discoveries to clinical practice, establishing standards of competence is essential for preparing physician–scientists for the profession and for defining the field. The diversity of skills needed to execute quality research within the field of CTS has heightened the importance of an educational process that requires learners to demonstrate competence. Particularly within the more applied clinical science disciplines where there is a multi- or interdisciplinary approach to conducting research, defining and articulating the unique role and associated competencies of a physician–scientist is necessary. This paper describes a systematic process for developing a competency-based educational framework within a CTS graduate program at one institution.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background
  5. Developing a CBE Framework
  6. Conclusion
  7. Acknowledgments
  8. References

In the emerging field of clinical and translational science (CTS), where researchers use both basic and clinical science research methodologies to move discoveries to clinical practice, establishing standards of competence is essential for preparing physician–scientists for the profession and for defining the field.[1, 2] For the purposes of producing quality research and maintaining the public's trust in the research process, it is vital that CTS research be conducted with the highest ethical and intellectual standards. As part of their education, physician–scientists are expected to develop knowledge and skills in research methods from an array of disciplines including epidemiology, public health, biostatistics, bioethics, and measurement. This, combined with the diversity of skills, from data manipulation to grant writing, needed to execute quality research within the field, has heightened the importance of an educational process that requires learners to demonstrate competence. Developing and evaluating competence within the CTS field has become not only an educational need but an ethical imperative as well. This paper describes one systematic process used to create a CBE curricular framework within one particular CTS graduate program.

Background

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background
  5. Developing a CBE Framework
  6. Conclusion
  7. Acknowledgments
  8. References

Competence refers to a professional's suitability for a discipline and is focused on the knowledge, skills, and attitudes of the professional.[3] Achieving competence is developmental and requires a gradual progression toward the integration of these knowledge, skills, and attitudes.[4, 5] Competency-based education (CBE) is a learning paradigm focused on describing and measuring what learners need to know and be able to do (outcomes), given the goals and mission of the program.[4, 6] In this light, competencies define the knowledge, skills, and attitudes needed to function successfully within the discipline.[4, 6]

Originally steeped in the disciplines of education and psychology, CBE has been implemented across several disciplines that strive to define the “requisite knowledge, skills, and attitudes necessary for professional functioning.”[4, 7] CBE is not a new concept and has its roots in the educational reform movements of the 1960s and 1970s that called for a structured system of accountability in which behavioral outcomes of learners were clearly defined.[4] Drawing on a fundamental aspect calling for continuous assessment and monitoring of a learner's progress, CBE can help graduate programs develop a system of accountability demonstrating that their graduates are qualified to perform their professional responsibilities.[8] This type of accountability system fulfills the educational commitment between institutions of higher learning and the public helping to ensure that the philosophy and goals of the program are aligned with the needs of the broader community.[4] This enables CTS graduate programs to develop curriculum, design experiential research opportunities, assess learning, and track the development of learners in order to document the competence of their graduates.

Developing a CBE Framework

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background
  5. Developing a CBE Framework
  6. Conclusion
  7. Acknowledgments
  8. References

The Clinical and Translational Science Institute (CTSI) offers a comprehensive physician–scientist training experience for learners enrolled in a certificate or degree-granting program. Learners from a variety of educational backgrounds (i.e., medicine, dentistry, rehabilitation sciences, public health) representing a continuum of learners, from medical students to established professionals, are enrolled in the programs. In 2009, the education core was awarded a 2-year supplemental grant to develop and implement a CBE framework for the CTSIs certificate and degree-granting programs. To implement this type of educational program, three steps were necessary. First, the competencies that outline the knowledge, skills, and attitudes of a CTS researcher were defined and operationalized (curriculum development). Next, because the CTSI had existing educational programs, an alignment between the existing course objectives and the new curricular objectives was done (curriculum alignment). Finally, instructors were made aware of the instructional and assessment methods that would be supportive of a CBE framework (curricular support).

Prior to beginning curriculum development, several comprehensive literature reviews focused on CBE-related domains were conducted. Understanding the history of CBE and how CBE was implemented in disciplines already using the paradigm (i.e., medicine, dentistry, psychology, engineering) helped inform a similar structure within CTS. In addition, it was necessary to know course requirements for other CTSIs and research-based graduate programs in order to develop program-specific curricular models.

Curriculum development

Beginning with the end goal was fundamental to outlining the competencies that would underlie the curriculum. For instance, when one envisions a clinical and translational scientist, what type of person comes to mind? Ideally, this is an individual with the requisite knowledge and skills needed to function optimally within the profession. The complexity of this end product, however, necessitated the development of a hierarchical competency structure[9] in which the lowest level operationally defined the competencies in measureable ways translating the competencies into a functional curriculum (Figure 1).

image

Figure 1. (A) Theoretical planning model.

(B) Theoretical competency model.

Download figure to PowerPoint

Starting with the ideal end goal defined above, two broad competency clusters were delineated (Figure 2). The requisite knowledge and skills that define the field (Foundational Skills) form one broad competency cluster of the CTS researcher. The other competency cluster outlines the skills a researcher would need to function as a professional within the field (Functional Skills). The next lower level in the hierarchical structure is competency domain. Competency domains begin to define the range of knowledge, skills, and attitudes of a CTS researcher by specifying fundamental aspects within the discipline. For Foundational Skills, the CTS researcher relies on applied knowledge and skills in data collection (Research Design) and data use (Data Analysis). For Functional Skills, not only do the competency domains outline communication and norms within the profession (Professional Skills), but include additional skills deemed necessary to work in a multi- or interdisciplinary context (Leadership and Teamwork).

image

Figure 2. Domain-specific competency framework for the Master's program.

Download figure to PowerPoint

It is important to note that at this level, the competency structure could equally apply to an undergraduate or graduate program, or a master's or doctoral program at the graduate level. Therefore, the structure is too broad to serve as a curriculum. In fact, all educational programs at the CTSI share the same competency clusters of Foundational Skills and Functional Skills and the same competency domains of Research Design, Data Analysis, Professional Skills, and Leadership and Teamwork. It is at the specific competency level; the next lower level, that the expectations of the certificate and degree-granting programs begin to be differentiated.

Most specific competencies, however, are still almost identical regardless of program. For instance: Research Design encompasses Problem Formulation, Methodology, Sampling, and Measurement; Data Analysis encompasses Data Management and Bioinformatics and Applied Analytic Techniques; and Professional Skills encompasses Oral Communication, Written Communication, and Ethics and Professional Norms. The specific competencies defining Leadership and Teamwork, however, differ depending on the program. The certificate program only requires that learners develop teamwork skills. The master's level program also includes the development of teamwork skills, but adds the specific competency of project management based on the idea that at this level, the learner should be working as part of a research team and developing management skills during that time. On the other hand, doctoral program expectations for this specific competency are leadership and teaching. In the capacity of a doctoral learner, it is expected that the skills necessary to lead a research project are being developed and that the learner develop skills as an instructor to further the discipline.

All specific competencies are operationalized through associated learning objectives that specify how the competencies are to be measured. Learning objectives that were too broad to be measureable were further defined by behavioral objectives which then served as the measureable outcomes. Additionally, it is the learning objectives that define levels of mastery; with the certificate, master's, and doctorate degrees requiring progressively greater levels of mastery in the objectives associated with each specific competency. Defining learning objectives for achieving mastery by specifying the amount of foundational knowledge (breadth) and specific knowledge (depth) outlines the process of moving from novice to expert as competence is achieved. For instance, while a master's level learner might be expected to identify the elements and rationale involved in constructing data security plans, a doctoral learner would be expected to actually construct a feasible data security plan. When developing learning objectives, several philosophical questions were considered:

  1. Since everything cannot be accomplished in the finite period of the degree program, what is a reasonable amount of content and experience?
  2. Is there a minimum amount of knowledge that learners should know about all methodologies?
  3. At what point is it appropriate for learners to focus only on one methodology of interest?

Additionally, learning objectives were written in such a way to accommodate the diversity of specialization within the field of CTS. Writing single objectives to accommodate each specialization would be overwhelming. For instance, consider the two following objectives; “Learners will evaluate the inclusion and exclusion criteria for a clinical trial and implications for recruitment and generalizability” versus “Learners will identify appropriate study populations, control groups and comparison groups for research problems.” Writing the objective in the second way allows the first objective to fit within it. Thus, each specialization could fit within the same curricular structure. This was particularly important to the CTSI since there are four specialization tracks offered. Occasionally an objective was too specific to a particular area of specialization and deemed not appropriate for all learners to know, such as writing behavior modification plans or learning to use laboratory-based technologies to conduct basic science research. Therefore, track-specific curricula were developed for each track despite the substantial overlap among them.

An additional consideration when writing objectives is the desired scope of the program. For instance, programs could decide that their graduates need to have an understanding of a variety of research methodologies regardless of their specialization. Alternately, programs could have the expectation that their graduates understand research methodologies within their area of specialization only. Of course, there are many gradations between these dichotomies.

One final set of competencies were delineated; Ethics and Cultural Sensitivity. These competencies were considered to be of such high importance and of such a different nature that they were labeled Super-Ordinate Competencies.[4] These super-ordinate competencies permeate the entire educational structure and are interwoven throughout the learning objectives at all levels of mastery just as they are integrated into all aspects of CTS research. For instance, within the specific competency of measurement learners are asked to identify the need to adapt measuring instruments based on the cross-cultural diversity of research participants.

Curriculum alignment

Because the CTSI has an existing program, the next step involved mapping the new CBE objectives to existing courses. Additionally, the National Council for Research Resources (NCRR) offers a comprehensive outline of the field as currently defined and the current curriculum should overlap this document as much as possible. While the alignment with the NCRR document was a straight-forward matching exercise, the curriculum alignment was more complicated. In order to complete it, four aspects of each course were catalogued; level of content coverage, instructional method used, level of assessment coverage, and assessment method used. This was done through a comprehensive review of course materials including videotaped lectures, lecture slides, and assessment materials. The project team created a matrix to document content and assessment coverage, as well as methods used, in order to visually reveal gaps in coverage (Figure 3). From an instructional perspective, questions to consider regarding comprehensiveness of curriculum are:

  1. Where do learners have an opportunity to demonstrate knowledge and practice skills?
  2. Is there sufficient overlap in content between courses to allow reinforcement of content and practice of skills in different contexts?
  3. Do learners have enough opportunities to practice the integration of knowledge and skills?
  4. What teaching strategies are used to deliver content? Are these appropriate methods to foster the development of competence?
  5. What assessment strategies are used? Are these appropriate methods to capture the complexity of competence?
image

Figure 3. Curriculum alignment matrix.

Download figure to PowerPoint

The alignment matrix for each course was discussed with the course instructor and the percent agreement between the project team and the instructor was calculated. The percent agreement exercise was important because it revealed how instructors saw the elements of their course and whether or not the project team perceived those elements in the same way. Reports were generated that outlined recommendations for additions to content and assessment strategies for closing the identified gaps. Recommendations also included teaching and assessment techniques that would be appropriate or more appropriate for the new curricular framework. In addition, the alignment matrix was designed to be a discussion point for program directors and faculty to evaluate their program with respect to the first three questions outlined earlier.

Curricular support materials

The final step was the compilation of teaching and assessment resources for faculty. Instructional methods were chosen for their potential to be interactive, engaging, and encouraging of self-directed learning. The project team wrote information sheets on several instructional methods including: guided practice, questioning, motivation, adult learning theory, simulation, case study, and small groups. Each information sheet contained a description of the method, strengths and weaknesses of the method, and best practices for the method.

Assessment is a critical aspect of CBE providing evidence of competence and establishing that learning has been achieved at the desired standard of performance.[10] Because of the nature of CBE, assessment methods that allow for the integration of knowledge, skills, and attitudes and have the potential to mimic problems encountered in everyday situations were given priority for inclusion. The assessment toolbox features educational modules on general aspects of assessment and specific assessment methods such as tests, performance assessments, peer assessment, and self-assessment. In addition, scoring rubrics, designed for types of assignments typically used, were also included in the assessment toolbox to allow for immediate use by instructors.

Conclusion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background
  5. Developing a CBE Framework
  6. Conclusion
  7. Acknowledgments
  8. References

The goal of this portion of the project was to develop a CBE curricular framework for the CTSI while contributing to the national discussion about the field of CTS by trying to define and outline unique characteristics of the discipline. Drawing on the course and degree requirements for other CTSIs and research-based graduate programs, a curriculum was developed. This curriculum is supportive of CBE in that it specifies what learners should know and be able to do upon graduation. While this step was critical, the resulting curriculum document does not represent a CBE program. To become a CBE program, the curriculum document must be implemented. Unfortunately, implementation is a more difficult step. Implementation requires that faculty be willing to look at their teaching and assessment in a critical way and determine whether their instructional and evaluation practices support the desired outcomes. For some instructors, this may require doing things differently than in the past. Implementation also requires that program directors advocate for the changes that must take place and support their instructors as they make necessary changes to complement the structure. Optimally, all instruction and assessment should support a continuum of experiences that move the learner toward the outcomes specified in the curriculum. This requires that all participants understand the continuum, where they fit in the continuum, and what they need to do to support the continuum so that all learners achieve competence.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background
  5. Developing a CBE Framework
  6. Conclusion
  7. Acknowledgments
  8. References

This publication was made possible by Grant Number 3 UL1 RR024153-04S2 from the NCRR, a component of the National Institutes of Health (NIH) and NIH Roadmap for Medical Research. Its contents are solely the responsibility of the authors and do not necessarily represent the official view of the NCRR or NIH.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Background
  5. Developing a CBE Framework
  6. Conclusion
  7. Acknowledgments
  8. References
  • 1
    Teo A. The development of clinical research training: past history and current trends in the United States. Acad Med. 2009; 84: 433438.
  • 2
    Forrest CB, Martin DP, Holve E, Millman A. Health services research doctoral core competencies. BMC Health Serv Res. 2009; 9: 107111.
  • 3
    Mentkowski M. Associates: Learning that Lasts: Integrating Learning, Development, and Performance in College and Beyond. San Francisco: Jossey-Bass; 2000.
  • 4
    Rubin NJ, Leigh IW, Nelson PD, Smith IL, Bebeau M, Lichtenberg JW. The competency movement within psychology: an historical perspective. Professional Psychol: Res Pract. 2007; 38: 452462.
  • 5
    Zane TW. Domain definition: the foundation of competency assessment. Assessment Update. 2008; 20: 34.
  • 6
    Lichtenberg, JW, Bebeau MJ, Nelson PD, Smith IL, Portnoy SM, Leigh IW. Challenges to the assessment of competence and competencies. Professional Psychol: Res Pract. 2007; 38: 474478.
  • 7
    Bourg EF, Bent RJ, McHolland JD, Stricker G. Standards and evaluation in the accreditation and training of professional psychologists: the National Council of Schools of Professional Psychology Mission Bay Conference. Am Psychologist. 1989; 44: 6672.
  • 8
    Kaslow NJ, Bebeau MJ, Lichtenberg JW, Portnoy SM, Rubin NJ, Leigh IW. Guiding principles and recommendations for the assessment of competence. Professional Psychol: Res Pract. 2007; 38: 441451.
  • 9
    Whiddett S, Hollyforde S. A Practical Guide to Competencies: How to Enhance Individual and Organisational Performance. 2nd edn. London: CIPD Publishing; 2003.
  • 10
    Searle J. Defining competency: the role of standard setting. Med Educ. 2000; 34: 363366.