SEARCH

SEARCH BY CITATION

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Medical Education 2012: 46: 409–416

Context  Video-based observational practice can extend simulation-based learning outside the training space. This study explores the value of collaborative feedback provided during observational practice to the acquisition of clinical skills.

Methods  Nursing students viewed a video demonstrating the proper ventrogluteal injection technique before performing a videotaped pre-test trial on a simulator. They were then assigned randomly to one of three observational practice groups: a group that observed the expert demonstration (EO group); a group that viewed the expert demonstration, self-assessed their individual pre-test and contrasted their self-assessments with expert feedback (ESO group), and a group that observed the expert demonstration, self-assessed and contrasted their assessments with those of an expert, and formed a community that engaged in peer-to-peer feedback (ESPO group). The observation of all videos, the provision of assessments and all networking occurred via an Internet-mediated network. After 2 weeks, participants returned for post-tests and transfer tests.

Results  The pre-test–post-test analyses revealed significant interactions (global rating scale: F(2,22) = 4.00 [p = 0.033]; checklist: F(2,22) = 4.31 [p = 0.026]), which indicated that post-test performance in the ESPO group was significantly better than pre-test performance. The transfer analyses revealed main effects for both the global rating scale (F(2,23) = 6.73; p = 0.005) and validated checklist (F(2,23) = 7.04; p = 0.004) measures. Participants in the ESPO group performed better on the transfer test than those in the EO group.

Conclusions  The results suggest that video-based observational practice can be effective in extending simulation-based learning, but its effectiveness is mediated by the amount of time the learner spends engaged in the practice and the type of learning activities the learner performs in the observational practice environment. We speculate that increasing collaborative interactivity supports observational learning by increasing the extent to which the educational environment can accommodate learners’ specific needs.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Simulation-based education in the health professions involves replicating a task-dependent event for the purposes of training and assessment. It provides learners with a comprehensive practice environment in which to make exploratory efforts, receive feedback and consolidate knowledge, all without incurring the normal costs of errors. Because of its intuitive appeal to the health professions educator concerned with developing student expertise while maintaining patient safety, the simulation approach has been embraced widely. Consequently, institutions have dedicated considerable resources to outfitting clinical training spaces with standardised patient programmes, high-fidelity manikins, virtual reality systems, part-task trainers and hybrid simulators.1 This shift to a simulation approach demands that health professions education researchers explore new methods of teaching with the hope of optimising the learning return on these investments.

Video-based observational practice is one way to extend simulation-based learning beyond the spatial and scheduling confines of the simulation laboratory.2 This form of practice operates on the premise that the functional architecture of the human central nervous system changes with experience. With respect to movement, the idea is that our central nervous system is organised to include internal representations of action. These representations are fundamental to the way we perceive and act within our environment, and when we practise a skill they are refined to incorporate specific information about the timing, magnitude and motor impulse combinations that are required to achieve a desired outcome.3,4 Interestingly, neuroimaging research has shown that some of the same neural reorganisation that occurs when we perform a skill also occurs when we observe that skill being performed.5,6 As such, observation has been theorised as an educational augment to physical skill practice and many studies have produced results in support of its positive learning impact.7–10 Given that modern simulation laboratories are typically outfitted with video-recording equipment at each skill practice station, it seems that an infrastructure is already in place for clinical trainees to reap the well-documented learning benefits associated with video-based, observational practice. However, video-based observational practice is not common in health professions education.

We see two barriers that interact to contribute to the reluctance to implement video-based observational practice into health professions education curricula. Firstly, simply watching video performances is not enough to achieve all the benefits of observational practice. Rather, like all feedback, the optimal delivery of observational practice videos is largely dependent on the learner’s current level of performance.11 As such, effective video-based observational practice requires that instructors provide learner-specific feedback and challenges throughout the learning process. Thus, simply providing trainees with video demonstrations to watch is not sufficient. The second barrier arises when one considers administering video-based observational practice in a setting in which instructors arrange individual times to watch video footage of skill performances with students. In this context the instructor can provide the learner with the specific feedback necessary for his or her development, but this one-to-one interaction is extremely time-consuming and resource-intensive for instructors and course administrators and is untenable in most educational situations.

Modern Internet environments may provide an effective way to circumvent these barriers and create an efficient means for providing video-based observational practice to clinical trainees outside the simulation-based training laboratory. Today, the Internet provides an easy and accessible way for individuals to browse, view and add to an astonishingly vast library of video-recordings in a manner that allows for remote and asynchronous collaboration among multiple users. The current study explores how manipulating the level of feedback delivered to trainees impacts the learning benefits they garner from observing video-based simulation laboratory performances via a collaborative Internet-mediated educational environment. To do so, we enlisted nursing students interested in learning the skill of ventrogluteal injection.12 These students participated in a laboratory-based learning session before they were assigned to groups that differed with respect to the level of collaboration they were permitted with experts and peers in a custom-developed, online, observational practice educational network. The collaborative levels served to create three groups, each of which had progressively more access to learning stage-specific feedback that might augment observational practice.

Methods

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Participants

Twenty-six nursing students from the Lawrence S Bloomberg Faculty of Nursing at the University of Toronto participated in this study. All the participants had already received formal institutional training for all injection sites within the 12 months prior to the start of the study. This research was conducted in accordance with the Declaration of Helsinki (1954) and the University of Toronto Research and Ethics Board. All participants provided informed consent prior to participating and were compensated with Can$15.00.

Simulation-based learning exercise

In order to standardise a functional level of competence,10,13,14 each participant began by viewing an instructional video that demonstrated the proper technique for ventrogluteal injection. Following the video, each participant performed a warm-up ventrogluteal injection on an inanimate, bench-top simulator (Laerdal Medical Canada Ltd, Toronto, ON, Canada) in a realistic clinical setting. The warm-up trial was permitted to ensure that the participants gained some experience with the simulator prior to the first experimental trial. In this way, we can be confident that the means associated with the participants’ pre-test performances are indicative of their initial level of expertise rather than the process of their becoming familiar with the novel features of the simulator (i.e. the thickness of the simulated skin).

Pre-test

Immediately following the warm-up trial, participants performed another ventrogluteal injection upon the inanimate simulator. This second performance was video-recorded and is referred to as the pre-test trial. No feedback was provided to the participants during or following either the warm-up or pre-test performances.

Intervention

Following the pre-test, each participant was assigned randomly to one of three experimental observational practice groups. The members of these groups were released to the community for a 14-day intervention period in which they were instructed to interact within an educational networking site until they had completed their group-specific responsibilities, and then as much as they liked after that. None of the participants performed the ventrogluteal injection physical in either the simulation laboratory or as part of any clinical duty during this period.

All three groups used the educational networking site for their observational practice and were differentiated by the level of collaborative interaction they were permitted within the Internet-mediated environment. However, members of each group were not privy to the networking content of participants assigned to the other groups. All collaborative activity was controlled by an administrator at the system’s server level. All participants were provided with standard e-mail reminders every 2 days throughout the intervention period to remind them of their involvement in the study and the responsibilities associated with their particular group assignment.

The expert observation group (EO group, n = 6) accessed the educational networking environment and viewed the instructional video from the introductory learning session. This level of interaction allowed learners to observe repeatedly an expert performance of the skill; in essence, it represented a model demonstration.

The expert and self observation group (ESO group, n = 10) accessed the educational networking environment to view the instructional video. In addition, each member of this group observed his or her pre-test performance, and assessed that performance using a standard checklist and global rating scale (GRS) for the ventrogluteal injection. When each member of the group had completed a self-assessment, he or she received a checklist and GRS assessment on pre-test performance from an expert. In this way, each member of this group was provided with an expert model demonstration, a viewing of his or her own pre-test performance, an opportunity to assess that performance, and explicit didactic information from an expert regarding that performance.

The members of the expert, self and peer observation group (ESPO group, n = 10) viewed the expert performance and their own performances, and were provided with self-assessment and expert feedback in the same manner as the ESO group. However, each member of the ESPO group also interacted within the network to provide asynchronous checklist and GRS assessments of the performances of other group members. Thus, each member of this group was able to observe an expert model demonstration and his or her own pre-test performance, receive self-assessment feedback and explicit feedback from an expert and peers on his or her own performance, and observe and assess the pre-test performances of other group members.

Post-test

Following the intervention period, participants returned to the simulation laboratory and took part in a videotaped trial of the ventrogluteal injection skill on the bench-top simulator.

Transfer test

Immediately after the post-test trial, participants’ knowledge of the ventrogluteal injection skill was tested under conditions of greater contextual fidelity15 and increased attentional demand. To accomplish this, an actor was employed as a friendly but inquisitive parent during a second video-recorded attempt at performing this skill. The participant was informed before this trial that the simulator represented an adolescent and that the simulated patient’s parent would be present during the injection. The actor followed a script in which she asked questions about the materials, sterile technique and expected pain sensation during specific events in the procedure (i.e. removing the needle packaging, landmarking and cleaning the area, needle insertion). Figure 1 shows the intervention design.

image

Figure 1.  The study design

Download figure to PowerPoint

Analysis

Two independent reviewers assessed each of the videotaped performances using the validated standard checklist and GRS (Table 1). The summed total score for each assessment served as a dependent measure for each video performance. Inter-rater reliability was established using interclass correlations. The GRS and checklist measures were compared in independent three-group (control, self-assessment, full collaboration) by two-test (pre-test, post-test) analyses of variance (anovas). The checklist and GRS scores associated with the transfer test assessments were subjected to a one-way anova with group as the only factor. Effects significant at an alpha set at p < 0.05 were further analysed using Tukey’s Honestly Significant Difference post hoc methodology.

Table 1.    The global rating and checklist scales used to evaluate ventrogluteal injection performance
Global rating scale
 12345
Respect for tissueFrequently used unnecessary force on tissue or caused damage by inappropriate use of instruments Careful handling of tissue but occasionally caused inadvertent damage Consistently handled tissues appropriately with minimal damage
Instrument handlingRepeatedly made tentative, inappropriate or awkward moves with instruments Competent use of instruments but occasionally appeared stiff or awkward Fluid movements with instruments and no stiffness or awkwardness
Time and motionMany unnecessary moves Efficient time/motion but some unnecessary moves Clear economy of movement and maximum efficiency
Flow of operationFrequently stopped operating and seemed unsure of next move Demonstrated some forward planning with reasonable progression of procedure Obviously planned course of operation with effortless flow from one move to the next
Overall performanceVery poor Competent Clearly superior
Checklist scale
 Below expectations for clinical duty Borderline for clinical duty Meets expectations for clinical duty Above expectations for clinical duty
Introduction/establish rapport1234567
Explanation of intervention including patient’s consent to proceed1234567
Assessment of patient’s needs before procedure1234567
Preparation for procedure1234567
Technical performance of procedure1234567
Maintenance of asepsis1234567
Awareness of patient’s needs during procedure1234567
Closure of the procedure including explanation of follow-up care1234567
Clinical safety1234567
Professionalism1234567
Overall ability1234567

Participants also provided estimates of the time they had spent engaged in the observational practice exercises during the 2-week intervention period.

Results

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Inter-rater reliability

Interclass correlations between the two reviewers revealed a significant positive relationship between the two scores generated by the two reviewers (r = 0.42; p < 0.01, Cronbach’s α = 0.970).

Pre-test and post-test

The analysis of the global rating scale dependent measure revealed a significant group by test interaction (F(2,22) = 4.00; p = 0.033). Post hoc comparison of this interaction indicated that all groups were equivalent at pre-test. Although injections trended towards being better at post-test than pre-test in all groups, only the ESPO group achieved significant differences between their post- and pre-test performances. The post-test performances of the ESPO group were rated as significantly better than the post-test performances of the ESO and EO groups.

Similarly, the checklist score analysis revealed a group × test interaction (F(2,22) = 4.31; p = 0.026) in which the pre-test performances of all groups were equivalent and the post-test performances of the ESPO group were rated as significantly better than all the performances compared (Fig. 2). Collectively, these results indicate that the group permitted the most collaborative interactivity within the online educational network derived the most learning benefit from the observational practice.

image

Figure 2.  (a) Global rating scale scores (mean ± standard error [SE]) and (b) checklist assessment data (mean ± SE) plotted as a function of group for the pre-test, post-test and transfer test. bsl00066, expert observation (EO) group; bsl00001, expert and self observation (ESO) group; •, expert, self and peer observation (ESPO) group

Download figure to PowerPoint

Transfer test

The analysis of the GRS assessments of the transfer performances revealed a significant main effect (F(2,23) = 6.73; p = 0.005). Post hoc analysis indicated that the ESPO group performed better than the EO group. The ESO group’s performance was intermediary to, but not significantly different from, those of the other two groups. The checklist assessment analysis revealed a significant main effect (F(2,23) = 7.04; p = 0.004), whereby both the ESPO and ESO groups outperformed the EO group (Fig. 2). Again, these findings suggest that increasing interactivity with the educational network has a positive benefit on learning and, in particular, allows learners to apply knowledge acquired to new, attention-demanding situations.

Estimated observational practice time

Individuals in the EO group reported spending a mean of 42.5 ± 17.3 minutes engaged in observational practice activity. Individuals in the ESO group estimated they had spent a mean of 53.5 ± 8.5 minutes engaged in observational practice. Learners in the ESPO group reported an estimated mean of 82 ± 12.6 minutes engaged in observational activity. Because these values represent participant estimates (i.e. time spent in practice was not measured rigidly), they were not subjected to statistical analysis.

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

The present study assessed the ventrogluteal injection performances of nursing students before and after they interacted within a customised educational network. This design was used to highlight the benefits that interactivity within an educational networking environment can have on the acquisition of procedural skills through observation. To this end, the present findings suggest that increasing the amount of interaction within an observational practice environment may have a positive effect on the learning benefits to be garnered from observational practice. However, we must first recognise that the conditions of the present educational manipulation demanded that individuals in the more interactive groups (the ESO and ESPO groups) spent more time engaged in learning activities within the observational practice environment. Specifically, members of the ESPO group reported spending 82 minutes interacting within the observational learning environment over the 2-week intervention period, whereas participants in the ESO and EO groups spent 53.5 minutes and 42.5 minutes, respectively. Given the established impact of the amount of time a learner engages in deliberate practice on skill acquisition,16 we cannot deny that this is likely to have factored into the ESPO group’s improved post-test performance. However, we are compelled to acknowledge that the types of activities performed by the three groups may also have contributed to the differences displayed in the post-test and transfer performances.

Most succinctly, the dependent variable manipulated in this study can be expressed as the level of observational feedback provided to the learners: the EO group was able to observe passively an expert demonstration video, whereas the ESO and ESPO groups received additional feedback from expert reviewers, and ESPO group members were also provided with opportunities to discuss their performances and exchange feedback with peers. Importantly, we cannot deny that one or all of these particular modes of feedback may contribute uniquely to the explanation of the study’s results. However, we maintain that educational interventions are often complex and, although the results leave us to speculate about the particular features of the intervention most responsible for the outcomes, we feel confident that organising our experimental groups with respect to the possibilities for feedback inferred by the learners’ collaborative engagement within the network reflects most clearly the ecological possibilities associated with self-directed and group-based learning.

We consider that the learning advantages highlighted by those who engaged in observational practice embedded within collaborative activities might also be described as a function of the learning processes targeted by their condition-specific practice.11,17 We speculate that the scaffolding nature of the observational practice conditions permitted those privy to more interactivity to engage in feedback, practice activities and challenges in a manner that could be flexibly tailored to each participant’s specific level of learning.11 More specifically, the primary benefits of observing correct skill demonstrations and receiving explicit directions come very early in learning, when learners are acquiring the basic timing and movement patterns of the skill.18 However, as learners progress, they need to be directed to the skill’s most critical elements. As such, their video-based observational practice benefits from straightforward didactic information that directs their attention to the most essential cues for success.19,20 Once learners have achieved a level of expertise through which they can develop connections between specific techniques and movement outcomes, observational practice is most effective when it engages them in exercises that take advantage of the inherent variability of the human neuromuscular system. Because of this variability, error detection and correction exercises boost learning by offering an avenue for consolidating information about the costs associated with different types of error into their representations of action. Such error information helps performers learn how to strategise movements against worst-case outcomes, coordinate the inputs from various feedback sources, and identify actions that are off-course early in their trajectory.4 Thus, watching flawed performances can be more beneficial to the advanced learner than watching flawless performances.21 These ideas are all very relevant to the learners compared here. Although each participant may have acquired the basic timing and movement patterns from the observational practice activity that constituted the initial instructional session, only the EPO and ESPO groups received direct feedback information from an expert regarding the skill’s most critical elements. Furthermore, only the ESPO learners engaged in the process of observing and assessing the flawed performances of their peers.

Importantly, the present findings support further inquiry into learning through models of observational practice and educational networking.22,23 The idea that a collaborative Internet environment can help learners to shape their own programmes,24 become better self-assessors25 and engage in error-centric exercises while also communicating with other learners, experts and important sources of information20 is extremely promising. If trainees can use online educational networking to enhance observational practice, construct knowledge together and develop communal expertise,21,23 a new and prevalent way to augment simulation-based practice is close at hand. However, we aim to subject these ideas to further empirical tests of learning with the hope of identifying the particular features of collaborative observational practice that are most influential in skill acquisition. By holding practice times and network interactivity constant while manipulating the amounts and types of feedback activities systematically, we can make specific examinations into the many facets of collaborative education, including the relative impacts of peer and expert feedback,26,27 the value of observing errorful performances,21 interprofessional arrangements,28 and the relative benefits of effective or ineffective self-assessment.29 Furthermore, future research on collaborative observational practice may also address its application to the range of skills – from highly complex precision manual skills to the psychosocial skills associated with professionalism and communication – that are important to health professionals.

In the interim, we hope that the present evidence encourages curriculum developers to consider how observational practice modalities can increase the learning return from simulation teaching laboratory investments. Because collaborative Internet-based environments can be used by students in remote locations without direct expert presence, they are extremely flexible with regard to availability and are easily tailored to a variety of individual learning and teaching styles. The asynchronicity and modular operation of collaborative networks make them easy to implement and they are easily integrated with other online systems and are often familiar to the current generation of health professions students.30,31

Contributors:  LEMG designed and performed the experiments, analysed the data and wrote the manuscript. MB supervised the data collection. BK contributed to the design of the collaborative networking tool. HC contributed to the experimental design of the research. AD planned and supervised all features of the project. All authors contributed to the critical revision of the paper and approved the final manuscript for publication.

Acknowledgments

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References

Acknowledgements:  the authors would like to acknowledge Shawn Meng, Lawrence S. Bloomberg Faculty of Nursing, University of Toronto, Toronto, Ontario, Canada, for his work in developing and managing the educational networking environment.

Funding:  this work was generously supported by the Network for Excellence in Simulation for Clinical Teaching and Learning, Toronto, Ontario, Canada.

Conflicts of interest:  none.

Ethical approval:  this study was approved by the University of Toronto Research Ethics Board.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Methods
  5. Results
  6. Discussion
  7. Acknowledgments
  8. References
  • 1
    Reznick RK, MacRae H. Teaching surgical skills – changes in the wind. N Engl J Med 2006;355:26649.
  • 2
    Liebermann DG, Katz L, Hughes MD, Bartlett RM, McClements J, Franks IM. Advances in the application of information technology to sport performance. J Sports Sci 2002;20:75569.
  • 3
    Miall RC, Wolpert DM. Forward models for physiological motor control. Neural Networks 1996;9:126579.
  • 4
    Elliott D, Hansen S, Grierson LEM, Lyons J, Bennett SJ, Hayes SJ. Goal-directed aiming: two components but multiple processes. Psychol Bull 2010;136:102344.
  • 5
    Rizzolatti G. The mirror neuron system and its function in humans. Anat Embryol 2005;210:41921.
  • 6
    Iacoboni M, Dapretto M. The mirror neuron system and the consequences of its dysfunction. Nat Rev Neurosci 2006;7:94251.
  • 7
    Carr MM, Hewitt J, Scadamalia M, Reznick RK. Internet-based otolaryngology case discussions for medical students. J Otolaryngol 2002;31:197201.
  • 8
    Greenhalgh T. Computer-assisted learning in undergraduate medical education. BMJ 2001;322:404.
  • 9
    Rogers DA, Regehr G, Howdieshell TR, Yeh KA, Palm E. The impact of external feedback on computer-assisted learning for surgical technical skill training. Am J Surg 1998;179:3413.
  • 10
    Xeroulis GJ, Park J, Moulton CA, Reznick RK, LeBlanc V, Dubrowski A. Teaching suturing and knot-tying skills to medical students: a randomised controlled study comparing computer-based video instruction and (concurrent and summary) expert feedback. Surgery 2007;141:4429.
  • 11
    Guadagnoli MA, Lee TD. Challenge point: a framework for conceptualising the effects of various practice conditions in motor learning. J Mot Behav 2004;36:21224.
  • 12
    Potter P, Perry A, Ross-Kerr J, Wood M, eds. Canadian Fundamentals of Nursing, 4th edn. Toronto, ON: Mosby Elsevier 2009;744–55.
  • 13
    Moulton CAE, Dubrowski A, MacRae H, Graham B, Grober E, Reznick R. Teaching surgical skills: what kind of practice makes perfect? A randomised, controlled trial. Ann Surg 2007;244:4009.
  • 14
    Brydges R, Carnahan H, Backstein D, Dubrowski A. Application of motor learning principles to complex surgical tasks: searching for the optimal practice schedule. J Mot Behav 2007;39:408.
  • 15
    Kneebone R. Evaluating clinical simulations for learning procedural skills: a theory-based approach. Acad Med 2005;80:54953.
  • 16
    Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med 2004;79:7081.
  • 17
    Wulf G, Shea C, Lewthwaite R. Motor skill learning and performance: a review of influential factors. Med Educ 2010;44:7584.
  • 18
    Bandura A. Recycling misconceptions of perceived self-efficacy. Cognit Ther Res 1984;8:23155.
  • 19
    Boyce BA, Markos NJ, Jenkins DW, Loftus JR. How should feedback be delivered? J Phys Educ Recreation Dance 1996;67:1822.
  • 20
    Darden GF. Videotape feedback for student learning and performance: a learning stages approach. J Phys Educ Recreation Dance 1999;70:4062.
  • 21
    McCullagh P, Meyer KN. Learning versus correct models: influence of model type on the learning of a free-weight squat lift. Res Q Exerc Sport 1997;68:5661.
  • 22
    De Laat M, Lally V. Complexity, theory and praxis: researching collaborative learning and tutoring processes in a networked learning community. Instr Sci 2003;31:739.
  • 23
    Veldhuis-Diermanse AE, Biemans HJA, Mulder M, Mahdizadeh H. Analysing learning processes and quality of knowledge construction in networked learning. J Agr Educ Ext 2006;12:4157.
  • 24
    Chiviacowsky S, Wulf G. Self-controlled feedback is effective if it is based on the learner’s performance. Res Q Exerc Sport 2005;76:428.
  • 25
    Bund A, Wiemeyer J. Self-controlled learning of a complex motor skill: effects of the learners’ preferences on performance and self-efficacy. J Hum Movement Stud 2004;47:21536.
  • 26
    Topping KJ. Trends in peer learning. Educ Psychol 2005;25:63145.
  • 27
    Magill RA, ed. Motor Learning and Control: Concepts and Applications, 7th edn. New York, NY: McGraw Hill 2001;248–304.
  • 28
    Zwarenstein M, Reeves S, Barr H, Hammick M, Koppel I, Atkins J. Interprofessional education: effects on professional practice and health care outcomes. Cochrane Database Syst Rev 2001;1:CD002213.
  • 29
    Eva KW, Regehr G. Self-assessment in the health professions: a reformulation and research agenda. Acad Med 2005;80 (Suppl):4654.
  • 30
    Mangold K. Educating a new generation: teaching baby boomer faculty about millennial students. Nurse Educ 2007;32:213.
  • 31
    Villeneuve M, MacDonald J. Toward 2020: Visions for Nursing. Ottawa, ON: Canadian Nurses Association 2006.