SEARCH

SEARCH BY CITATION

Keywords:

  • Scotland;
  • educational measurement/*methods;
  • *education, medical, undergraduate;
  • students, medical/*psychology;
  • feedback;
  • perception;
  • teaching/*methods;
  • *attitude of health personnel

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

Objectives  The objectives of this study were to identify and analyse students’ attitudes to the portfolio assessment process over time.

Methods  A questionnaire containing statements and open questions was used to obtain feedback from students at the University of Dundee Medical School, Scotland. The responses to each statement were compared over 4 years (1999, 2000, 2002 and 2003).

Results  Response rates were 83% in 1999, 70% in 2000, 89% in 2002 and 88% in 2003. A major finding is that students perceived that portfolio building heightened their understanding of the exit learning outcomes and enabled reflection on their work. Student reactions to the portfolio process were initially negative, although they appreciated that senior staff took time to become familiar with their work through reviewing their portfolios. Student attitudes became more positive over the 4 years as the process evolved. Although portfolio assessment was recognised as supporting student learning, portfolio building was perceived to interfere with clinical learning as a result of the excessive amounts of paper evidence required.

Conclusions  Paperwork should be kept within manageable limits. A student induction process that highlights the importance of providing evidence for achieving all learning outcomes, not just theoretical knowledge and skills, may be helpful in allaying student concern over portfolio building and assessment and support preparation for lifelong learning and reflective clinical practice.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

The educational value of portfolios is accepted with regard to the promotion of student-centred learning,1 deep learning2 and reflective learning.3 The potential of portfolios to drive student learning in an educationally desirable direction and the importance of identifying individual strengths and weaknesses with regard to limits of competence as part of professional clinical practice may be some of the reasons for the current wave of enthusiasm for portfolios in the health care professions.4 The place of portfolios in a summative assessment scheme at undergraduate level, however, is still unclear, although portfolios are used to assess aspects of the curriculum such as personal and professional development5 and communication skills.6 There are concerns regarding portfolio assessment relating to its reliability,7,8 practicability9 and student acceptance of the process.10 However, O’Sullivan et al.11 showed that, in postgraduate training in psychiatry, portfolio assessment results achieved reliability levels appropriate for high-stakes assessment, and Driessen et al.12 found that reliability and validity could be achieved using a structured portfolio to assess reflective competence at undergraduate level.

The research questions addressed in this study were:

  • • 
    What were the attitudes of students at the University of Dundee Medical School, to the introduction of a portfolio assessment process as a major component of their final examination?
  • • 
    Did student attitudes change over the first 4 years of the evolution of the portfolio assessment process?

A paper dealing with staff attitudes to the portfolio assessment process is in preparation. These articles will be of interest to those wishing to implement a portfolio assessment system.

Context

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

The Dundee Medical School curriculum and the student assessment process

The undergraduate medical programme at the University of Dundee, Scotland, UK, is a 5-year course with an integrated, systems-based curriculum.13 The framework for the curriculum design refers to 12 exit learning outcomes14 in a three-circle model, comprising:

  • • 
    the inner circle, representing the tasks of the doctor, including: clinical skills; practical procedures; investigating a patient; patient management; health promotion and disease prevention; communication skills, and the handling and retrieval of information;
  • • 
    the middle circle, representing the approach to the tasks, including: understanding of the basic and clinical sciences and their underlying principles; appropriate attitudes, ethical stance and legal responsibilities, and appropriate decision making, clinical reasoning and judgement, and
  • • 
    the outer circle, representing the professionalism of the doctor, embracing: understanding of the doctor’s role in the health service and aptitude for personal development.

Portfolio building for Phase 3 (Years 4 and 5) began in 1997. The first portfolio assessment took place in 1999 and was part of the final examination, designed to assess the 12 exit learning outcomes. The final examination is in two parts. Part 1 takes place at the end of Year 4 and comprises a multiple-choice paper of extended matching items to examine knowledge and its application to clinical practice, a constructed response question paper to assess higher-order thinking (e.g. problem solving, critical analysis) and knowledge, and an objective structured clinical examination (OSCE) to assess clinical and communication skills. Only those students who pass each examination in Part 1 progress to Part 2, which takes place towards the end of Year 5. Part 2 comprises a portfolio assessment process, which is an exemption examination. Those who pass go on to graduate. Those who fail re-sit the portfolio and go through a diagnostic OSCE, both of which they must pass to graduate. Those who fail the re-sit portfolio and diagnostic OSCE repeat the final year. The steps in the portfolio assessment process were described by Daviset al.9 and Friedman Ben David et al.15

The students construct their portfolios during Years 4 and 5 by including course work submitted at pre-arranged times over the 2 years. The work is marked and graded and returned to the students with feedback. At least 15 staff grade each student’s work over the 2 years. The marking system gives the student a grade (A–G) for each relevant learning outcome of the course and includes constructive feedback. Students reflect on their learning and on the feedback returned to them, and are expected to provide evidence of subsequent learning in relation to any weaknesses identified. Student grades, reflection on the grades and learning, and remedial work are all included in the portfolio content. All students undertake the portfolio exemption assessment process, in which two examiners independently judge the student’s strengths and weaknesses in terms of the learning outcomes by reviewing the evidence contained in the portfolio. Then, in a structured interview session, they meet with the student to confirm or refute their initial judgements. A final grade is awarded to each student for every outcome. Students who receive a passing grade in every outcome are exempt from further assessment and have passed their final examinations. Two or more borderline (D) grades or any one grade below D results in non-exemption from Part 2 of final examinations. Non-exempt students are counselled and undertake appropriate remediation before resubmitting their enhanced portfolios for the final assessment.

The portfolio content is shown in Table 1.

Table 1.   Portfolio content in 1999
Portfolio contentNumber of itemsMarked by
  1. PRHO = pre-registration house officer; SSC = student-selected component

Completed PRHO learning plans and assessment forms
 Medicine 1Educational supervisor
 Surgery 1Educational supervisor
Clinical SSC assessment form 1Course tutor
Practical procedures form
 Procedures competent to practise49Signed off by ward staff
 Procedures observed by the student13Not graded
Case discussions – students’ written essays relating curriculum outcomes to an individual patient seen by the student19Theme co-ordinator
Year 4 assignment – student project report, feedback and mark 1Project supervisor
Theme-based SSC – assessment form following a 4-week course 1Supervisor of individual course
General practice SSC – written report and assessment form 1GP tutor
Elective assessment form from on-site supervisor 1Individual on-site supervisor
Elective report (written report by student) 1One of two elective co-ordinators in Dundee Medical School
Patient presentations – Year 448Not marked
 Year 5 8Not marked
The following additional material was included in relation to interim year arrangements
 Psychiatry examination mark 1Marked by department
 Obstetrics and gynaecology examination mark 1Marked by department
Evidence of reflection presented using proformaWith every portfolio componentNot marked
Any additional material illustrating progress towards the curriculum outcomes in Phase 3VariableNot marked

Context of study

The exit learning outcomes, portfolio content and portfolio assessment process all changed during the course of this study.

Changes in outcomes

In 1999, 11 exit learning outcomes were identified. In 2000, the outcome that related to ‘patient investigation and management’ was split to produce two outcomes: ‘patient investigation’ and ‘patient management’.

Changes to portfolio content

Changes were made to the portfolio content for each cohort of students on the basis of evaluation feedback from both staff and students. These are summarised in the first part of Table 2, together with the reason for each change.

Table 2.   Changes in portfolio content and process from 1999 to 2003
YearPortfolio contentChangeReasons for change
  1. SSC = student-selected component

2000Clinical SSCsThe number increased from one to twoMove from interim arrangement
 Practical proceduresSlight variation in the number of procedures occurred on an annual basisPractice requirements
 Theme SSCsThe number increased from one to twoMove from interim arrangement
 Case discussionsThe number decreased from 19 to 15 in Year 4Student evaluation feedback relating to overload
 Patient presentationsYear 5 presentations were removedStudent evaluation feedback relating to overload
2000Analysis of each learning opportunity in terms of the 12 learning outcomesReplaced reflection proformaProforma provided repetitive details Students tended to omit middle and outer circle outcomes when completing the proforma
 Year 4 patient presentationsThe number decreased to 10Student/staff evaluation feedback relating to overload
 Anaesthetic workbookA new additionTo enhance the importance of this 1-week course
2001Outcome summary sheetReplaced analysis of each learning opportunityReduction of repetition
2002Year 5 progress test resultsA new additionIn response to staff feedback and concerns regarding student levels of knowledge
 Curriculum vitaeA new additionTo improve student career planning
YearPortfolio processChangeReasons for change
1999Use of prompt questions by portfolio examiners in the structured interviewQuestions designed to probe strengths and weaknesses of individual students, since 1999Prompt questions ignored by examiners because seen as too artificial
1999Two pairs of examiners interview each student for 25 minutesOne pair of examiners interviews each student for 40 minutes, since 2000Staff feedback indicated the time was too short for in-depth discussion
Changes to process

Changes were also made to the process and are summarised in the second part of Table 2.

Methods

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

Six sources of evidence were used to evaluate the portfolio assessment process: analysis of student results; observer documentation; examiner evaluation questionnaire; student evaluation questionnaire; verbal report from student representatives, and external examiner reports.

The first five of these six sources were reported by Daviset al.9 This article focuses on one of these sources, the student evaluation questionnaire, which was administered in 1999, 2000, 2002 and 2003.

The initial student questionnaire was developed using periodic feedback from students during their Phase 3 studies and student group discussions conducted by an external observer.9 The initial questionnaire underwent changes each year to reflect changes in portfolio content and the assessment process. The full questionnaire and responses for each year are available from the first author on request. They focused broadly on the process of building the portfolio and the assessment process itself.

In each year, the questionnaire was distributed after all students had been informed of the outcome of the examination and the non-exempt candidates had received counselling. Students completed the questionnaire anonymously. Participation in the study was voluntary. No record of participation or non-participation was made. No individuals were identified in the study.

The questionnaire for all 4 years contained a series of statements and open questions. The students were asked to rate the statements using a 5-point Likert scale where 1 = strongly disagree and 5 = strongly agree. The responses were analysed using spss Version 10.1 (SPSS, Inc., Chicago, IL, USA). The responses for each statement for all years were compared using the Kruskal–Wallis test.

As there was no suitable method to estimate the effect size of non-parametric comparisons involving more than two variables, the anova-based partial eta squared was calculated using spss to estimate the effect that ‘year of administration’ had on any change in student perceptions.

The closed statements were also subjected to principal component analysis. Only questionnaire items administered for ≥ 2 years were analysed as below this there were insufficient responses. Items that had correlation coefficients of ≥ 0.9 were discarded to eliminate multi-colinearity and singularity (i.e. high or perfect correlations). The rest of the items were subjected to principal component analysis with varimax rotation.

The responses to the open questions were read independently by the first and second authors. Themes were identified for each year and a list of themes common to all years was agreed by both authors. For the purposes of comparison between the years, responses which appeared with a frequency of > 10 within a single year or which appeared in all 4 years are reported here.

Results

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

The questionnaire response rates were 83% (107/129) in 1999, 70% (102/146) in 2000, 89% (116/131) in 2002 and 88% (141/160) in 2003.

Principal component factor analysis extracted seven factors. Based on the eigenvalues and the scree plot, however, most items loaded to five main factors, which accounted for 52.5% of the total variance. The items were then classified according to this five-factor solution. Table 3 shows the five main factors that the factor analysis produced, together with the median (with means and standard deviations in brackets), P-value and effect size of each factor.

Table 3.   The median (mean ± standard deviation) for each factor in each year, and P-values and partial eta squared values for each factor compared over 4 years
Factor2003200220001999P-value*ηp2
  1. * P ≤ 0.005

  2. ηp2 = partial eta squared

1 Portfolio assessment4 (3.3 ± 0.4)4 (3.4 ± 0.4)3 (3.2 ± 0.5)3 (2.9 ± 0.4)0.040.05
2 Potentially contentious issues3 (3.1 ± 0.7)3 (3.2 ± 0.7)3 (3.1 ± 0.7)4 (3.5 ± 0.9)0.780.05
3 Portfolio content4 (3.6 ± 0.2)4 (3.4 ± 0.2)4 (3.4 ± 0.1)4 (3.3 ± 0.2)0.540.02
4 Achievement of curriculum outcomes3 (2.9 ± 0.2)3 (2.9 ± 0.2)3 (3.1 ± 0.1)2 (2.5 ± 0.5)0.060.05
5 Portfolio building4 (3.7 ± 0.2)4 (3.8 ± 0.2)4 (3.7 ± 0.2)3 (2.9 ± 0.3)0.010.12

The main questionnaire items that fell into each factor are given below.

Portfolio assessment process

  • • 
    I felt my performance was judged against acceptable standards.
  • • 
    The portfolio examination identified my strengths and weaknesses.
  • • 
    Examiners’ expectations were clear.
  • • 
    The combination of an OSCE and written examination in fourth year and a portfolio examination in fifth year is an effective approach to assessing my competence to graduate.
  • • 
    The portfolio examination was well organised.
  • • 
    Portfolio examination allowed me to reflect on my Phase 3 work.
  • • 
    I appreciated senior staff taking time to become familiar with my work through reviewing my portfolio.
  • • 
    I have positive feelings towards the portfolio.

Potentially contentious issues

  • • 
    Building the portfolio interfered with my clinical learning.
  • • 
    I was petrified at the prospect of the examination.
  • • 
    I would have liked more freedom to select what went into the portfolio to demonstrate that I achieved the outcomes.
  • • 
    I would have liked more advance information about building the portfolio.
  • • 
    There was too much paperwork.
  • • 
    Different examiners applied different standards.
  • • 
    Portfolio examination should be introduced earlier in the course.

Portfolio content

  • • 
    My grades in the clinical student-selected component (SSC) represented my true ability.
  • • 
    My grades in the theme-based SSC represented my true ability.
  • • 
    What I wrote in my elective report represented my true ability.
  • • 
    My grades in the pre-registration house officer learning plan represented my true ability.

Achievement of curriculum outcomes

  • • 
    The 12 curriculum outcomes provide a good basis for deciding who should graduate.
  • • 
    My grades in the case discussions represented my true ability.
  • • 
    Building the portfolio helped me to achieve the 12 curriculum outcomes through my daily work.

Building the portfolio

  • • 
    Building the portfolio was a useful learning experience.
  • • 
    Building the portfolio gave me a sense of achievement.
  • • 
    Building portfolio heightened my understanding of the 12 outcomes.

Medians (with means and standard deviations in brackets), P-values and effect sizes for the student ratings of each questionnaire statement included in the factor analysis are shown in Appendix S1, with the questionnaire statements categorised by the five main factors. The statements that were not included in the factor analysis are shown in Appendix S2.

Most of the significant change occurred between 1999 and 2000.

The responses to the open questions for the years from 1999 to 2003, grouped under broad headings, are shown in Table 4.

Table 4.   Student feedback for open-ended questions: the main comment categories; and the number of responses (percentages of the total number of students in each year)
Comment categoryNumber of responses (%)
2003200220001999
Positive findings
 1 Gained a sense of achievement/satisfaction26 (18.4)22 (16.8)15 (10.3)15 (11.6)
 2 Improved my ability (e.g. organisational,     IT [information technology])21 (14.9)10 (7.6)5 (3.4)6 (4.7)
 3 Provided a collection of my work/an overview/covered     a variety of aspects20 (14.2)5 (3.8)6 (4.1)2 (1.6)
 4 Examiners created a relaxed atmosphere     and encouraged discussion34 (24.1)18 (13.7)15 (10.2)25 (19.4)
 5 A chance was provided to represent oneself     and one’s work and justify it16 (11.3)14 (10.7)4 (2.7)1 (0.8)
Negative findings
 6 Little guidance/difficulties in writing reflective     summary sheets/not certain what to include30 (21.3)6 (4.6)10 (6.8)11 (8.5)
 7 Not standardised/subjective/unfair71 (50.4)69 (52.7)59 (40.4)31 (24.0)
 8 Little emphasis on clinical competence24 (17.0)3 (2.3)-6 (4.7)
 9 Time-consuming (at the expense of clinical experience)36 (25.5)16 (12.2)14 (9.6)32 (24.8)
 10 Too much paperwork or work in general18 (12.8)17 (13.0)12 (8.2)58 (45.0)

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

The study identified students’ attitudes to the introduction of the portfolio assessment process at Dundee Medical School and significant changes in attitudes over a 5-year period. Student perceptions are discussed under the following headings, based on the factor analysis: portfolio building and learning; achievement of outcomes and portfolio building; potentially contentious issues, and portfolio assessment and reflection. Finally, the possible reasons for changes in the students’ perceptions are discussed.

Portfolio building and learning

An important finding of the study is that Dundee Medical School students perceived that the portfolio process supported their learning and heightened their understanding of the institutional learning outcomes (item 32, Appendix S1). Although the effect sizes were generally low, the highest result was for this item, accounting for 22% of the variability of student perception over the years. A significant trend in assessment is its use to support learning.16 Previously, portfolios have been shown to support individually set learning goals at continuing professional development level17 and reflective practice in undergraduate medical education.18 Weiner10 recommended encouraging the student to make the connection between his or her individual responsibility to meet the learning outcomes and the portfolio process.

Achievement of outcomes and portfolio building

That the Dundee Medical School portfolio process was explicitly structured around the 12 exit learning outcomes of the curriculum may have been responsible for heightening students’ understanding of the learning outcomes. Driessen et al.19 identified a clear portfolio structure as one of the key ingredients contributing to the effectiveness of portfolio assessment of undergraduate medical students in Maastricht, the Netherlands. The utility of developing a portfolio assessment directly related to learning outcomes was also recognised by Gallagher20 at the Faculty of Nursing, Universal College of Learning, New Zealand. Robinson,21 although not in a health care setting, reported similar findings. Gordon5 produced a ‘non-prescriptive list of prompts’ for the contents of the portfolio used to assess Year 1 medical students at the University of Sydney, Australia. These prompts, although not explicitly outcomes in nature, provided the structure for this portfolio.

Potentially contentious issues

The students were equivocal about the clarity of examiner expectations (item 5, Appendix S1). They were uncertain whether the examination was fair (item 1, Appendix S1) and perceived that different examiners applied different standards (item 20, Appendix S1). The examiners probed individual student strengths and weaknesses during the portfolio discussion with the student, resulting in inevitable variation in the questions put to the students. The individual nature of the questioning, however, led to perceptions of unfairness and the application of different standards in the eyes of students accustomed to objective, standardised examinations such as the OSCE and multiple-choice question formats.

Schuwirth et al.22 justified adherence to subjective judgements, but recommended sampling across error sources by collecting independent judgements from many different judges. This sampling has been achieved in the Dundee Medical School portfolio assessment, with a minimum of 15 staff grading each student. Studies by Rees and Sheard23 and Boenink et al.,25 which achieved such sampling, have produced acceptable inter-rater reliability. Dreissen et al.12 used a portfolio analysis scoring inventory and suggested that inter-rater reliability is enhanced when criteria are open and shared.

Portfolio assessment and reflection

Reflection is a crucial component of safe medical practice and is an important prerequisite for producing self-directed learners. The portfolio assessment facilitated student reflection on their Phase 3 work (item 9, Appendix S1). This finding confirms the evidence in the literature that portfolio assessment leads to reflective learning.3,15,18,25,26 The increasing recognition of reflection as a key factor influencing the practice of medicine has led to the introduction of specific assessment tools for reflection in portfolio assessment.6,24,27 In feedback to the open-ended questions, some students indicated that they needed more guidance on the reflective component (item 6, Table 4). The student induction process and changes in the mentoring programme have been introduced to tackle these issues.

Changes in student attitudes

The students in the present study felt that portfolio building interfered with clinical learning (item 12, Appendix S1). The students’ perception that there was too much paperwork (item 18, Appendix S1) in the portfolio process and that the process was time-consuming (item 9, Table 4) may be pertinent and may have contributed to a significant increase in student anxiety over the years (item 14, Appendix S1). Reduction in portfolio content over the years may be responsible for improvements in student attitudes to the portfolio process over time. Further refining of portfolio content should be undertaken, although it is well recognised that the portfolio process is time-consuming for both students and examiners.28 There is, however, another explanation: students may not place the same importance on all the groups of outcomes. Students tended to focus on inner circle outcomes at the expense of middle and outer circle outcomes (Table 2). Student induction that emphasises attitudes and professionalism as well as the more technical tasks of the doctor is crucial to student understanding of the process. The student induction process was also used to emphasise the importance of part 1 of the final examination including an OSCE to assess their clinical competence in order to address student concerns over a perceived lack of emphasis on clinical competence in the portfolio building and assessment process.

Several statements (statements 2, 3, 7, 9, 11, 12, 16, 24, 27, 28, 30, 31, 32, 34, 35 in Appendix S1), however, indicate that student attitudes towards the portfolio process improved, particularly after the initial year. The reasons for these changes in student attitudes may partly reflect developments in the external world, leading to familiarity with the portfolio process. In 1997 when the portfolio assessment process was introduced at Dundee Medical School, the notion of a portfolio was unfamiliar to both staff and students. By 2003, many doctors were building portfolios as evidence of good standing for postgraduate awards, revalidation purposes or postgraduate assessment. After 1999, students who had undergone the first portfolio assessment process were asked to contribute to the student induction in the subsequent year, which may have contributed to the improved ratings in 2000.

The positive attitudinal change towards portfolio assessment and the initial feeling of student uncertainty and resentment is a finding common to several portfolio assessment studies. Weiner10 found that trainee teachers expressed initial reservations about the portfolio assessment process. Robinson21 stated that although some frustration was exhibited by several students at the beginning, this tapered off quickly as students became familiar with the portfolio development process. Mathers et al.,29 in relation to portfolio assessment for postgraduate general practice trainees, found that ‘once the principles were understood and the requirements clearly explained, the process became easier and less threatening’.

The growing student appreciation of the portfolio process in the present study may reflect the willingness of the medical school to allow the portfolio assessment to evolve in line with evaluation feedback. One of the seven strategies recommended by the World Health Organization for overcoming resistance to change is to ‘test and modify the innovations frequently’.30

We acknowledge, however, the limitations associated with self-perception questionnaires. As this study is based only on student perceptions, these results should be interpreted in conjunction with evidence from other studies on staff perceptions, external examiner reports, and student results evaluation.

Conclusions

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

The students perceived that portfolio building heightened their understanding of exit learning outcomes and enabled reflection on their work. Initially there were negative attitudes to the portfolio assessment process. Attitudes improved over the 5 years of the study, but concerns about the amount of paperwork remained, as did anxiety about the examination.

Contributors:  MHD designed and administered the questionnaires and devised the framework of the article. GGP carried out the data analysis and prepared the initial manuscript. MHD and JSK contributed to and edited the draft manuscript. All authors approved the final manuscript for publication.

Acknowledgements:  this study was carried out as part of the course evaluation procedure at Dundee Medical School, Dundee, UK.

Funding:  none.

Conflicts of interest:  none.

Ethical approval:  not required.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

Supporting Information

  1. Top of page
  2. Abstract
  3. Introduction
  4. Context
  5. Methods
  6. Results
  7. Discussion
  8. Conclusions
  9. References
  10. Supporting Information

Appendix S1. Comparison over the 4 years of student responses to closed statements, categorised by factor analysis

Appendix S2. Comparison over the 4 years of student responses to closed statements, excluded from factor analysis

Please note: Wiley-Blackwell are not responsible for the content or functionality of any supporting materials supplied by the authors. Any queries (other than missing material) should be directed to the corresponding author for the article.

FilenameFormatSizeDescription
MEDU_3250_sm_appendices ONLINE ONLY.doc180KSupporting info item

Please note: Wiley Blackwell is not responsible for the content or functionality of any supporting information supplied by the authors. Any queries (other than missing content) should be directed to the corresponding author for the article.