SEARCH

SEARCH BY CITATION

Keywords:

  • dental education;
  • interactive learning;
  • radiographic anatomy;
  • dental radiology curriculum

Abstract

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References

Introduction:  Studies reporting high number of diagnostic errors made from radiographs suggest the need to improve the learning of radiographic interpretation in the dental curriculum. Given studies that show student preference for computer-assisted or digital technologies, the purpose of this study was to develop an interactive digital tool and to determine whether it was more successful than a conventional radiology textbook in assisting dental students with the learning of radiographic anatomy.

Materials and methods:  Eighty-eight dental students underwent a learning phase of radiographic anatomy using an interactive digital tool alongside a conventional radiology textbook. The success of the digital tool, when compared to the textbook, was assessed by quantitative means using a radiographic interpretation test and by qualitative means using a structured Likert scale survey, asking students to evaluate their own learning outcomes from the digital tool.

Results:  Student evaluations of the digital tool showed that almost all participants (95%) indicated that the tool positively enhanced their learning of radiographic anatomy and interpretation.

Discussion:  The success of the digital tool in assisting the learning of radiographic interpretation is discussed in the broader context of learning and teaching curricula, and preference (by students) for the use of this digital form when compared to the conventional literate form of the textbook.

Conclusion:  Whilst traditional textbooks are still valued in the dental curriculum, it is evident that the preference for computer-assisted learning of oral radiographic anatomy enhances the learning experience by enabling students to interact and better engage with the course material.


Introduction

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References

The responsibilities of a dentist regarding radiological examinations are clearly defined in a clause of the Code of Practice for Radiation Protection in Dentistry (1). This clause covers the responsibility of ‘interpretation of radiographs’, which means that the dentist must be able to identify normal anatomical landmarks and distinguish these from signs of pathological change. Radiology in dentistry, as in other health disciplines, is an important and commonly utilised diagnostic aid. However, accuracy in radiographic interpretation appears to be lacking, with studies reporting a high number of diagnostic interpretation errors made by health professionals, even by radiology residents who are regarded as the ‘gold standard’ in the measure of competency (2, 3). In searching for a resolution to this issue, it is important to define the underlying cause of these interpretation errors, thereby suggesting the need to review the learning and teaching of radiology.

One of the mandatory skills required at graduation is that each dental graduate has the ability to carry out and interpret radiological examinations common in general dental practice. With the emergence of digital radiography, film-processing errors are virtually eliminated. The next obvious obstacle is to define a means by which interpretation errors can be eliminated, or at least reduced. Studies show that the ability to interpret radiographs is not correlated with the level of medical education (4), and this suggests that emphasis should be placed less on the quantity of radiology learning and teaching, and more so on the quality. The issue of curriculum crowding in tertiary education further emphasises this need for improved quality rather than quantity in learning and teaching. Hence, a more specific educational approach is required to address the issue of error in radiographic interpretation.

With the emergence of new teaching methods such as action learning, competency-based education, contextual learning, life-long learning, problem-based learning and self-directed learning, there has been a wealth of educational literature on how these methods should be implemented (5–10). Despite these proposals, the curriculum for undergraduate education in oral and maxillofacial radiology is characterised by a strongly didactic viewpoint (5), with emphasis on detailed facts. This approach, however, places the student at risk of merely accumulating isolated facts (surface learning) rather than making the connection between excellence in radiography and quality of diagnosis (deep learning).

A recent study by Gutmark et al. (11) highlighted the preference for computer-assisted learning and reference tools, when compared to reference books, even by physicians in a radiology department. Computer-assisted learning methods have already been designed and used in other health professions to aid decision-making in interpretation and diagnosis (12–14). The purpose of this study, therefore, was to develop an interactive digital tool that could be made available online to assist undergraduate dental students with their learning of radiography and the identification of radiographic skeletal and soft tissue anatomy.

The aims of this study were threefold: (i) to determine whether the digital tool was more successful than a conventional radiology textbook as a learning and teaching resource in assisting dental students with their learning and understanding of radiographic anatomy; (ii) to determine whether the effect of the digital tool in learning radiographic interpretation was superior in novice students (second-year dental students with no prior learning in radiographic interpretation) when compared to experienced students (fifth-year dental students with prior learning in radiographic interpretation); (iii) to measure the visual-spatial ability of dental students and to determine whether there was a correlation between visual-spatial ability and radiographic interpretation ability.

Materials and methods

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References

A digital interactive learning tool was constructed as an online resource to assist dental students with their learning of radiographic anatomy. Digital maps of anatomical hard tissue structures and soft tissue structures were created using a digital photograph of a volunteer and the drawing functions of Adobe Photoshop™ CS4 version (Adobe Systems Incorporated, San Jose, CA, USA). These maps were then superimposed on extraoral radiographs of the same volunteer using Adobe Photoshop™ CS4. Ten transparency levels were then made of these superimposed images (at 10% intervals ranging from 0% to 100%) and exported into Microsoft PowerPoint 2004 for Mac, Version 11.5.1 (Microsoft, Bloomsbury Publishing, New York, NY, USA). Hence, the tool was essentially a PowerPoint presentation consisting of numerous slides of varying transparency of hard and soft tissue anatomy superimposed on the same radiographic image (Figs 1–3). The interactive function built into the digital tool was the ability to scroll through these transparencies, allowing addition or subtraction of soft tissue structures from the hard tissue structures displayed on the radiographs. Text and labels were also incorporated into the tool, indicating the various anatomical features generally discernable in radiographs.

image

Figure 1.  Representative images from the digital tool showing maps of anatomical hard tissue structures (top row) and soft tissue structures (bottom row) superimposed on an extraoral radiograph, the Lateral Cephalogram. The superimpositions (of tissue structures onto radiographs) shown are with a transparency level of 100% (A1 and A2), 50% (B1 and B2), 20% (C1 and C2) and 0% (D1 and D2). The interactive function built into the digital tool enables the user to scroll through these transparencies (ranging from 0% to 100%) at self-pace.

Download figure to PowerPoint

image

Figure 2.  Representative images from the digital tool showing maps of anatomical hard tissue structures (top row) and soft tissue structures (bottom row) superimposed on an extraoral radiograph, the posterior-anterior cephalogram. The superimpositions (of tissue structures onto radiographs) shown are with a transparency level of 100% (A1 and A2), 50% (B1 and B2), 20% (C1 and C2) and 0% (D1 and D2). The interactive function built into the digital tool enables the user to scroll through these transparencies (ranging from 0% to 100%) at self-pace.

Download figure to PowerPoint

image

Figure 3.  Representative images from the digital tool showing maps of anatomical hard tissue structures (left) and soft tissue structures (right) superimposed on an extraoral radiograph, the orthopantomogram. The superimpositions (of tissue structures onto radiographs) shown are with a transparency level of 100% (A1 and A2), 50% (B1 and B2), 20% (C1 and C2) and 0% (D1 and D2). The interactive function built into the digital tool enables one to scroll through these transparencies (ranging from 0% to 100%) at self-pace.

Download figure to PowerPoint

Aim 1

Sixty-four second-year undergraduate dental students (39 men and 25 women) attending the School of Dentistry at the University of Queensland in 2009 participated in the study. Students were informed of the study via participant information sheets, and participation was voluntary after signing informed consent forms. Second-year dental students specifically were invited to participate in this part of the study as they had no prior learning in radiographic anatomy, but had prior learning in head and neck anatomy as part of the dental curriculum to that point in their education.

The participants were randomly allocated to one of two groups, namely Group A (31 students) and Group B (33 students). Each group underwent a 1-h intervention phase involving the learning of radiographic anatomy, using one of two resources: Group A used a conventional oral radiology textbook (15) and Group B used the digital tool. Both resources were accessed online via the University of Queensland Online Learning Management System, Blackboard Learning System™ (Blackboard Academic Suite™; Blackboard Inc., Washington, DC, USA). All participants were then assessed on their understanding of radiographic anatomy. The assessment consisted of 12 multiple-choice questions (MCQ) based on radiographic anatomy as seen on an orthopantomogram (OPG) (Fig. 4). The MCQ test was provided to each participant in hard copy format, whereas the OPGs corresponding to each question were provided online (via Blackboard Learning System™) in the form of a PowerPoint presentation, with all OPGs presented in the same order as the questions, and projected on a dual screen. The total time allocated was 24 min (2 min per question), and each student was required to circle the one correct answer (out of four alternatives) on the hard copy provided.

image

Figure 4.  Representative multiple-choice questions asked on the radiographic interpretation test (MCQ Test). The correct answers are shown in bold font.

Download figure to PowerPoint

After completing the MCQ test, the two groups underwent a second 1-h intervention phase of radiographic anatomy learning. However, in this case the groups were reversed; Group A used the digital tool as the resource and Group B used the conventional oral radiology textbook. All participants were then assessed again using another MCQ test, provided in the same manner as the first test. The two MCQ tests were standardised such that the questions asked were of equal relevance and difficulty, and a select number of OPGs used in the first test were re-used in the second test. The MCQ tests were marked by one of the authors (JV) in a blinded manner, and the scores from each test were compared for each student. To ensure anonymity whilst being able to compare the scores from the two MCQ tests, each participant used a randomly allocated number, anonymous to the marker. The aim of comparing the scores from the two MCQ tests was to determine, quantitatively, whether the digital tool was successful in assisting student learning and understanding of radiographic anatomy, and whether the digital tool was more successful than the conventional radiology textbook at assisting student learning.

After completing the intervention phase, each participant was asked to complete a questionnaire to assess (in qualitative terms) their impressions regarding the effect of the digital tool on their learning of oral radiographic anatomy, when compared to the conventional oral radiology textbook. The questionnaire was designed with a four-point Likert scale to determine whether this digital tool enhanced student learning of oral radiographic anatomy.

Aim 2

The study was repeated with 24 fifth-year undergraduate dental students (17 men and seven women), in exactly the same manner, utilising the same MCQ tests and questionnaires. However, instead of undergoing two intervention phases of learning radiographic interpretation, students underwent one intervention, using the digital tool only. The intervention phase utilising the conventional oral radiology textbook was omitted, given that fifth-year dental students had prior learning of radiographic interpretation utilising this conventional radiology textbook as part of their curriculum. Statements in the questionnaire relating to the ‘textbook’ were regarded as pertaining to whichever conventional radiology textbooks were used by the students throughout their undergraduate programme to assist them with their learning of radiographic interpretation. The results obtained from the MCQ test (performed pre- and post-intervention) and the questionnaire evaluations were compared with those obtained from the second-year students to determine whether the effect of the digital tool in learning radiographic interpretation was superior in novice students when compared to experienced students.

Aim 3

All 88 participants underwent assessment of their visual-spatial ability, using the redrawn Vandenberg and Kuse mental rotations test version A (MRT-A) (16). This is the most commonly used version of the MRT, which examines the ability to mentally rotate figures around the vertical axis. The test consisted of 24 items, and for each item a target figure was presented on the left and four stimulus figures on the right (Fig. 5). For all items, two of the four stimulus figures were rotated versions of the target figure. Participants had to mentally rotate the figures and identify the two rotated versions of the target figure. A score of ‘1’ was given only if both choices were correct. The maximum score was 24. The instructions, procedures and scoring methods were identical to those reported by Peters et al. (16). The score obtained on the MRT was compared to the score obtained on the MCQ tests to determine a putative correlation between visual-spatial ability and the ability to interpret radiographs.

image

Figure 5.  Example diagram of the redrawn Vandenberg and Kuse mental rotations test version A (MRT-A). The target figure is on the left and four stimulus figures are on the right. The first and third figures are rotated versions of the target figure.

Download figure to PowerPoint

Statistical analysis

Statistical analysis of groups was performed using the Student’s t-test for paired data, and statistical significance was accepted at the level of < 0.05. Associations between variables were determined using Pearson’s correlation, where r represents the correlation coefficient. All data are presented as mean ± standard deviation (SD), where N represents the number of participants.

Results

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References

A total of 88 participants, consisting of 64 second-year students and 24 fifth-year students, enrolled in the Bachelor of Dental Science (BDSc) at UQ participated in the intervention study, providing 88 quantitative measures of the intervention, whilst 86 participated in the questionnaire evaluation element of the intervention. Questionnaire evaluations were designed with a four-point Likert scale with the following options: Strongly Agree, Agree, Disagree and Strongly Disagree. All aspects of the study reported here were approved and performed in accordance with the guidelines of the ethical review process of the University of Queensland.

The scores of the radiographic interpretation MCQ tests for second-year dental students are presented in Table 1. The results show that it did not matter whether the students used the textbook or digital tool in the first learning intervention phase (mean score of 51.3% for textbook vs. 50.0% for digital tool). In fact, the scores did not change significantly after the two groups repeated the learning intervention phase with the alternative resource. The scores improved marginally, but this was the case for both groups (mean score of 58.6% for Group A and 54.6% for Group B), suggesting that the content of material learnt by the students was similar, despite the use of different methods.

Table 1.   Radiographic interpretation scores (presented as percentage) for second-year dental students, obtained from the multiple-choice question (MCQ) tests following learning intervention phase 1 (MCQ Test 1) and learning intervention phase 2 (MCQ Test 2)
 MCQ Test 1 (%)MCQ Test 2 (%)
  1. Group A (= 31) underwent intervention phase 1 with the radiology textbook and intervention phase 2 with the digital tool. Group B (= 33) underwent intervention phase 1 with the digital tool and intervention phase 2 with the radiology textbook.

  2. Values are mean ± standard deviation, and N represents the number of participants in the group.

  3. *P value = 0.049 for Test 2 vs. Test 1, Group A.

  4. P value > 0.05 for Test 2 vs. Test 1, Group B.

  5. P value > 0.05 for Group B vs. Group A, Test 1.

  6. P value > 0.05 for Group B vs. Group A, Test 2.

Group A51.34 ± 17.08 (textbook)58.60 ± 17.39* (digital tool)
Group B50.00 ± 12.34 (digital tool)54.57 ± 14.94 (textbook)

The scores of the radiographic interpretation MCQ tests for fifth-year dental students were higher than those of second-year dental students, both pre-intervention (mean score of 79.9%) and post-intervention (mean score of 82.6%). Similarly to second-year students, there was a slight improvement in the average score post-intervention with the digital tool; however, this was not significant and could be explained by the effect of repetition on learning and performance.

It is evident from the questionnaire evaluation results of second-year dental students (Table 2) that the large majority of students (94%) preferred using the digital tool to the textbook. All of the participants (100%) found the digital tool easy to use, and the majority (85%) felt more confident learning with the digital tool than with the textbook. There was an overall agreement that learning radiographic anatomy was easier (90%) and faster (95%) with the digital tool, and the majority of students (75%) felt that learning with the digital tool was more beneficial to them than learning with a textbook. Most pertinent to this study, the results show that a remarkable 95% of students felt the digital tool positively enhanced their learning of radiographic interpretation, with 97% agreement that it positively affected their performance on the radiographic interpretation MCQ test. Equally outstanding, the results reveal that almost all of the students felt the digital tool positively affected what they learnt (95%) and how they learnt (98%) radiographic interpretation. Only 21% of the students thought that the digital tool alone was sufficient to learn radiographic anatomy, with the majority (88%) indicating the need for both the digital tool and the textbook. Importantly, however, only 15% of the students thought that the textbook alone was sufficient to learn radiographic interpretation.

Table 2.   Second-year student evaluation of the learning intervention using the digital tool when compared to the conventional radiology textbook, and overall agreement comparison with fifth-year student evaluations
QuestionStrongly agree (%)Agree (%)Disagree (%)Strongly disagree (%)Overall agreement (%)
Year 2Year 5
  1. Values are percentages of students that strongly agree, agree, disagree or strongly disagree with a particular statement. Overall agreement refers to the total percentage of students that indicated strongly agree or agree with a particular statement. For questions where the total percentage (cumulative of strongly agree, agree, disagree and strongly disagree) does not equal to 100, the remaining percentage was assumed to be undecided.

Digital tool was easy to use693100100100
Preferred using digital tool over textbook6529209483
Navigating through images of digital tool was more effective than using textbook in learning oral radiographic anatomy6826229479
Learning oral radiographic anatomy was easier with the digital tool than with the textbook5535829084
Learning to interpret radiographs was faster with the digital tool than with a textbook5342509582
Learning to interpret pathological structures on radiographs was easier with the digital tool than with the textbook26392756550
Felt more confident learning with digital tool than with textbook43421508563
Learning with the digital tool is more beneficial than learning from textbook31441327550
Digital tool was superior to textbook in helping identify radiographic anatomy40481028879
Digital tool was superior to textbook in helping to understand and interpret radiographs263927276546
AFTER using the digital tool, felt more confident in radiographic interpretation without additional help24522227675
AFTER using the digital tool, found identification of radiographic anatomy easier48520010083
AFTER using the digital tool, found radiographic interpretation easier3558309379
Believe that performance on radiographic interpretation test was better after using digital tool when compared with textbook29550158467
Digital tool positively enhanced learning of radiographic interpretation34610095100
Feel that digital tool positively affected grade on radiographic interpretation test2968209788
Using digital tool positively affected WHAT was learnt in radiographic interpretation2966209588
Using digital tool positively affected HOW radiographic interpretation was learnt4256209887
Using the digital tool ALONE is sufficient to learn radiographic interpretation31861182112
Using the textbook ALONE is sufficient to learn radiographic interpretation21347391558
It is necessary to use BOTH digital tool and textbook in learning radiographic interpretation355311088100
Ability to access digital tool via Blackboard at any time would be advantageous to learning radiographic interpretation68310099100
Digital tool should be made available on Blackboard for ALL dental students to learn radiographic interpretation73270010096
Digital tool should be included in Bachelor of Dental Science curriculum for teaching radiographic interpretation60400010092

Results obtained from the fifth-year student evaluation questionnaires were similar to those obtained from the second-year students (Table 2). The main differences observed between the experienced (fifth year) and novice (second year) students were that experienced students did not find the digital tool as beneficial to their learning of radiographic interpretation (50% agreement) as much as the novice students (75% agreement), and more than half of the experienced students (58%) felt that the textbook alone was sufficient for their learning. This is partly because the experienced students have gained their knowledge and confidence in radiographic interpretation with the use of a conventional textbook, and obviously viewed the use of the digital tool as merely a revision guide. This observation is in agreement with the scores obtained for the radiographic interpretation tests, in which all students scored highly on the MCQ test, both pre- and post-intervention.

Responses from open-ended questions provided numerous reasons for student preference of the digital tool, including ‘ease of use’, ‘better visualisation’, ‘more engaging’, ‘fun’, ‘more interactive’, ‘more effective’ and ‘not as boring as the textbook’. Similar reasons were given for wanting to continue to use the digital tool and included that ‘it is a great revision guide’. On the other hand, those who preferred using the textbook did so only because it ‘provides more detailed explanations’. Overall, the majority of students (65%) gave the digital tool a rating of ‘very good’ (four of five), 24% gave it a rating of ‘excellent’ (five of five), 8% gave it a rating of ‘good’ (three of five) and only 3% rated the tool as ‘satisfactory’ (two of five).

The results obtained from the MRT-A show that regardless of their performance on the radiographic interpretation tests, all participants scored highly on the visual-spatial ability test (score of 89.39 ± 16.33% for second-year students and 85.59 ± 15.83% for fifth-year students). Although there was a gender difference in the visual-spatial ability, with men (score of 93.27 ± 8.53%) performing significantly better (< 0.05) on the MRT-A than women (score of 83.33 ± 22.85%), direct comparison of the MRT-A score with the MCQ score for each participant demonstrated very weak correlation between visual-spatial ability and radiographic interpretation (correlation coefficient <0.2).

Discussion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References

A computer-oriented solution was chosen to address the need for improved dental radiology learning at an undergraduate level. With ongoing implementation of digital technologies across a variety of curricula, it is not surprising that the current generation of students show immediate preference for computer-assisted learning and reference tools (11). However, there is no advantage in utilising these new technologies just for the sake of student preference unless they provide an educational benefit. With the introduction of any new learning and teaching modality, one must be aware of its effects on student learning and student interaction with the technology (17–19).

Student learning, as discussed by Marton and Saljo (20), is related to how they learn, with learning outcomes being strongly related to the learning process. The ‘deep’ approach is thought to promote better (or ‘life-long’) learning, as opposed to the ‘surface’ approach in which students try to memorise text (20). Deep learning is related to intrinsic motivation (learning out of interest) (20), and this motivation is something one finds or develops, not creates (21). Hence, to utilise students’ intrinsic motivation, one must focus on what they are interested in and to link the course material to that.

Results from the evaluation questionnaire clearly show a strong student preference for the digital tool as a learning resource. Observations made by the teacher (author CSF) also showed student preference for the digital tool instead of the textbook, for reasons such as ease of use, clarity in overlaid soft and hard tissue structures, and flexibility in moving between overlaid images. The data shows that the digital tool positively affected how the students learnt radiographic interpretation, but did not significantly affect what they learnt. The students were able to learn radiographic interpretation and perform similarly regardless of the method used (i.e. textbook or digital tool). This suggests that the content of material learnt was consistent for all students; however, there was an obviously different impact of the methods used to deliver that content. Using the digital tool, when compared to the textbook, positively enhanced the learning process by enabling students to interact and better engage with the course material. This may be related to students’ intrinsic motivation for computer-based learning. Furthermore, students commented on the use of the digital tool as being ‘interesting’, ‘fun’ and ‘not as boring as the textbook’. These findings coincide with previous analysis showing that the main effect on the approach to learning comes from experience, i.e. whether students feel interested (20). This is an important consideration in attempting to promote a deep learning approach in students, and hence to improving the quality of learning and teaching.

Amongst the current institutional changes in education is the move from analogue to digital technologies, a transition sometimes referred to as a shift ‘from literacy to electracy’ (22). This contemporary form of communication in learning and teaching allows the incorporation of not only text, but also visual images and sound, placing greater importance on ‘multimodality’ (23), as opposed to the single traditional mode of print. This preference for a ‘multimodal’ approach to learning is clearly demonstrated in the study presented here, with students commenting that they preferred using and would continue using the digital tool rather than the textbook because of the additional ‘visual’ form of presentation of the course material. Comments such as ‘I prefer learning with visual aids’, ‘visual learning is better for me’ and ‘I like visual learning, it makes it stick in my mind better’ reiterate the positive effect of the visual mode of presentation on student learning. It is likely that a ‘multimodal’ approach is more effective in the fact that it engages a wider variety of our senses in the learning process (24). The results obtained from this study suggest the same. The majority of students felt that both the visual presentation of the digital tool and the conventional textual format of the reference book were needed for learning radiographic anatomy. Furthermore, students provided suggestions for improving the digital tool by ‘incorporating more text with some of the visual images’. These comments further accentuate this preference for a ‘multimodal’ approach to learning by the majority of participant students, and it may be beneficial to investigate further potential benefit of incorporating additional modes, such as the auditory mode.

The learning of radiographic anatomy, and identification and interpretation thereof, can be difficult in general, especially from radiographic images such as the OPG, where technical factors of projection and superimposition constantly interfere. This study proposes a new and more interactive approach to improving the quality of such learning and teaching. In contrast to suggestions made by other authors (25, 26), this study showed weak interrelations between radiographic interpretation ability and individual visual-spatial ability. This may be explained by the fact that the majority of students scored very high on the MRT-A. There was not a large distribution in either the MRT-A scores or the radiographic interpretation test scores, therefore limiting the chance of observing a putative association between these two abilities. Even for individuals with high visual-spatial ability, albeit a beneficial one, the learning and teaching process are fundamental in achieving excellence in radiographic interpretation and diagnosis. It is difficult to speculate if students with high visual-spatial ability would generally utilise a different learning approach to students with low visual-spatial ability, i.e. one that is more effective in promoting deep learning.

Learning tools and resources considered by students as ‘fun’, ‘interactive’, ‘engaging’ and ‘easy to use’ are important, not only in student-centred forms of education, but also in self-directed and continuing education settings. Results obtained from the fifth-year dental students clearly demonstrate this versatile applicability of the digital tool. The data show that although the fifth-year (experienced) dental students have significantly greater radiographic interpretation ability than second-year (novice) students, experienced students gave a similarly positive evaluation of the digital tool as the novice students.

Taking into consideration the students’ comments, improvements to the digital tool can be made to achieve an electronically available learning resource in the oral radiology curriculum that is even more interactive, more interesting and engaging, and with the incorporation of greater ‘multimodality’ aimed at promoting deep and life-long learning. Personal reflections made from the study point to the importance of investigating time as a factor in determining the effectiveness of the digital tool in assisting students with their learning of radiographic interpretation. Incorporating the digital tool into the curriculum as an official learning and teaching resource may provide a more accurate measure of the effectiveness of the tool on students’ ability to interpret radiographs.

Conclusion

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References

Although the newly constructed digital tool was not quantitatively superior to the conventional textbook in assisting dental students with their learning of radiographic interpretation, qualitative measures indicated a strong preference for the digital tool as a learning and teaching resource in radiographic interpretation.

Whilst traditional reference books are still valued in the dental curriculum, the preference for computer-assisted learning of oral radiographic anatomy appears to enhance the learning experience by enabling students to interact and better engage with the course material. The implementation of electronic learning tools and resources made easily accessible and aimed at promoting deep learning is an important consideration in improving the quality of learning and teaching in student-centred forms of education, as well as in self-directed and continuing education settings.

Acknowledgements

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References

The authors thank Professor Michael Peters (University of Guelph, Ontario, Canada), for allowing the use of his version of the MRT-A.

References

  1. Top of page
  2. Abstract
  3. Introduction
  4. Materials and methods
  5. Results
  6. Discussion
  7. Conclusion
  8. Acknowledgements
  9. References
  • 1
    Australian Radiation Protection and Nuclear Safety Agency. Code of practice for radiation protection in dentistry: Radiation Health Committee, 2005. http://www.ag.gov.au/cca
  • 2
    Stheeman SE, Mileman PA, van’ t Hof M, Van der Stelt PF. Room for improvement? The accuracy of dental practitioners who diagnose bony pathoses with radiographs Oral Surg Oral Med Oral Pathol Oral Radiol Endod 1996: 81: 251254.
  • 3
    Halsted MJ, Kumar H, Paquin JJ, et al. Diagnostic errors by radiology residents in interpreting pediatric radiographs in an emergency setting. Pediatr Radiol 2004: 34: 331336.
  • 4
    Margolis SA, Nilsson KA, Reed RL. Performance in reading radiographs: does level of education predict skill? J Contin Educ Health Prof 2003: 23: 4853.
  • 5
    Rohlin M, Hirschmann PN, Matteson S. Global trends in oral and maxillofacial radiology education. Oral Surg Oral Med Oral Pathol Oral Radiol Endod 1995: 80: 517526.
  • 6
    Marchese TJ. Contexts for competency-based curricula in dental education. J Dent Educ 1994: 58: 339341.
  • 7
    Coles CR. Helping students with learning difficulties in medical and health-care education. Med Educ 1990: 24: 300312.
  • 8
    Savage R. Continuing education for general practice: a life long journey. Br J Gen Pract 1991: 41: 311312.
  • 9
    Norman GR, Schmidt HG. The psychological basis of problem-based learning: a review of the evidence. Acad Med 1992: 67: 557565.
  • 10
    Mann KV. Educating medical students: lessons from research in continuing education. Acad Med 1994: 69: 4147.
  • 11
    Gutmark R, Halsted MJ, Perry L, Gold G. Use of computer databases to reduce radiograph reading errors. J Am Coll Radiol 2007: 4: 6568.
  • 12
    White SC. Computer-aided differential diagnosis of oral radiographic lesions. Dentomaxillofac Radiol 1989: 18: 5359.
  • 13
    D’Orsi CJ, Getty DJ, Swets JA, Pickett RM, Seltzer SE, McNeil BJ. Reading and decision aids for improved accuracy and standardization of mammographic diagnosis. Radiology 1992: 184: 619622.
  • 14
    Abe H, MacMahon H, Engelmann R, et al. Computer-aided diagnosis in chest radiography: results of large-scale observer tests at the 1996–2001 rsna scientific assemblies. Radiographics 2003: 23: 255265.
  • 15
    White SC, Pharoah MJ. Oral radiology: principles and interpretation. St Louis, MO: Mosby, 2009.
  • 16
    Peters M, Laeng B, Latham K, Jackson M, Zaiyouna R, Richardson C. A redrawn vandenberg and kuse mental rotations test: different versions and factors that affect performance. Brain Cogn 1995: 28: 3958.
  • 17
    Farah CS, Maybury T. Implementing digital technology to enhance student learning of pathology. Eur J Dent Educ 2009: 13: 172178.
  • 18
    Farah CS, Maybury TS. The e-evolution of microscopy in dental education. J Dent Educ 2009: 73: 942949.
  • 19
    Maybury T, Farah CS. Perspective: electronic systems of knowledge in the world of virtual microscopy. Acad Med 2009: 84: 12441249.
  • 20
    Marton F, Saljo R. Approaches to learning. In: Marton F, Hounsell D, Entwistle N ed. The experience of learning. Edinburgh: Scottish Academic Press, 1984: 3655.
  • 21
    Fransson A. On qualitative differences in learning: IV – effects of intrinsic motivation and extrinsic test anxiety on process and outcome. Br J Educ Psychol 1977: 47: 244257.
  • 22
    Ulmer GL. Internet invention: from literacy to electracy. Boston, MA: Longman, 2003.
  • 23
    Kress GR, Van Leeuwen T. Multimodal discourse: the modes and media of contemporary communication. London: Arnold, 2001.
  • 24
    Mitchell WJ. The reconfigured eye: visual truth in the post-photographic era. Cambridge, MA: MIT Press, 1994.
  • 25
    Berbaum KS, Smoker WR, Smith WL. Measurement and prediction of diagnostic performance during radiology training. AJR Am J Roentgenol 1985: 145: 13051311.
  • 26
    Nilsson T, Hedman L, Ahlqvist J. Visual-spatial ability and interpretation of three-dimensional information in radiographs. Dentomaxillofac Radiol 2007: 36: 8691.