Methodological issues surrounding ‘The mind's scalpel in surgical education: a randomised controlled trial of mental imagery’
Article first published online: 9 APR 2013
© 2013 The Authors BJOG An International Journal of Obstetrics and Gynaecology © 2013 RCOG
BJOG: An International Journal of Obstetrics & Gynaecology
Volume 120, Issue 6, page 776, May 2013
How to Cite
Geoffrion, R. (2013), Methodological issues surrounding ‘The mind's scalpel in surgical education: a randomised controlled trial of mental imagery’. BJOG: An International Journal of Obstetrics & Gynaecology, 120: 776. doi: 10.1111/1471-0528.12168
- Issue published online: 9 APR 2013
- Article first published online: 9 APR 2013
- Manuscript Accepted: 3 JAN 2013
I am pleased to respond to the invitation by Sevdalis et al. to participate in a debate to improve research in mental imagery (MI) surgical training. This field of research is in its infancy and I admire the tenacity of their group to investigate the scientific validity of MI. In response to concerns of methodology in our recently published negative trial of MI for teaching hysterectomy, we carefully considered both imagery ability and script compliance.
Our trial was designed in 2006/07; the Mental Imagery Questionnaire (MIQ) was partially validated and published in 2010 to investigate differences in MI skills. The MIQ is not ready for widespread use; we do not yet have pass/fail scores on the MIQ to establish whether a learner has achieved sufficient MI skills. As for using the MIQ in the controls in our trial, I think the suggestive nature of the MIQ questions would have encouraged them to perform MI and would have biased our results. Our learners had two sessions with trained investigators according to a standard script as well as individual sessions in between until comfortable with MI. We trained all MI investigators for standard MI technique. Limiting the trial to one centre or one investigator performing MI with learners would have significantly affected the feasibility and generalisability of our trial. Even with a questionnaire like the MIQ, it is difficult to prove the establishment of a dynamic motor neural network. Other motor imagery measurement tools are currently in development. The Motor Imagery Index combines psychometric, behavioural and psychophysiological tools. Functional magnetic resonance imaging has also been used to determine brain activity during MI. Our MI sessions as administered did not improve motor performance and I wonder whether novice learners are indeed able to perform appropriate imagery for complex surgery. Perhaps MI is a strategy reserved for expert surgeons who have an established motor neural network for each procedure and just need to make adjustments specific to each surgical case. This would correspond to elite golf players successfully using MI, compared with someone being handed a golf club for the first time. Rather than concentrating on novices, we should perhaps study how MI may benefit intermediate or expert surgeons.
Regarding MI script compliance, we considered introducing an MI session logbook for learners. We decided against the idea, because of individual learning variations (some would need few and some many MI sessions to achieve the same effect) and because we did not know how many MI sessions to recommend for optimal skill acquisition.
In summary, although the MIQ shows promise in the assessment of MI practice and MI has helped to improve simulation skills, the ultimate test of validity of a new surgical teaching intervention is transferability of skills from the MI ‘laboratory’ to the real operating room. MI may be useful for other cohorts, such as more advanced learners initiating a new procedure or novice learners with simpler surgeries. The ongoing practice of MI may improve surgical performance as it improves self-confidence and stress.
- 5The mind's eye: functional MR imaging evaluation of golf motor imagery. AJNR Am J Neuroradiol 2003;24:1036–44., , , , .