Original Article
An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training
Version of Record online: 25 JAN 2013
DOI: 10.1002/rcs.1485
Copyright © 2013 John Wiley & Sons, Ltd.
Issue

The International Journal of Medical Robotics and Computer Assisted Surgery
Volume 9, Issue 4, pages e34–e51, December 2013
Additional Information
How to Cite
Loukas, C., Lahanas, V. and Georgiou, E. (2013), An integrated approach to endoscopic instrument tracking for augmented reality applications in surgical simulation training. Int. J. Med. Robotics Comput. Assist. Surg., 9: e34–e51. doi: 10.1002/rcs.1485
Publication History
- Issue online: 16 DEC 2013
- Version of Record online: 25 JAN 2013
- Manuscript Accepted: 14 DEC 2012
- Manuscript Revised: 1 NOV 2012
- Manuscript Received: 19 JUL 2012
- Abstract
- Article
- References
- Cited By
Keywords:
- surgical simulation;
- laparoscopic training;
- augmented reality;
- instrument tracking;
- occlusion handling
Abstract
Background
Despite the popular use of virtual and physical reality simulators in laparoscopic training, the educational potential of augmented reality (AR) has not received much attention. A major challenge is the robust tracking and three-dimensional (3D) pose estimation of the endoscopic instrument, which are essential for achieving interaction with the virtual world and for realistic rendering when the virtual scene is occluded by the instrument. In this paper we propose a method that addresses these issues, based solely on visual information obtained from the endoscopic camera.
Methods
Two different tracking algorithms are combined for estimating the 3D pose of the surgical instrument with respect to the camera. The first tracker creates an adaptive model of a colour strip attached to the distal part of the tool (close to the tip). The second algorithm tracks the endoscopic shaft, using a combined Hough–Kalman approach. The 3D pose is estimated with perspective geometry, using appropriate measurements extracted by the two trackers.
Results
The method has been validated on several complex image sequences for its tracking efficiency, pose estimation accuracy and applicability in AR-based training. Using a standard endoscopic camera, the absolute average error of the tip position was 2.5 mm for working distances commonly found in laparoscopic training. The average error of the instrument's angle with respect to the camera plane was approximately 2°. The results are also supplemented by video segments of laparoscopic training tasks performed in a physical and an AR environment.
Conclusions
The experiments yielded promising results regarding the potential of applying AR technologies for laparoscopic skills training, based on a computer vision framework. The issue of occlusion handling was adequately addressed. The estimated trajectory of the instruments may also be used for surgical gesture interpretation and assessment. Copyright © 2013 John Wiley & Sons, Ltd.
1478-596X/asset/RCS_left.gif?v=1&s=f9d99925b4e7de7672ef051ec180bc4f1ee0afc1)
1478-596X/asset/RCS_right.gif?v=1&s=1fb3d5a20e4c943a94262731be2f3dc494bdcc37)