SEARCH

SEARCH BY CITATION

Keywords:

  • robot;
  • surgical education;
  • training;
  • laparoscopy;
  • minimally invasive surgery

Abstract

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix

Study Type – Therapy (case series) Level of Evidence 4

OBJECTIVE

• To assess the content validity of an early prototype robotic simulator. Minimally invasive surgery poses challenges for training future surgeons. The Robotic Surgical Simulator (RoSS) is a novel virtual reality simulator for the da Vinci Surgical System.

PATIENTS AND METHODS

• Participants attending the 2010 International Robotic Urology Symposium were invited to experience RoSS. Afterwards, participants completed a survey regarding the appropriateness of the simulator as a teaching tool.

RESULTS

• Forty-two subjects including surgeons experienced with robotics (n= 31) and novices (n= 11) participated in this study.

• Eighty per cent of the entire cohort had an average of 4 years of experience with robot-assisted surgery.

• Eleven (26%) novices lacked independent robot-assisted experience. The expert group comprised 17 (41%) surgeons averaging 881 (160–2200) robot-assisted cases. Experts rated the ‘clutch control’ virtual simulation task as a good (71%) or excellent (29%) teaching tool.

• Seventy-eight per cent rated the ‘ball place’ task as good or excellent but 22% rated it as poor.

• Twenty-seven per cent rated the ‘needle removal’ task as an excellent teaching tool, 60% rated it good and 13% rated it poor.

• Ninety-one per cent rated the ‘fourth arm tissue removal’ task as good or excellent.

• Ninety-four per cent responded that RoSS would be useful for training purposes.

• Eighty-eight per cent felt that RoSS would be an appropriate training and testing format before operating room experience for residents.

• Seventy-nine per cent indicated that RoSS could be used for privileging or certifying in robotic surgery.

CONCLUSION

• Results based on expert evaluation of RoSS as a teaching modality illustrate that RoSS has appropriate content validity.


Abbreviations
RoSS

Robotic Surgery Simulator

dVSS

da Vinci Surgical System

RPCI

Roswell Park Cancer Institute

SSS

Simulate Surgical Systems

MdVT

Mimic dV-Trainer.

INTRODUCTION

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix

Traditionally surgical education has been based on the Halstedian methodology of see one, do one, teach one [1]. However, with the expeditious growth of minimally invasive surgery, the academic world is faced with the challenge of training future generations of surgeons while integrating new technology into the curriculum. In addition to the financial impact of teaching residents and the increasing cost of the operating room, training constraints including length of residency programmes and working hour regulations need to be overcome [2–4]. Moreover, pressures to minimize medical errors and the ethics of training on patients are at the forefront of this debate [3].

Virtual reality simulators may provide the solution to these obstacles. Training with simulators shortens the learning curve of complex procedures by allowing one to learn and practice in a controlled, risk-free setting [5,6]. Acquiring skills that are transferable to the operating room while possibly decreasing the incidence of future complications in a reliable and cost-effective manner is invaluable [7]. In fact, the American College of Surgeons and the Residency Review Committee recognized the importance of simulation by mandating that all programmes implement skills training curricula in 2008 [8,9].

Before a surgical simulator can be used to assess the competency of surgeons, the simulator must undergo rigorous validation testing. Subjective assessments include face and content validity. Content validity is based on the experts’ evaluation of the appropriateness of the simulator as a teaching modality. This compares with construct, concurrent and predictive validities that are also necessary and are based on objective data [10].

Robotic Surgical Simulator (RoSS) is a novel virtual reality simulator for the da Vinci Surgical System (dVSS) [11,12]. Immersion can be achieved through Ross’s interface (see appendix) that replicates the current robotic surgical system with a kinematic chain design. The purpose of this study is to assess the initial content validity of version 1.0 of the robotic simulator (Simulated Surgical Systems; Williamsville, NY, USA).

PATIENTS AND METHODS

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix

RoSS is a virtual reality surgical simulator for dVSS created in collaboration between Roswell Park Cancer Institute (RPCI) and the University at Buffalo (Fig. 1). RoSS has been released as a commercial product by SSS and is currently undergoing beta testing. It contains a mock up of the dVSS-like master controls and renders a surrogate interface with simulation-based practice modules. The console of the trainer is made of two 6-degrees of freedom input devices, a stereo head mounted display, pedals for clutch and camera controls, and custom-designed pinch components to simulate the EndoWristTM of the dVSS.

image

Figure 1. Prototype of Robotic Surgical Simulator (RoSS) trainer.

Download figure to PowerPoint

All participants attending the 2010 International Robotic Urology Symposium in Las Vegas organized by the Vattikuti Urology Institute at Henry Ford Health System were invited to attend a demonstration and hands-on experience with RoSS. After completion of the task modules, all participants completed a survey regarding demographic data, past robotic surgical experience, video game experience, in addition to their opinions regarding the content validity of RoSS. The questionnaire assessed the appropriateness of the simulator as a teaching tool. Participants could choose up to a total of four possible tasks to complete based on preference. The first module included contacting an object with alternating arms using the clutch control (Fig. 2A). The second task was an advanced module that required ball acquisition, precision control and positioning of the ball (Fig. 2B). The third was a needle removal module. This required needle acquisition with precise placement of the suture end into a grasper for retrieval. Incorrectly placed needles would not be grasped for removal (Fig. 2C). The fourth module required the use of the fourth arm to remove a piece of tissue (Fig. 2D). The tasks were arranged from basic modules, such as clutch control, to more advanced modules, such as using the fourth arm to remove tissue.

image

Figure 2. Screen images of task modules (A) Clutch Control (B) Ball Place (C) Needle Removal (D) Fourth Arm Tissue Removal.

Download figure to PowerPoint

A research fellow at RPCI and graduate students of the University at Buffalo Virtual Reality Laboratory administered the study. For analysis purposes, the participants were divided into three groups based on their operative experience and surgical volume. The novice group had no previous experience performing independent robot-assisted procedures. The expert group consisted of participants who had performed at least 150 robot-assisted surgical cases independently, whereas the intermediate group consisted of participants who had at least one robot-assisted surgical experience but fewer than 149 cases. The intermediate group included surgeons categorized as advanced beginner, competent, and proficient according to the Dreyfus model of skill acquisition [13]. The department of biostatics at RPCI performed the statistical analysis.

Statistical analyses for comparing groups with regard to categorical data were performed using Fisher’s exact test. Values for continuous variables are given as mean (sd). Values for categorical data are specified as frequency (%). Statistical analysis was performed using the Statistical Analysis System version 9.2 (SAS Institute Inc., Cary, NC, USA). A nominal significance level of 0.05 was used.

RESULTS

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix

Forty-two subjects including robotic surgeons (n= 31) and novices (n= 11) participated in this study. Six (14%) had left-hand dominance and 36 (86%) had right-hand dominance. Training levels included 31 (74%) attendings, five (11%) fellows, two (5%) residents, two (5%) nurses, and two (5%) Physician Assistant/Operating Room technicians. Participants reported their sub-specialties to include bladder (48%), prostate (79%), kidney (60%) and other (19%). Nine (21%) had no video game experience, but 14 (33%) played during childhood and 19 (46%) had played within the last 6 months. Thirty-two (78%) participants had performed a mean of 181 pure laparoscopic cases (range 10–1000). Eighty per cent of the entire cohort had an average of 4 years of experience with robot-assisted surgery. There were 11 (26%) novices who had no independent robotic experience. The intermediate group consisted of participants who had at least one robot-assisted surgical experience but less than a total of 149 cases. There were 14 (33%) participants in the intermediate group with an average of 40 robot-assisted surgical procedures (range 2–100). The expert group comprised 17 (41%) surgeons experienced with robotics who had performed an average of 881 (160–2200) robot-assisted surgical cases. Thirty-eight participants completed the clutch control task, 37 completed the ball place task, 32 completed the needle removal task, and 29 completed the fourth arm tissue removal task.

Seventeen (41%) surgeons experienced with robotics were included in this assessment. The experts indicated that RoSS showed content validity. All experts rated the ‘clutch control’ virtual simulation task as a good (71%) or excellent (29%) teaching tool. Seventy-eight per cent rated the ‘ball place’ task as good or excellent, while 22% rated it as poor. Twenty-seven per cent rated the ‘needle removal’ virtual simulation task as excellent, while 60% rated it good and 13% as a poor teaching tool. Ninety-one per cent rated the ‘fourth arm tissue removal’ task as good or excellent, while only 9% rated it as poor.

Ninety-four per cent of the experts responded that RoSS would be useful for training residents or medical students. Seventy-five per cent recommended acquiring RoSS for training. One hundred per cent reported that RoSS is relevant for robotic surgery. Eighty-eight per cent of the experts felt that RoSS would be an appropriate training and testing format before operating room experience for residents. Lastly, 79% responded that RoSS could be used as part of the privileging or certifying process for robotic surgery.

Content validity was then analysed in terms of robotic experience as seen in Figs 3 and 4. Statistical analysis was performed, which did not reveal any statistical difference between the expert, intermediate, or novice groups’ opinion of RoSS as a teaching tool.

image

Figure 3. Assessment of task modules for training.

Download figure to PowerPoint

image

Figure 4. Assessment of Robotic Surgical Simulator (RoSS) for training.

Download figure to PowerPoint

DISCUSSION

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix

Few would dispute the importance of simulation in medical education and surgical training. The best method for implementing simulation into programmes and defining its role in training remain somewhat controversial. A study surveying accredited urological training programmes across the USA revealed that there is unanimous agreement among programme directors that simulation has a role in residency training [14]. In a randomized study during a paediatric robotic surgery course, 100% of those surveyed believed that robots are a valuable tool for surgery, that there is a role for simulation in robotic training, and that it should be implemented into the residency curriculum [15]. Ninety-three per cent of participants believed that a training simulator would be useful in training people to use a dVSS, showing that the opinion regarding the importance of robotics in surgical education has changed. Furthermore, a multi-national survey showed that 78% of responders felt that it was required or beneficial to have robot-assisted surgical training and 83% would consider a robot-assisted fellowship [16]. However, despite the widespread interest in incorporating surgical simulation into robot-assisted training, there remains considerable debate as to whether simulators are an adequate substitute for hands-on instruction [14]. Moreover, validation methods along with financial and technical barriers continue to limit their immediate adoption [17,18].

With the advent of minimally invasive surgery, surgeons are left with the daunting task of shortening the learning curve and minimizing any potential harm associated with it. Surgical simulation may be the means by which novice surgeons are able to cultivate the necessary surgical skills in a safe environment. Similar to aviation’s mandatory use of simulators in assessing proficiency and maintaining expertise, so could medicine incorporate simulation technology in assessing competency and possibly even standardizing the medical curriculum [19]. As Le et al. [14] rightly pointed out in a study from the Mayo Clinic, there is significant variation among urological training programmes in operative experience, curriculum and standards for a given trainee.

Several studies have shown the utility of virtual reality simulators in residency training by improving operative performance [20–22]. In a prospective randomized blinded study of surgical residents performing laparoscopic cholecystectomy with an attending surgeon, non-virtual reality trained residents were nine-times more likely to transiently fail to make progress and five times more likely to injure the gallbladder or burn non-target tissue. Mean errors were six times less likely to occur in the virtual reality trained group [20]. Similar results were seen in another randomized study by Grantcharov et al. [21], which showed that surgeons receiving virtual reality simulator training had a significantly greater improvement in operative performance than controls. Furthermore, Banks et al. [22] showed that a surgical skills simulator improved not only operative performance, but also resident knowledge in laparoscopic tubal ligation.

Robotic simulators lead the future wave of technological innovations. However, like all simulators, they too must be subjected to necessary validation testing [10]. Two recent studies attempted to validate the virtual reality surgical simulator, Mimic dV-Trainer (MdVT) developed by Mimic Technologies, Inc. (Seattle, WA) [23,24]. These studies included experienced surgeons who had completed at least 30–50 cases each. Their results suggest that MdVT is realistic; however, the investigators did not compare it to the dVSS. This contrasts the recent face validation study of RoSS, which showted that RoSS was realistically close to the dVSS console in terms of virtual simulation and instrumentation, regardless of robotic experience [25]. In fact, the expert group in that study had performed a mean of 740 robotic cases.

Our study investigated the ability of RoSS, a novel robotic surgical simulator, to serve as a training device. Traditionally, content validity entails a formal assessment by the experts in the field [10]. Given the variable definition of expertise in the literature, we chose 150 cases as our cut-off mark, as we had done previously for the face validation of RoSS [25–28]. This was based on the definition by Herrell and Smith [28] of proficiency of 150 prostatectomies. Our collected data appear to support the theory that RoSS’s virtual simulation tasks are effective teaching tools. The tasks modules train the robot-naive surgeon on how to work the clutch, use the fourth arm, manipulate the camera, and properly remove a needle, among many necessary skills. Moreover, from the novice to the expert, most reported that RoSS would be useful for training residents and medical students, recommend acquiring RoSS for training, felt that RoSS was relevant to robotic surgery, and could provide appropriate training and testing before reaching the operating room. Additionally, using RoSS-based real-time anatomy lessons has proved to be instructional. This was shown in a pilot anatomy study wherein medical students trained on RoSS exhibited accelerated learning and made fewer mistakes [29].

RoSS affords trainees the educational opportunity to immerse themselves in a robotic interface similar to dVSS at a fraction of the cost. Purchasing and maintaining RoSS will probably cost less than 10% of the dVSS expense [25]. Furthermore, it can be placed in an environment that is more accessible than dVSS such as in a training centre or library as opposed to the operating room. Additional costs associated with dVSS that must be considered include training staff, space, materials and the limited lives of dVSS instruments [23]. Training residents in the operating room and overcoming the learning curves of robot-assisted surgery pose financial limitations that simulation could alleviate [30–32].

Although the current study is limited by a sample size of 42 participants, this is the first study to consist of experts having performed an average of over 800 independent robot-assisted surgical procedures. Having such a cohort of expertise judge the effectiveness of RoSS as a teaching device strengthens the belief that surgical simulation is warranted. Integrating RoSS would not only facilitate novice robotic surgeons in developing basic surgical skills and tracking their progress, but would eventually provide instruction in surgical procedures for the advanced trainee.

In conclusion, RoSS has appropriate content validity, confirmed by experts’ evaluation of RoSS as an effective teaching modality. Further validation testing is underway at multiple institutions. Incorporating surgical simulation into a structured curriculum is the next step in strengthening surgical education and training.

CONFLICT OF INTEREST

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix

Khurshid A. Guru is a speaker for Intuitive Surgical Inc. and has received honoraria and is Founder of Simulated Surgical Systems. Thenkurussi Kesavadas is also a founder of Simulated Surgical Systems.

REFERENCES

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix
  • 1
    Barnes RW, Lang NP, Whiteside MF. Halstedian technique revisited. Innovations in teaching surgical skills. Ann Surg 1989; 210: 11821
  • 2
    Vick LR, Vick KD, Borman KR et al. Face, content, and construct validities of inanimate intestinal anastomoses simulation. J Surg Educ 2007; 64: 3658
  • 3
    Fried GM, Feldman LS, Vassiliou MC et al. Proving the value of simulation in laparoscopy surgery. Ann Surg 2004; 240: 51828
  • 4
    Zorn KC, Gautam G, Shalhav AL et al. Training, credentialing, proctoring and medicolegal risks of robotic urological surgery: recommendations of the society of urologic robotic surgeons. J Urol 2009; 182: 112632
  • 5
    Albani JM, Lee DI. Virtual reality-assisted robotic surgery simulation. J Endourol 2007; 21: 2857
  • 6
    Stefanidis D, Korndorffer JR, Sierra R et al. Skill retention following proficiency-based laparoscopic simulator training. Surgery 2005; 138: 16570
  • 7
    Meier AH, Rawn CL, Krummel TM. Virtual reality: surgical application – challenge for the new millennium. J Am Coll Surg 2001; 192: 37284
  • 8
    Scott DJ, Cendan JC, Pugh CM et al. The changing face of surgical education: simulation as the new paradigm. J Surg Res 2008; 147: 18993
  • 9
    American College of Surgeons, Division of Education. Accredited Education InstitutesTM: enhancing patient safety through simulation. Available at: http://www.facs.org/education/accreditationprogram/. Accessed January 2010
  • 10
    McDougall EM. Validation of surgical simulators. J Endourol 2007; 21: 2447
  • 11
    Kesavadas T, Seshadri S, Srimathveeravalli G et al. Design and development of RoSS: a virtual eality Simulator for da Vinci Surgical System. Int J Comput Aided Radiol Surg 2008; 3: 2812
  • 12
    Baheti A, Kumar A, Srimathveeravalli G et al. RoSS: Virtual Reality Robotic Surgical Simulator for the DaVinci Surgical Simulator System. In Proceedings of the IEEE Haptics Symposium, 2007
  • 13
    Dreyfus S, Dreyfus H. A Five Stage Model of the Mental Activities Involved in Directed Skill Acquisition, ORC 80-2. Berkeley: Operations Research Center, University of California, 1980
  • 14
    Le CQ, Lightner DJ, Vanderlei L et al. The current role of medical simulation in American urological residency training programs: an assessment by program directors. J Urol 2007; 177: 28891
  • 15
    Lendvay TS, Casale P, Sweet R et al. VR Robotic Surgery: randomized blinded study of the dV-Trainer Robotic Simulator. Stud Health Technol Inform 2008; 132: 2424
  • 16
    Guru K, Hussain A, Chandrasekhar R et al. Current status of robot-assisted surgery in urology: a multi-national survey of 297 urologic surgeons. Can J Urol 2009; 16: 456873
  • 17
    Guru K, Kuvshinoff B, Pavlov-Shapiro S et al. Impact of robotics and laparoscopy on surgical skills: a comparative study. J Am Coll Surg 2007; 204: 96101
  • 18
    Katsavelis D, Siu K, Brown-Clerk B et al. Validated robotic laparoscopic surgical training in a virtual-reality environment. Surg Endosc 2009; 23: 6673
  • 19
    Heintzman RJ. Determination of Force Cueing Requirements for Tactical Combat Flight Training Devices (ASC-TR-97-5001). Wright Patterson AFB: Aeronautical Systems Center, Training Systems Product Group, 1996
  • 20
    Seymour NE, Gallagher AG, Roman SA et al. Virtual reality training improves operating room performance. Results of a randomized, double-blinded study. Ann Surg 2002; 236: 45864
  • 21
    Grantcharov TP, Kristiansen VB, Bendix J et al. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg 2004; 91: 14650
  • 22
    Banks EH, Banks EH, Chudnoff S et al. Does a surgical simulator improve resident operative performance of laparoscopic tubal ligation? Am J Obstet Gynecol 2007; 197: 541
  • 23
    Sethi AS, Peine WJ, Mohammadi Y et al. Validation of a novel virtual reality robotic simulator. J Endourol 2009; 23: 5038
  • 24
    Kenney PA, Wszolek MF, Gould JJ et al. Face, content, and construct validity of dV-Trainer, a novel virtual reality simulator for robotic surgery. Urology 2009; 73: 128892
  • 25
    Seixas-Mikelus SA, Kesavadas T, Srimathveeravalli G et al. Face validation of a novel robotic surgical simulator. Urology 2010; 76: 35760
  • 26
    Narazaki K, Oleynikov D, Stergiou N. Objective assessment of proficiency and bimanual inanimate tasks in robotic laparoscopy. J Laparoendosc Adv Surg Tech 2007; 17: 4752
  • 27
    Ahlering T, Skarecky D, Lee D et al. Successful transfer of open surgical skills to a laparoscopic environment using a robotic interface: initial experience with laparoscopic radical prostatectomy. J Urol 2003; 170: 173841
  • 28
    Herrell SD, Smith JA. Robotic-assisted laparoscopic prostatectomy: what is the learning curve? Urology 2005; 66: 1057
  • 29
    Seixas-Mikelus SA, Adal A, Kesavadas T et al. Can image-based virtual reality help teach anatomy? J Endourol 2010; 24: 62934
  • 30
    Steinberg PL, Merguerian PA, Bihrle W et al. The cost of learning robot-assisted prostatectomy. Urology 2008; 72: 106872
  • 31
    Lee SL, Sydorak RM, Applebaum H. Training general surgery residents in pediatric surgery: educational value vs time and cost. J Pediatr Surg 2009; 44: 1648
  • 32
    Bridges M, Diamond DL. The financial impact of teaching surgical residents in the operating room. Am J Surg 1999; 177: 2832

Appendix

  1. Top of page
  2. Abstract
  3. INTRODUCTION
  4. PATIENTS AND METHODS
  5. RESULTS
  6. DISCUSSION
  7. CONFLICT OF INTEREST
  8. REFERENCES
  9. Appendix

A variety of validation assessments must be employed before the widespread adoption of any simulation device. Such validation assessments are conducted at beta-sites, or testing sites that evaluate the new instrument before finalizing the product in an environment not controlled by the developer.

Validity assesses the degree to which the simulator measures what it is intended to measure. The assessment is divided into both subjective and objective evaluations. Face and content validities are two subjective measurements and are often obtained through questionnaire-type methods. Face validation is classically measured by the novice and assesses how realistic the simulator is. Content validation, on the other hand, examines the appropriateness of the simulator as a teaching modality and is based on the experts’ evaluation of the simulator. Objective assessments include construct, concurrent and predictive validities. Construct validity assesses whether the simulator is able to differentiate the experienced surgeon from the novice. Various metrics such as time required to complete a task, total distance travelled, accuracy, etc. are used to test both the experienced and novice surgeons, which then enables the simulator to tell them apart. Concurrent validity is another objective measurement that analyses simulator performance in relation to a gold standard. The Objective Structured Assessment of Technical Skills (OSATS) is one such assessment based on a compilation of checklists and global rating scores. Finally, predictive validity objectively evaluates whether the simulator performance scores predict future subject performance using for instance OSATS in the operating room.

This appendix serves to provide a quick review of the terminology employed in the validation of surgical simulators. We refer the reader to the manuscript by McDougall [10] for a complete review of the validation process.