SA-CME credits are available for this article here.
Recent advancements in computer-generated graphics have enabled new technologies such as augmented and virtual reality (AR/VR) to simulate and recreate realistic clinical environments. Their utility has been validated in integrated learning curriculums and surgical procedures. Radiation oncology has opportunities for AR/VR simulation in both training and clinical practice.
Systematic review was performed to query the literature based on a combination of the search terms “virtual,” “augmented,” “reality,” “medical student,” and “education” to find articles that examined AR/VR on learning anatomy and surgery-naïve participants’ first-time training of procedural tasks. Studies were excluded if nonstereoscopic VR was used, if they were not randomized controlled trials, or if resident-level participants were included.
For learning anatomy and procedural tasks, the studies we found suggested that AR/VR was noninferior to current standards of practice.
These studies suggest that AR/VR programs are noninferior to standards of practice with regard to learning anatomy and training in procedural tasks. Radiation oncology, as a highly complex medical specialty, would benefit from the integration of AR/VR technologies, as they can be cost-effective methods of enhancing training in a field with a narrow therapeutic ratio.
Healthcare providers strive for cost-effective, easily accessible methods to train and practice medicine in this changing landscape. Virtual reality/Augmented reality (VR/AR) systems are readily available programs that can realistically simulate clinical environments. These immersive technologies are on a continuum of reality-virtuality.1 A real environment is the reality we live in and is filled with real objects. A virtual environment fills a display device with virtual objects.1 Everything between these two environments can be called mixed reality or extended reality (XR). One platform within XR is AR, in which a display device will overlay a digital image into the field of view of a real environment. Google Glass is considered a “nonimmersive” version of AR as it projects a computer monitor display into the upper right corner of a field of view. There are several factors to consider when assessing XR technology and several devices included within it that will not be discussed further in this paper. These platforms are typically used with either a head-mounted display (HMD) or a monitor-based display device.
The most basic VR programs remain “nonimmersive,” displaying traditional content, such as watching a movie on a computer screen. However, the most advanced VR programs try to emulate 3 sense-based modalities to provide a truly immersive environment: sight, sound and touch. The HMD-based devices use stereoscopic animations and surround sound, re-creating sight with depth perception and sound with distance localization.2 Haptic feedback, or touch sensation, is on the horizon as well.3-5
AR-based devices work with some form of optic modulation through a medium such as glasses, a smartphone, and possibly contact lenses in the future. Some of the simplest nonmedical AR uses include smartphone applications that use a smartphone’s gyroscope, internet connection and global positioning system (GPS) to triangulate and display astronomical constellations on the phone when pointing its camera lens to the night sky. Regardless of their level of immersion, one aim of these technologies is to help us see things that are difficult to visualize.
Previous iterations of immersive console experiences were unsophisticated with clunky, pixelated graphics; however, the latest graphic cards can produce photorealistic virtual environments.6,7 In medicine, this advantage can translate to simulating procedures requiring precision dexterity that can possibly harm a patient. The experience required to obtain deft procedural ability would previously have been at the expense of real patients. Our surgical colleagues have already noticed the utility of simulated environments using the daVinci Surgical Simulator (dVSS) (Intuitive Surgical Inc.; Sunnyvale, California),8,9 which is of particular interest to radiation oncology residency programs that train young physicians not only in external-beam techniques,10 but also in internal brachytherapy delivery.11 In radiation oncology practice, ensuring the safe delivery of implanted dose is of the highest significance due to the proximity of adjacent normal tissues and the potential of long-term radiation-induced late complications. Indeed, quality assurance programs in radiation oncology aim not only to ensure that the graduating physician possesses the technical ability to perform external-beam and brachytherapy delivery, but also that such competent skill is safely maintained over the lifetime of the practitioner.
The entire practice of radiation oncology is predicated on the individual practitioner’s successful deployment of specific technologies. From contouring anatomical structures, to creating dose angles for treatment, to the technical insertion of permanent radioactive seeds or temporary catheters for high dose rate (HDR) brachytherapy, opportunities for AR/VR technology integration are numerous.10,12-14 Clinical application of this new technology will be a challenge, as randomized controlled trials are needed to prevent unnecessary patient harm. A safer method of examining the utility of this technology in preliminary studies is by comparing noninferiority with traditional means of training.
The aim of this review is to determine whether AR/VR is a suitable surrogate for training clinically naïve radiation oncology healthcare practitioners. It is hypothesized that the main advantage of AR/VR’s immersive environment is that it helps healthcare professionals understand 3-dimensional (3D) visuospatial representations better, or at least equal to, traditional textbook learning. Therefore, this study sought to find articles in which visuospatial learning would be most utilized, in anatomy and simple procedures requiring the understanding of anatomy.
An initial search in the literature for articles written in English on the use of AR/VR for educational use at the medical student level as a surrogate for the entry level radiation oncology resident was performed, dating from 1997 to 2017. Specifically, articles that dealt strictly with anatomy education and surgery-naïve procedural skills were sought. A combination of the terms “virtual reality,” “augmented reality,” “VR,” “AR,” “medical student,” and “education” were queried.
A diagrammatic flow chart of the search algorithm used is depicted in Figure 1
. The initial search of the literature yielded 612 articles. After this initial screening of article titles, 127 were selected for abstract review. Among criteria for exclusion were the inclusion of resident-level anatomy topics or participants; use of nonstereoscopic 3D models; and trials that were not randomized and controlled, not adequately powered, or did not have the article available in text. Additionally, studies were excluded if they did not explicitly test for a procedural task in a randomized controlled trial. Finally, studies were excluded if they did not utilize a true stereoscopic virtual reality simulator or augmented reality if the final test was not a 2-dimensional (2D) laparoscopic procedure or if the articles were unavailable in text.
After eliminating 86 studies, 42 articles were reviewed in full text. Finally, 19 articles were left that met inclusion criteria and form the basis for this review. Meta-analysis was not performed due to heterogeneity in outcomes measured; controls; and randomized, controlled trial arms.
We identified 7 articles that used VR/AR to supplement anatomy courses at the pre-clerkship medical student level (Table 1
).15-21 Most of the studies found that AR/VR did not significantly differ in standardized testing scores when compared with traditional anatomy lectures that included cadaveric dissection. A variety of VR programs were used, with no single study using the same program for anatomy teaching. Participants included first- and second-year medical students, with one study including graduate-level students taking a medical anatomy course.19 Controls across the studies varied, but all were randomized controlled trials. Outcomes measured were similarly heterogeneous, ranging from 10- to 30-question multiple choice exams and practical exams requiring cadaveric identification of structures.
Twelve studies22-33 were identified that sought to evaluate AR/VR training vs. box training in improving procedural tasks in surgery-naïve medical students (Table 2
). Box trainers are the current standard of laparoscopic training. They consist of an enclosed box with a minimum of 2 laparoscopic port sites for instrument entry, a camera that displays the inside of the box, and a variety of objects inside to train in procedural skills. Some of the most common tasks include peg transfer, in which trainees must use laparoscopic tools to pick up porous silicone objects impaled by vertical pegs and place them in a targeted area. Most of the studies found that AR/VR did not significantly differ from traditional learning methods. The most common AR/VR programs used include LAP Mentor (3D Systems; Valencia, California), Minimally Invasive Surgical Training-Virtual Reality (MIST-VR), and dVSS. Participant demographics varied from first-year medical students to surgery-naïve surgical interns. As with anatomy education, procedural learning control groups were highly variable. They consisted of box training, didactic lectures, online training modules, and 3D videos. Standardized outcome measures used included objective structured assessment of technical skill (OSATS), global rating scales (GRS), and various subcomponents such as time to task completion, errors committed, and economy of motion.
The studies identified in this review suggest that AR/VR is a suitable surrogate for acquiring the visuospatial skills necessary to be proficient in learning anatomy and simple procedural tasks,15-36 topics with high relevancy for radiation oncology residency training and potentially ongoing maintenance of certification requirements. While the majority of U.S. medical schools use prosections, cadaveric dissections, and didactic lectures to teach anatomy, a standardized methodology does not exist; instead, anatomy curricula are created per the discretion of each medical school and accredited by the Accreditation Committee for Graduate Medical Education (ACGME). Interestingly, 2 out of 134 medical schools were able to maintain their accreditation even without traditional cadaveric dissections. This suggests that nontraditional means of producing functional anatomy curricula is practical and already in existence.37 This study specifically sought articles using medical students as participants to examine the largest possible benefit from AR/VR naïve training, and the results are promising. With traditional learning done through the necessary use of live porcine models or expensive cadavers, the medical education community can benefit AR/VR’s scalable and cost-effective benefits.
Kucuk et al and Nicholson et al showed that if the control group were taught using 2D lectures without cadaveric dissection, the AR/VR group performed significantly better.18,23 This suggests that the ability to create 3D anatomical representations are adequately learned through AR/VR training. Interestingly, Moro et al used a control group consisting of a tablet-based 3D representation of neuroanatomical structures, and none of the groups (either VR or AR) performed significantly better than the tablet group.19 All studies controlled for prior anatomy experience, and only 3 of the studies controlled for previous experience with AR/VR. Time spent with AR/VR supplementation varied significantly across all studies, from as short as 24 minutes to 12 hours. Peterson and his study fall in the latter group, and his data suggest that AR-supplemented training increased standardized scores, even against traditional cadaveric dissection.21
Outcomes measured amongst the procedural studies consisted of multiple choice exams and practical exams comprised of standardized scores for procedural effectiveness via time to task completion, errors made, and economy of motion. The results were heterogeneous. Time allotted for AR/VR training varied drastically, from 2 to 12 hours. Overall, VR training did not significantly differ from box trainer in terms of mean time to task completion, errors made, or economy of motion. Instead, they improved a participant’s procedural task abilities similarly to box trainers when allowed to train for equal amounts of time. Standard learning curves for procedural tasks are expected to have a high slope early on with eventual plateauing, indicative of diminishing returns based on time put in.38-41 However, determining the time to proficiency is critical in creating an effective educational course, an outcome not readily measured in these current studies. The advantage to a stereoscopic training environment is that it assists in visualizing a 3D world. However, all studies were tested in 2D laparoscopic view and were still found to be noninferior to laparoscopic box training. Most of the studies used live porcine models, although Tanoue et al and Chien et al tested their participants on box transfer.24,32
The status of AR/VR research in healthcare is in its infancy. Unfortunately, this means that the studies available are single-center, industry-backed projects with small study populations and heterogeneous-measured outcomes. Even the definition of virtual reality remains ambiguous, as many nonstereoscopic 3D image-based studies from the last decade used it in their title. A need for formalized training procedures on AR/VR can eliminate this problem by standardizing the time required to reach proficiency in anatomy education and simple procedural tasks. Additionally, a gold standard for outcome measures based on a standardized time to proficiency needs to be established.
Understanding the representation of accurate 3D visualization of tumor volumes, treatment dose distributions,12 and radiation damage to healthy tissue on computed tomography (CT), MRI, ultrasound and/or positron emission tomography (PET)/CT is necessary for radiation oncologists who typically have no formalized radiology training. VR has already been used to help teach patients, residents, and radiation therapists about patient positioning using a projector-based virtual reality program.10 Pilot studies using AR have also been used to help guide the placement of brachytherapy needles.11 Moreover, intraoperative delivery of radiation treatment or precise positioning of permanent seeds, as well as outpatient HDR insertion techniques, all require technical expertise, which can be difficult to measure during residency and in medical practice. Standardization and practice with procedural techniques could potentially improve safety in high-risk but necessary procedures such as brachytherapy. As brachytherapy fellowships are typically few and rely on an apprenticeship training model, the democratization of high-quality patient care will be limited by the quantity of cases at high-volume cancer centers. As AR/VR is an incredibly versatile and scalable technology, training can be systematically improved and adjusted based on the current standards of practice, with the potential to measure individual proficiency. Corrective training and real-time peer review can then be possible. In addition, treatment can be simulated without causing any patient harm, providing a safe and effective method of training next-generation radiation oncologists and ensuring the ongoing competence of the existing practitioners. AR/VR technology is ready to be integrated into radiation oncology training programs with needed research into how best to optimize such an initial and ongoing approach to ensure competency.
As healthcare shifts with a focus on producing cost-effective practices, healthcare education can benefit from the scalable nature of AR/VR. All of the studies we reviewed demonstrated noninferiority to the current standard of practice regarding training in clinically naïve participants. For radiation oncology residents, this translates into a more immersive learning environment in a field that requires proficient visuospatial and technical abilities. Future integration opportunities may extend far beyond residency education and offer practicing radiation oncologists the AR/VR immersion capability for demonstrating procedural proficiency for ongoing maintenance of certification, ultimately enhancing patient safety and ensuring the highest standards in quality of care.
Jin W, Birckhead B, Perez B, Hoffe S. Augmented and virtual reality: Exploring a future role in radiation oncology education and training. Appl Rad Oncol. 2017;6(4):13-20.
Mr. Jin is a 4th-year medical student at the University of South Florida Morsani College of Medicine, Tampa, FL. Dr. Birckhead is a radiation oncologist at Medical College of Wisconsin, Department of Radiation Oncology, Milwaukee, WI. Dr. Perez is a radiation oncologist, and Dr. Hoffe is section head of Gastrointestinal Radiation Oncology, Moffitt Cancer Center, Tampa, FL.
Disclosure: The authors have no conflicts of interest to disclose. None of the authors received outside funding for the production of this original manuscript and no part of this article has been previously published elsewhere.