VasAR

AR-guided Non-invasive Medical Procedures


Providing AR-based guidance to physicans performing time-critical medical procedures.

2016-2021

Augmented reality (AR) can improve a number of medical procedures by providing 3D visualization of patient anatomy, with potential to reduce contrast and procedure time. We present a novel AR guidance system that displays virtual, patient-specific 3D anatomic models and assess intraprocedural impact during TAVR with CEP in six patients.


Presented at IEEE ISMAR 2019, the NYC Media Lab Summit 2018 and ACM SIGGRAPH 2018.

Media

Publications


Systems and methods for augmented reality guidance

Steven Feiner, Gabrielle Loeb, Alon Grinshpoon, Shirin Sadri, Carmine Elvezio
US Patent #16796645
[US Patent] []
Certain embodiments include a method for assisting a clinician in performing a medical procedure on a patient using augmented reality guidance. The method can include obtaining a three-dimensional model of an anatomic part of the patient. The method can also include aligning the three-dimensional model with data to form augmented reality guidance for the medical procedure. In addition, the method can include presenting the augmented reality guidance to the clinician during the medical procedure using an augmented reality three-dimensional display.

Manipulating 3D Anatomic Models in Augmented Reality: Comparing a Hands-Free Approach and a Manual Approach

Shirin Sadri, Shalva Kohen, Carmine Elvezio, Shawn Sun, Alon Grinshpoon, Gabrielle Loeb, Naomi Basu, Steven Feiner
IEEE ISMAR 2019
[IEEE Xplore] []
Many AR and VR task domains involve manipulating virtual objects; for example, to perform 3D geometric transformations. These operations are typically accomplished with tracked hands or hand-held controllers. However, there are some activities in which the user's hands are already busy with another task, requiring the user to temporarily stop what they are doing to perform the second task, while also taking time to disengage and reengage with the original task (e.g., putting down and picking up tools). To avoid the need to overload the user's hands this way in an AR system for guiding a physician performing a surgical procedure, we developed a hands-free approach to performing 3D transformations on patient-specific virtual organ models. Our approach uses small head motions to accomplish first-order and zero-order control, in conjunction with voice commands to establish the type of transformation. To show the effectiveness of this approach for translating, scaling, and rotating 3D virtual models, we conducted a within-subject study comparing the hands-free approach with one based on conventional manual techniques, both running on a Microsoft HoloLens and using the same voice commands to specify transformation type. Independent of any additional time to transition between tasks, users were significantly faster overall using the hands-free approach, significantly faster for hands-free translation and scaling, and faster (although not significantly) for hands-free rotation.

Augmented reality guidance for cerebral embolic protection (CEP) with the sentinel device during transcatheter aortic valve replacement (TAVR): First-in-human study

Shirin Sadri, Gabrielle Loeb, Alon Grinshpoon, Carmine Elvezio, Poonam Velagapudi, Vivian G Ng, Omar Khalique, Jeffrey W Moses, Robert J Sommer, Amisha J Patel, Isaac George, Rebecca T Hahn, Martin B Leon, Ajay J Kirtane, Tamim M Nazif, Susheel K Kodali, Steven K Feiner, Torsten P Vahl
Circulation 2019
[AHA Circulation] []
Augmented reality (AR) can improve transcatheter interventions by providing 3D visualization of patient anatomy, with potential to reduce contrast and procedure time. We present a novel AR guidance system that displays virtual, patient-specific 3D anatomic models and assess intraprocedural impact during TAVR with CEP in six patients.

Hands-free augmented reality for vascular interventions

Alon Grinshpoon, Shirin Sadri, Gabrielle J Loeb, Carmine Elvezio, Samantha Siu, Steven K Feiner
ACM SIGGRAPH 2018
[ACM DL] []
During a vascular intervention (a type of minimally invasive surgical procedure), physicians maneuver catheters and wires through a patient's blood vessels to reach a desired location in the body. Since the relevant anatomy is typically not directly visible in these procedures, virtual reality and augmented reality systems have been developed to assist in 3D navigation. Because both of a physician's hands may already be occupied, we developed an augmented reality system supporting hands-free interaction techniques that use voice and head tracking to enable the physician to interact with 3D virtual content on a head-worn display while leaving both hands available intraoperatively. We demonstrate how a virtual 3D anatomical model can be rotated and scaled using small head rotations through first-order (rate) control, and can be rigidly coupled to the head for combined translation and rotation through zero-order control. This enables easy manipulation of a model while it stays close to the center of the physician's field of view.

3: 54 PM Abstract No. 29 Augmented reality guidance for cerebral angiography

G Loeb, S Sadri, A Grinshpoon, J Carroll, C Cooper, C Elvezio, S Mutasa, G Mandigo, S Lavine, J Weintraub, A Einstein, S Feiner, P Meyers
Journal of Vascular and Interventional Radiology 2018
[Research Gate] []
Augmented reality (AR) holds great potential for IR byintegrating virtual 3D anatomic models into the real world.1In thispilot study, we developed an AR guidance system for cerebralangiography, evaluated its impact on radiation, contrast, and fluo-roscopy time, and assessed physician response.

Hands-free interaction for augmented reality in vascular interventions

Alon Grinshpoon, Shirin Sadri, Gabrielle J Loeb, Carmine Elvezio, Steven K Feiner
IEEE VR 2018
[IEEE Xplore] []
Vascular interventions are minimally invasive surgical procedures in which a physician navigates a catheter through a patient's vasculature to a desired destination in the patient's body. Since perception of relevant patient anatomy is limited in procedures of this sort, virtual reality and augmented reality systems have been developed to assist in 3D navigation. These systems often require user interaction, yet both of the physician's hands may already be busy performing the procedure. To address this need, we demonstrate hands-free interaction techniques that use voice and head tracking to allow the physician to interact with 3D virtual content on a head-worn display while making both hands available intraoperatively. Our approach supports rotation and scaling of 3D anatomical models that appear to reside in the surrounding environment through small head rotations using first-order control, and rigid body transformation of those models using zero-order control. This allows the physician to easily manipulate a model while it stays close to the center of their field of view.