skip to Main Content

Radiology Building
1st Floor, Room 126 (RA-126)
20 Shattuck Street, Boston, MA 02115

Image-guided Surgical Navigation

We have developed an intraoperative surgical navigation software to allow the surgeon to accurately localize the tumor and the surrounding sensitive structures. The surgical navigation software is like a “GPS for surgery”. The software localizes the instruments with respect to a virtual display consisting of 3D surface-rendered models generated from patient-specific imaging. Fast intraoperative segmentation and registration algorithms are also implemented as part of the module to allow for the use of the surgical navigation software with intraoperative imaging. The navigation software has been utilized to perform lung wedge resection surgery, MR-guided parathyroidectomy, PET-CT guided pheochromocytoma resection, C-arm CT guided lymphadenectomy and MR-guided cryotherapy.

Rigid surface reconstruction of the anatomy

One of the difficulties of providing accurate navigation for general surgical applications is the distortion of the anatomy during surgery. The anatomy no longer corresponds to the virtual map created from the volumetric CT, C-arm CT, or MR images. To compensate for the tissue deformation, we have developed novel algorithms based on the Simultaneous Localization and Mapping (SLAM) algorithm to reconstruct the tissue anatomy in real-time using a stereo laparoscope. This algorithm has been tested on images acquired during robotic partial nephrectomy, liver surgery, urethroplasty, and spine surgery.

Non-rigid surface reconstruction of the anatomy

One of the fundamental assumptions of the SLAM algorithm is that the environment being mapped is rigid and static. This is not true for mapping the anatomy during surgery which is subject to motion due to physiological motion like breathing, pulsatile motions and tissue manipulation. We have developed a non-rigid version of the SLAM algorithm called EMDQ-SLAM to reconstruct the tissue map in presence of significant tissue deformation. The algorithm can also estimate the tissue deformation in real-time caused due to tool manipulation.

Mixed and Augmented Reality for Surgery

Typically for image-guided surgery, there is a tremendous amount of information that is provided to the surgeon. It is imperative to present context-specific information to surgery to minimize the cognitive overload on the surgeon. For this, we have explored novel methods to present the information to the surgeon using mixed and augmented reality headsets. The headsets could be used for diagnosis, surgical planning or intervention.

Back To Top