Pattern Recognition and Mixed Reality for Computer-Aided Maxillofacial Surgery and Oncological Assessment

Antonio Pepe, Gianpaolo F Trotta, Christina Gsaxner, Dieter Schmalstieg, Jürgen Wallner, Jan Egger, Vitoantonio Bevilacqua

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

Abstract

Maxillofacial tumor resection is often dependent on the expertise of the surgeon performing the operation. This study wants to be a first exploration of the role that commercial mixed reality headsets could have in this field. With this purpose in mind, a mixed reality Head Mounted Display (HMD) application is proposed to ease this task and provide a training tool. Due to the invasiveness of the operation, a marker-less registration has been considered as advantageous and therefore a pattern recognition algorithm has been adopted for properly placing a segmented PET-CT scan over the target face. To document the validity and appreciation rate of such a system, groups of physicians and engineers were asked to evaluate and assess the resulting prototype according to the standard ISO-9241/110. The application showed a noticeable accuracy of millimeters, consistent with other biomedical studies, due to intrinsic limitations of the device. Nonetheless, the remarkably positive feedbacks collected from both groups suggest high interest in further work.

Original languageEnglish
Title of host publicationBMEiCON 2018 - 11th Biomedical Engineering International Conference
ISBN (Electronic)9781538657249
DOIs
Publication statusPublished - 2019

Keywords

  • Head and Neck Cancer
  • Human-Computer Interaction
  • Maxillofacial Surgery
  • Medical Imaging
  • Microsoft HoloLens
  • Mixed Reality
  • PET-CT

ASJC Scopus subject areas

  • Artificial Intelligence
  • Instrumentation
  • Biomedical Engineering

Fingerprint

Dive into the research topics of 'Pattern Recognition and Mixed Reality for Computer-Aided Maxillofacial Surgery and Oncological Assessment'. Together they form a unique fingerprint.

Cite this