Real-Time Workflow Detection using Video Streams in Craniosynostosis Surgery

  1. L. García-Duarte Sáenz 1
  2. D. García-Mato 1
  3. S. Ochandiano 2
  4. J. Pascau 2
  1. 1 Universidad Carlos III de Madrid, Madrid, España
  2. 2 Instituto de Investigación Sanitaria Gregorio Marañón, Madrid, España
Libro:
XXXVIII Congreso Anual de la Sociedad Española de Ingeniería Biomédica. CASEIB 2020: Libro de actas
  1. Roberto Hornero Sánchez (ed. lit.)
  2. Jesús Poza Crespo (ed. lit.)
  3. Carlos Gómez Peña (ed. lit.)
  4. María García Gadañón (ed. lit.)

Editorial: Grupo de Ingeniería Biomédica ; Universidad de Valladolid

ISBN: 978-84-09-25491-0

Año de publicación: 2020

Páginas: 384-387

Congreso: Congreso Anual de la Sociedad Española de Ingeniería Biomédica CASEIB (38. 2020. Valladolid)

Tipo: Aportación congreso

Resumen

Real-time automatic surgical tool detection is a demanding task in deep learning applications aiming to recognize and optimize the surgical workflow. This work focuses on applying such technology in a navigation-based craniosynostosis surgical procedure, which presents an elevated complexity and surgical outcomes are highly dependent on the surgeon’s expertise. A total of four neural networks were trained for automatic tool detection and phase estimation. Three state-of-the-art networks (VGG-16, MobileNetV2 and InceptionV3) were trained with transfer learning, and a convolutional neural network, named CranioNet, was developed. These four networks were tested for the automatic classification of tools and surgical phase estimation from video streams recorded in two different scenarios: a surgical simulation performed on a 3D printed phantom, and a real surgical procedure for the correction of craniosynostosis. Classification models were integrated into a fully customized application software, developed to perform the tool-recognition task in real-time and to provide information to estimate the surgical phase duration. Results of this study demonstrate that deep learning can accurately classify surgical tools and estimate surgical phases in video streams of craniosynostosis surgeries. VGG-16 and CranioNet networks showed optimal performance for this task. The integration of this technology into the clinical practice could provide objective feedback and performance evaluation of surgical skills and could lead to a reduction of surgical errors and complications in craniosynostosis surgery