Image-guided intervention training platform (PerkTutor)

Language of Percutaneous Surgery

We are interested in modeling and discovering the underlying structures in the minimally invasive surgeries. We build mathematical models to evaluate surgeons' dexterity in using surgical tools and performing the surgery. Each surgical procedure is broken automatically down to series of gestures which their statistical relation are calculated by using recorded motion data of surgical tools, robotic arms, surgeon's hands and eyes. The current surgery we are analyzing is Functional Endoscopic Sinus Surgery (FESS) which could be critical due to its close proximity to the brain, major arteries and critical tissues. Our long-term goal is to develop quantitative surgical skill assessment methods to reduce faulty procedures and prevent some clinical risks.

The research is based on a joint collaboration between the Laboratory for Percutaneous Surgery at Queen's University, the  Computational Interaction and Robotics Lab  (CIRL), the Laboratory for Computational Sensing and Robotics (LCSR) at Johns Hopkins University, and the Johns Hopkins Medical Intuitions (JHMI).