170 likes | 256 Views
Automatic Recognition of Surgical Motions Using Statistical Modeling for Capturing Variability. Carol E. Reiley 1 Henry C. Lin 1 , Balakrishnan Varadarajan 2 , Balazs Vagvolgyi 1 , Sanjeev Khudanpur 2 , David D. Yuh 3 , Gregory D. Hager 1
E N D
Automatic Recognition of Surgical Motions Using Statistical Modeling for Capturing Variability Carol E. Reiley1 Henry C. Lin1, Balakrishnan Varadarajan2, Balazs Vagvolgyi1, Sanjeev Khudanpur2, David D. Yuh3, Gregory D. Hager1 1Engineering Research Center for Computer-Integrated Surgical Systems and Technology, The Johns Hopkins University 2Center for Speech Language Processing, The Johns Hopkins University 3Division of Cardiac Surgery, The Johns Hopkins Medical Institutions MMVR January 31st, 2008
Introduction • Our Goal • Automatically segment and recognize core surgical motion segments (surgemes) • Capture the variability of a surgeon’s movement techniques using statistical methods
Introduction • Given a surgical task, a single user tends to use similar movement patterns Lin 2005 Miccai
Introduction • Different users demonstrate more variability to complete the same surgical task • Our goal is to identify core surgical motions versus error/unintentional motion
Related Work • Prior work focuses on surgical metrics for skill evaluation • High level (applied force and motion) • Low level (motion data) • Our work aims to automatically identify fundamental motions Low level surgical modeling: MIST-VR High level surgical modeling: University of Washington-Blue Dragon Low level surgical modeling: Imperial College-ICSAD
Our Approach • Surgeme: elementary portions of surgical motion Reaching for needle Positioning Needle Pull Suture with Left Hand
Label Description A End of Trial, Idle Motion Reach for Needle (gripper open) B K Position Needle (holding needle) C Insert Needle/Push Needle Through Tissue D Move to Middle With Needle (left hand) E Move to Middle With Needle (right hand) F Pull Suture With Left Hand G Pull Suture With Right Hand* H Orient Needle With Two Hands I Right Hand Assisting Left While Pulling Suture* J Loosen Up More Suture* Motion Vocabulary *Added based on observed variability of technique
Signal Processing Feature Processing Classification/ Modeling Extraction of Structure Our Approach
Data Collection The da Vinci Surgical Robot System • Recorded parameters at 23 Hz: • (Patient and master side) • Joint angles, velocities • End effector position, velocity, orientation • High-quality stereo vision Courtesy of Intuitive Surgical With the increasing use of robotics in surgical procedures, a new wealth of data is available for analysis.
Subject Medical Training Da Vinci Training Hrs 1 - - 10-15 2 - - 100+ 3 X X 100+ 4 - X 100+ 5 - X <10 6 - X <10 7 - - <1 Experimental Study • Users had varied level of experience • Each user performed five trials • Each trial consisted of a four-throw suturing task
Classification Methods • Linear Discriminant Analysis (LDA) with Single Gaussian • LDA + Gaussian Mixture Model (GMM) • 3-state Hidden Markov Model (HMM) • Maximum Likelihood Linear Regression (MLLR) • Supervised • Unsupervised
Results Percent classifier accuracy (average): • Leave one trial out per user cross-validation • MLLR not applicable
Results • Example classifier to manual segmentation result
Results • We repeated the analysis, this time leaving one user out • Supervised: Surgeme start/stop events manually defined • Unsupervised: Surgeme start/stop events automatically derived
Conclusions • Preliminary results show the potential for identifying core surgical motions • User variability has a significant effect on classification rates • Future work: • Use contextual cues from video data • Filter class decisions (eg. majority vote) to eliminate class jumping • Apply to data from live surgery (eg. Prostatectomy)
Acknowledgements • Intuitive Surgical • Dr. Chris Hasser • This work was supported in part by: • NSF Grant No. 0534359 • NSF Graduate Research Fellowship